Browne / Cave / Drage | Feminist AI | Buch | 978-0-19-288989-8 | sack.de

Buch, Englisch, 432 Seiten, Format (B × H): 158 mm x 236 mm, Gewicht: 868 g

Browne / Cave / Drage

Feminist AI

Critical Perspectives on Algorithms, Data, and Intelligent Machines
Erscheinungsjahr 2024
ISBN: 978-0-19-288989-8
Verlag: Oxford University Press

Critical Perspectives on Algorithms, Data, and Intelligent Machines

Buch, Englisch, 432 Seiten, Format (B × H): 158 mm x 236 mm, Gewicht: 868 g

ISBN: 978-0-19-288989-8
Verlag: Oxford University Press


Feminist AI: Critical Perspectives on Algorithms, Data and Intelligent Machines is the first volume to bring together leading feminist thinkers from across the disciplines to explore the impact of artificial intelligence (AI) and related data-driven technologies on human society.

Recent years have seen both an explosion in AI systems and a corresponding rise in important critical analyses of these technologies. Central to these analyses has been feminist scholarship, which calls upon the AI sector to be accountable for designing and deploying AI in ways that further, rather than undermine, the pursuit of social justice.
This book aims to be a touchstone text for AI researchers concerned with the social impact of their systems, as well as theorists, students and educators in the field of gender and technology. It demonstrates the importance of an intersectional understanding of the risks and benefits of AI, approaching feminism as a political project that aims to challenge various interlocking forms of injustice, social inequality and structural relations of power.

Feminist AI showcases the vital contributions of feminist scholarship to thinking about AI, data, and intelligent machines as well as laying the groundwork for future feminist scholarship on AI. It brings together scholars from a variety of disciplinary backgrounds, from computer science, software engineering, and medical sciences to political theory, anthropology, and literature. It provides an entry point for scholars of AI, science and technology into the diversity of feminist approaches to AI, and creates a rich dialogue between scholars and practitioners of AI to examine the powerful congruences and generative tensions between different feminist approaches to new and emerging technologies. It features original and essential works specially selected to span multiple generations of practitioners and scholars.

These contributors are also attuned to conversations at industry-level around the risks and possibilities that frame the drive to adopt AI. This collection reflects the increasingly blurred divide between the academy, industry and corporate research groups and brings interdisciplinary feminist insights together with postcolonial studies, disability theory, and critical race studies to confront ageism, racism, sexism, ableism, and class-based oppressions in AI.
This is an open access title available under the terms of a CC BY-NC-ND 4.0 International licence. It is free to read on the Oxford Academic platform and offered as a free PDF download from OUP and selected open access locations.

Browne / Cave / Drage Feminist AI jetzt bestellen!

Weitere Infos & Material


Professor Jude Browne is the Head of the Department of Politics and International Studies at the University of Cambridge and the Frankopan Director of the Centre for Gender Studies. She is the Principal Investigator (PI) on the Gender and Technology project (funded by Christina Gaw). She has published multiple books and is the inaugural winner of the Aaron Rapport Prize, as well as a University of Cambridge Pilkington Prize winner.

Dr Stephen Cave is Director of the Leverhulme Centre for the Future of Intelligence at the University of Cambridge. His research focuses on philosophy and ethics of technology, particularly AI, robotics and life-extension. He is the author of Immortality (Crown, 2012), a New Scientist book of the year, and Should We Want to Live Forever (Routledge, 2023); and co-editor of AI Narratives (Oxford University Press, 2020) and Imagining AI (Oxford University Press, 2023). He writes widely about philosophy, technology and society, including for the Guardian and Atlantic. He also advises governments around the world, and has served as a British diplomat.

Dr Eleanor Drage is a Christina Gaw Post-doctoral Research Associate at the Centre for Gender Studies, a Research Associate of the Leverhulme Centre for the Future of Intelligence, and a Research Associate at Darwin College, Cambridge. She examines how anti-racist and anti-sexist critical theory can be implemented at industry-level to develop ethical and socially transformative technological products. For her work, she has been recognised as one of Women in AI Ethics' Brilliant Women in AI Ethics for 2022. She has also spoken and written widely about gender, feminism and techology for outlets such as the UN, Natwest, and IAI TV.

Dr Kerry Mackereth is a Christina Gaw Post-doctoral Research Associate in Gender and Technology at the University of Cambridge Centre for Gender Studies, and a research associate at the Leverhulme Centre for the Future of Intelligence (LCFI). She is a former Girdlers Scholar and Gates Cambridge Scholar, and was recognised as one of Women in AI Ethics' Brilliant Women in AI Ethics for 2022. As of October 2022 she will be joining LCFI as a postdoctoral researcher on anti-Asian racism and AI, and will be a Visiting Fellow at UCL's Institute of Advanced Studies in 2023.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.