Kate Crawford

Kate Crawford (born 1974) is a researcher, writer, composer, producer and academic, who studies the social and political implications of artificial intelligence. She is based in New York and works as a principal researcher at Microsoft Research (Social Media Collective), the co-founder and former director of research at the AI Now Institute at NYU, a visiting professor at the MIT Center for Civic Media, a senior fellow at the Information Law Institute at NYU, and an associate professor in the Journalism and Media Research Centre at the University of New South Wales. She is also a member of the WEF's Global Agenda Council on Data-Driven Development.

Crawford’s research focuses on social change and media technologies, particularly on the intersection of humans, mobile devices, and social networks. Her research examines how AI affects various aspects of human life, such as gender, race, and economic status. She argues that AI systems are not neutral or objective, but rather reflect and reinforce existing systems of power and inequality. She also explores the environmental and ethical impacts of AI, as well as the historical and cultural contexts of its development. She has published on cultures of technology use and the way media histories inform the present, and has exhibited creative works in music and art at museums such as the Museum of Modern Art in New York and the Victoria and Albert Museum in London.

Background
Crawford was previously part of the Canberra electronic music duo B(if)tek (along with Nicole Skeltys) and released three albums between 1998 and 2003. Crawford co-founded the Sydney-based Deluxe Mood Recordings record label and is a member of the Clan Analogue music collective.

As a writer Crawford has written for The Sydney Morning Herald and Foreign Policy. She was a Fellow of the Centre for Policy Development and in March 2008 she was selected as one of 1000 Australians to attend the Australia 2020 Summit in Canberra on 19–20 April 2008.

She is a member of the feminist collective Deep Lab.

In 2017 Crawford established the research institute AI Now Institute with Meredith Whittaker. It is associated with New York University Tandon School of Engineering.

In 2019 she was the inaugural holder of the AI & Justice visiting chair at École Normale Supérieure in Paris, in partnership with the Fondation Abeona.

Writing and Speaking
Crawford has a PhD from the University of Sydney. In 2006 her book based on this dissertation, Adult Themes – Rewriting the Rules of Adulthood, won the individual category of the Manning Clark National Cultural Award and in 2008 she received the biennial medal for outstanding scholarship from the Australian Academy of the Humanities.

Crawford has spoken and published academic papers on such topics as social media,   government regulation of media content, the interplay between gender and mobile devices, young people and sexting, and big data. She has given keynote addresses at venues such as the 2013 O'Reilly Strata Conference and the 2013 DataEDGE conference hosted by the University of California, Berkeley School of Information. She co-authored a book titled Understanding the Internet: Language, Technology, Media, and Power in 2014.

She published a book Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence through Yale University Press in 2021. Her article, "Artificial Intelligence Is Misreading Human Emotion", was shortlisted for The Bragg UNSW Press Prize for Science Writing in 2022.

Art projects

 * Anatomy of an AI System: a large-scale map and essay that is intended to reveal the material, human, and ecological costs of the Amazon Echo device. It was created in collaboration with artist Vladan Joler and won the Beazley Design of the Year Award in 2019. It is in the permanent collection of the Museum of Modern Art in New York and the Victoria and Albert Museum in London.
 * Training Humans: In collaboration with artist and photographer Trevor Paglen, images used to train AI systems were exhibited at the Prada Foundation, Milan in 2019. This was intended to show that AI systems are subject to bias, can reflect existing systems of power, and may reinforce inequality. The exhibition was criticised for making use of facial images without informed consent.