User:Kyl3htran/Algorithmic Justice League

Olay Decode the Bias Campaign
In September 2021, Olay collaborated with AJL and ORCAA to see if Olay's Skin Advisor System shows any potential harmful bias against women of color. The audit revealed that the system skin age estimate was more accurate for lighter skin tones and more accurate for ages 30-39, less personalized "best zone" and "improvement zone" across all users, used the Fitzpatrick Sin Type and ITA skin classification scales that leaned towards lighter skin tones, more designed for women than other gender identities, and could improve consent of data practices. In result, Olay has taken steps to be inclusive by sending 1,000 girls to Black Girls CODE Camp to pursue STEM careers.

CRASH Project
In July 2020, the Community Reporting of Algorithmic System Harms (CRASH) Project was launched by AJL. The project began in 2019 when Joy Buolamwini and Camille François met at the Bellagio Center Residency Program. The project aimed to find individuals who were interested in helping to better AI against racial bias. In June 2021, AJL updated that they had taken all their findings on bug bounties research and combined it into a brief where it has been submitted to a new developing platform. AJL states that this upcoming platform would help people report algorithmic harm which would help companies develop improved AI systems.

Coded Bias Film
Coded Bias is a film that was released on January 26, 2020 directed by Shalini Kantayya, which follows the world of facial recognition that Joy Buolamwini discovered that the system mostly recognized white faces rather than faces like Joy's. It's described how it "exposed prejudice and threats to civil liberty in facial recognition algothrim and artificial intelligence" as stated by the PBS website.

Voicing Erasure project
In March 2020, Algorithmic Justice League released a project titled Voicing Erasure Project. A project performed by various women and led by Allison Koenecke. These Automated speech recognition (ASR) systems are known as sophisticated machines that identifies and converts spoken language into text. In this project they address the notion of racial bias in speech recognition algorithms. They also examine various ASR systems that are developed by Amazon, Apple, Google, IBM, and Microsoft - through the process, they transcribe the interviews they conducted with 42 white individuals and 73 black individuals in order to demonstrate the racial disparities based on the performance of five commercial ASR systems. The end results determined racial disparities in all five commercials, with an average word error rate of 0.35 for black speakers, in comparison to 0.19 for white speakers. Therefore depicting machine learning systems as a database heavily relying on English that is spoken by white Americans.

EPIC Champions of Freedom Awards Innovating Privacy
The National Press Club Washington, DC hosted the EPIC Campions of Freedom Award Innovating Privacy on November, 3, 2021. They had the honor to have Joy Buolamwini from the AJL. She was awarded the 'Privacy Champion' by last years awarded nominee Brandi Collins-Dextor. To which she ended her speech with a poem for the Brooklyn Tenant.