User:Caitlynxliu/sandbox

History

PhotoDNA technology has evolved into a version called PhotoDNA Cloud Services, which aims to ease, and cut costs on, implementation for qualified customers and developers. PhotoDNA Cloud Services is not available for law enforcement to use. Unlike PhotoDNA technology, the PhotoDNA Cloud Services are only limited to images at this time. PhotoDNA Cloud Ser

In 2016, Farid also reported that PhotoDNA removed over 10 million CSAM images with no disputes.

How PhotoDNA is used

A researcher visited the National Center for Missing and Exploited Children (NCMEC) and shadowed an analyst to see PhotoDNA in action. The analyst uploaded a prospective CSAM to PhotoDNA. The software then output images with similar hashes from a database. Once the search retrieved several hundred photos, the analyst marks any images that are clearly–to the analyst’s eyes–not a match or are not part of the set. This helps PhotoDNA’s algorithm recognize for future queries that these photos are not an accurate result, also known as supervised learning or training an algorithm. Finally, the analyst examines the results to determine whether or not further investigation is needed. For example, during this visit, the analyst’s query produced a series of photos where a child appears to be growing across images. The analyst communicated that this raises concerns because it is likely that the child is still being abused.

Challenges and Criticisms

A challenge that developers had to navigate is ensuring that PhotoDNA would not interfere with business interests of technology companies.

Some scholars question how effective image detection algorithms, such as PhotoDNA, are at detecting CSAM due to algorithmic bias. Specifically, some argue that that the algorithmic management of race subjects algorithms to be biased towards detecting CSAM featuring victims with lighter skin-tones, than that of victims with darker skin-tones.