User:Princepine11/sandbox

"How Normal Am I" is an interactive experience created by Tijmen Schep as a part of the European Union's Sherpa Research Project to allow people to explore the ways that artificial intelligence and algorithms perceive them. The tool, released to the public in October 2020, uses algorithms to determine the user's age, gender, BMI, life expectancy, and overall normalcy score, as well as giving a comprehensive "beauty score", based on algorithms that rank attractiveness. The algorithms that were used to give a "beauty score" can be found on Github here and here. The algorithms used in predicting age, gender, and facial expression are from FaceApiJS. The BMI prediction algorithm was created by Tijmen Schep specifically for this project. Tiktok's algorithm has been regarded as especially effective, but many were left to wonder at the exact programming that caused the app to be so effective in guessing the user's desired content. Tiktok released a statement regarding the "For You" page, and how they recommend videos to users. In this statement, published June 18, 2020, Tiktok stated that they use "account user preferences as expressed through interactions with the app, like posting a comment or following an account ". The statement then went into further detail, stating that "user interactions", "video information" , and "device and account settings" are also taken into account when culminating a user's specialized For You page. These factors are then weighed "based on their value to a user" - for example, a video that is watched several times will have more weight than a video only watched for a few seconds. Tiktok is an app that has come under fire for their filter bubble , as well as for "shadowbanning" users who create or promote content regarding the experiences of marginalized groups. The app released a statement in October of 2020 detailing how their algorithm promotes content to users, and addressed the concept of a filter bubble directly, stating that there "is a risk of presenting an increasingly homogenous stream of videos", but assuring users that they take this issue seriously. Facial recognition systems have been criticized for upholding and judging based on a binary gender assumption. When classifying the faces of cisgender individuals into male or female, these systems are often very accurate, however were typically confused or unable to determine the gender identity of transgender and non-binary people. Gender norms are being upheld by these systems, so much so that even when shown a photo of a cisgender male with long hair, algorithms was split between following the gender norm of males having short hair, and the masculine facial features and became confused. This accidental misgendering of people can be very harmful for those who do not identify with their sex assigned at birth, by disregarding and invalidating their gender identity. This is also harmful for people who do not ascribe to traditional and outdated gender norms, because it invalidates their gender expression, regardless of their gender identity.