User:Morgan such/sandbox

Topics relating to gender/algorithmic bias:

Person or thing?

In the last two years, multiple studies have been conducted in order to highlight the gender bias as it relates to machine learning and unintentional stereotyping. In the first of these studies, an experiment was constructed around the ability of AI systems to classify certain nouns as a person, place, or thing. When given a variety of different simple sentences containing a male/female name performing some kind of action, a greater percentage of error was found when attempting to classify female names as a person than male names. These errors include a wide range of mistakes such as classifying a female name as a city or thing instead of a person. The results of this experiment display the gender bias as it is present in AI entity recognition which is a widely used form of artificial intelligence in today's world (Mehrabi ).

The second study performed pertaining to this topic used a similar concept, however, measured AI stereotyping of common roles in the workforce rather than entity recognition. In this experiment, the AI was given a more complex sentence in which there appeared two nouns, most likely male or female names, mapped to two different occupations in a modern day industry. Machine bias was tested by observing which occupation the male or female name was mapped to given the grammar and context of the sentence. The results of this study were as expected, displaying gender bias in a seemingly neutral AI environment. Female names were more commonly mapped to stereotypical jobs for women or family positions such as "caretaker", while their male counterparts were mapped to more dominant industry positions with a less family oriented bias. When introducing a name swap in order to skew the bias of the results, the experiment was conducted successfully without a present gender bias (Zhao ).

Google Translate

There has also been recent discrepancy surrounding the bias present in modern day machine translation as it relates to various languages. Google translate can act as a primary example of this phenomena, however, machine bias has been replicated across the board when using all of the top 6 most popular machine language translators. In these experiments, various non-gendered nouns were presented in the context of a simple sentence and translated into a multitude of various languages. Results of these trials proved a gender bias was present in every single translating platform as stereotypical male nouns and occupations were translated with a male identity as well as female nouns/occupations gaining a feminine gender identity. These results were consistent across every language, even those containing a gender neutral schema (Prates ). When attempting to over correct this gender bias by associating common feminine adjectives along with the given noun in the sample sentence, the gender bias effect was re corrected and the machine translator converted the sentence with the proper gender pronoun (Stanovsky ).

Wikipedia as a gender biased platform

Recently, Wikipedia itself has been receiving evaluations concerning its presence as a gender biased platform. When an analysis of Wikipedia contributors was conducted, it was found that the main editors present on this platform belong to the white male category. Taking this further, it was concluded that male contributors on the site outnumber females but a ratio of about 10:1. with female editors representing only about 16% of the total contributor population (Graells-Garrido ). The results of experiments conducted to analyze this gap display a male monopoly concerning the information displayed on Wikipedia which in effect affects the perception of females in a sense of global informational representation. The analysis of these trials also displayed statistics pertaining to the type of language used in male vs. female articles present on the site. The results displayed a trend linking men to "null gender" which is a societal bias that assumes male as the baseline or standard gender in most social situations. This trend affects female editors by classifying them into separate groups pertaining to strictly female identifying editors rather than including them in the male categorizations for authors, therefor affecting representation and site neutrality (Wagner ).