User:DrPronoun/Algorithmic radicalization

Algorithmic radicalization (or radicalization pipeline) is the concept that algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Through echo chamber channels, the consumer is driven to be more polarized through preferences in media and self-confirmation.

Companies' Response
In an internal memo in August 2019, Facebook has admitted that "the mechanics of our platforms are not neutral", concluding that in order to reach maximum profits, optimization for engagement is necessary. In order to increase engagement, algorithms have found that hate, misinformation, and politics are instrumental for app activity. As referenced in the memo, "The more incendiary the material, the more it keeps users engaged, the more it is boosted by the algorithm." According to a 2018 study, "false rumors spread faster and wider than true information... They found falsehoods are 70% more likely to be retweeted on Twitter than the truth, and reach their first 1,500 people six times faster. This effect is more pronounced with political news than other categories."

Self-radicalization
The U.S. department of Justice defines 'Lone-wolf' (self) terrorism as "someone who acts alone in a terrorist attack without the help or encouragement of a government or a terrorist organization". Through social media outlets on the internet, 'Lone-wolf' terrorism has been on the rise, being linked to algorithmic radicalization. Through echo-chambers on the internet, viewpoints typically seen as radical were accepted and quickly adopted by other extremists. These viewpoints are encouraged by forums, group chats, and social media to reinforce their beliefs.

The Social Dilemma
"The Social Dilemma" was a 2020 docudrama about how algorithms behind social media enables addiction, while possessing abilities to manipulate people's views, emotions, and behavior to spread disinformation. The film repeatedly uses buzz words such as 'echo chambers' and 'fake news' to prove psychological manipulation on social media, therefore leading to political manipulation. In the film, we follow Ben as he falls deeper into a social media addiction as the algorithm found that his social media page has a 62.3% chance of long-term engagement. This leads into more videos on the recommended feed for Ben and he eventually becomes more immersed into propaganda and conspiracy theories, becoming more polarized with each video.

Section 230
In the Communications Decency Act of 1996, section 230 states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider". Section 230 protects the media from liabilities or being sued of third-party content, such as illegal activity from a user. However, this approach reduces a company's incentive to remove harmful content or misinformation. This loophole has allowed social media companies to maximize profits through pushing radical content without legal risks.