Algorithmic radicalization

[1][2][3][4] Algorithmic radicalization remains a controversial phenomenon as it is often not in the best interest of social media companies to remove echo chamber channels.

Social media platforms learn the interests and likes of the user to modify their experiences in their feed to keep them engaged and scrolling, known as a filter bubble.

"[11] Around March 19, 2024, a New York state judge ruled Reddit and YouTube must face lawsuits in connection with the mass shooting over accusations that they played a role in the radicalization of the shooter.

[15] In an August 2019 internal memo leaked in 2021, Facebook has admitted that "the mechanics of our platforms are not neutral",[16][17] concluding that in order to reach maximum profits, optimization for engagement is necessary.

Al-Qaeda and similar extremist groups have been linked to using YouTube for recruitment videos and engaging with international media outlets.

[22] The results of the study showed that YouTube's algorithm recommendations for extremism content factor into the presence of radical keywords in a video's title.

[23] Multiple studies have found little to no evidence to suggest that YouTube's algorithms direct attention towards far-right content to those not already engaged with it.

[27] Since TikTok's inception, the app has been scrutinized for misinformation and hate speech as those forms of media usually generate more interactions to the algorithm.

[28] Various extremist groups, including jihadist organizations, have utilized TikTok to disseminate propaganda, recruit followers, and incite violence.

[29] As of 2022, TikTok's head of US Security has put out a statement that "81,518,334 videos were removed globally between April – June for violating our Community Guidelines or Terms of Service" to cut back on hate speech, harassment, and misinformation.

For example, in early 2023, Austrian authorities thwarted a plot against an LGBTQ+ pride parade that involved two teenagers and a 20-year-old who were inspired by jihadist content on TikTok.

[29] Another case involved the arrest of several teenagers in Vienna, Austria, in 2024, who were planning to carry out a terrorist attack at a Taylor Swift concert.

The investigation revealed that some of the suspects had been radicalized online, with TikTok being one of the platforms used to disseminate extremist content that influenced their beliefs and actions.

House Democrats Anna Eshoo, Frank Pallone Jr., Mike Doyle, and Jan Schakowsky introduced the "Justice Against Malicious Algorithms Act" in October 2021 as H.R.

An infographic from the United States Department of Homeland Security's "If You See Something, Say Something" campaign. The campaign is a national initiative to raise awareness to homegrown terrorism and terrorism-related crime.