Facebook content management controversies

Videos hosted by Facebook are given a higher priority and prominence within the platform and its user experience (including direct embedding within the News Feed and pages), giving a disadvantage to posting it as a link to the original external source.

[1][2] In August 2015, Facebook announced a video-matching technology aiming to identify reposted videos, and also stated its intention to improve its procedures to remove infringing content faster.

Having previously refused to delete such clips under the guideline that users have the right to depict the "world in which we live", Facebook changed its stance in May, announcing that it would remove reported videos while evaluating its policy.

[7] Two days later, Facebook removed a video of a beheading following "worldwide outrage", and while acknowledging its commitment to allowing people to upload gory material for the purpose of condemnation, it also stated that it would be further strengthening its enforcement to prevent glorification.

[13] In 2017, a Facebook video of Libyan National Army (LNA) special forces commander Mahmoud al-Werfalli was uploaded showing him shooting dead three captured fighters.

Erin Saltman, Facebook's policy manager for counterterrorism in Europe, Middle East and Africa, told BBC Arabic, "Sometimes there are very conflicting narratives of whether or not the victim is a terrorist, or whether it's a civilian over who's committing that act, we cannot be the pure arbiters of truth.

The groups promoted dramatic weight loss programs, shared extreme diet tips, and posted pictures of emaciated girls under "Thinspiration" headlines.

"[35] In Italy in 2009, the discovery of pro-mafia groups, one of them claiming Bernardo Provenzano's sainthood, caused an alert in the country[36][37][38] and brought the government to rapidly issue a law that would force Internet service providers to deny access to entire websites in case of refused removal of illegal contents.

The segment also included an exposé of a 2006 accident, in which an eighteen-year-old student out for a drive fatally crashed her father's car into a highway pylon; trolls emailed her grieving family the leaked pictures of her mutilated corpse.

[52] Following a campaign that involved the participation of Women, Action and the Media, the Everyday Sexism Project and the activist Soraya Chemaly, who were among 100 advocacy groups, Facebook agreed to update its policy on hate speech.

[54] In June 2015, the UK National Society for the Prevention of Cruelty to Children raised concerns about Facebook's apparent refusal when asked to remove controversial video material which allegedly showed a baby in emotional distress.

[60] Senator Risa Hontiveros responded to the incidents with the proposal of a law that would impose "stiff penalties" on such group members, stating that "These people have no right to enjoy our internet freedom only to abuse our women and children.

[61] According to the study commissioned by Meta and carried out by Business for Social Responsibility (BSR), Facebook and Instagram's policies during Israeli attacks on the Gaza Strip in 2021 harmed the fundamental human rights of Palestinians.

BSR's report is yet another indictment of the company's ability to police its global public square and to balance freedom of expression against the potential for harm in a tense international context.

[64] In March 2019, Facebook subsidiary Instagram declined to remove an anti-semitic image posted by right-wing conspiracy theorist Alex Jones, saying that it did not violate their community standards.

In Facebook's Dublin, Ireland headquarters, six individuals were determined to be "high priority" victims of the error, after the company concluded that their profiles were likely viewed by potential terrorists in groups such as ISIS, Hezbollah and the Kurdistan Workers' Party.

[81] At a conference called Techonomy, Mark Zuckerberg stated in regards to Donald Trump, "There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news".

"[90] Sri Lankan telecommunications minister Harin Fernando stated that Facebook had been too slow in removing content and banning users who were using its platforms to facilitate violence during the riots.

[91] In April 2019, during the aftermath of the Easter bombings, the Sri Lankan government blocked access to Facebook, Instagram and WhatsApp in an effort to stop the spread of misinformation that could lead to further violence.

Myanmar's relatively recent democratic transition did not provide the country with substantial time to form professional and reliable media outlets free from government intervention.

"[103] On 6 December 2021, approximately a hundred Rohingya refugees launched a $150 billion lawsuit against Facebook, alleging that it did not do enough to prevent the proliferation of anti-Rohingya hate speech because it was interested in prioritizing engagement.

[116] On 23 January 2025, Rohingya human rights activist and genocide survivor Maung Sawyeddollah filed a whistleblower lawsuit against Meta before the U.S. Securities and Exchange Commission (SEC).

[125][126][127] The Epoch Times, an anti-Chinese Communist Party (CCP) newspaper affiliated with Falun Gong, has spread misinformation related to the COVID-19 pandemic in print and via social media including Facebook and YouTube.

[128] In April 2020, rumors circulated on Facebook, alleging that the US Government had "just discovered and arrested" Charles Lieber, chair of the Chemistry and Chemical Biology Department at Harvard University for "manufacturing and selling" the novel coronavirus (COVID-19) to China.

[citation needed] In August 2021, Facebook said that an article raising concerns about potentially fatal effects of a COVID-19 vaccine was the top-performing link in the United States between January and March 2021, and that another site publishing COVID-19 misinformation was among its top 20 visited pages.

[134] In February 2022, Facebook was accused by the Bureau of Investigative Journalism and The Observer of letting activists incite ethnic massacres in the Tigray War by spreading hate and misinformation.

[135] Following the report, a lawsuit against Meta was filed in December 2022 in the High Court of Kenya by the son of a Tigrayan academic murdered in November 2021 after receiving racist attacks on the platform.

Among the changes, Facebook will discontinue internal fact-checking in favor of a "Community Notes" system similar to X, in which contextual addendums may be added to posts with agreement from other users.

In addition, Meta relaxed certain moderation practices to focus more on severe and illegal content; these include relaxations of certain "Hateful Content" policies to comply with "mainstream discourse", including allowing the targeting of others by "protected characteristics" in connection to the spread of COVID-19, allowing "allegations of mental illness or abnormality when based on gender or sexual orientation, given political and religious discourse about transgenderism and homosexuality and common non-serious usage of words like 'weird.

Human Rights Campaign (HRC) president Kelley Robinson stated that "while we understand the difficulties in enforcing content moderation, we have grave concerns that the changes announced by Meta will put the LGBTQ+ community in danger both online and off.

An example of a Facebook post censored due to an unspecified conflict with "Community Standards"
Error message generated by Facebook for an attempt to share a link to a website that is censored due to Community Standards in a private chat. Messages containing certain links will not be delivered to the recipient.