Facebook
FacebookiStock

A just released study by an advocacy group has concluded that Facebook has not done enough to reign in toxic content posted by extremist groups with ties to far right ideologies, such as the boogaloo and militia movements. Both have been implicated in "violence-glorifying content in the heat of the 2020 election."

Avaaz, which bills itself as a "global web movement to bring people-powered politics to decision-making everywhere," conducted an "analysis of the steps Facebook took throughout 2020" to combat hate and extremism on its platform.

They found that if Facebook had been more hands on in "adopting civil society advice and proactively detoxing its algorithm, it could have stopped 10.1 billion estimated views of content from top-performing pages that repeatedly shared misinformation over the eight months before the US elections."

Furthermore, Facebook has rolled back many of its emergency policies that went into effect during the 2020 elections, with its algorithms designed to detect problematic content returning to the "status quo that allowed conspiracy movements like QAnon and Stop the Steal to flourish."

The study found 267 problematic pages and groups, with a following of 32 million, that it stated where posting "violence-glorifying content in the heat of the 2020 election."

Of those pages, 68.7 percent shared imagery and conspiracy theories linked to far right movements. Many of today's far right groups have anti-Semitic ideologies or invoke classic anti-Semitic conspiracy theories.

Avaaz found that even with "clear violations of Facebook's policies," 118 of the 267 groups and pages are still active with a total following of 27 million.

The group made four recommendations to "protect democracy." Transparency, detox the algorithm, correct the record and reform 230, referring to Section 230 which is a U.S. law that provides immunity from civil liability to online service platforms whose content may be deemed "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable."

They also made six recommendations to the Biden administration for implementation, including "adopting a national disinformation strategy."

The study did not examine problematic content on Facebook linked to far left groups or that promotes terrorism.