Facebook Amps Up Its Crackdown on QAnon

Facebook, facing criticism that it hasn’t done enough to curb a fast-growing, fringe conspiracy movement, said on Tuesday that it would remove any group, page or Instagram account that openly identified with QAnon.

The change drastically hardens earlier policies outlined by the social media company. In August, Facebook unveiled its first attempt to limit the spread of QAnon, by establishing policies that barred QAnon groups that called for violence.

But hundreds of other QAnon groups and pages continued to spread on the platform, and the effort was considered a disappointment in many circles, including among Facebook employees.

On Tuesday, Facebook acknowledged that its previous policies had not gone far enough in addressing the popularity of the far-right conspiracy movement.

“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” Facebook said in a public post.

Since Facebook’s initial ban, QAnon followers had found ways to evade the rules. The group dates back to October 2017, but has experienced its largest increase in followers since the start of the pandemic.

At its core, QAnon is a sprawling movement that believes, falsely, that the world is run by a cabal of Satan-worshiping pedophiles who are plotting against President Trump. It has branched into a number of other conspiracies, including casting doubt on medical advice for dealing with the pandemic, like wearing masks.

On Facebook, QAnon has attracted new followers by adopting tactics such as renaming groups and toning down the messaging to make it seem less jarring. A campaign by QAnon to co-opt health and wellness groups as well as discussions about child safety drew thousands of new people into its conspiracies in recent months.

Researchers who study the group said that QAnon’s shifting tactics had initially helped it skirt Facebook’s new rules, but that the policies announced on Tuesday were likely to tighten the screws on the conspiracists.

“Facebook has been instrumental in the growth of QAnon. I’m surprised it has taken the company this long to take this type of action,” said Travis View, a host of “QAnon Anonymous,” a podcast that seeks to explain the movement.

Since QAnon has become a key source of misinformation on a number of topics, Mr. View said, the action announced by Facebook is likely to have a far-reaching impact in “slowing the spread of misinformation on Facebook and more generally across social media.”

Nearly 100 Facebook groups and pages, some with tens of thousands of followers, have already been affected by the changes, according to a survey conducted by The New York Times using Crowdtangle, a Facebook-owned analytics tool.

Facebook said that it had begun to enforce the changes on Tuesday, and that it would take a more proactive approach to finding and removing QAnon content, rather than relying on people to report content.

source: nytimes.com