Twitter Takedown Targets QAnon Accounts

OAKLAND, Calif. — Twitter said Tuesday evening that it had removed thousands of accounts that spread messages about the conspiracy theories known as QAnon, saying their messages could lead to harm and violated Twitter policy.

Twitter said it would also block trends related to the loose network of QAnon conspiracy theories from appearing in its trending topics and search, and would not allow users to post links affiliated with the theories on its platform.

It was the first time that a social media service took sweeping action to remove content affiliated with QAnon, which has become increasingly popular on Twitter, Facebook and YouTube.

Facebook is preparing to take similar steps to limit the reach of QAnon content on its platform, said two Facebook employees with knowledge of the plans, who spoke on the condition of anonymity. The company has been coordinating with Twitter and other social media companies and plans to make and announcement next month, the employees said. Facebook declined to comment.

The QAnon theories stem from an anonymous person or group of people who use the name “Q” and claim to have access to government secrets that reveal a plot against President Trump and his supporters. That supposedly classified information was initially posted on message boards before spreading to mainstream internet platforms and has led to significant online harassment as well as physical violence.

“QAnon is not conventional political discourse,” Alice Marwick, an associate professor of communication at the University of North Carolina at Chapel Hill. “It’s a conspiracy theory that makes wild claims and baseless accusations about political actors and innocent people alike.”

Over several weeks, Twitter has removed 7,000 accounts that posted QAnon material, a company spokeswoman said. The accounts had been increasingly active, and had been involved in coordinated harassment campaigns on Twitter or tried to evade a previous suspension by setting up new accounts after an old account was deleted.

An additional 150,000 accounts will be hidden from trends and search on Twitter, the spokeswoman added. The takedowns were reported earlier by NBC News.

“These accounts amplify and enable networked harassment on a level that’s clearly against the Twitter terms of service,” Ms. Marwick said. “But this won’t stop QAnon from operating. It’s multiplatform and really good at adapting as media ecosystems change.”

In May, Facebook removed a cluster of five pages, 20 Facebook accounts and six groups affiliated with QAnon, saying they had violated its policy against coordinated inauthentic behavior.

After years of taking a hands-off approach to content moderation, Twitter has acted more aggressively in recent months to stem the flood of abuse and harassment on its platform.

Since it became a venue for disinformation during the 2016 U.S. presidential election, Twitter has cracked down on content that spreads false information or encourages harassment. In February, it introduced a ban against manipulated photos and videos, a popular method of tricking viewers and spreading disinformation. And in May, it began labeling some of Mr. Trump’s tweets, saying they contained false information or promoted violence.

Twitter’s aggressive enforcement actions have put it on a collision course with Mr. Trump, who has said that Twitter is unfairly silencing conservative voices and has encouraged regulators to crack down on the service. While the QAnon ban was applauded in many circles, some conservatives said Twitter’s move was further evidence that the company unevenly enforced its rules against Mr. Trump’s supporters.

The political attention has added to Twitter’s headaches. A wide-ranging hack last week compromised the Twitter accounts of Democratic political figures, including former Vice President Joseph R. Biden Jr. and former President Barack Obama. Twitter also faces concerns that advertisers are tightening spending during the coronavirus pandemic, and is expected to report its second-quarter earnings this week.

More than two years after QAnon emerged from the troll-infested corners of the internet, supporters of the movement, which the F.B.I. has labeled a potential domestic terrorism threat, are trickling into the mainstream of the Republican Party. Precisely how many candidates, mostly Republicans, are running under the QAnon banner is unclear. Some estimates put the number at a dozen, and few are expected to win in November.

A number of the candidates have sought to spread a core tenet of the QAnon conspiracy: that Mr. Trump ran for office to save Americans from a so-called deep state filled with child-abusing, devil-worshiping bureaucrats. According to QAnon, backing the president’s enemies are prominent Democrats who, in some telling, extract hormones from children’s blood.

The president has repeatedly retweeted QAnon supporters and cheered on candidates who openly support the conspiracy theory, like Marjorie Taylor Greene, a Republican House candidate in Georgia.

“A big winner. Congratulations!” Mr. Trump tweeted after Ms. Greene, whose ads have been banned by Facebook for violating its terms of service, placed first in her primary.

Some QAnon followers have diverted their attention from political causes. Recent QAnon campaigns on Twitter have focused on Wayfair, a furniture and décor company, and Chrissy Teigen, a model and cookbook author who recently said she had blocked one million accounts affiliated with QAnon. She called on Twitter to take action after she became a target of harassment.

QAnon theories share similar themes with Pizzagate, a conspiracy theory popularized ahead of the 2016 presidential election that advanced the baseless notion that the Democratic nominee, Hillary Clinton, and party elites were running a child sex-trafficking ring out of a Washington pizzeria. In December 2016, a vigilante gunman showed up at the restaurant with an assault rifle and opened fire into a closet, and social media companies fear that they could be linked to similar incidents if they allow conspiracy theories to thrive on their platforms.

Facebook, Twitter and YouTube managed to largely suppress that Pizzagate conspiracy theory, but as the presidential election nears it has appeared to rebound on those platforms and newer ones, like TikTok.

Reporting was contributed by Sheera Frenkel, Matthew Rosenberg, Jennifer Steinhauer and Kevin Roose.

source: nytimes.com