Facebook said in a Tuesday press release that it “will remove any Facebook Pages, Groups and Instagram accounts representing QAnon” from its platforms. Although it’s unclear how Facebook is defining affiliations with QAnon accounts, this announcement appears to be one of the broadest bans Facebook has ever imposed.
The new ban expands on the social network’s previous actions against the conspiracy theory and its followers. In August, Facebook announced that it had removed hundreds of QAnon Facebook Groups and Pages for “discussions of potential violence.” The company now says it will remove such Pages and Groups “even if they contain no violent content.” The announcement also comes after Facebook’s announcement last week that it will promote credible information about child safety, after QAnon hijacked related hashtags like #SaveTheChildren.
“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” the company explained in a Tuesday statement. “For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public.”
Recode has contacted Facebook for comment.
Facebook has in the past struggled to enforce its rules against accounts that promote the QAnon conspiracy theory. However, in today’s press release, Facebook noted that QAnon frequently changes its messaging strategies in order to evade content moderators and that it will take time for the social network to fully scale up enforcement of this latest policy update.
“We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary,” said the statement. The New York Times reported that, as of Tuesday night, almost 100 QAnon accounts had already been affected, according to an analysis from the Facebook-owned tool CrowdTangle.
Facebook’s announcement also comes less than a week after members of the US House of Representatives voted to formally condemn the conspiracy theory. As the November 3 election draws ever closer, the social media company has faced growing criticism over its handling of misinformation and potentially violent groups that can organize on its platforms. QAnon has factored into some of those concerns, as it’s been linked to violence in the past and its adherents have at times spread dangerous misinformation, including about the recent West Coast wildfires.
Some feel that Facebook has a role in controlling the spread of QAnon but that the company’s updated policy comes way too late. As Recode’s Shirin Ghaffary has reported, before its latest announcement, Facebook seemed unwilling to take simple steps to remove Groups and Pages that promoted QAnon, meaning people could easily find content about the conspiracy theory.
Others question whether Facebook’s updated approach will have much impact. Adam Enders, a political scientist at the University of Louisville who studies conspiracy theories, says it’s not clear how many followers of QAnon there actually are, since many who might be sympathetic to QAnon are generally sympathetic to other vague conspiratorial thoughts about the “deep state.”
“There’s not much movement on the needle to be had,” Enders told Recode. “The people that are part of these groups that are down the rabbit hole — true believers — they’re just going to find a different platform.”
“Removing some QAnon groups that people are pretty unlikely to be incidentally exposed to in the first place isn’t going to impact those people very much,” he added. “It’s just going to be, probably, a minor annoyance to the actual true believer.”
Other groups that have been critical of Facebook, including Accountable Tech, Media Matters, and the Anti-Defamation League (which helped organized the advertising boycott of Facebook earlier this year), emphasized that the impact of the move will depend on how effective the company is at finding and removing QAnon content.
“Their announcement acknowledged several important truths — that enforcement at the individual post level cannot counter hate and disinformation; that content need not explicitly support violence to bring about real-world harms; and that without aggressive deterrence, these platforms will continue to serve as critical organizing and recruitment tools for extremist movements,” said Accountable Tech’s co-founder Jesse Lehrich in a statement.
It will take time for the impact of this shift in Facebook’s policies around QAnon to come into focus. In the meantime, many doubt the company is truly invested in or would be able to clamp down on the theory, especially with a high-stakes election just weeks away.
Update, October 6, 8 pm: Updated to include additional commentary from experts and advocacy groups.
Millions turn to Vox each month to understand what’s happening in the news, from the coronavirus crisis to a racial reckoning to what is, quite possibly, the most consequential presidential election of our lifetimes. Our mission has never been more vital than it is in this moment: to empower you through understanding. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone make sense of an increasingly chaotic world: Contribute today from as little as $3.