Facebook is banning all QAnon accounts from its platforms, starting on Tuesday. This action is a significant escalation for Facebook and is one of the broadest rules it has ever put in place.
Facebook created a policy in August that initially only removed accounts related to the QAnon conspiracy theory that discussed violence. That policy resulted in the termination of 1,500 pages, groups, and profiles. Facebook has since updated the original policy to include the ban on all QAnon-related accounts.
A spokesperson for Facebook said the updated policy will "bring to parity what we've been doing on other pieces of policy with regard to militarized social movements."
"Starting today, we will remove Facebook Pages, Groups and Instagram accounts for representing QAnon. We're starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks," Facebook said in a statement. "Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports."
QAnon is a conspiracy theory that believes high-profile Democrats and Hollywood celebrities are members of a child-eating group that is being secretly taken down by President Trump. The conspiracy theory relies on posts from Q, an anonymous user of an extremist message board.
Facebook said that it's "not going after individual posts," but rather whole accounts that spread the conspiracy theory.
The QAnon community has pushed the conspiracy theory that Trump is not sick with COVID-19. Instead, the group believes that Trump is carrying out secret missions in a war.
QAnon accounts are also known for spreading misinformation of coronavirus since many of the followers do not believe the virus exists or that it is as deadly as scientists say.
"We have to think about the QAnon networks as the rails upon which misinformation is driven. Every account, event, and page are tracks where disinformation can be spread. so it is imperative that Facebook dismantle their infrastructure. Without Facebook, they are not rendered inert, but it will make it more difficult to quickly spread disinformation," said Joan Donovan, research director of the Shorenstein Center on Media, Politics, and Public Policy at the Harvard Kennedy School.
Donovan said she wished Facebook had acted sooner.
"Of course, this all could have been done sooner, before Q factions aligned with militia groups and anti-vaxxers, to curtail the spread of medical misinformation and the mobilization of vigilante groups," Donovan said.
With the updated policy in place, Facebook now faces the difficult task of identifying accounts and enforcement. After Facebook enforced the first policy in August, QAnon groups and followers changed their mode of operation to evade moderation. They began dropping explicit references to Q and "camouflaging" QAnon content under hashtags purportedly about protecting children.