Skip to main content
  1. Home
  2. Social Media
  3. News

Facebook yanks large QAnon conspiracy group off platform

Add as a preferred source on Google
 

Facebook has reportedly removed one of the most popular Facebook groups associated with the viral, far-right conspiracy theory group QAnon for violating its community guidelines.

Recommended Videos

The group, Official Q/Qanon, had a following of over 200,000 members and was reportedly removed on August 4 for specific community violations and fringe conspiracy theories that could lead to harm, according to Reuters.

The move follows Twitter’s decision in July to ban 7,000 QAnon-related accounts, citing that the messages shared by the group could potentially lead to harm.

A New York Times report last month said Facebook, Twitter, and other social media sites, have been working together to take similar steps to moderate the growing popularity of QAnon content on each platform — an effort to quell the spread of misinformation and the potential threat of physical, real-world violence, such as 2016’s “Pizzagate” shooting.

This is not the first time Facebook has taken action against the group. In May, the company said it took down 700 accounts and nearly 800 pages for coordinated inauthentic behavior and manipulating public debate, many of which were based out of Russia and Iran.

This appears to be Facebook’s first time taking direct and public action against a group that has proven to use the platform in a militant-style to spread misinformation and hate. However, unlike Twitter, Facebook has yet to publicly announce the group’s ban. Even with Official Q/Qanon removed, the lack of an updated Facebook policy means most of the members will simply turn to smaller groups on the site.

Thanks to social media, QAnon conspiracies are no longer fringe ideas. The group spawned from a rumor in 2017 about supposed efforts to undermine Trump. The “Q” in QAnon refers to a single person, or group, within the administration with access to confidential government information that is said to reveal a plot against the president. Since the coronavirus pandemic, the group has latched onto the public health crisis and turned it into a political debate, discourse that is primarily spread through sites like Facebook and Twitter and often receive viral or trending status — exposing the widely debunked rhetoric to millions of eyes.

Digital Trends has reached out to Facebook for comment and will update this story when we hear back.

Facebook’s decision to remove the Official Q/Qanon group from its platform is unlikely to extinguish the group’s existence, nor prominence in popular culture. According to a Facebook statement to Reuters, the company is monitoring other QAnon groups as “it strengthens enforcement,” but without a ban on the group entirely, members of banned chat groups will merely reform new ones.

Critics and experts have called QAnon members “really good at adapting” to online ecosystems, and several QAnon supporters are running for public office on platforms that represent the conspiracy theories shared within the group.

In recent months, Facebook has been on defense of its infamous “hands-off” approach to moderating content on its platform. When Black Lives Matter protests spread across the country, Facebook chose not to take action against President Donald Trump’s “when the looting starts, the shooting starts” post, even as other platforms did. The decision proved to be damaging: Over 200 advertisers announced an advertising boycott of Facebook. One advertiser noted its decision to join the campaign was because its ad was placed next to a video touting QAnon conspiracies. Facebook addressed the concerns of the public, and its advertisers, in a plan to fight hate speech, but many said it was not enough.

Meira Gebel
Former Digital Trends Contributor
Meira Gebel is a freelance reporter based in Portland. She writes about tech, social media, and internet culture for Digital…
X is closing communities. But hey, you now have custom timelines and group chats
From forums to feeds and chats
X Twitter Custom Timelines

X is making one of its biggest structural changes in years, and it’s not subtle. A core feature is getting shut down, but in its place, the platform is doubling down on AI-driven feeds and real-time chats.

Why is X shutting down Communities?

Read more
Instagram’s new Instants app is basically Snapchat all over again
Disappearing photos, one-time views, and a familiar playbook.
Instagram Instants Featured

Instagram is once again going back to its roots, or at least, borrowing heavily from someone else’s playbook. Because this time, it’s not just a feature. It’s a whole new app. Instagram has started testing a new standalone app called Instants, focused entirely on sharing disappearing photos and short videos with friends.

What is Instagram’s new Instants app?

Read more
Meta will let parents see children’s chats with AI and intervene before risks spiral
Meta's new teen AI supervision tools don't hand parents a transcript; they hand them just enough context to have the right conversation before something goes wrong.
Meta AI Family Center new insights feature.

Meta has been in hot water over teen safety and AI for a while now. A Wall Street Journal investigation, a lost lawsuit in New Mexico, and an FTC inquiry later, the company is finally putting in place some meaningful parental supervision tools. 

Today, Meta is introducing a new Insights tab in its supervision hub across three of its most popular platforms: Instagram, Facebook, and Messenger. While the name doesn’t make it clear right away, the feature gives parents a window into what their teenagers are actually discussing with Meta AI on all these apps. 

Read more