Skip to main content
  1. Home
  2. Social Media
  3. News

Facebook’s content moderators break their silence on terrifying work conditions

Add as a preferred source on Google

The moderators in charge of policing inappropriate content on Facebook have begun to speak out over work conditions that have caused them to fear for their lives and left them unable to enforce Facebook’s rules.

The conditions detailed in a shocking report by The Verge are so bad that one moderator collapse collapsed at his desk while on the clock and died of a heart attack. Three former moderators spoke out about their experiences despite nondisclosure agreements.

Recommended Videos

The content moderators of Facebook are the unsung heroes of the platform, working long hours to ensure graphic content is policed or deleted from Facebook users’ timelines.

These employees aren’t the ones working at Facebook’s campus headquarters in Menlo Park, California, which has a burrito bar, treadmill work stations, meditation rooms, and a 9-acre rooftop garden deck. Instead, they work for a firm contracted by Facebook, Cognizant, which has significantly less glitzy offices in places like Tampa, Florida and Phoenix, Arizona.

The worker who died at his desk last year worked the overnight shift at the Tampa site. According to the report, moderators only receive two 15 minute breaks, a 30-minute lunch break and a 9-minute “wellness” break. Not only overworked, content moderators are  forced to see images or videos of graphic violence, child pornography, hate speech, conspiracy theories, and even murder day in and day out, all so they can delete awful content before it reaches Facebook’s billions of users. 

According to Glassdoor, the average salary for a Facebook intern in the San Jose area is $6,625 per month, or $79,500 a year — a stark difference from the reported $28,800 yearly salary that content moderators make. 

Aside from moderating mentally taxing content and receiving low pay, these employees have also dealt with a bed bug infestation at their office, bodily waste appearing at workstations, frequent sexual harassment, and unsanitary bathroom conditions.

Without these employees, we would likely see unspeakable things on our Facebook timelines and (even more) fake news. In a statement, a Facebook spokesperson said the company works with its content review partners “to provide a level of support and compensation that leads the industry.”

“There will inevitably be employee challenges or dissatisfaction that call our commitment to this work and our partners’ employees into question,” the spokesperson said. “When the circumstances warrant action on the part of management, we make sure it happens.”

Cognizant did not immediately respond to a request for comment, but a spokesperson told The Verge that it takes “allegations such as this very seriously” and “strives to create a safe and empowering workplace.”

Aside from the human impact, the harrowing work conditions have left some of the moderators’ offices unable to meet their targets for enforcing Facebook’s policies. That means that more disturbing content could slip through the cracks — and into your feed.

Allison Matyus
Former Digital Trends Contributor
Allison Matyus is a general news reporter at Digital Trends. She covers any and all tech news, including issues around social…
Internet’s favorite app Vine is back from the dead, and it’s called Divine
The six-second videos that launched a thousand creators are back, and this time, they're here to stay.
Divine app open on iPhone

Vine is back, and if you're already feeling nostalgic, you're not alone. Divine, a Vine reboot backed by Twitter co-founder Jack Dorsey, is now available on the App Store and Google Play. The app brings back roughly 500,000 archived Vine videos and lets creators post new six-second looping videos once again.

As reported by TechCrunch, Dorsey's nonprofit, "and Other Stuff," financed the project. He's not looking for a return on his investment here. His goal is simpler: to undo the mistake he made when he shut down Vine back in 2017.

Read more
Social media scams caused over two billion dollars in losses to consumers last year
Facebook scams led consumer losses as social media fraud surged in 2025
cyberscam-romance-scam

Social media is now America's most expensive scam hotspot. According to the Federal Trade Commission (FTC), consumers reported $2.1 billion in losses from platform-based fraud in 2025, a number that has grown eightfold in five years. Nearly one in three fraud victims said the con started on a social platform.

Why is Facebook such a big target?

Read more
I found an app that finally broke my toxic affair with doomscrolling
Electronics, Mobile Phone, Phone

I won’t pretend I’m above it — I watch Instagram Reels and YouTube Shorts like everyone else, and it usually starts small. A notification pops up, I unlock my phone, and I tell myself I’ll just check one thing. The next moment, I’m deep into a stream of random videos, with no idea how I got there or how much time has quietly slipped away. I’ve genuinely tried to fix it — I set app timers and convinced myself I’d follow them. I even removed Shorts from my YouTube feed, thinking that would finally solve it. I tried apps that promise to limit usage and keep you in check. For a day or two, it felt like I had things under control. Then slowly, almost without noticing, I’d fall right back into the same loop. This habit creeps in during moments of boredom, and suddenly I’m scrolling again.

At some point, I had to admit it — doomscrolling was no longer something I occasionally did; it was something I kept returning to without even thinking about it. Then, almost by accident, I found an app that actually made a difference. It didn’t completely fix everything overnight, but it did something more important. It made me pause and be aware of what I was doing in that moment. And that small interruption was enough to help me pull back before I went too far. It just gave me a bit of control, which was exactly what I needed.

Read more