Recommended

Facebook Moderators Face Horrors Everyday and Get Treated 'Like Nothing,' According to Former Employee

Facebook moderators are faced with some of the worst humanity has to offer, and according to a former employee, the company will not lift a finger to support them. Beheadings, child pornography, animal abuse and more are part of their daily experience, and at the end of it all, the former moderator only remembers that Facebook treats them "like nothing."

"How are you allowing this to happen? That young people like us are having to see these things — we're treated like nothing," the former Facebook employee asked her former employers, in an interview with BBC.

It would seem that Facebook's training only goes so far to prepare people for the worst of the content that people would post online.

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

"They definitely warn you, but warning you and actually seeing it are different," Sarah Katz, another former Facebook moderator, also told BBC. Despite Facebook's claims, she does not recall any counseling services being offered during her stay.

"At the time there was nothing in the way of counselling services. There might be today, I'm not sure," she noted, adding that if there was a counseling service offered by Facebook, she would have likely taken them up on it.

Counseling service would have helped the anonymous former Facebook moderator before her mental health began to take a turn for the worse. The woman recalled having nightmares more than one time, and in one of them, she was shocked at seeing people taking photos and videos instead of helping people jumping off a building.

"It's the most important job in Facebook, and it's the worst, and no one cares about it," she said.

Facebook published the internal guidelines that its moderators use, and its about 8,500 words long as it goes into detail on what to keep out of the social media platform, including how to weed out sexual or violent content, or hate speech, according to the Business Insider.

Facebook also said that psychological help is available to moderators 24 hours a day, according to Monika Bickert, Facebook's head of product policy. She claimed that support systems are in place for the employees of the company, should they need them.

Bickert also notes that content like child pornography, extreme violence and disturbing imagery is something that a Facebook moderator rarely sees anyway.

"This work is hard, but I will say that the graphic content, that sort of content, is a small fraction of what reviewers might see," she claimed. She adds that they've been able to use technology, particularly AI, to screen out all the worst bits.

It was an approach, however, that does not work all that great for things like hate speech and fake news, as even more former workers of Facebook attested.

"We're committed to giving them what they need to do this job well. If they're ever uncomfortable at work, there are counseling resources for them, and they can be shifted to work on a different type of content," Bickert added.

Was this article helpful?

Help keep The Christian Post free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CP's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.