Facebook Moderator: The Labourer who Keeps Inappropriate Content Out of Your Facebook News Feed!

**This article may contain descriptions of potentially disturbing content.**

Every user of the big social media giant, Facebook is likely to see some content in which they disagree, such as hate speech or racist comments.
You must have already noticed this tiny little button at the corner on each post called “Report Post” that you have probably clicked on.

\"\"

But have you ever wondered what happens to all the inappropriate posts after you reported them to Facebook?
Who’s handling all those disturbing contents?
The answer is – Facebook Content Moderator.

Here’s a blog posting by Facebook explaining how this mechanism works.

\"\"

What is Content Moderation?

Content Moderation is the practice of monitoring and applying a predetermined set of rules and guidelines to user-generated submissions to determine best if the communication (a post, in particular) is permissible or not.
In a simple term, it means reviewing the contents including disturbing photos and videos and determining which posts should stay up and which should be removed.
And all of these tasks are carried out by the content moderators.

Social media companies rely on content moderators to help filter out all the disturbing contents to better protect all social media users.
In 2017, Facebook announced that it had expanded the people working on safety and security to 30000, half of which are content moderators after it has been under heavy criticism for failing to prevent abuses of the platform.

Reporters from The Verge was contacted by some moderators who worked for Facebook in Tampa, Florida through Cognizant, a company contracted with Facebook which is tasked with moderating the content on the platform.
Three former Facebook moderators broke their nondisclosure agreements to reveal about the poor working conditions.

According to the report by The Verge, high performing moderators will review at least 400 posts daily, including graphic imagery and hate speech.
Shawn Speagle, a former Facebook content moderator said at first, he was told that he would review some high profile social media accounts such as Disney World and he would also do some data searching to gain insights on what type of posts has the most interactions.

When asked about some of the contents that the moderators had seen in the past, Michelle Bennetti, a former moderator told The Verge that she has seen babies being abused in the queue of contents that she was to moderate.
“… the mother was dropping the babies on the ground. This is one we saw over and over again and then choked the baby, and you hear the baby gurgling, trying to breathe, and for days, it infected my mind.”, “I had to know what happened to this baby because I’d seen it over and over again, and luckily the baby was okay”, said Melynda Johnson.

“I have seen videos of a babysitter choking a toddler to death and giving bloody noses to babies, and it stays, and nobody does anything… The stuff that does get deleted, it does wind up back there anyway…”, told Shawn.

Tech companies like Facebook plan to remove these contents by using artificial intelligence. But it was unsure when this AI technology will be effective.

According to The Verge, moderators are judged by an “accuracy score”, a measure of how often their decisions to remove content are judged to be correct based on Facebook’s ever-changing policies. If the moderators’ accuracy score is lower than 98 percent, they could lose their job.
Shawn told The Verge that the policies are changing on a daily basis.

Based on the interviews by The Verge, moderators get two 15-minutes breaks, a 30-minutes lunch, and 9 minutes of “wellness time” ( which the moderators could talk to a counselor) a day.

Shawn told the reporter that he was working in a toxic environment where the higher-ups are being nonchalant about the problems such as sexual harassment in the workplace, people drinking alcohol and smoking weed in the parking lot. He also told The Verge that the entire office building has only one bathroom with poor hygiene conditions for the 800 employees working there.

Cognizant told The Verge that it strives to keep its workplace clean. In response to the report by The Verge, Cognizant said it “will stress the importance of keeping common workplace areas clean” and “monitor compliances”.

Hiring more moderators to help to remove extreme content on the platform might help to reduce the number of inappropriate content on the site, but at the same time, it sacrifices the physical and mental well-being of the moderators.
Let us hope that one day, artificial intelligence and robust algorithms can help filter these contents from social media so that no one would have to work under these poor conditions.

Written by,Teng Jun Siong

Edited by,Rayvathi a/p Theivindran

Leave a Comment

Your email address will not be published.