+44(0)113 216 2820

Blog

Welcome to the OTB blog.

The Unenviable Task of Being a Facebook Moderator

Oct 17, 2018
The Unenviable Task of Being a Facebook Moderator

Social media’s reach on modern society is growing and so is our dependency on it. But as its grip on society and how it functions unquestionably tightens, who is moderating what we see on our screens? To date, Facebook has an inadequate team of 7500 worldwide content moderators who provide 24/7 monitoring of all harmful subject matter on the site. Is this number really sufficient to police the world and what it chooses to upload to Facebook?

Although the site has announced plans to more than double its workforce to a 20,000 strong team -this number still appears to be vastly insufficient for a social media platform that boasts of over 2 billion users. These moderators are asked to scour the site for the most depraved content, ranging from beheadings to bestiality, child abuse to rape, before either removing or escalating what they have seen in accordance to the Facebook content guidelines. However, their unenviable task is not as simple as clicking ‘yes’ or ‘no’, they must grapple with nuanced versions of what is acceptable and what is not. For example, the site strictly prohibits nudity, but this would not apply to an image of a naked Holocaust victim that was posted during an awareness month. The moderators must combat this moral minefield, while suppressing the effects of the disturbing images that collect in their mind. 

The need for Facebook moderation in this modern age couldn’t be greater. Not only is the social network expanding at a rate that cannot be kept up with, but our reliance on our own newsfeed is becoming more apparent. People no longer exclusively scroll Facebook as an escape or as a resource to keep in touch with relatives and friends, people now genuinely use it as their top source of news. This became particularly clear during the 2016 US General Election which was plagued by claims of Russian hacking and targeting of US voters via Facebook. This in turn means that a moderator’s job is now twofold: they must protect Facebook users on a social media platform from harmful content and protect them against inaccurate news stories which may impact their political outlook.

As moderators see their workload double in size and importance, a Ms Selena Scola, an employee of Pro Unlimited contracted by Facebook to moderate content, has brought a lawsuit against the company for exposing her to content which caused anxiety, insomnia, fatigue and PTSD. She contests that the work she was contracted to undertake directly caused these mental health issues and their deterioration occurred due to lack of training, access to psychological assistance and negligent work conditions.

 So, while Facebook moderation is a necessary occupation in the modern age, can the way Facebook does it improve? The overwhelming answer from employees past and present is yes. Despite working at Facebook’s US headquarters, Scola was an outside-contractor. This seems to be an intentional ploy on the part of Facebook; by outsourcing the work, Facebook are putting physical distance between themselves and what is actually happening, effectively absolving all blame on the matter. Having spoken with a close friend who freelances as a Facebook moderator, he too echoed the same concerns regarding Facebook washing their hands with moderators and the long-term effects:

 

“I’d say the biggest toll on my mental health came from the isolation of working from home constantly staring at a screen… when I worked full-time it could be days before I left the house.”

 

Although the long-term impact of such work is relatively unknown, experts have suggested that this type of work will only result in further mental-health complications for those tasked with sifting through Facebook’s user generated content.

Two common themes have emerged in Ms Scola’s account and from those who have been vocal in condemning the work they did as a Facebook moderator: lack of initial training and poor access to psychological help. A couple of weeks initial training is no where near substantive enough to prepare any human for a career of constantly trawling through the worst content that humanity has to offer. Even if a moderator is able to somehow maintain a modicum of sanity, their resolve is likely to be tested after scrolling through hours of depravity on a daily basis. Facebook’s notoriously secretive stance surrounding training processes and practises does not help either. In order to improve the working conditions of these people, we must first have in-depth understanding of what is in place currently and how it can be bettered.

Facebook’s director of corporate communications Bertie Thomson said the company recognises this work can “often be difficult” - a welcome but incredibly downplayed admission of what is going on. It’s evident that something needs to change in Facebook’s working environment and how they subsequently deal with issues caused by the nature of a job whereby viewing such disturbing content is just a normal day at work. Even those who are struggling and know of the psychological help on offer, are reluctant to engage with it in the fear that they will be reprimanded for admitting they need help coping. 

As the world of social media shows no sign of slowing down, Facebook and other social media giants must take it upon themselves to better protect their staff who are charged with moderating our social consumption.


Share This Post:
Latest Tweet
Contact Us
  • Address: Escher House, 116 Cardigan Road, Headingley, Leeds, LS6 3BJ
  • Phone: +44(0)113 216 2820
  • Email: talktous@outsidethebox.co.uk
  • Monday - Thursday: 09:00 - 17:30
    Friday: 09:00 - 16:30
    Saturday - Sunday: Closed
SAY HELLO

All content © 2018 outside the box. otb and outside the box are trading names of Think So Ltd. Registered office: Escher House, 116 Cardigan Road, Headingley Leeds LS6 3BJ, UK. Registered in England: 5687748