ThinkOTB Agency

The unenviable task of being a Facebook moderator

October 17, 2018

Social media’s reach on modern society is growing and so is our dependency on it. But as its grip on society and how it functions tightens, who is moderating what we see on our screens? To date, Facebook has an inadequate team of 7500 worldwide content moderators. They provide 24/7 monitoring of all harmful subject matter on the site. Is this number really sufficient to police the world and what it chooses to upload to Facebook?

Although the site has announced plans to more than double its workforce to a 20,000 strong team -this number still appears to be vastly insufficient for a social media platform that boasts of over 2 billion users. A Facebook moderator must scour the site for the most depraved content, ranging from beheadings to bestiality, child abuse to rape, before either removing or escalating what they have seen in accordance to the Facebook content guidelines. However, their unenviable task is not as simple as clicking ‘yes’ or ‘no’, they must grapple with nuanced versions of what is acceptable and what is not. For example, the site strictly prohibits nudity, but this would not apply to an image of a naked Holocaust victim that was posted during an awareness month. The moderators must combat this moral minefield, while suppressing the effects of the disturbing images that collect in their mind.

The need for moderation

The need for Facebook moderation in this modern age couldn’t be greater. The social network is ever expanding, and our reliance on our own newsfeed is becoming more apparent. People no longer exclusively scroll Facebook as an escape or to keep in touch with relatives and friends. People now genuinely use it as their top source of news. This became particularly clear during the 2016 US General Election. Campaigns were plagued by claims of Russian hacking and targeting of US voters via Facebook. This in turn means that a moderator’s job is now even more important. They must first protect Facebook users on a social media platform from harmful content. Second, they must protect users against inaccurate news stories which may impact their political outlook.

Moderator’s work

Facebook moderators have seen their workload double in size and importance. But it isn’t easy going. Ms Selena Scola, an employee of Pro Unlimited contracted by Facebook to moderate content, has brought a lawsuit against the company. The lawsuit is for exposing her to content which caused anxiety, insomnia, fatigue and PTSD. She contests that the work she was contracted to undertake directly caused these mental health issues. She claims subsequent deterioration occurred due to lack of training, access to psychological assistance and negligent work conditions.

So, while Facebook moderation is a necessary occupation in the modern age, can the way Facebook does it improve? The overwhelming answer from employees past and present is yes. Despite working at Facebook’s US headquarters, Scola was an outside-contractor. This seems to be an intentional ploy on the part of Facebook; by outsourcing the work, Facebook are putting physical distance between themselves and what is actually happening, effectively absolving all blame on the matter. Having spoken with a close friend who freelances as a Facebook moderator, he too echoed the same concerns regarding Facebook washing their hands with moderators and the long-term effects:

“I’d say the biggest toll on my mental health came from the isolation of working from home constantly staring at a screen… when I worked full-time it could be days before I left the house.”

Although the long-term impact of such work is relatively unknown, experts have suggested that this type of work will only result in further mental-health complications for those tasked with sifting through Facebook’s user generated content.

Working environments

Two common themes have emerged from those who work as a Facebook moderator: lack of initial training and poor access to psychological help. A couple of weeks initial training is no where near substantive enough for the work. It cannot adequately prepare workers for a career of constantly trawling through the worst content that humanity has to offer. Even if a moderator maintains their sanity, their resolve will be tested by daily scrolling through depravity. Facebook’s notoriously secretive stance surrounding training processes and practises does not help either. In order to improve the working conditions of these people, we must first have in-depth understanding of what is in place currently and how it can be bettered.

Facebook’s director of corporate communications Bertie Thomson said the company recognises this work can “often be difficult”. This is a welcome but incredibly downplayed admission of what is going on. It’s evident that something needs to change in Facebook’s working environment. Changes must be made to how they subsequently deal with issues caused by a job whereby viewing such disturbing content is just a normal day at work. Even those who are struggling and know of the psychological help on offer, are reluctant to engage with it. Workers fear that they will be reprimanded for admitting they need help coping.

The world of social media shows no sign of slowing down. Facebook and other social media giants must better protect their staff who are charged with moderating our social consumption. Good luck to anyone who is a Facebook moderator, we don’t envy you!

It’s not all doom and gloom! Take a look at these 6 ways social media does great things.