Skip to main contentSkip to navigationSkip to navigation
‘Every day people would have to visit psychologists. Some couldn’t sleep or they had nightmares.’
‘Every day people would have to visit psychologists. Some couldn’t sleep or they had nightmares.’ Photograph: Niall Carson/PA
‘Every day people would have to visit psychologists. Some couldn’t sleep or they had nightmares.’ Photograph: Niall Carson/PA

Underpaid and overburdened: the life of a Facebook moderator

This article is more than 6 years old

Testimony from those working to keep beheadings, bestiality and child sexual abuse images off Facebook indicates that the support provided isn’t enough

“There was literally nothing enjoyable about the job. You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.”

That’s how one man, who wished to remain anonymous, characterized his job as a content moderator at Facebook.

“We were underpaid and undervalued,” said the man, who earned roughly $15 per hour removing terrorist content from the social network after a two-week training course.

Pictures, videos, profiles and groups would be flagged either by users or algorithms to review, and his team would decide if they needed to be removed or escalated.

“Every day people would have to visit psychologists. Some couldn’t sleep or they had nightmares.”

Psychologists say that the emotional toll of looking at extreme imagery, whether violent terrorist acts or child sexual abuse, can be punishing. Workers exposed to such content should have extensive resiliency training and access to counsellors, akin to the support that first-responders receive. However, testimony from those working to keep beheadings, bestiality and child sexual abuse images off Facebook indicates that the support provided isn’t enough.

“The training and support was absolutely not sufficient,” said the analyst, who worked at a company contracted by Facebook to moderate content.

Facebook is notoriously secretive about what training and support it offers its moderators, although a company spokeswoman said:

“We recognize that this work can often be difficult. That is why every person reviewing Facebook content is offered psychological support and wellness resources. We have worked with psychologists to put a program in place that is specifically designed to support people in these roles. The program includes counseling, resiliency training and support programs. As we increase our commitment to our Community Operations team, we are also deepening our investment in this important area.”

The company also said it provides continued training and assessment of moderators after the initial two weeks.

The content moderator who spoke to the Guardian said he wasn’t given any mandatory counseling, although his company offered mindfulness sessions every few months and there was access to a counsellor on request. However, he said the contracted workforce, many of whom were recent immigrants with limited English skills and who were hired to work in their native language, would choose to seek psychological help in their personal time rather than asking for help internally for fear of losing their jobs or being sent home without pay.

There is also some suggestion that Facebook’s approach to moderation falls short of standards elsewhere in the industry. Three other organizations that assess objectionable or illegal content found that the training and support they offer appear to be much more comprehensive than that offered to Facebook’s contractors.

‘The training and support was absolutely not sufficient.’ Photograph: Jonathan Nackstrand/AFP/Getty Images

Both the UK’s Internet Watch Foundation (IWF) and the US’s National Center for Missing and Exploited Children (NCMEC) have teams monitoring content reported as child sexual abuse. Meanwhile, popular web forum Reddit is among the companies to hire an external consultant, the Workplace Wellness Project, to help its staff cope with viewing and removing illegal content from the platform.

Before the IWF hires an analyst, they are assessed for suitability by a psychologist, who asks about their opinions on pornography in general, their support network, childhood and triggers.

Once they pass this stage, they are interviewed about their work skills before going through to the final stage, where they are exposed to child sexual abuse imagery. Candidates will sit with two IWF employees and go through a sequence of images getting progressively more egregious, working towards the worst kinds of sexual violence against children. This stage is designed to see how candidates cope and let them decide whether they wish to continue with the role.

Once they accept the job, analysts have an enhanced background check before they start their six months’ training, which involves understanding criminal law, learning about the dark web and, crucially, building resilience to looking at traumatic content.

One 46-year-old mother of two has been working at the IWF as an analyst for three years. The role involves looking at up to 1,000 potentially illegal images each day, assessing whether it depicts child sexual abuse imagery and, if so, what category it is, ranging from erotic posing of children through to penetrative sexual activity, bestiality and sadism.

“Newborn babies being raped is probably the most horrific thing I’ve seen,” she said.

She said she has mandatory counseling once a month to build up coping strategies for dealing with images, including learning to keep detached from victims and focusing on other details in the image, taking regular breaks, talking to colleagues and meditation. She also has an annual assessment with a psychologist specialising in trauma and has access to her counsellor whenever she likes.

It’s a similar story at NCMEC, where analysts are trained for four to six months. The mental well-being of analysts is supported by a safeguard program run by Lanae Holmes. The program starts during the interview process and extends to after the analyst leaves the organization.

For the first six months there are mandatory monthly individual and group sessions for analysts focusing on identifying triggers, recognizing compassion fatigue, secondary traumatic stress symptoms and finding support and coping skills. After six months, the requirement for counseling drops to twice a year, but can schedule additional sessions whenever they want. Around half of all staff request additional sessions, said Holmes.

“We borrow a lot from other fields that are high risk employment, such as the work done to keep first responders healthy,” Holmes said.

Like at the IWF, the program focuses building resiliency and identifying triggers before team members develop a stress response. The center also offers support for spouses and significant others to educate them about the work and how to recognize a stress response.

“The work entails looking at the underbelly of human nature, including betrayal and cruelty. We need to offset the dark side with human connections,” said Naheed Sheikh, co-founder of the Workplace Wellness Project.

If moderators have not been trained effectively to handle a particular type of content, it can be especially distressing.

For the 46-year-old analyst at IWF, violent images wrongly reported to the foundation can be especially upsetting. “I’m trained in assessing child sexual abuse, so when a member of the public reports a gore site or beheading that can shock me because I’m not expecting it,” she said.

There’s an upside that comes with working at organisations such as the IWF and NCMEC, which work closely with law enforcement: the clear link between their work and helping victims.

“I’ve seen websites come down. I know I’ve disrupted it. That’s rewarding,” the IWF analyst said. “I’ve also had letters from victims I’ve helped,” she added, explaining that minors will often report videos or photos of them that’s been put online as “revenge porn”.

“It’s a lovely environment. I know what to look for in terms of my stresses and I have the right support network around me,” she added.

Holmes suggested that being at a mission-driven organization makes the job more fulfilling than it might be at an internet service provider. “In contrast to the ISP folks, we employ people who are very clear that their purpose for coming to work is to help children,” she said.

For companies that lack such a clear mission, Sheikh recommends rotating front-line staff onto projects that don’t deal with emotional content. “We recommend staff also engage with projects that are more cognitively focused, for example a research-based project.”

Without such safeguards, staff turnover will be high as individuals struggle to cope with grisly content.

“You can get burnt out like with any field that draws on your emotional resources,” Sheikh said. “Particularly if the labor is not appreciated or rewarded.”

Contact the author: olivia.solon@theguardian.com

More on this story

More on this story

  • Love Island star has had ‘serious conversations’ about becoming MP

  • Revealed: catastrophic effects of working as a Facebook moderator

  • Georgia Harrison: I didn’t realise how difficult revenge porn case would be

  • Facebook moderators tell of strict scrutiny and PTSD symptoms

  • Laura Bates: ‘For teenage girls, escaping harassment, revenge porn and deepfake porn is impossible’

  • Facebook's burnt-out moderators are proof that it is broken

  • ‘I have moments of shame I can’t control’: the lives ruined by explicit ‘collector culture’

  • Facebook failing to protect moderators from mental trauma, lawsuit claims

  • UK's revenge porn helpline registers busiest year on record

  • Facebook releases content moderation guidelines – rules long kept secret

Most viewed

Most viewed