An online decency moderator's advice: Blur your eyes

Facebook Content reviewers in Essen, GermanyFacebook
People working in content moderation see the worst that the internet can throw at them

"When I left, I didn't shake anyone's hand for three years. I'd seen what people do and how disgusting they are. I didn't want to touch anyone. I was disgusted by humanity."

Roz Bowden is talking about her time as a content moderator at MySpace, viewing the very worst the internet could throw at her so that others didn't have to.

The job she did has become even more important as social media has spread its influence and user generated content has become a crucial part of the internet.

Facebook now has 7,500 content moderators working around the globe 24 hours a day, and they regularly view images and videos showing depraved content, from child sexual abuse, to bestiality, beheadings, torture, rape and murder.

Now one of them is suing the social network for psychological trauma after watching thousands of hours of toxic and disturbing content.

Selena Scola claims that Facebook and Pro Unlimited, the firm to which the social network contracted the work, failed to keep her emotionally safe.

She claims that she now suffers from post-traumatic stress disorder as a result of the things she has seen online.

The case is likely to shine a light on the murky world of content moderation and raise questions about whether people should be doing this kind of work in the first place.

Sarah Roberts, a University of California assistant professor who has studied content moderation for the last eight years, believes social networks could be sleepwalking into a mental health crisis.

"There are no public studies that look at the long-term ramifications of this work," she told the BBC.

"We are looking at a huge number of people - and that is growing exponentially - and collectively we should be very concerned about the long-term outcome.

"There is no long-term support plan when these content moderators leave. They are just expected to melt back into the fabric of society."

Ms Bowden was in finance before working at MySpace from 2005 to 2008 and was glad to return to her previous field when the social network job became too much for her to cope with.

"I only look at numbers now," she told a conference last year.

But she often wonders what became of the team she helped train and supervise back in the early days of social networking.

"What happened to all of these people who watched heads being blown off in the middle of the night? It's important to know."

MySpace website
MySpace was one of the first social networks and moderators saw as much bad content as currently available

When she started out, working the graveyard shift at MySpace, there was little guidance about how to do the job.

"We had to come up with the rules. Watching porn and asking whether wearing a tiny spaghetti-strap bikini was nudity? Asking how much sex is too much sex for MySpace? Making up the rules as we went along.

"Should we allow someone to cut someone's head off in a video? No, but what if it is a cartoon? Is it OK for Tom and Jerry to do it?"

There was also nothing in the way of emotional support, although she would tell her team: "It's OK to walk out, it's OK to cry. Just don't throw up on my floor."

And when it came to looking at the content, she had the following advice: "Blur your eyes and then you won't really see it."

Psychological help

In a blogpost last year, Facebook described its content moderators as "the unrecognised heroes who keep Facebook safe for all the rest of us".

However, it admitted that the job "is not for everyone" and that it only hires people "who will be able to handle the inevitable challenges that the role presents".

But, despite its promise to care, it outsources much of the work even for those, like Ms Scola, who are based at its US headquarters in Mountain View and Menlo Park.

Prof Roberts thinks this is a way of removing itself from blame.

"This work is often outsourced in the technology industry. That brings cost savings but it also allows them a level of organisational distance when there are inevitable cases such as this one,"

Getty Images People looking at phonesGetty Images
As more people share content online, the job of moderating it becomes more crucial

Facebook screens for resilience, with pre-training for all its moderators to explain what is expected in the job and a minimum of 80 hours with an instructor using a replica of the system, before reviewers are let loose in the real world.

It also employs four clinical psychologists, and all content reviewers have access to mental health resources.

Peter Friedman runs LiveWorld - a firm which has provided content moderators to firms such as AOL, eBay and Apple for the past 20 years.

He told the BBC that employees rarely, if ever, use the therapy that is on offer to them.

Prof Roberts is not surprised.

"It is a pre-condition of the job that they can handle this and they don't want their employer to know that they can't," she said.

"Workers feel they could be stigmatised if they use these services."

Getty Images Person looking stressed at computerGetty Images
Moderators rarely use psychological help on offer, said one content moderation boss

LiveWorld has now racked up more than one million hours of moderation and Mr Friedman has plenty of advice for how to do it well.

  • The cultural model around the moderator is crucial. You have to make them feel strong and empowered. Having an intern view images of child abuse could break the culture of the entire company
  • A relaxed environment, not a call centre, is important as is management support. Knowing we are there 24/7 makes moderators better able to deal with the stuff they are seeing
  • Shifts need to be relatively short - 30 minutes to three-and-a-half hours for those looking at the nastiest content
  • It may not suit religious or culturally conservative people, who may have a harder time dealing with the kind of stuff out there
  • Instead the ideal candidate is someone who already embraces social media "so they realise that there is good and bad" as well as someone who is able to "put up their hand and say I need a break for a day, a week, a month"
  • There is a need for emotional maturity. A college student is less likely to be good than a mother

Facebook admits that content review at the scale it is now doing it "is uncharted territory".

"To a certain extent we have to figure it out as we go", it said in its blogpost.