Facebook moderator: ‘Every day was a nightmare’

BBC Isabella Plunkett with a Facebook logo in front of Dublin HQBBC
Isabella Plunkett has spoken out about her experience as a Facebook content moderator

A Facebook moderator has for the first time given evidence revealing the mental toll of the job, to a parliamentary committee.

The Irish parliament heard how moderators viewed graphic content up to eight hours a day.

Law firm Foxglove and the Communication Workers Union, representing moderators, called for better psychological support and freedom to speak out.

Facebook said it provides 24 hours support to staff.

Isabella Plunkett has worked as a Facebook content moderator for just over two years, and still works there.

Her job is to review posts on the platform - which can contain graphic violence, exploitation, extremism, abuse and suicide.

The 26-year-old says she could not speak to her friends or family about the things she saw at work due to a non-disclosure agreement (NDA) which she had signed at the beginning of her contract.

Members of Ireland's Joint Committee on Enterprise, Trade and Employment, commended her bravery in speaking out.

Oireachtas TV Isabella Plunkett gives evidence at the Irish committeeOireachtas TV
Isabella Plunkett gives evidence at the Irish committee

Isabella also spoke to the BBC

“I'm here speaking out and I don't actually necessarily know in detail what I'm legally allowed to say and not to say,” she said.

“It was always clear we couldn't speak about our job, we couldn't speak about our job to friends, family... and it's definitely a workplace with a sense of secrecy.”

Facebook told the BBC that NDAs are standard practice and that reviewers can discuss any aspect of their job with doctors and counsellors.

Staff can discuss the general challenges and rewards of their jobs with family and loved ones, but not specific details of the content they are reviewing.

Mental health

“I’ve done the job for two years and I don’t think I could do it for much longer because of the strain it does cause to your mental health,” Isabella told the BBC.

“It's not like a normal job where you can go to work and go home and forget about it - the stuff you’re seeing is really ingrained in your mind.’”

Isabella processes around 100 "tickets" a day - these can be videos, images or text posts on the platform. She said they often contain graphic violence, suicide, exploitation and abuse.

She works for Covalen, one of Facebook’s largest contractors in Ireland.

Isabella claims she was not allowed to work from home, unlike her counterparts who were employed directly by Facebook who did the same job.

As a result, she says she is exposed to more graphic content, because she is in the office.

'A nightmare'

“The high priority queues - the graphic violence, the child stuff, the exploitation and the suicides, people working from home don’t get that - the burden is put on us.”

Despite having family shielding at home, she was told to come into the office and developed anxiety, for which she now takes antidepressants.

“Every day was a nightmare,” she said, adding that the support given was “insufficient.”

Facebook says psychological help is available to all its moderators 24 hours a day, but Isabella claims its wellness coaches are not qualified psychiatrists.

“I was seeing the wellness team but didn’t feel I got the support I needed. I can’t say I left work feeling relieved or knowing I could go home and have a good night's sleep - that’s not possible,” she added.

“It would follow me home. I could just be watching TV at home and think back to one of the horrible, really graphic tickets.”

Meet people who review Facebook's reported content

Sub-contracted staff are given 1.5 hours of "wellness" time a week, she says, which can be used for speaking to a wellness coach, going for walks or taking time out when feeling overwhelmed.

“It’s not enough. I’m now seeing the content I view in work in my dreams. I remember it, I experience it again and it is horrible.

“You never know what is going to come next and you have to watch it the full way through because they might have violators.”

PTSD disclaimer

Some Facebook moderators are asked to sign a disclaimer before starting work, accepting that the content seen in their jobs could lead to poor mental health and PTSD (Post Traumatic Stress Disorder).

An example of the contract, read out in the committee said: “I understand that exposure to this content may give me post traumatic stress disorder.

"I will engage in a mandatory wellness coaching session but I understand that those are not conditions and may not be sufficient to prevent my contracting PTSD.”

A Facebook spokeswoman said: "Everyone who reviews content for Facebook goes through an in-depth training programme on Facebook's Community Standards and has access to psychological support to ensure their wellbeing.

“We are committed to working with our partners to provide support for our content reviewers as we recognise that reviewing certain types of content can sometimes be hard," she added.

"In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment.

"We are also employing technical solutions to limit their exposure to potentially graphic material as much as possible. This is an important issue, and we are committed to getting this right.”

Technical solutions

Facebook uses a combination of machine learning algorithms and human moderators to review content.

In future, it hopes to reduce the number of human moderators through machine learning.

But Isabella said this was a Facebook "fantasy", that systems were “not even near that stage”.

Speaking to the committee, Isabella said “people are intimidated” by the NDA process and afraid of losing their jobs.

She cited an internal communications platform on Facebook, in which workers' posts were deleted when speaking up. Facebook denied these claims and said no disciplinary action is taken for employees raising concerns.

“People complained about the treatment and what was going on and how they felt unsafe,” Isabella told the committee. “It was clear that it was being censored because people's comments were being deleted, accounts were being disabled."

She said her experience drove her to give evidence: “I just had such a feeling that I needed to do it,” she added in her testimony. “I need to speak for the people that are too afraid, that feel they have too many responsibilities, and they can't afford to take any risks."