Facebook hate speech glitch investigated by firm

Getty Images The Facebook logo seen on a smartphoneGetty Images
The BBC raised the issue with Facebook

A few days after Janet, not her real name, reported some hate speech on Facebook, the company sent her a message thanking her for doing "the right thing".

"We removed both the group and all its posts, including the one you reported," Facebook's support team told her.

But, none of that happened. The posts, and the group, stayed online. Facebook is now looking into a potential glitch in the system it uses to moderate harmful content on its network.

"We are investigating this issue, and will share more information as soon as we can," the company told the BBC.

The glitch appears to send an automated message to the user telling them content they had reported had been removed - when in fact Facebook's staff had ruled it should stay online.

"It's a huge breach of trust," said Brandie Nonnecke, from the Center for Information Technology Research in the Interest of Society (Citris).

"How big is this problem in their system?"

Facebook said it was unable to answer that question.

'I can't be the only one'

Janet shared several examples of posts that had stayed up, as well as an entire group, named: "LARGEST GROUP EVER! We need 10000000 members to Make America Great Again!".

Its 50,000 members post around 800 items each day. The posts are mostly a collection of images and video. Some, but not all, contain hateful anti-Muslim and anti-immigration rhetoric - which is what Janet, who lives in Las Vegas and runs a retail business, said she was concerned about.

"[Facebook] has been promoting themselves in my Newsfeed saying they are trying to keep our democracy safe by eliminating content that is false and divisive," she told the BBC.

"If they are sending me notices they removed the content and offensive groups but in reality are not, doesn't this go against what they say in public or to Congress?"

Facebook responds to the complaint
Facebook's response to the complaint

The BBC agreed to keep Janet's identity a secret, due to her concern people might target her because of the political nature of the content she decided to report.

She did, however, provide the BBC with access to her account so that messages' authenticity could be confirmed.

In the Support Inbox, the section where users are notified about the status of content they have reported, there are at least five different instances where Facebook's support staff told Janet content would be taken down, but at the time of publication remained live.

"Facebook claims to be removing this content but obviously they are not," Janet said. "I can't be the only one."

The BBC notified Facebook about the issue on Sunday. After three days, the firm has been unable to offer an explanation. When the BBC suggested the content may be still live because the original poster had appealed the moderator's original decision, it was told that was not the case.

Human reliance

Janet, by her own admission, reports a lot of content on Facebook. "I binge report when I'm angry!" she joked.

In her Support Inbox, there are many other instances where she was told messages she reported did not go against Facebook's guidelines and would therefore not be removed.

Getty Images Facebook's Sheryl Sandberg is questioned by US politiciansGetty Images
Facebook's Sheryl Sandberg - the removal of material from Facebook has been a key topic when the firm is questioned by US politicians

There are also examples where content has been removed as promised.

For Facebook, which earlier this year admitted could only combat hate speech with the help of public reports, people like Janet are a necessary part of how that system works.

Thanks to the nuance of hate speech - versus, say, spam - it is much harder for a computer algorithm to detect.

According to the company's own statistics, between January and March this year, 62% of hate speech removed by the company was thanks to a member of the public flagging it first.

line

Follow Dave Lee on Twitter @DaveLeeBBC

Do you have more information about this or any other technology story? You can reach Dave directly and securely through encrypted messaging app Signal on: +1 (628) 400-7370