Fixing mistakes for better content moderation: Facebook

Danny Woods
July 19, 2018

The Series features an undercover reporter's findings while working as a content moderator in Facebook.

"People come to Facebook for a safe, secure experience to share content with their family and friends", Allan said in a new documentary by the long-running investigative show, Dispatches, from the United Kingdom (UK).

Facebook has recently committed to reducing fake news and improving privacy on its platform, which is welcome.

Facebook responded to Channel 4's report in a Tuesday blog post.

Far-right activists" Facebook pages, including those of Tommy Robinson and Britain First, received special protection from the company as they "generate revenue', a Channel 4 documentary has found.

The Sun reports that footage shot by an undercover reporter showed one staffer saying: "If you start censoring, people lose interest. It's all about making money at the end of the day", an unnamed Facebook moderator allegedly told the filmmakers.

It isn't the first time watchdog groups have noticed bad content circularing over Facebook, despite the company's efforts to clean the site up.

Mr McNamee said: "It's the really extreme, really unsafe form of content that attracts the most highly engaged people on the platform". People are debating very sensitive issues on Facebook, including issues like immigration.

Richard Allan, Facebook's vice president of public policy.

More news: Jean Segura’s All-Star Game homer - and Nelson Cruz’s priceless reaction

"Shocking content does not make us more money, that's just a misunderstanding of how the system works".

Dispatches showed that during training sessions, moderators were shown a video of an adult man punching and stamping on a toddler.

He also said that Facebook trains contractors to ignore visual evidence that a user is under 13 unless there's an admission that the person is underage. Yet despite Facebook's policies stating it won't accept hate speech, Channel 4's undercover investigation still found examples of hate speech not being purged from Facebook. A comment aimed at Muslim immigrants that said "f**k off back to your own countries" was allowed to remain on the site. That's opened the door for hate speech, violent videos and Facebook pages from far-right groups to persist over the social networking site, even as the content can violate company policies.

Mark Zuckerberg's former mentor has claimed Facebook permits unsafe content because it is like "crack cocaine" for web users.

"Obviously they have a lot of followers so they're generating a lot of revenue for Facebook", one moderator said about the fascist Britain First page, before it was deleted when deputy leader Jayda Fransen was convicted of racially aggravated harassment in March. And that political debate can be entirely legitimate.

"We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention", Mr Allan said.

The next Retail Excellence-Facebook training course was scheduled to take place next week, with Facebook also expected to address delegates at Retail Excellence eCommerce Conference in September.

Facebook has been investigating exactly what happened so as to prevent these issues from happening again.

Facebook said it has made "mistakes", but denied accusations that it seeks to profit from extreme content.

Other reports by LeisureTravelAid

Discuss This Article