Study calls for Facebook to stop outsourcing content moderation
Study also urges Facebook to double number of moderators and hire a content overseer
The NYU Stern Center for Business and Human Rights has released a report that urges Facebook’s to stop outsourcing content moderation.
According to the report, Facebook’s decision to outsource content moderation is the key reason the company’s efforts to moderate the platform are failing.
As a result, the NYU Stern Center for Business and Human Rights has called on Facebook to stop outsourcing content moderation and commit to bringing the work in-house.
According to the report, Facebook users, along with the company’s artificial intelligence system, flag more than three million items to be moderated daily. With Facebook reporting a 10% error rate spread across 20 sites that means the company makes an estimated 300,000 content moderation mistakes per day.
While Facebook’s content moderation problems have been widely reported on, Paul Barrett, the study’s principal author, wanted to highlight that while content moderation is fundamental to Facebook, the company’s choice to outsource the work to underpaid contractors in remote locations is problematic.
Barrett also cites the lack of training moderators receive in processing flagged content as a major issue.
“They never actually teach you to process what you’re seeing,” says Sean Burke, a participant in Barrett’s study. “It’s not normal seeing people getting their heads cut off or children being raped.”
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
To remedy Facebook’s content-moderation problem, Barrett has called on the social media company to bring its moderation efforts in-house while also providing moderators with proper salaries and benefits.
Other recommendations include doubling the number of moderators, hiring a content overseer, expanding moderation in at-risk countries, providing on-site medical services to moderators and expanding its fact-checking efforts.
In an interview with Venture Beat, Barrett recognized the cost of implementing the suggested measures will likely serve as a major deterrent for the company, though he’s optimistic Facebook will take some steps into consideration.
“It is a very ambitious ask,” Barrett said. “But my attitude is if the current arrangement is inadequate, why not just go for it and urge [the company] to remedy the problem in a big way. I don’t think Mark Zuckerberg is going to [smack himself on the head] and say, ‘Oh my god, I never thought of that!’ But I do think it’s possible the company is ready to move in that direction.”