FYI.

This story is over 5 years old.

News

Leaked manuals show how Facebook decides if you see nudity or death

Leaked manuals show exactly how Facebook's 4,500 content moderators are trained to manually filter out extreme racism, misogyny, violence, pornography and other content deemed too disturbing for wider human consumption.

This segment originally aired on May 23, 2017 on VICE News Tonight on HBO.

Leaked manuals show exactly how Facebook’s 4,500 content moderators are trained to manually filter out extreme racism, misogyny, violence, pornography and other content deemed too disturbing for wider human consumption.

Moderators work on a special page, called the single review tool (SRT), where they review the millions of reports flagged by Facebook users. They sometimes have as little as 10 seconds to decide whether to ignore, escalate, or delete each post. And their job is complicated by laws that vary from country to country.

Though it’s fast, the process is quite complicated. According to the manuals, moderators must analyze each post with respect to both its content and the laws of its country of origin.

Facebook processes 1.3 million posts per minute, and the documents show that the company is trying to develop policies on everything from profanity to cannibalism. It has automated systems to root out some extreme content, and the company has promised to hire 3,000 more moderators, which may be more of a necessity than an expansion — sources say current moderators move on quickly and suffer from anxiety and PTSD.