Facebook allows users to live stream acts of self-harm, post videos of violent deaths, pictures of certain kinds of child abuse, and even a video of an abortion — unless it contains nudity in which case Facebook’s army of moderators will automatically take it down.
These are just a fraction of the rules Facebook moderators use as they battle to strike a balance between freedom of speech and avoiding causing harm in the real world. The rules have been revealed in over “100 internal training manuals, spreadsheets and flowcharts” reviewed by the Guardian and published on the newspaper’s website.
The so-called Facebook Files reveal for the first time the inner workings of the social network’s moderation policy, something that Facebook and its CEO Mark Zuckerberg have come under increasingly pressure about, in recent years.
Responding to the Facebook Files publication, Monika Bickert, the company’s head of global policy management, told VICE News: “We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”
The manuals deal with issues such as violence, hate speech, terrorism, pornography, racism and self-harm. They are used by Facebook’s 4,500 human moderators and work in conjunction with the site’s automated content filters which focus on removing the most extreme content, such as child pornography and terrorism.
Here is how Facebook deals with certain content:
- Child abuse: Certain content of non-sexual child abuse, including physical abuse and bullying, is permitted on Facebook according to the manuals. Images of such content is not actioned, while videos are marked as disturbing. Items with a celebratory or sadistic nature are banned.
- Nudity: Facebook says that sharing nudity and sexual activity in “handmade” art is allowed but digital art showing sexual activity is not.
- Abortion: Facebook allows videos of abortion, but only if there is no nudity
- Self-harm: Facebook will allow users to live stream video of themselves self-harming because it “doesn’t want to censor or punish people in distress who are attempting suicide.” However once it is deemed that the person can no longer be helped, they will remove the footage.
- Animal cruelty: Facebook’s rules allows users to post photos of animal abuse such as humans kicking or beating animals, with only extremely upsetting imagery to be marked as “disturbing.” The reason Facebook says is to raise awareness about what is happening.
- Threat of violence: Threats of violence are only removed in certain cases. A threat made against a head of state or even a candidate for head of state, are automatically removed. Also in this protected category are certain law enforcement officials, activists and journalists. However should a user say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die,” these are not removed as they are not seen as credible threats.
Facebook recently announced it was closing in on having 2 billion monthly active users around the world, and with such size comes a huge amount of problems. One document seen by the Guardian reveals that in a single week, Facebook deals with 6.5m reports a week relating to potentially fake accounts.
While Facebook has announced plans to hire an extra 3,000 moderators, it is clear that its systems is at breaking point. One of the areas it is seeing a spike in reports is sextortion and revenge porn, with one document suggesting it dealt with 54,000 reports in a single month.
With almost one third of the world’s population using Facebook regularly, the decisions the company makes about what people can and cannot see are hugely influential — and that could be a problem. “Facebook’s decisions about what is and isn’t acceptable have huge implications for free speech,” the Open Rights Group told the BBC. “These leaks show that making these decisions is complex and fraught with difficulty.”
The publication of these documents also calls into question the lack of transparency under which Facebook operates, with some calling for the company to be more open about why it removes certain content. “”Facebook will probably never get it right but at the very least there should be more transparency about their processes,” the Open Rights Group said.