"In most cases, we found this material due to advances in our technology, but this also includes detection by our internal reviewers", Monika Bickert, Vice-President of Global Policy Management at Facebook, says in the latest post shared as part of Facebook's Hard Questions series.
For the first time ever, the world's largest social network published its internal definition of "terrorism". Facebook was under pressure because the European Union was forcing Facebook and its tech industry competitors to remove extremist content as soon as possible otherwise face legislation.
The new appeal process will first focus on posts remove on the basis of nudity, sex, hate speech or graphic violence.
To support its new detailed community standards, Facebook will also be launching a series of events around the world, including in the U.S., U.K., Germany, France, India, and Singapore, where it will solicit feedback from users on the ground. The full guidelines are about 8,500 words, and include rules for what's acceptable and what's not in terms of violent, sexual, or otherwise controversial content, along with hate speech and threatening language. Eventually, it will add appeals for more types of content and for people who reported posts that weren't taken down.
Facebook already publishes a set of "Community Standards", a relatively brief overview of its global rules, for public consumption.
The document is filled with striking details about very specific issues. You can only show victims of cannibalism if there's a warning screen and age requirement.
For example, Facebook's policy of banning images containing female nipples (with some exceptions, like breast-feeding), which been criticised by the #FreeTheNipple movement, remains. Most of the sections are fairly self-explanatory and not entirely surprising - for example, if you share copyrighted material such as a video that you do not own, don't be surprised if it magically disappears.
"Our enforcement isn't flawless".
"We made a decision to publish these internal guidelines for two reasons", noted Facebook's VP for global product management Monika Bickert, in a blog post. It will not make all of its moderator guides public, such as lists of hate-speech words, as releasing them could make it easier for people to game the system.
It's important to note that the release of these guidelines doesn't actually signal a change in Facebook policy. A policy team meets every two weeks to review potential additions or edits.