A blog on why norms matter online

Thursday, February 23, 2012

Where humor overrules hate speech and crushed limbs are "ok to show": Facebook's Content Moderation Standards leaked







Have you every wondered on what basis (beyond the vague Statement of Rights and Responsibilities) Facebook polices content? 


Wonder no more.  The Gawker recently published a review of Facebook's guidelines for online content based on a leaked copy of Facebook's "Abuse Standards 6.2. Operation Manual for Live Operators" (and an update).


The Standards make for interesting reading as they clarify Facebook opaque content moderation process. By not clearly publishing its standards, Facebook makes it more difficult for users to make the company accountable. As the Gawker's article put it: "It would be clear what Facebook was choosing to censor according to its policies, and what amounted to arbitrary censorship."


The moderation process of Facebook can be summed up as follows: 

  1. Users report pictures, videos and wall posts. 
  2. The outsourced (and underpaid) content moderation teams wade through the stream of reported items. 
  3. Using the Abuse Standards they 
  • confirm the flag and delete the content;
  • unconfirm the flag and allow the content to stay; or
  • "escalate" the flag, thus turning it over to a higher-level Facebook employee. 


The leaking of the  Abuse Standards 6.2 only partly remedies the opaque content policy of Facebook. Though Facebook can now be held to its policies, they still seem partly arbitrary. 



Among pictures which are not allowed, we find those showing 
"Any OBVIOUS sexual activity [...] Cartoons/art included. Foreplay allowed (Kissing, groping, etc.) even for same-sex individuals"
Users are also not allowed to "describe sexual activity in writing, except when an attempt at humor or insult."


 not allowed is "Digital/cartoon nudity", but "Art nudity" is fine. People “using the bathroom” are not allowed, neither are "[b]latant (obvious) depiction of camel toes and moose knuckles".


Pictures showing marijuana use are allowed, "unless context is clear that the poster is selling, buying or growing it".



Facebook also bans "[s]lurs or racial comments of any kind", hate symbols and "showing support for organizations and people primarily known for violence." But the Guidelines caution that "[h]umor overrules hate speech UNLESS slur words are present or the humor is not evident."


Gory images depicting "the mutilation of people or animals, or decapitated, dismembered, charred, or burning humans" are forbidden. Graphic images of animals are allowed if the animal is "shown in the context of food processing or hunting as it occurs in nature." 



Deep flesh wounds are also "ok to show", as is "excessive blood". "Crushed heads, limbs, etc. are ok as long as no insides are showing".



Really interesting from an internatioal freedom of expression perspective is the section entitled "IP Blocks and International Compliance". All of the following contents has to be "escalated", that is: forwarded to a higher ranking Facebook content controller for review:




Especially, Facebook warns its moderators to confirm flagged images showing "Burning the Turkish flag", but writes "other flags are ok to be shown burning".


Other content that should be escalated includes: 


  • child pornography/Pedophilia
  • threats of school violence, credible or otherwise
  • necrophilia and bestiality
  • credible threats and indications against public figures (under certain circumstances),  Law Enforcement Officers (LEO) and "[a]ny threat (credible or not) against Heads of State"
  • credible indications of past/future crime and organized crime
  • any indication of terrorist activity; and interestingly
  • poaching of endangered species

Since the importance of Facebook as an international forum of aggregation and articulation of ideas is growing, the leaked document amount to what it believes should be an international moral consenus on allowed content. This would be problematic as the document is not free of bias and should be vetted more carefully against international law on freedom of expression. With regard to the generally excepted exceptions from freedom of expression,  however, most of the standards pass muster. 


What is problematic though is that content regulation is outsourced to low-paid employees of third companies and that Facebook did  not publish the document itself. 


"Sunlight is the best disinfectant",  US Supreme Court Justice Louis Brandeis once wrote. He was talking about the accountability of the banking sector, but the saying holds true for social networks as well. 


Content violative of human rights of others will always exist. Social network providers are obliged to protect their users from that content but at the same time must ensure that they do not infringe freedom of expression unnecessarily. 


What Facebook should now do is officially publish the Abuse Standards, clarify the moderation process, and start a vigorous debate among its users on the international standards of freedom of expression.





















No comments:

Post a Comment