Facebook’s Community Standards: Severed heads are okay, but nipples are bad (unless accompanied by a baby)

Posted: August 5, 2014

For the first time in history, unfiltered publication is possible from almost anywhere.  Amongst many other things, this means that graphic images can be shared from areas all around the world where acts of extreme violence are being committed.  This in turn means that certain organisations (principally social media companies, whose platforms are predominantly used to share such images) are cast in the role of moral arbiter: as they try to establish standards of what is, or is not, socially and legally acceptable.  This raises the question of where (if at all) boundaries should be drawn for the publication and dissemination of graphic and disturbing images.

We illustrate the issue by using a real life example, which by necessity involves discussion of disturbing issues.  It is not in our view appropriate to include the image in question in this post, and, anyone of a nervous disposition may prefer not to read the detail further (as we describe what is depicted). 

Whilst there are many examples appearing online, we have chosen to discuss an image purportedly from Syria. Much has been written about the tactics deployed by Islamic State (formerly ISIS) fighters in their recent advances / attacks in the Middle East.  At the end of July it was reported that members of the Islamic State had ambushed a Syrian army base and killed scores of Syrian soldiers.  The Syrian Human Rights Observatory reported that “at least” 50 of those killed had been decapitated, with the heads being subsequently mounted on poles or fences. 

A photograph is currently being shared on Facebook (one of numerous photos showing scenes of graphic violence from around the world) which appears to show the aftermath of the attack, depicting the severed heads of seven men placed on the posts of a fence.  In the foreground of the photograph a man stands making a celebratory gesture with a forefinger raised, as if giving a victory or warning sign.  Underneath the photograph there is text referring to skewers and food (we have chosen not to repeat the words in this post). Effectively, it is an attempt at a joke in extremely bad taste.

When a link to the photograph and accompanying text is shared by anyone, it automatically appears in the timeline of that person’s ‘friends’.  There is no advance ‘opt-out’ for friends.  In other words, a person does not have to search out such images.  If they are shared by a ‘friend’, a person can as usual one minute be looking at harmless photographs of friends and family in their timeline, and an instant later be viewing severed heads on a fence. 

Facebook’s ‘Community Standards’ state, amongst other things, that:

“Graphic Content

Facebook has long been a place where people turn to share their experiences and raise awareness about issues important to them.  Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses or acts of terrorism.  In many instances, when people share this type of content, it is to condemn it.  However, graphic content shared for sadistic effect or to celebrate or glorify violence have no place on our site.[our emphasis]

We should be clear in saying that the person who shared this link as we saw it was not sharing it to glorify the acts committed; far from it. It appeared to be a well-intentioned attempt by that person to condemn both the original act and the apparent celebration of it by the individual in the foreground of the photograph.  That said, it seems to us that there can be no doubt that the initial photograph was taken and posted in order to celebrate or glorify violence and for sadistic effect. 

However, in response to a complaint raised on the point, Facebook responded:

“Thank you for taking the time to report something that you feel may violate our Community Standards.  Reports like yours are an important part of making Facebook a safe and welcoming environment.  We reviewed the share you reported for containing graphic violence and found it doesn’t violate our Community Standards.”

On the one hand, allowing such graphic images to circulate is, at least arguably, a way of drawing attention to the very serious atrocities committed around the world, which in turn may well lead to something being done about them sooner.  It may be the only way of getting the true horror of what is happening in a particular place out to the world at large. Perhaps the restrictions placed on the mainstream media about what they can and cannot show dulls the public’s reaction. We can see the validity of such arguments. 

On the other hand, however, one may feel that (1) the fact of such events can be satisfactorily communicated without the need for such graphic imagery; (2) it is inappropriate for this content to be freely available online (including to children) and in at least some cases ‘pushed’ into Facebook users’ timelines with no advance filtering mechanisms in place; (3) it likely encourages a significant number of people to seek out and share such images for the wrong reasons; (4) it desensitises people in general; and (5) that if the particular image we have chosen to discuss “doesn’t violate [Facebook’s] Community Standards”, what will?

The task facing Facebook and other companies in a similar position is unenviable and one can well understand that undertaking a role of moral arbiter is one that few companies will relish (especially where accusations of censorship might be made).  The question is, however, are social media companies currently setting the bar of what is permissible on their platforms at the right level, and should it be left to them to decide anyhow?

Whilst, after some considerable arguments, images of breasts – if they depict breastfeeding or following a mastectomy – are now permitted by Facebook, other images containing nudity remain banned. Yet, images showing the most graphic and horrendous violence imaginable (unimaginable to the majority of rational people) are apparently acceptable. Is this how we want our society to evolve and should there not be a proper debate about what is or isn’t acceptable? Is nudity really more damaging to society than violence? Have the lines been drawn correctly in the sand?

One last thought, if Facebook’s position regarding the depiction of graphic material thought to be in the public interest is deemed acceptable “when people share this type of content…. to condemn it”, then couldn’t that be said to apply to almost anything horrendous in the margins of society?  If that were the case, then the logical continuation of this line of argument is to say that it is no bad thing to allow such material to populate mainstream sites simply because some of those sharing it may condemn it. We do not envy the social media companies who face the unenviable task of being dragged to the forefront of the challenges and changes facing society. That said, the real question is should those companies be the ones making the decisions or should we the public, and those who we elect, help them set the boundaries of what is or is not acceptable?

Gideon Benaim and Jon Oakley are from the reputation protection team at Michael Simkins LLP