Who should really be monitoring Facebook?

With a huge user base comes big responsibilities, some difficult questions and few clear answers for Facebook.

Facebook has again hit the headlines worldwide with an issue they’d resolved (temporarily) way back in May. Namely their decision to prevent users posting video content featuring extreme acts of violence.

The focal point of the issue’s resurgence was the recent posting of a video featuring a woman being beheaded  in what is believed to be Mexico.

In a seeming change of policy, Facebook did not remove this video and, when pressed on the matter, clarified that such videos will no longer be banned. Instead, they will be allowed in order for people to share and condemn their content. Facebook will only intervene if such posts are seen to be ‘glorifying’ such acts.

After this went public there was a swift response from figures in authority and the public. The debated backlash included the UK Prime Minister condemning the company saying, “It’s irresponsible of Facebook to post beheading videos”.

Not long after that, a warning message appeared (unannounced) on Facebook videos that featured graphic material.

Graphic material suddenly featured this warning.
Graphic material suddenly featured this warning.

Whilst this was seen as Facebook trying to swiftly put the issue to rest it seemingly wasn’t enough to quell the public distaste. Condemnation and negative feedback has continued, particularly at the continued availability of the brutal execution video that had reignited the debate. Which Facebook had chosen not to remove whilst, at the same time, removing images of women breastfeeding (“exposing a full breast”).

However, in the latest development (within the past hour at time of writing) Facebook themselves released an update on the matter, stating that they have “re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it.”

Facebook succumbed to the mounting pressure and removed the video that had been the litmus test for their revised stance. Had they removed it earlier, they may have saved a lot of face and still been able to retain some faith in their policy.

Regardless of the specifics of this instance it highlights that there are still bigger questions outstanding.

Does allowing the distribution of such extreme content facilitate reasoned condemnation and debate or does it merely stimulate the macabre form of titillation the human condition is so often drawn towards?

More fundamentally, is it the responsibility of Facebook to decide and police such content or should the Facebook community and its users be the bearer of standards?

It’s important to note that it’s currently more a debate around ethics rather than a legal obligation. After all, there is still legal/criminal recourse available against users who post content that is illegal or glorifying certain acts.

Facebook has always sought to distance itself from assuming the role of an active regulator of user content. This is often explained under the guise of freedom of speech etc but to do so would also prove a major technical challenge requiring significant resources. To actively police a network which has over 1 billion active monthly users would incur significant cost. Not only that but if, in screening, Facebook were to actively deem legally dubious content permissible then it would assume a degree of legal responsibility for that.

Ultimately, Facebook is a platform like many others and, as such, it cannot take a completely hands off approach. Other platforms that have tried that strategy have found it difficult, for example Torrent sites that allowed links to copyrighted material. Whilst the sites themselves didn’t host the files, they did facilitate the distribution of the content. It would appear that ‘don’t shoot the messenger’ hasn’t constituted an adequate defence.

Whether or not they should be is a matter of opinion. For some, it’s censorship which restricts freedom of expression, for others it’s Facebook’s job to protect their users against objectionable content.

It’s also a matter of child protection. Facebook allows users who are not adults, anyone who is 13+ can set up a profile so it may be said that there is a more pressing duty of concern to protect these users from potentially harmful images. With some accounts being set up by users younger than 13 it could have an even more  damaging effect.

This could well be where parenting comes in, with the wealth of easy to use web monitoring software available these days, it is not unreasonable to suggest that parents should be able to exercise control over the surfing habits of their children on whatever device they have been given and not just expect Facebook or any other site to restrict access on their behalf.

Also, if we don’t wish to see content that is very disturbing to us (but not illegal) we don’t have to seek it out or watch it. Should we be able to stop others from viewing it purely because we dislike it? If so, various groups may then wish to exert pressure on what is allowed, risque jokes that may be viewed as blasphemous/racist could be called into question. As might content that insults or criticises governments/politicians or institutions.

The debate could be deepened much further and be addressed from a multitude of ethical or legal standpoints, yet questions would still remain. Right or wrong is not to be found.

As social media, and the regulation around it, continues to evolve, more of these questions will require answers. Facebook, for one, are keen to answer those questions themselves and not require formal intervention by state policy makers, leaving it in a halfway house between keeping the issue at arms length and being seen to be proactive in maintaining quality standards across the board.

Whilst the uncertainty and experimentation continues, user attitudes will evolve too and change is coming.

Will the social web find itself becoming that bit less open in the name of protection?

More importantly, should it?

Leave a Reply

Your email address will not be published. Required fields are marked *