The video of the beheading of a second American journalist, Steven Sotloff, by ISIS on August 2, was not carried on social media. This follows the controversy which erupted when Twitter decided to remove the video showing the decapitation of American photojournalist James Foley last month.
A new debate has ensued about whether software firms should take editorial decisions. Reading the comments that appeared in media outlets on both sides of the Atlantic, it would seem that Twitter had crossed the Rubicon, as it were. While opinion varied on the rightness or otherwise of Twitter’s action, nearly everyone seemed to wonder how this would affect the broader media landscape.
No one in their right mind should argue with the call that Twitter took with regards to the Foley video. CEO Dick Costolo explained that the decision was taken at the request of the Foley family. Even from a news angle, the decision seems perfectly legitimate. It is pertinent to ask why one would want to consume such a video when its contents were well-known. There can be little reason apart from vicarious thrill-seeking.
The debate over Twitter's decision is especially irrelevant because it speaks to a time when traditional media held absolute power over news production and distribution. Perhaps the response to the decision is merely a cry against obsolescence.
Over the past few years, internet firms have not only been the first to "trend" newsworthy topics, they have, in several cases, positively aided news dissemination. The 2009 Iranian protests against the election of Mahmoud Ahmedinejad picked their cue from social media, and were dubbed the "Twitter revolution".
This was repeated across swathes of the Arab world when the Jasmine Revolution still held the promise of ushering in substantial change. More recently, the Financial Times reported that #Ferguson had been retweeted a million times before CNN showed even a second of coverage on the incident.
There are other issues that the debate fails to consider. Unlike traditional media whose spread is controlled by the powers that be, social media has a stickiness that can be hard to unglue even after all traces of the offending media have been taken down. Due to the pervasiveness of social media, the Foley video spread like wild fire and may have already, in some ways, served the Islamic State’s unconscionable mission.
The debate also fails to consider how terrorist organisations have become adept at using social media to further their nefarious designs. As foreign fighters continue to join the IS "cause" and it becomes known that the Foley beheader was a British Muslim himself, Twitter's so called "editorial" call may have, if nothing else, achieved a broader humanitarian purpose.
The critics of the Twitter decision also fail to take into account that what Twitter did was not really editorialising, but pruning. It did not issue a statement on the rightness of its decision from a news point of view. It was not censoring, it was merely doing what it did to respect the feelings of the Foley family.
Writing in the Guardian, Emily Bell decried not merely what Twitter had done but launched a broader attack on algorithms deciding the priority of our news consumption. She gave the example of sociologist Zeynep Tufekci, who “noted after the riots in Ferguson that although many news items were being posted to Facebook, she initially saw none of them in her feed, just ice bucket challenges.”
Reports that Facebook has in the past run experiments on users’ data, such as the one in 2012 during which the news feeds of some 700,000 users were altered, add fuel to the fire. But Ms Bell fails to see that there is a difference between hard news and personal feeds.
What Facebook does with user data is a privacy, not editorial, issue. An algorithm, unlike a human editor, does not have biases. Certain posts not appearing on one’s feed is a question of who one ‘follows’ and what one has ‘Liked’ in the past. Those are inputs that get fed into Facebook’s system and determine what gets shown to the user in future. If you are not getting enough news on your timeline, maybe you are not showing enough interest in news for the algorithm to change its behavior.
Finally, the question to ask is when does something become improper for broader consumption? In this regard Twitter and Facebook themselves are not on the same page. Twitter’s policy on nudity, for instance, is vastly more liberal than Facebook’s. Several porn stars run legitimate Twitter accounts, and post pictures that would make Facebook squirm.
One incident in particular illuminates the differences in the two websites’ attitudes towards "editorialising". In late 2013, after the Supreme Court ruling on Section 377 upholding the criminalization of homosexuality, a Sikh gentleman posted a picture of himself participating in the Global Day of Rage protest in Toronto.
The picture had him kissing another man and holding a poster that read "No Going Back" with the digits '377’ written inside 'o' and slashed. While the picture received several Likes, Facebook decided to remove it and suspend the Sikh's account for 12 hours. This, when the caption to the picture clearly stated that the gentleman’s uncle had told him that he would have killed him “before 20” if he knew the boy was gay.
We need to ask what editorialising constitutes. Is it the taking down of a gratuitously violent video or is it the banishing of a picture that shows nothing untoward except articulate the desire of a sexual minority for its rights? That is the real debate hidden within the contours of what the traditional media is raking up.
Vikram Johri is a Bangalore-based writer. He tweets at @VohariJikram
Such articles are only possible because of your support. Help the Hoot. The Hoot is an independent initiative of the Media Foundation and requires funds for independent media monitoring. Please support us. Every rupee helps.