Online comment moderation: emerging best practices

IN Digital Media | 11/10/2013
Readers' feedback and perspectives can broaden the publication's coverage from their vantage point, inspire new stories and provide possible sources or ways to address an issue.
WAN-IFRA brings a guide to promoting robust and civil online conversation.

Below is an extract from the WAN-IFRA report on online content moderation. Full report can be accessed on WAN-IFRA website.

Introduction

In many parts of the globe, online comments have become an essential ingredient of a thriving news publication: readers feel that they have a right to make their contribution in an online environment that is becoming increasingly more dialogue-based than one-way broadcasting. The ability to comment on news articles or in discussion forums offers readers the chance to indulge in debate on hot topics hosted by their favourite news organisations with other readers from all over the world.

For news organisations, online comments can be an extremely valuable resource. They provide additional detail and insight to articles from informed readers who are passionate about the subject, offer a wide range of supplementary opinions and give newsrooms a window into how their readers see both their journalism and the world around them. Their feedback and perspectives can also broaden the publication’s coverage from their vantage point, inspire new stories and provide possible sources or ways to address an issue.

But it’s not all a happy tale of considerate readers offering wisdom and useful information during a democratic debate on the top issues of the day. It is impossible to limit commenting to those who do have something constructive to say and discussions frequently descend into torrents of insults that are utterly irrelevant to the original article. Maybe it’s the fact that anonymity and distance often allow consequence-free behaviour and a chance to defy social norms, or maybe it’s a factor of the structure of online conversations, but comment threads on websites can frequently shock due to abusive, uninformed, not to mention badly-written contributions.

How to moderate these comment threads is a significant challenge for news organisations as they seek to strike a balance between providing a place for free expression and robust debate while ensuring a civil and constructive dialogue, and ideally finding value from reader input. As Mathew Ingram, senior writer with GigaOm, said in a recent article[i], “Comments from readers are probably one of the thorniest problems for online publishers of all kinds… and the methods for dealing with them are all over the map.”

The issue is further complicated by the fact that news organisations are seeing input from their readers not just on their own sites but on social network pages also. The social networks themselves are being forced time and time again to rethink their own policies for dealing with problematic user content and question whether they are publishers or platforms.

We spoke to online editors and community managers at 104 news organisations from 63 countries across the globe, plus a selection of experts from the corporate and academic worlds to identify key trends, opportunities and best practices.

Overview of findings

The news organisations that we spoke to could be broadly divided into two camps with regards to their attitudes to online comments: there are those who embrace comments from users, often as part of a wider strategy of involving their readers in their publication, and there are those who see them as essentially, a necessary evil.

Very few organisations (seven) didn’t allow comments at all, but in times of financial difficulties, a costly initiative such as comment moderation, without any immediate and obvious financial benefit, is not always a priority.

However, there are many organisations which see them as an essential element in fostering a real community around their publication or a niche topic. Comments are believed to increase reader engagement, both in terms of time spent on site, and in terms of loyalty.

Summary of key points:

  • There was a relatively even split between those that moderate pre- and post-publication: 38 and 42 respectively, with 16 adopting a mixed approach
  • Organisations are deleting an average of 11% of comments, primarily because the content is generally offensive, containing hate speech or bad language, or because it’s spam. The subjects that attract the most comments (according to the editors) are (predictably) politics, followed by societal issues, religion, sports and opinion.
  • There was general consensus that by moderating comments,publications were not limiting their readers’ freedom of speech. Most editors believe that there are an infinite number of places online for the public to express their points of view, it doesn’t have to be on a specific news site, so it is up to the publication to determine the kind of conversation it wants to host.
  • There was a notable lack of awareness about the precise legal situation surrounding online commenting: who is responsible for what is being said where, what exactly is illegal, and the best way to deal with this. “It’s a grey area,” was a comment made on several occasions in interviews across a range of countries.
  • Real name registration vs. allowing anonymity is a divisive issue, with no consensus of which was preferable. There is a general feeling that requiring real names leads to a better quality of conversation, though smaller in terms of numbers. However, many organisations believe it is important to offer anonymity as an option to those who might not be able to speak freely under their real names.
  • Although many agree that when journalists participate, the discussion is of a higher quality, few organisations see their journalists frequently entering into conversation with readers. Some don’t believe it’s appropriate for journalists to be involved in an area which belongs to the readers.
  • The majority of publications don’t moderate their Facebook pages and other social networks as heavily as their own sites, because the networks are not their territory and because the real identity policies are seen to make the discussion less controversial.
  • Some news organisations are highlighting the ‘best’ comments or most active commenters in some way, although many have some way to go in this area in terms of how useful they actually make these functions to readers.

The importance of moderation

Moderation of comments, meaning, at its most basic level, deleting or blocking those deemed offensive or unsuitable, is widely considered to be essential. A key motivation of active moderation we noted was the perceived need to protect the news organisation’s brand by ensuring a high quality of discussion. Vitriolic hate speech, abusive attacks directed at commenters or even just irrelevant, off-topic remarks are seen as potentially very damaging, as the following quotes explain:

“The comments are associated with your brand. It’s absolutely up to you as a newsroom to control what sort of comments you want to have. Sitting back and saying ‘those comments are stupid but what can we do about it’ is definitely not the way to go, I would say”

-- Die Zeit, Germany

“If we got to a point where a lot of comments that were not suitable were being published it would be potentially damaging to the news website: if you’ve got a properly branded BBC news article and then all these weird and wonderful comments at the bottom that shouldn’t be there, then it’s potentially damaging to our journalism.”

-- BBC, UK

“If you have comments up that show unethical journalism it damages the brand much more. It balances out hiring good, trained people with editor level skills so they are equipped to make decisions. If it violates ethics, it damages the brand even if you have a disclaimer. The average reader is going to presume that the comment is there because you as a newspaper allowed it to be. So that intangible damage is far more costly because to build up credibility takes much longer.”

-- Gulf News, UAE

What’s more, as a study[ii] from the University of Wisconsin-Madison showed in early 2013, uncivil comments can affect a reader’s news perception (in this specific case the risk factor of a new technology such as nanotechnology). This suggests that comments can in fact impact the way that a news outlet’s journalism is interpreted by readers, making it all the more important to monitor them.

Moderation also often involves protecting readers from abuse and creating an environment in which they feel comfortable expressing themselves.

Jerémie Mani, CEO of French moderation agency Netino, believes “for real freedom of speech, moderation is necessary”, citing Godwin’s Law, which predicts that the longer an online discussion continues, the more likely the conversation will degenerate into irrelevant insults involving comparisons to Hitler and the Nazis. He added that if you don’t protect minority opinions, they can be overwhelmed by insults from the majority and feel forced to leave the conversation.

“I think it’s the obligation of the news organisation to create an environment where the type of reader that they have feels comfortable having a conversation and discussion. We want to have conversations with our readers. Our moderation system is in no way meant to silence them, it’s meant to create a safe environment where people can have intelligent conversation and can feel comfortable voicing strong opinions. I think moderation is important for everyone but you need to have that balance – you can’t just have a free-for-all.”

-- Gawker, US

“It’s about the image and the brand you’re trying to portray. I’m for absolute freedom but it gets out of control. You can get a comment section that is unusable if you have things that are bad language and just spam. These things can hurt.”

-- Al-Akhbar English, Lebanon

Some news outlets deliberately resist moderating, and regret the need to get involved.

“Because we have a company policy based on a light touch, we tend to leave the conversation to the readers.”

-- The Straits Times, Singapore

“We should be able give the space to readers to discuss. Some people inside the company think we should leave the discussion to the readers and not moderate at all, but then the quality gets worse.”

-- The Nation, Kenya

Conclusion

As journalism increasingly tends towards becoming a dialogue between reporters and readers, online comments and other reader input will only become more important. The challenges they pose are not going to go away, but they can be addressed, and the potential of comments to make a positive contribution to a news outlet is considerable.

Broadly, news outlets are moving through three stages in their approach to tackling online comments. The first challenge is how to avoid offensive content appearing on your publication, and this was clearly a first priority for those we interviewed. The editors and managers we spoke to were clear that they believe news organisations need to maintain significant control over the content on their sites, and it is important that they are confident in the methods they establish to manage user contributions.

Once a news outlet has found a strategy to deal with this, they can move on to looking at how to cultivate a robust, constructive dialogue on their sites that is a draw in itself, and then finally focus on how to make comments a truly valuable, integrated element of their publication. Some have already embraced their readers’ input with open arms, but for many, a change in mindset from seeing online comments as a burden to seeing them as an opportunity is an essential step to making them actually useful. It isn’t just the moderators who need to see the potential in commenting, but the editorial team as a whole.

Getting to the point where you can make best use of online comments also requires investment in resources and intelligence, which is particularly challenging at a time when many news outlets are struggling to establish sustainable digital businesses. More effective automated filters will undoubtedly be developed, but there will continue to be a need for human input in the moderation process and consequently for well-trained moderators and community managers.

The benefits that many see from comments – from feedback and ideas for stories, to genuine loyalty and trust that leads to more visits and time on site – are significant and increasingly important for news organisations in times of tough digital competition.

Predictably, in many cases it is the big, well known organisations that are leading innovation in comments. But smaller news organisations can learn from these and take the results of the experiments that suit them.



[i] Making readers a part of the story – the New York Times experiments with highlighting comments, GigaOm, 30 July 2013 http://gigaom.com/2013/07/30/making-readers-a-part-of-the-story-the-newyork-times-experiments-with-highlighting-comments/

[ii] The “Nasty Effect:” Online Incivility and Risk Perceptions of Emerging Technologies, Journal of Computer-Mediated Communication
http://onlinelibrary.wiley.com/doi/10.1111/jcc4.12009/abstract

Subscribe To The Newsletter
The new term for self censorship is voluntary censorship, as proposed by companies like Netflix and Hotstar. ET reports that streaming video service Amazon Prime is opposing a move by its peers to adopt a voluntary censorship code in anticipation of the Indian government coming up with its own rules. Amazon is resisting because it fears that it may alienate paying subscribers.                   

Clearly, the run to the 2019 elections is on. A journalist received a call from someone saying they were from Aajtak channel and were conducting a survey, asking whom she was going to vote for in 2019. On being told that her vote was secret, the caller assumed she wasn't going to vote for 'Modiji'. The caller, a woman, also didn't identify herself. A month or two earlier the same journalist received a call, this time from a man, asking if she was going to vote for the BSP.                 

View More