Fake news and religious hatred on WhatsApp in Karnataka elections

IN Digital Media | 19/05/2018
Real news about fake news: fake audio on WhatsApp in India, and a fake pre-election poll purportedly sponsored by the BBC, showing a big victory for the BJP.
LAURA HAZARD OWEN’s round up in the NiemanLab newsletter

 Elections in India are now fought and won on WhatsApp. Printscreen, The Economic Times


Can signing a “pro-truth pledge” actually change people’s behavior online?

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Elections in India are now fought and won on WhatsApp.” The Washington Post’s Annie Gowen and Elizabeth Dwoskin look at news-sharing (plus the spread of “fake news and religious hatred” on closed messaging platform WhatsApp in India, during a high-profile election in the state of Karnataka this month and ahead of the country’s national election next year. India is the Facebook-owned WhatsApp’s largest market; it has more than 200 million users there. While Facebook partnered with Indian fact-checking site Boom to fact-check news on that platform, “little has been done in this cycle to combat incendiary content on WhatsApp…Indian officials, feeling helpless to stop the spread of WhatsApp content, have resorted to shutting the Internet down in tension-filled places, with more than 70 stoppages last year compared with six in 2014, according to the Internet Shutdowns tracker portal.” All content sent on WhatsApp is encrypted.

The New York Times has a screenshot of “a fake pre-election poll, purportedly sponsored by the BBC, [that] was widely circulated on WhatsApp last week. The poll, whose origin is unknown, predicted a huge win for the Bharatiya Janata Party, or B.J.P., in Karnataka, where it currently is in the minority. Fake polls, including another one supposedly conducted by the United States Embassy in New Delhi, have been a staple of the political battle on WhatsApp.”

A fake survey on Karnataka polls has been circulating on Whats App and claims to be from BBC News. We'd like to make absolutely clear that it's a #fake and does not come from the BBC. The BBC does not commission pre-election surveys in India. #fakenews

— BBC India (@BBCIndia) May 7, 2018

A new kind of content is also spreading on the platform: Fake audio messages.

U.T. Khader, an incumbent member of Karnataka’s legislative assembly, experienced the WhatsApp effect firsthand. Just before the election, Mr. Khader, a Muslim in the Congress party, was the target of what Mangalore police said was a disturbing new type of WhatsApp attack: a series of profane audio messages purporting to be an escalating exchange of threats between Hindus and Muslims over his candidacy.

In one recording, which was supposedly a phone call between two Hindu political activists, one voice harangued the other for putting a saffron-colored shawl, which the B.J.P. views as a Hindu symbol, around Mr. Khader.

“Why did you put a saffron shawl on Khader? Do you love your life or not?” the first voice said. “If I shove a knife into you, do you think Khader will come to your support?”

Later messages sounded like they came from Muslims threatening to kill the first voice in response. “Son of a prostitute, I’m warning you,” said one. “I’ll take you out.” The messages were sent to various WhatsApp groups, so they were heard by many voters.

Mr. Khader, who has represented the area for more than a decade and won with a large margin last time, said the alleged conversations were fake and recorded in a studio.

Facebook rumors lead to violence in Sri Lanka. That’s what an episode of The Daily was about this week, featuring Amanda Taub and Max Fisher. One problem: When Facebook doesn’t hire content moderators who speak local languages, posts that users have flagged as hate speech just sit there without being reviewed or taken down.

Could a pledge to promote honesty actually work? Gleb Tsipursky, a professor at Ohio State University, and a group of behavioral scientists came up with a “Pro-Truth pledge” that “aims to promote honesty by asking people to commit to 12 behaviors that research shows correlate with an orientation toward truthfulness. For example, the pledge asks takers to fact-check information before sharing it, cite sources, ask friends and foes alike to retract info shown to be false, and discourage others from using unreliable news sources.”

Now: It looks as if many of the public figures who’ve signed the pledge are — well — the kind of people who’d sign a group of professors’ Pro-Truth Pledge; I counted a LOT of academics, and also some politicians. Still, signing it appeared to have an effect on their Facebook sharing behavior — in this one study of 21 people, anyway. Tsipursky writes up the research in The Conversation:

[We] got permission from participants to observe their actual Facebook sharing. We examined the first 10 news-relevant posts one month after they took the pledge and graded the quality of the information shared, including the links, to determine how closely their posts matched the behaviors of the pledge. We then looked at the first 10 news-relevant posts 11 months before they took the pledge and rated those. We again found large, statistically significant changes in pledge-takers’ adherence to the 12 behaviors, such as fewer posts containing misinformation and including more sources.

Facebook partners with Atlantic Council. Facebook has partnered with Atlantic Council’s Digital Forensic Research Lab to fight disinformation in elections around the world. “This partnership will help our security, policy and product teams get real-time insights and updates on emerging threats and disinformation campaigns from around the world,” Katie Harbath, Facebook’s global politics and government director, said in a statement. “It will also increase the number of ‘eyes and ears’ we have working to spot potential abuse on our service — enabling us to more effectively identify gaps in our systems, preempt obstacles, and ensure that Facebook plays a positive role during elections all around the world.”

Do paywalls lead to increased polarization?

Good thread on the probable effect of rising newspaper paywalls on the political polarization of the media. As traditional outlets limit access to paying subscribers, mogul-subsidized, Trump-aligned outlets will remain freely accessible & will therefore be more widely read. https://t.co/4nRHxGIk1g

—    Annemarie Bridy (@AnnemarieBridy) May 16, 2018


Laura Hazard Owen is deputy editor of the Lab. @laurahazardowen


The Hoot is the only not-for-profit initiative in India which does independent media monitoring.
Subscribe To The Newsletter
The new term for self censorship is voluntary censorship, as proposed by companies like Netflix and Hotstar. ET reports that streaming video service Amazon Prime is opposing a move by its peers to adopt a voluntary censorship code in anticipation of the Indian government coming up with its own rules. Amazon is resisting because it fears that it may alienate paying subscribers.                   

Clearly, the run to the 2019 elections is on. A journalist received a call from someone saying they were from Aajtak channel and were conducting a survey, asking whom she was going to vote for in 2019. On being told that her vote was secret, the caller assumed she wasn't going to vote for 'Modiji'. The caller, a woman, also didn't identify herself. A month or two earlier the same journalist received a call, this time from a man, asking if she was going to vote for the BSP.                 

View More