• on January 26, 2019

How Adding Friction To Group Messaging Can Help Defuse Disinformation

Disinformation and misinformation on WhatsApp have been a growing hazard to public safety in Brazil, Pakistan and India, after rumors have fueled deadly mob violence and dangers to public health. As the Oxford Internet Institute’s Computational Propaganda Project has documented in its report on social media manipulation, the Facebook-owned messaging app with over a billion users has been a host to disinformation campaigns in nearly a dozen countries around the world.

In 2018, WhatsApp and Facebook have been taking steps to mitigate misinformation and defuse disinformation leading to fatal outcomes, “hiring engineers to specifically focus on disinformation in elections, and it is building in new technology that will indicate whether a message has been forwarded.”

In 2019, WhatsApp is trying to add more friction to viral misinformation with a messaging limit, further restricting the number of times a given WhatsApp user can forward a given message to five other users, after

This additional friction followed WhatsApp adding more controls in the summer of 2018 to slow down disinformation viruses on its platform, including removing a “quick forward” dialog next to messages and ratcheting down forwarding options from 250 users to 20 to try to throttle viral hoaxes.

All of this is part of WhatsApp’s ongoing battle against misinformation in India, ahead of elections this May and under intense pressure from government regulators to address the demonstrated harms.

Zeynep Tufecki, a sociology professor who has been calling attention to the challenges of unfettered free speech in our digital age, praised the move:

This is a good step. Friction is good for a messaging app, especially an end-to-end encrypted one. Yes, it will slightly inconvenience some people but it will keep Whatsapp from veering more into the virality lane while preserving communication. https://t.co/LMGF43fFHP

— zeynep tufekci (@zeynep) January 22, 2019

So did former Facebook chief information security officer Alex Stamos, who noted that this “seems like a reasonable reaction to anecdata on how misinformation spreads in India, and perhaps necessary to protect [end-to-end encryption]. He made the comment in response to questions raised by veteran information security journalist Kim Zetter, who is concerned that “random censorship of messages based only on how many have been sent rather than the content makes no sense and will do more harm than good.”

Unfortunately, Kim, Facebook is highly reactive to elite media opinion and you are the only journalist I’ve seen tempering calls for WhatsApp to react. This seems like a reasonable reaction to anecdata on how misinformation spreads in India, and perhaps necessary to protect E2E.

— Alex Stamos (@alexstamos) January 21, 2019

The tensions that are highlighted here are precisely those that have been raised in the essays we have published and will be publishing in the days ahead.

Given reports of Facebook’s plans to unify Messenger, Instagram and WhatsApp, the feature changes it makes will affect billions of people. “Code is law,” in this context, and although nations should not cede governance to the tech giants that operate the planetary platforms of our age, these kinds of changes can address the new problems the companies have introduced.

As Recode explored, however, features that help mitigate the spread of misinformation on encrypted messaging may not work for disinformation on Facebook or Twitter.

There are other ways WhatsApp could fix its misinformation problem, including “detecting patterns of massively shared content,” as Tai Nalon outlined in December 2018, based on his experience leading Aos Fatos, a Brazilian fact-checking organization.

As Nalon acknowledged, however, “tor every technological solution we create to fight misinformation, a new problem seems to arise.”

How and whether the potential benefits of a given change outweigh the harms in any given context is precisely the debate that every legislature should be having.

Alexander B. Howard is a writer, digital governance expert and open government advocate based in Washington, DC.

[Image Credit:  Jeso Carneiro / Flickr]

The Latest
Big wins for electoral reform in 2020
categories: Blog
This year voters across the country exercised their power to make Democracy better, an achievement we can all celebrate.
What this election means
categories: Blog
This victory belongs to all of us who fought in the cross-partisan coalition against a president who tried to dismantle American self-government. We proved that there are enough Americans willing to put country over party to defeat the threats to our republic. But what should we make of the election as a whole?
Trump’s fraud claims: Unsurprising but dangerous
categories: Blog
President Trump and some of his most ardent supporters continue to deny the outcome of the 2020 presidential race.
In the downward media spiral of Trump World, who is wagging who?
categories: Defusing Disinfo
Lies and disinformation have tidal effects, and we need to understand that the president is being led by disinformation as much as he is driving it.
Success in Arizona!
categories: Blog
When all was said and done, our “Republicans for a New President” campaign in Arizona reached 2 million Arizonans, numerous times, with multiple messages tailored to rebut the Trump campaign’s strongest arguments.
Skin Color
Layout Options
Layout patterns
Boxed layout images
header topbar
header color
header position