By Alex Howard
In 2019, we’ve learned enough about modern misinformation and disinformation campaigns to understand that democracies everywhere have a wicked problem. The challenge is that some approaches to mitigating misinformation and defusing this information could do more harm than good.
We need a Hippocratic oath for our body politic, much like the ones that doctors uphold when they evaluate sick patients. Today’s extraordinary platforms for expression have been weaponized by bad actors, both foreign and domestic.
One needs look no further than anti-vaccine campaigns online and recents blooms of measles to understand real world impact of inaction or neglect, or the mob violence that has followed conspiracy theories spreading like wildfire through private messaging.
As we learn more about the strategies and tactics used to create and spread disinformation and the dynamics by which misinformation spreads, it’s critical for all of the different stakeholders to act deliberately, consciously, and effectively.
This panoply of problems is why I was motivated to learn more about disinformation space over the past two years and work with experts to develop constructive policy and technical approaches to manage the problem.
As Mark Zuckerberg admitted after a year of studying what happened on Facebook, there’s no silver bullet nor singular tweak for this set of wicked problems — but there are common sense approaches that will help. The first batch of proposals are now online at DefusingDis.info, with more to come.
There are a broad diversity of perspectives to acknowledge, with conflicting views on what should be done. There are some common threads that run through, however, and I’ll tease them out here.
The role of media
The first step has to be informing the public, which implicated government and media. over the past two years, journalism has been the primary force in this arena, as story after story documented how governments acted or reacted, and reports out about what occurred on social media and messaging platforms and what didn’t happen inside of the companies that operate them.
Media companies and the journalists who are employed by them, however, also have to take responsibility for not amplifying disinformation and being unknowing pawns for intelligence services.
Journalists have not always served the public well, dismissing tweets, ephemeral messaging and videos as the playthings of narcissistic teens, when authoritarian leaders adopt and adapt them, that casual treatment will come back to haunt us. When reporters amplify lies and disinformation in headlines and tweets, shorn of context or fact-checking, they don’t serve the public: they disinform them.
Fortunately, the answer is this challenge is relatively straightforward, practice journalism. Verify, add trust, then publish. Work with the people formerly known as the audience to identify and debunk disinformation, as Andy Carvin suggests. Lead with the facts. Don’t repeat the lie.
The role of government
Media, however, can’t hold bad actors accountable or enact reforms that are responsive to these problems.
The bitterly divided politics of the United States and uniquely corrupt, compromised Trump administration have hindered government action. One of the resounding failures of the Obama administration will be the president‘s decision not to address will nation about what was happening as Russia’s influence campaign burned through social media and was amplified on television screens, apparently out of fear that doing so would be seen as interfering and perhaps out of confidence that former Secretary of State Hillary Clinton would win. An extraordinary public statement from federal law enforcement and intelligence agencies didn’t break through to shape the national debate, hindered by ongoing disclosures of hacked email from Clinton’s campaign chairman.
Significant responsibility must also be laid at the feet of Senate majority leader Mitch McConnell, who refuse to sign onto a bipartisan statement about what was happening. Congressional leaders should put country before party. Future presidents must use the bully pulpit more effectively to rally the public against hostile foreign actors.
The public information failures of 2016, however, are dwarfed by the outright disinformation that we see every day from the president of United States on Twitter, unchecked and amplified by Republican majorities and partisan media. After initially denying what had happened, technology companies that hosted this information have taken steps to add friction, under intense public pressure and scrutiny. But Congress has failed to act.
Abroad, the European Union has moved ahead with a Code of Practice on Disinformation. That will help, with various European countries aggressively confronting disinformation campaign’s from Russia and other actors. Whether the Senate or this president is willing to debate or an act them, however, remains to be seen.
Our deregulated campaign finance system in United States allows political groups funded by vast amounts of dark money to influence our elections without accountability. Indeed, one could argue that it has been designed to do so. It should have come as no surprise that that same architecture that has enabled billions of people to participate in media creating, consumption and distribution created vulnerabilities for public information and opinion to be manipulated, but here we are.
Our governments should analyze our democratic systems and institutions and defend them against information attacks, as Bruce Schneier and Henry Farrell recommend.
Harvard law professor Yochai Benkler recommends adding disclosure of commercial actors that today’s dark arts use to the Honest Ads act, which could help. “Trolling-as-a-service” can’t become our new normal.
Whether any of that happens or not in the 116th Congress will probably depend upon whether a severe shock to our political system catalyzes reform, as Watergate did, leading the Senate to act in defiance of its current leadership’s apparent allergy to transparency, and good government.
As Reneé DiResta outlined, however, there’s must more that policymakers and regulators can and should do if Congress doesn’t move.
The role of technology companies
Beyond government, the second set of actors that are are the corporations that created and maintain the technologies that have changed every aspect of society, from how we work, shop, play, and debate.
Unfortunately, the technology companies that created these mighty engines of surveillance capitalism did not build democratic defaults and principles into them, from transparency to accessibility.
Recent shifts towards adding more friction to messaging and transparency to advertising are meaningful steps, but insufficient to the in enormity of the need. Integrating more fact checking, adding context and trust signals, analyzing patterns of activity to detect disinformation campaigns, and working with researchers and regulators will help, as Brooke Binkowski and others propose.
The personalized advertising that technology companies offer to their customers, however, may not be able to coexist with the shared public facts required for an informed public to be self governing. The technologists at these companies can play a key role not only in whistleblowing but pressuring their employers to take action.
The public’s role
Finally, responsibility rests with all of us. In an age when we carry super computers connected to billions of people around the world, every person has great power to spread or stop this information and misinformation. While we can and should expect publishers and technology companies to be held to a higher standard, given the role they play, we can’t shirk our own culpability in spreading or damping misinformation, hoax, rumors, or conspiracy theories. Claire Wardle has given us all some basic principles and practices to improve on this account.
We need to help one another, from the senior citizens who appeared to be most vulnerable to misinformation to the students whose education should include basic civics, statistics, and media literacy. Everyone should learn how to use social media responsibly. Our schools and libraries will be the front lines of the fight against disinformation for years to come.
Finally, we need to remember that laws and technologies are created by humans and can be changed by humans. The extraordinary innovations of the last decades have permanently changed how we create, access, and distribute information. That same shift means misinformation and disinformation can flow more easily into our inboxes, screens and airwaves than ever before.
As Wardle wrote, our media ecosystems have become polluted.
If we think about this as a public health challenge, like ensuring food and drug safety, instead of a “war” against foreign propaganda, it may help us find ways to improve how we create and consume media, from inspecting it for disinformation viruses to putting out advisories on viral infections in various companies, brands, or publications. We need a modern version of the environmental movement to ensure that our children will grow up with a healthier information diet than the toxic mess that’s currently flowing by our screens.
[Image Credit: FBI]