Our information ecosystem in 2019 is polluted. Every day, misleading narratives, conspiracies, fabricated images and videos travel via our information streams online. Some of it could be described as harmless; for example, we’ve seen photoshopped images of sharks swimming up highways in the aftermath of a hurricane. But much is not harmless. It is designed to sow division between different groups of people, taking advantage of the tensions that exist based on religious, social or political beliefs or racial, socio-economic or sexual identities.
We live in an age of information disorder, an age where we see people deliberately pushing false information; this is known as disinformation. When this content is shared further by people online who don’t realize it is false or misleading, this is what we call misinformation. It’s when disinformation turns into misinformation that we have more opportunities to take action.
There is little we can do to prevent people creating or publishing disinformation. But those of us who use social media can take action to slow down its spread by becoming more aware of what we share. These agents of disinformation are only successful because we have a tendency to mindlessly share content without checking whether it is trustworthy. Without additional sharing and amplification, many rumors and conspiracies would wither and die.
Disinformation Tactics and Techniques
Our work at First Draft News – monitoring social networks and news coverage during elections in different countries across the world – has shown clear patterns of how disinformation moves through the ecosystem. Those who are pushing disinformation plant seeds in the anonymous web or closed messaging groups, hoping it will be travel through different online communities when unsuspecting people go onto share those rumours, conspiracies or fabricated content.
The end-goal of the agents of disinformation is to see the content appear in the mainstream news media. By seeding misleading or fabricated content, they hope it will travel quickly, potentially creating a trending hashtag or resulting in a retweet by an influential account. This in turn might catch the eye of journalists who now regularly turn to online sources to inform their news gathering.
Having a deliberate hoax or conspiracy featured by an influential news organization is considered a serious win for agents of disinformation. But so is finding their work the focus of a debunk, as it all amounts to coverage.
As Ryan Broderick reported in the aftermath of the ‘MacronLeaks’ data dump on the eve of the French election in May 2017, users on 4chan celebrated when mainstream news outlets began fact-checking the controversy surrounding Macron’s financial affairs, boasting that the debunks were a “form of engagement.”
Over 2018, we have seen these tactics play out in various regional contexts around the world. The platforms might be different, but the techniques are the same.
The Trumpet of Amplification
Drawing on a ‘Platform Migration’ diagram that was first created by Ben Decker, as well as the excellent work of Whitney Philips in her report, “Oxygen of Amplification,” I am sharing my ‘Trumpet of Amplification’. It attempts to show the journey that disinformation often takes through the Internet of 2019.
Disinformation starts on the anonymous Web, on platforms like 4chan and Discord. It then moves into closed or semi-closed groups on Twitter, Facebook or WhatsApp messaging threats, then into conspiracy communities on Reddit Forums or YouTube channels. Finally, it moves onto open social networks like Twitter, Facebook and Instagram.
Unfortunately at this point, disinformation often moves into the professional media in newspaper, cable and broadcast TV networks. This might be when a false piece of information or content is embedded in an article or quoted in a story without adequate verification. But it might also be when a newsroom decides to publish a debunk or provide an ‘explainer.’ Unless this is done responsibly, it can legitimize a rumor or conspiracy and spread it further than it would have otherwise traveled. Either way, the agents of disinformation have won. This was their goal in the first place.
Most newsrooms have teams dedicated to monitoring social media for tips, sources and eyewitness content. The problem is that not all of those teams have had the training to dig deeply into the provenance of the posts, images or videos they find online.
Where did they originate? Unless journalists are taught to really interrogate a source, they could miss the fact that the origin was actually a post on Discord two weeks previously. It could have been part of a coordinated campaign on 4chan to amplify certain messages, a tactic designed in a WhatsApp group or a narrative honed in conspiracy communities on YouTube.
The news industry is incredibly vulnerable. There are thousands of journalists globally, many of them independently monitoring and posting on social networks every day. Convince one journalist to publish the falsehood or fabricated content, and it gets pushed almost immediately across the wider community. Many newsrooms do not check reporting by other journalists or outlets with the same rigor, assuming there has been thorough vetting completed already.
This is all happening just as newsrooms are being stripped of resources, but competition for clicks is fiercer than ever. And when many journalists haven’t had the necessary training in the forensic analysis of digital sources or content, it’s easier than ever to succeed in fooling newsrooms.
So what can we all do to prevent deliberate falsehoods and coordinated conspiracies from creeping into the mainstream?
How the general public can responsibly use social media
In a previous post aimed at journalists, I shared 5 lessons for reporting in an age of disinformation. As I mentioned, the news industry is vulnerable and newsrooms need to do more so they are not manipulated. As users of social media, however, we can all play a role in slowing down the spread of misleading content and false narratives.
Develop good habits: do not let yourself share anything without being certain it can be trusted
Too often, we share posts on social media without thinking. We’re standing in line at a coffee shop or scrolling mindlessly during our morning commute. The minute we see something that resonates – it might make us laugh, angry or even smug – we click on the ‘share’ or retweet button, and carry on scrolling. It’s often a split second decision. It feels like it carries little weight. But those split second decisions can have repercussions.
When smartphones first appeared, we seized the opportunity to have these amazing gadgets in our pockets. We could interact with friends in real-time, play games, read the news and scroll through fun, engaging content. What we didn’t realize, as a society, was that these little gadgets required a new level of responsibility. Overnight, everyone had become a publisher.
Before a journalist is allowed to publish, they have to go on training sessions to learn about media law and ethics. Their work is reviewed by editors before it goes out. If they publish something incorrect, they have to issue a correction.
For everyone with a social media account, the ability to post thoughts, images or videos, or to share or retweet someone else’s has provided hundreds of millions of people with the same power as journalists; the power to amplify certain information.
But we didn’t offer any training. As a society, we didn’t create an environment by which sharing untrustworthy information carried a social stigma. So, when we’re standing in that line for coffee, there’s no repercussions to us sharing without checking.
Disinformation agents are capitalizing on this fact. We must be more mindful.
Be prepared: make sure you have the skills to check out whether something is trustworthy
Before you share, make sure you learn how to authenticate the digital provenance of a post, so you can track where a piece of content originated. You might discover a piece of information on Facebook, but did it start there? Can you investigate the digital footprint of that piece of content? Or the person who posted it? How trustworthy is the author or website that is the original source of the report?
There are excellent free resources online that can help teach you how to ascertain whether something is true or not. For example, watch these short videos from Factcheck.org or Common Sense Media. For more detailed resources, check out the News Literacy Project, or our own resources at First Draft.
With just a little bit of practice, you can become a “digital sherlock,” spotting clues and piecing them together to work out whether or not something is trustworthy.
Be responsible: Don’t give disinformation additional oxygen
When so many people get their news from just tweets, Facebook posts, as headlines on Google News or push notifications, the responsibility for how headlines are worded is more important that ever. It doesn’t matter if an 850 word article provides all the context and explanation to debunk or explain why a narrative or claim is false; if the 80 character version of that context is misleading, it’s all for nothing.
Academic research shows that simply repeating a falsehood in a headline can be problematic. Finding alternative ways to word a headline is difficult, but journalists have to be smarter about the ways we word headlines, tweets and posts by focusing on the truth. A famous example often used to illustrate this is the following: rather than regularly using headlines such as “Obama is not a Muslim,” there should have been headlines leading with the fact that former President Barack Obama is a practising Christian.
While this is incredibly important for reporters to understand, the same applies for social media users. Think about what you’re posting and how you’re posting it.
First Draft’s work suggests that there is a tipping point when it comes to disinformation. If you report or share on niche rumors or misleading claims, you help push the rumour further. But once a piece of content is traveling quickly and widely, it helps if multiple people and outlets share the debunk – as long as it’s worded responsibly!
Sharing falsehoods, even in the form of a debunk, can give unnecessary oxygen to rumors or misleading content that might otherwise die. So, if you see something and you haven’t yet seen anyone else share it, it’s better to do nothing. Reporting or debunking too late means the falsehood takes hold, and there’s really nothing to do to stop it. (This becomes a “zombie rumor” that just won’t die.) Unfortunately, academics are still trying to figure out how to measure this tipping point and when to debunk so it has the most impact. In the meantime, for everyday users of social media it’s better to be cautious and responsible when sharing misinformation.
Just because you see many people sharing or retweeting a piece of information, do you need to do the same? What might the repercussions be? Our use of social media is driven by our psychological make-up. Often, we open Facebook or Twitter to show we’re the first to know. That might be a Games of Thrones spoiler or a piece of breaking news you’ve seen flash up as a notification on your phone. It’s important that we become much more aware of why and, crucially, how we post what we do.
It might make us feel good in the moment, but when it comes to disinformation, someone might be deliberately manipulating us. Our emotions are being used against us. Agents of disinformation want us to share and retweet.
I often compare disinformation to pollution, as there are so many similarities. Consider littering. If one person drops a piece of litter to the ground, the impact is minimal. If everyone does, our streets become filthy, affecting everyone’s quality of life.
Similarly, if one person retweets a false or misleading piece of content, and no one else does, there is little impact. If many people do, there can be a serious impact. It can get picked up by journalists, and amplified even further. Society is negatively affected. We all have to take seriously our responsibility as players in today’s information ecosystem. What we share online matters.
Claire Wardle, PhD, is the co-founder and executive chair of First Draft News.