What haven’t we learned since 2016?
Since the realization that foreign and domestic disinformation played a central role in driving certain narratives during the 2016 elections, many Americans have tried to grapple with what this means, how it affects us and our communities, and what we can do about it.
This series explores those lessons we haven’t learned well enough since 2016, and what we should learn from them to be better information citizens and break patterns of how disinformation works.
Lesson 1: RAGE
Posting content to denounce it, either because it is false or makes you angry, amplifies it and expands its reach. The rage-engage cycle is a key part of how malign narratives gain traction on social media and cross over into traditional media.
Since the 2016 election, both quick analyses and in-depth academic reports have shown that false stories on the election spread faster and had broader reach on social media than legitimate news content. In many respects, this is precisely because disinformation content is designed to be polarizing so that you engage with it. This exploits the business models of social media, which de facto incentivize divisive and inflammatory content because of the psychology behind them.
In many ways, President Trump is the living embodiment of a modern dilemma, where pointing out that something is false and dangerous can also end up giving more oxygen to the fire. He seems to know instinctively what many reports have confirmed: that rage is the most powerful emotion on the internet.
His path to the White House was paved with billions of dollars of “earned media” — free coverage gained by giving increasingly incendiary rally speeches that drove every news cycle. The president himself tracks this closely. Obsessed with his own twitter engagement rates, he tests and revises purposefully inflammatory content to “improve his numbers” as needed, wanting to be the headline – the “winner” – of each day’s episode of the worst reality show in history. It’s the ultimate race to the bottom of our public discourse.
Many of us don’t follow the President on twitter, but his false, angry, and misleading tweets litter our timelines anyway as journalists and other influential accounts retweet him, often to denounce his statements. The same is true for videos of the president saying things that are 100 percent false or fabricated at his rallies (or these days, during his daily coronavirus briefings). Since Trump’s election, reporters and analysts have continued to struggle with this issue of how to address his lies and outrageous statements without needlessly amplifying them in their reporting.
But he is no more than a snapshot of the layers of behavior that amplify malign content across social media. The practice of the rage-engage cycle is how people outraged by a conspiracy theory that the Parkland school shooting was “staged” by “actors” ended up exposing the lie to new and bigger audiences. It’s how basically everyone ended up amplifying a video intentionally edited to elicit a specific reflexive anger response — a video posted by a sketchy account that was later suspended. It’s how the hashtag “NeverWarren” trended — driven by the people who were against the idea as much as those who were for it. During the 2016 election cycle, both WikiLeaks and Russia’s Internet Research Agency troll farm created controversial hashtags that they knew would be engaged and trended by people who disagreed with them — because again, rage is good business sense in the engagement economy.
Reporters and researchers trying to cover the ever-expanding, completely bonkers QAnon conspiracy deal with rage-engage challenge everyday: it’s important to explain to the vast majority of people who never interact with Q content that it is toxic, manipulative, driving domestic terror attacks, and openly supported by dozens of Republican candidates for office — but these reporters know that a byproduct of their explanations is an unintentional amplification of Q content, motivating those who are pushing conspiracy and hate for profit.
Just the idea of “hate and conspiracy for profit” is depressing — and yet a quick google search will yield countless “social marketing strategies” tip blogs that offer advice on how to exploit anger to gain an audience. Misinformation content is also big business, pulling ad revenues away from real media outlets.
Malign information creates rage which fuels engagement, the staple of the social media business model, which incentivizes the creation of more malign information and elevates the accounts that peddle it effectively. That’s why social media platforms are not inclined to shut down the networks, architecture, and practices that create and disseminate disinformation.
So, what can you do about it?
Understand the role you play as an individual in incentivizing and fueling disinformation and corrosive content. Don’t click the outrageous story link. Don’t rage-post. Leave content that is toxic alone. Yes, this includes the president’s tweets.
- Break the rage-engage incentive by not engaging content when you are angry about it.
- Always ask first what the purpose of that content is.
- Be aware of the source of inflammatory content, especially videos. If you don’t know where it came from, leave it alone.
- If there is a tweet, article, post, or video you feel it is important to comment on, take screenshots instead of replying, retweeting/reposting, or linking. This breaks the amplification.
- Don’t use hashtags when you disagree with them, or else you are just helping them trend.
- Don’t retweet or engage accounts that are known amplifiers and generators of disinformation (yes, this includes the president and more than a few members of Congress). When you directly engage them to say they are wrong, you are doing their work for them by giving them an algorithmic incentive and boost.
- If you feel it is important to comment on false or inflammatory content, instead of pithy statements, try explaining the purpose of the disinformation — what it wants you to believe versus what is actually true. Giving people the “why” is powerful. It helps us question not only that one piece of content, but future content.
- Consider unfollowing, muting, or blocking people who you believe to be consistent purveyors of deceptive narrative and information. Downtrend and disincentivize.
Takeaway: Choose your battles; push back on content or narratives with methods that don’t elevate the original content. Breaking the engagement models can disincentivize feedback loops and the disinformation economy.