• on June 22, 2020

What lessons haven’t we learned since 2016? Lesson 5: Are YOU Disinfo?

Since the realization that foreign and domestic disinformation played a central role in driving certain narratives during the 2016 elections, many Americans have tried to grapple with what this means, how it affects us and our communities, and what we can do about it. 

This series explores those lessons we haven’t learned well enough since 2016, and what we should learn from them to be better information citizens and break patterns of how disinformation works. 


Lesson 5: If what you post contributes to disinformation narratives, you are the disinformation.

This week, there were new reports that Kremlin information operations may be shifting focus in 2020 toward identifying conspiracies and useful narratives originating from Americans and using smaller networks of mostly disposable accounts on social media to amplify them, instead of investing heavily in creating their own narratives to promote via their own false-identify, established accounts, as they did in 2016. Why spend a lot of time writing original scripts when they can copy off our homework?

Central to this is an idea discussed earlier in this series: that disinformation fails if we refuse to be a part of the machinery and behavior essential to its function.

Said another way: you don’t want to be the disinformation, or the fuel in the spread of disinformation.

So it’s time to look closer at another factor here: that we need to be more aware of whether and when our own posts are feeding narratives being pushed by adversaries and malign actors, what that means, and what we should do about it.

In 2016, the Kremlin information operations focused on a variety of narratives to coalesce support behind candidate-Trump and divide the opposition to him. But in all the discussion of how Kremlin narratives helped Trump, far less attention has been paid — beyond headline talking points like “sought to divide,” “amplified social divisions,” “preyed on social weakness” — to the narratives that targeted the left and progressives. There were two major efforts. 

First, Kremlin-backed campaigns amplified divisions in the Democratic Party by backing Bernie Sanders (and third party options like Jill Stein of the Green Party) over Hillary Clinton and pushing progressives not to vote for her (either not to vote at all, or to vote for Trump or third parties). Before Senator Sanders dropped out of the race for the 2020 democratic presidential nomination, he was again being supported by Kremlin information campaigns (which this time, he denounced). This strategy will continue in 2020 — unite Trump base, divide all others

Second, there was a major line of effort on race — racism, racial injustice, violence involving police, etc — leaving black Americans one of the most targeted groups in 2016. The years leading up to the 2016 election, with the birth of the Black Lives Matter movement and mass protests/demonstrations in multiple cities sparked by incidents of police violence, were fertile ground for leveraging narratives on racism and racial divisions in America. The Russian troll farm not only played in the information domain on these issues, but reached out in person to black activists in the United States to fund and sponsor events and other work on this space.

After this outreach became public during the investigations of Russian attempts to undermine American democracy, The Guardian reached out to the American activists that had coordinated with Russian troll farm personnel to get their reactions to the information. The range of responses is why this article is one I think about a lot — everything from outrage and confusion that Russia was exploiting painful personal and social issues, to a shrug and a “well if it takes Russia to support social justice, then fine, I’ll work with Russia.” One end of this spectrum shows we have far more education to do to raise awareness of how adversaries work to exploit our open systems; the other end exposes the far greater challenge of people not caring about what any of this means in an  ends-justifying-means way. 

In 2016, the Kremlin initiatives also used the tactics outlined in this week’s NYT story — populating their own narratives with American-generated content — and analysis of the troll farm’s activity showed that individual accounts were “rewarded” with amplification and following for contributing to specific narratives and interacting with troll farm accounts.

The same tactic of watch/identify/amplify was seen in this detailed analysis of the “release the memo” campaign in early 2018, where US far-right social media, media, and politicians aligned with Russian amplification to push an information campaign online that forced a specific policy response. Foreign actors amplifying these appropriated topics and narratives makes it harder to identify them as foreign or not, but the amplification nevertheless sculpts the information domain that is influencing Americans. 

The idea of a system of unspoken rewards that incentivizes certain behaviors and narratives online is also being replicated by different means in new platforms. For example, the algorithms guiding China-based app TikTok have been accused of censoring specific content viewed as unfavorable by the Chinese Communist Party, while also amplifying content seen as pro-China by encouraging usually-unrelated social media participants to post pro-China content for a boost in following (while potentially self-censoring critical content that might disrupt that boost).

What does all this mean given the new reporting and leading up to the 2020 elections? 

For starters: read the articles linked to this post. They will really help you understand the goals and machinery of disinformation and tactics you don’t want to be drawn into. 

But then:

First, it matters what narratives you are contributing to online — intentionally or not — and you should be aware not only of where your content comes from, but where it is going.

If you write a crazy twitter thread trashing someone and RT writes it up as an article, then it’s pretty obvious you just did the adversary a favor. But this is maybe less obvious when you post a divisive video with an inflammatory explanation of the content and it is retweeted by a lot of sketchy accounts, going viral, and influencing online perceptions. The latter is a particularly sharp tool with visual content in situations of high-emotion, like national attention to racially-motivated incidents of hate and violence. 

Remember: disinformation is as much about context and intent as truth. If you’re writing divisive content for the disinformation machine, stop. Figure out a slower, better way to add context to content you post. What’s the real issue? What’s a solution? What’s a way to talk about it that doesn’t demonize or exclude specific groups?

Second, block bots and disruptive or sketchy accounts that follow and promote you, even if they are agreeing with you. Disincentive the lazy-machinery of propaganda and disinformation whenever you can. 

This is harder with larger accounts and takes some attention to detail. But it is important in breaking the behavioral engineering incentives inside most social media. 

Third, don’t “follow back” or “friend” automatically. Don’t pass on content on more localized platforms if it is from a source you don’t know. Remember the way you engage with accounts and information builds networks that can be monitored and exploited by others.

Fourth, if you find yourself sharing sympathies or narrative with an identified hostile foreign actor or other malign actor, question why, and what that means. Is there a better way to make your point? Avoid tropes and blanket statements, embrace nuance.

Fifth, just as a refresher to the previous columns in this series — always ask what the purpose of the information in front of you is, and what it aims to achieve. Don’t rage-post. Don’t feel like you too must repost viral content that is divisive and inflammatory. Don’t be a cog in the disinformation machine.

Molly McKew (@MollyMcKew) is a senior adviser to the Stand Up Republic Foundation. She is a writer and lecturer on Russian influence and information warfare. She advised the Georgian president and national security council from 2009 to 2013, and former Moldovan Prime Minister Vlad Filat in 2014 and 2015.

Skin Color
Layout Options
Layout patterns
Boxed layout images
header topbar
header color
header position