What To Know About Disinformation History, Tactics, and Real-Life Examples

Not sure what’s happening with disinformation? Here’s the history, tactics, and real-life examples to help you sort fact from fiction.

In The Beginning, Was the Problem

“You are free to eat from any tree in the garden; 
but you must not eat from the tree of the 
knowledge of good and evil, for when you eat 
from it you will certainly die.”

— Genesis 2:16–17

The Genesis account tells of the snake that plays on the doubts in the back of our minds, our fears that those in power are hiding something from us. The snake doesn’t invent a new thought; he merely stokes a suspicion that Eve already has.

You will not certainly die,” says the snake, “For God knows that when you eat from it your eyes will be opened, and you will be like God, knowing good and evil.

The snake tells Eve the things she wants to hear. He gives her permission to believe as she wants.

Eve was the target of disinformation. The tactic lures us with the promise of a world where our every suspicion is justified and every opinion objectively right. The notion is seductive next to an impartial, unfair, and unpleasant reality. The biblical account clarifies that disinformation has been tripping up humans for some time, which begs the question, why does it pose such an existential threat now?

Social media and the internet, mostly.

Think of the invention of social media as an incredibly helpful genetic mutation in a virus. Now viral disinformation can leap from person to person in a densely populated metropolis — the internet. Before this “mutation” in our information landscape, disinformation moved slowly. It never gained the traction it has today.

Much as SARS-CoV-2 had a gene that allowed it to spill into the human population, the internet offers a similar advantage to disinformation — and spill, it certainly has.

What Does Disinformation Look Like Now And Why?

Far too many modern disinformation efforts exist to detail all of them, but Americans must understand how social media has breathed new life into the psychological operations of our adversaries. The belief after the fall of the Soviet Union was that the use of “dezinformatsiya” or disinformation would end. It did not.

With our guard down, our adversaries began their most successful mission yet — sowing discord and undermining democracy. Authoritarians, much like a schoolyard bully, are insecure. People living in autocracies want democracy, too. The best way to quiet that demand is to make democracy unappealing.

That means exploiting free speech and press in democratic nations to show how the grass isn’t as green as it looks. Western nations have faced this threat by arming the public with the skills needed to identify it, forging alliances with other democratic countries, and confronting the responsible parties.

Inexplicably, the US has denied or ignored the threat. Instead of sending a message that meddling with the Americans is a poor choice, our deference and subordination have only invited more meddling and more international embarrassment. We can deny that we’re being toyed with, but the world is well aware of what is happening. America has only deceived herself.

Critically, not believing in the threat will not protect us. We must fight or perish. Our founders envisioned a well-regulated militia, manifested in the National Guard, as sufficient protection. It isn’t. Today, healthy skepticism and knowledge about how others influence our perception are essential skills needed in our diplomatic toolbox — sometimes we forget we have options besides military intervention.

Our current dysfunctional communication landscape poses an existential threat to our society. Enemies will have already noted how easily the great superpower felled beneath a crisis that required cooperation, trust in government, and public communication. With proper planning, I shudder to think about the possibilities.

The good news is that only with our cooperation can adversaries continue to succeed. If we choose to resist, we will win. So, let’s begin your training, shall we?

A Primer on Disinformation

The brilliance of modern disinformation is that it plays on the paranoia already found in American politics. These efforts amp up at specific times to tempt us at the precise moments when we are most vulnerable, and because it builds upon what is already there, they’re not selling us something new. They’re selling something we already want — they’re offering the subscription.

Operatives observe us, prepare, and they wait for a disaster or naturally divisive event, like patient predators ready to strike. When that moment comes, a firehose of information blasts into the American information landscape, meaning all avenues of information like news, social media, journalism, and research. There are 15–20 topics people could be relatively certain would cover all situations.

The threat of “fake news” is greatest during times of crisis. It isn’t the crisis itself that nefarious actors need; it’s the uncertainty. In a crisis, our information processing takes a predictable trajectory, so exploiting those changes makes for easy prey.

Knowing these tendencies come with a crisis, what ways can you think of to make things worse? If you thought of it, adversaries probably have too.

Operatives present us with a custom-tailored story that affirms our existing beliefs. The story will be simple, which makes the situation easy to understand. Who could resist? We’re unlikely to challenge our current opinions in a crisis. That means whatever we already think is going to play a role in how we assess information. Instead of asking if the new information disproves something we previously assumed, we interpret the additional details under the premise that the previous beliefs are true. Of course, this happens outside of a crisis too.

Crisis Thinking Example

Assume the following evidence and details are true:

  • Local water testing shows that a chemical spill is much worse than previously reported.

  • The danger means local businesses must shut down and people must evacuate, despite being told otherwise earlier.

  • Officials thought they contained it to a small area, but now it’s clear the spill is far worse.

  • Local reporters independently confirmed the findings and sequence of events using the Freedom of Information Act.

  • Evacuation may cost you your job, and you’re worried about money.

If a person relies on existing beliefs and the first message heard, they might think:

  • There is no need to evacuate. They told us that. The damage wasn’t this bad. Why are they changing their story? I don’t think they’re being honest. Maybe they want to hurt [insert name(s) of current official(s)] or wreck the economy to swoop in while rates are low. I’m not letting that cost me my job. I’m not a sheep. I better not trust anyone.

  • There was no need to evacuate before. They can’t keep their story straight. Do they expect me to believe they didn’t know all along? They must have hidden how bad it really was to save themselves from a political scandal, but now — now the truth is out. This is outrageous. I cannot trust anyone.

Both lines of thought assume that the information from before was correct or had more meaning than it did. Rather than updating what they know about the situation, they interpret the new information in a way that affirms the earlier messages. The outcome in both instances is the inability to tell fact from fiction.

Potential thoughts assessing all information and not assuming existing belief as correct:

  • They said we didn’t have to leave! Now we do? This is confusing, but maybe they underestimated the problem or didn’t have all the information they do now. I need to keep tabs on this. We better evacuate.

Can you see potential opportunities for a nefarious actor to create problems during a crisis?

  • Someone could manufacture documents or emails and leak them with authentic documents to make it look like officials knew about the extent of the damage from the start. Bad actors could frame it to look like a failed cover-up, complete with paid or blackmailed witnesses. It could get messy if domestic politicians play along because it’s politically helpful.

  • Someone could publish computer-generated fake conversations of politicians scheming over how to make the crisis appear worse so they can use the crisis to win the next election. This could generate mistrust in political establishments, lead to disengagement in society, or trigger violence.

  • Assuming the details in the prior point, instead of simply sowing discord, the aim could be to increase the number of civilians who stay in harm’s way. This would hurt the local economy, traumatize a community, and erode public trust. If it was something extremely toxic, this could have catastrophic consequences. A similar skewing of risk perception could weaponize a natural outbreak or natural disaster against us for the same purpose.

Even if none of this is true, most Americans will have no trouble imagining these scenarios playing out in real life. Perhaps you already know who would share what on social media. Within 48 hours these stories would circle the internet.

By the time fact-checkers got their shoes on, the domestic terror threat known as Qanon would have the narrative inserted into its sacred text. People who want to believe the story will reject additional information as “fake news.”

The damage will be done.

Uncertainty Makes Us Think Like Rats on Meth

Yale study from May 2020 found that our response to uncertainty looks a lot like the behavior of laboratory rats that chronically use methamphetamines. Uncertainty can come from a disaster, the chaos following a terrorist attack, or it can come from a highly consequential unknown, like anxiety over the outcome of an election.

A crisis is where nuance goes to die and where inconvenient details do not exist. Uncertainty may be so burdensome that it’s more distressing than the actual negative event. In a study, researchers had two groups of participants. One group knew whether they would get shocked, but the other only knew it might happen. Those who didn’t know whether they would be shocked felt more pain than those who were told definitively they were getting shocked. The shock was the same, but the experience was not.

Information made the difference. That matters. Instinct often leads people in charge of a group to fear panic. Often it’s well-meaning, usually paternalistic. Sometimes it’s condescending, but in either case, it’s not an instinct we should follow.

Panic — meant in the way popular culture speaks of it — comes from not enough or conflicting information. Genuine panic is rare. When a person urinates because of extreme fear or someone’s body freezes up, leaving them in the way of an oncoming train, that is panic. Disaster communicators stress that fear of panic is misguided, harmful, and never an acceptable reason to withhold information.

When we say panic, if we mean people stockpiling or behaving in other extreme ways, what people are doing makes sense in context. Consider how you might act if you didn’t know whether you’re being told the truth and were unsure that anyone would protect you.

In that situation, a sane response is hoarding. That’s not panic; it’s an instinct to survive. It’s the predictable consequence of a communication failure.

Putting quality information out in a timely fashion is only part of the struggle because now to “update” their information — basically, additional evidence may not stick. To understand the power uncertainty has over our behavior and thinking, let us turn to another experiment.

Researchers had people who ranged in paranoia from low- to high-levels play a card game. Divided into a low-paranoia group and a high-paranoia group, the two categories differed in how they played the card game. Upon introducing uncertainty into the game, the low-paranoia group’s behavior changed. Now they responded like the high-paranoia group. The uncertainty was neither long-term nor life-threatening. Still, it changed their behavior and reduced their ability to learn from previous rounds in the game.

Put another way, crises may make us more receptive to conspiracy theories. The addition of depression, anxiety, and increased time on social media only further increases the chances a conspiracy theory grabs someone. A crisis is to normal times, as the growing season is to other times of the year. Try to plant something during an unpredictable time, and chances of success may be greater than during an uneventful time. The same is true of planting a seed in the growing season versus in the winter.

Persisting in an information bubble where we filter out the things we’d rather not hear, can leave us feeling much more strongly than we can justify. Again, we take part in skewing our perception. These situations promote biases that affect our memory and perception of reality. Collective misremembering is a memory distortion, also known as “the misinformation effect.”

This phenomenon happens when something we learn in the present affects the way we recall a memory. A common example is from Forrest Gump. Most people remember the main character saying, “Life is like a box of chocolates,” except that isn’t the quote. Still, you may think you remember it. Instead, Tom Hanks said, “Life was like a box of chocolates.”

If you had that memory wrong, there are probably others you’ve unknowingly edited.

Why Are Enemies So Focused on Disinforming Us?

The biggest producer of disinformation is working to undermine democracy and destabilize society. The Internet Research Agency (IRA), a Russian operation, is the most infamous disinformation creator. Other actors sometimes have financial motives. Fake news generates a lot of clicks.

In one instance, an entrepreneurial teen in Eastern Europe started a website that raked in thousands in US Dollars by appealing to highly partisan American audiences. One can make a lot of money if they set aside the harm they’re doing. Still, the most common goal is breaking down democracy from the inside.

When you try to name the countries that could compete with the US in a battle of military strength, the reason becomes obvious. Psyops are also cheaper than having a battle, and it may be more effective. Have you ever wondered if the CIA killed President Kennedy? That was a well-known Russian disinformation story they started pushing almost immediately after JFK’s death.

Was the moon landing fake? They likely wanted to sow doubt about our accomplishments. Their home audience must have been receptive to their campaigns as well. Nearly 60% of Russians surveyed do not believe the US landed on the moon.

Image for post
A cartoon implying the collusion of scientists with the US military appeared in the 31 October 1986 issue of PravdaThis narrative plays a role even today. COVID-19 conspiracies build on several past operations

Will Covid conspiracies last just as long? 

For our sake, I hope not, but let’s look at what makes a suitable topic for disinformation and why they stick with us. If you can’t identify them, we can’t get rid of them.

We know operatives repurpose existing narratives and stir the pot where tension already exists. That frees them from the more arduous task of getting people to believe or not believe something specific. They follow our lead. Operatives create content designed to entice our insatiable desire to “own” your opponent with “a-ha!” stories.

Dis-informers generate evidence of a world where everyone we don’t like is wrong and everything we like is right. Buying into that fantasy comes at a high price: democracy and civil society.

Have you argued over one of these? Perhaps you feel your blood boiling just thinking about them. Disinformation may be responsible for some of it.

Disinformation, when it’s successful, leaves people unable to tell what is true. While disinformers may have candidates and political movements they support, that support or suppression has more to do with self-service than a shared ideology. (See Brexit & Cambridge Analytica) The effect of these campaigns is subtle, affecting you like radiation. When stories tell us what we want to hear, we scrutinize much less.

These techniques truly wage war on information and not figuratively. A former military psyops operation that became a public influence mercenary, working for the highest bidder, employs tactics formerly used against our enemies. These weapons differ from physical harm, though. They still require some, however small, level of cooperation. It’s not that we know something is false, but we aren’t asking either. In some sense, disinformation works so well because we are eager to cooperate right now.

Why would anyone accept disinformation, you wonder? It feeds our egos, proving that we are right and our opposition is wrong — always. It scratches that arrogant itch that we all have and should not feed, lest it further erodes our ability to discern fact from fiction. The experience is intoxicating, so we head back for more. Many probably don’t realize what they’re doing, but slowly it happens. You unfriend a person here and there; you stop reading this or that paper.

Before long, you live in a world of your own making. The result is a public that is far less willing or able to have meaningful dialogue. It also strokes some serious outrage because everyone feels like they’re always right and the subject of a massive conspiracy to end them. That would tick off even the meekest among us. In reality, the never-ending, massive conspiracy that is always against us (weird how we always get it right, eh?) exists only in our minds.

Now, the situation is so extreme that Americans admit to wanting each other to drop dead, which is disturbing alone. The by-product of this preoccupation with one another is that we aren’t paying attention to real enemies. That’s the whole point, of course. In that environment, even a small challenge is catastrophic. It’s like herding angry cats — with rabies.

The US, the nation with more resources and deeper wells of knowledge than any other country, could not control a fairly straightforward outbreak. This is the power of disinformation. We did not see a single case of SARS. Disease control and prevention is something we have done for decades, something for which we were once the gold standard.

Much of that failure flowed from a loss of public trust thanks to egregious disaster communication mistakes, egos, disinformation, and a little immaturity. Hell hath no fury like an American who has never read the Constitution or books on American History if he or she suspects you mean to take his or her “freedom.

Examples of Disinformation Aimed at the US

One of the richest resources for examples of public manipulation efforts in the US is the Senate Intelligence Report on Russian Efforts to Sow Discord. The report details the ads that appeared on social media platforms starting in 2015. The operatives chose topics based on what was divisive in the US. They adapted and understood American culture well. They lie in wait for opportune moments. After the shooting of Michael Brown in Ferguson Missouri, the IRA turned on the disinformation firehouse within hours.

Their success isn’t only thanks to the right topics and times. They also handpick the right audience. They tailor small campaigns that speak to us and provoke visceral emotional reactions. In short, they discovered what would trigger outrage and put it in front of you.

The examples are paired together as an image and the image’s target audience details, like a city or state, interests like the NRA, and demographic details to include or exclude like age, sex, education, income, affiliations, religion, and more.

Advertisement #1: Don’t Shoot

Advertisement #2: Being Patriotic

Advertisement #3: Defend the 2nd

Most ads reinforce beliefs or provoke fear. Nefarious actors chose topics from those most controversial. In the target details, “RUB” after Ad Spend abbreviates Rubles, the current Russian currency. Race, immigration, gun rights, pseudoscience/anti-science are all popular disinformation subjects.

Congress Taps Research Groups to Investigate Disinformation Tactics & Tropes

Now that you have some idea how disinformation operatives target Americans, identifying material will be that much easier. You may even remember images and posts you suspect may have been disinformation.

How common was this exactly, and how many people saw these posts and interacted with them? The quick answer — a lot.

Facebook Ads and Stats

They didn’t just show us divisive content and impersonate Americans. Part of the effort included diverting traffic to other websites. Some were owned by the Russian IRA, but some were real websites owned by Americans that they felt would best rile up the public. Remember, disinformation wins because it relies on us indulging our worst impulses.

If some of these look familiar, don’t feel alone. I, too, recognized a lot of these, especially the ones that ran right before the 2016 election. That may have to do with living in a swing state at the time, but I’m fairly certain they influenced how I felt about social issues.

Instagram Ads and Stats

Twitter Ads and Stats

Ads and Stats for YouTube

Next time we will talk about disinformation operations that have translated into Americans taking action in real life.

Leave a comment

Visit Hoaxlines Disinformation Database