We live in an age of widespread public deception, where aggressive forms of narrative manipulation spread despair and undermine trust in the commons.
Disinformation entrepreneurs harass and intimidate scientists in an attempt to silence them.
They create noise and stir up misdirected public outrage.
During elections like Canada’s ending today, a shadow war of false information rages on the internet, polluting our political discourse. Even after the winner is announced, we may see efforts to falsely sow mistrust in the voting process, further damaging faith in our collective ability to choose how we are governed.
For a forthcoming book titled Silencing Science, I and a colleague, Mark Shakespear, spent months speaking to experts about how to counter disinformation rooted in conspiracy theories, violent anti-government and anti-expert ideology, and flawed views of freedom and democracy.
With all the swirling lies, tribalism and name-calling around us, how do we begin to communicate and reconnect authentically again?
People are turning off and tuning out. As a result, we are closing down the public square, the platform for informed and respectful dialogue so vital for preservation of a functioning democracy.
How can we re-create the space for higher-quality public debates?
How do we turn destructive conflict into healthy conflict and foster a better role for science in shaping constructive public conversations?
My journey of inquiry convinced me that an adversarial system has gone too far — and that each of us can play a more positive role. Here are some insights and tools we can use to reverse course.
Richard Petty is a distinguished professor of psychology at Ohio State University and a world-leading expert on persuasion. When I asked Petty what his work has taught him about why people fall for disinformation, he made six points.
We process information and disinformation in the same way. Our brains are wired to take in information and disinformation as if it’s true. It stems from evolution. It’s too much work to analyze every single thing to assess whether it’s true or not.
Determining that something is false is much more work than determining that something is true. Since the default setting in our brains is to believe that information we encounter is true, coding something as false is like adding a tag to the piece of information that says “false.” Recalling information with this extra tag is harder. This is why we need to focus on what’s true, not what’s true versus what’s false.
We judge if information is true by assessing how many people are saying the same thing. We judge truth by how many people are saying the same thing because most of the time, what a majority of people believe is true.
Everyone has their own sources of information with social media platforms tailored to their interests. Whatever you click on, you get more of. If I were to click on a social media post saying that the 2020 U.S. election was stolen, I’m now more likely to see another post that repeats this disinformation.
We assess what’s true by who said it. If a person who is credible says something, I will think it’s more likely to be true.
Repetition is another way to assess truth. This isn’t just the number of people who say something. The more often we encounter information, the more likely we are to think that information is true.
Positive reciprocity is a social norm where someone responds to positive actions from us with similar positive actions. We need to speak up. But speaking up in a public square polluted by toxic disinformation is not easy. Not only do you need to counter falsehoods, you need to deal with divisiveness and intense feelings of outrage.
Propaganda works through negative emotions. The positive reciprocity that comes from listening, respect, two-sided persuasion and trust building can help.
Professor Petty has been doing research on what he calls two-sided messaging and using it to influence people with strongly held beliefs. Those who think they’re morally right and confident in what they believe are the hardest people to change.
A one-sided message simply presents a message or an argument. A two-sided message acknowledges the merits of some points on the opposing side, along with the message or argument.
An example is a study Petty did on mask wearing. Instead of just saying “Masks are great, they protect you from getting COVID and spreading it,” a two-sided message acknowledges that masks are uncomfortable and that we don’t like government telling us what to do. These are two reasons people oppose masks.
In short, Petty and his colleagues showed that when aiming to influence the views of those who were most morally committed to their viewpoint, acknowledging some merit to their position was more effective than only presenting a strong argument on one side. It’s positive reciprocity when I acknowledge your side that opens you up to listening to my message.
This is a useful tool for countering disinformation, which is fuelled by the polarization of negative reciprocity.
Peter Kim is a professor in the Marshall School of Business at the University of Southern California. He studies dispute resolution and how trust works, stating:
“Trust and long-lasting healthy relationships are built on respect and reciprocity. Listening and showing that you value others’ views helps foster this.
“If a one-sided, black-and-white message is sent out, a one-sided, black-and-white message is likely to be sent back. But if you listen and show that you care about another’s views, they are more likely to listen to you and take you seriously. This is the difference between dialogue and domination.”
Trust can be built in small group conversations or in public speeches by considering the views of others alongside your own. When talking to someone who disagrees with you, it’s even more important to show them that you’ve considered their point of view, and that it is important to you. Even if you don’t change their mind on the issue, doing this will make them more likely to listen to you and take your views seriously.
Professor Kim believes we should pursue this pluralistic dialogue approach instead of one that involves dominating the conversation and forcing your beliefs on others. This involves building a shared social truth and moving beyond the paradigm of trying to dominate the conversation with our own personal truth.
Guy Itzchakov is an associate professor in the department of human services at the University of Haifa. He researches what he calls high-quality listening. Itzchakov’s research shows that when people are listened to well, they become less polarized and more complex in their attitudes.
High-quality listening reduces extreme attitudes and cultivates positive reciprocity. It’s like being in a conversation with another person for the other person. High-quality listening consists of three dimensions: attention, comprehension/understanding and positive intention.
Attention is some eye contact and not being distracted by your phone, etc. Research shows that we have the capacity to listen twice as much as the other person can speak. We fill in this gap with internal thoughts. People listening to someone often have thoughts in their heads that distract from what the speaker is saying. A good listener helps the speaker by getting rid of their own distracting thoughts so that they can listen properly.
Comprehension/understanding relates to how humans have an innate need to feel understood; it’s the feeling that another person “gets us.” Itzchakov observed in his research that good listening is a behaviour that facilitates “felt understanding” in a speaker.
Positive or benevolent intention towards a speaker is the feeling that the listener has a non-judgmental approach. People often confuse this with agreement. However, I don’t have to agree with your opinion in order to listen to you well. I can be non-judgmental towards you even if I disagree. Being non-judgmental means that I acknowledge your autonomy and your freedom to speak your mind.
According to Itzchakov, listening is not a passive act. The listener is not a passive actor. The listener determines at least 50 per cent of where the conversation will go through back channel behaviour, which can be verbal or non-verbal. Itzchakov states:
“Listening is a reciprocal process. Even if we disagree, if you listen to me well, it’s highly likely that I will listen well to you and vice versa. If we create this positive reciprocity, it’s a major step to finding common ground. My research consistently found that when speakers are being listened to, their attitudes become more complex. They acknowledge that there’s another aspect. This is the core: if we want to reduce polarization we need people to acknowledge the complexity.”
Part of countering disinformation with good storytelling is something U.S. communications expert, author and political strategist Anat Shenker-Osorio calls message ordering. Starting a message with anger about a problem, or calling someone a liar, reinforces the sense that nothing is likely to change. Instead Shenker-Osorio recommends something called a value sandwich: value, villain, vision.
A value sandwich starts with a shared value. Next, it points to the problem that stands in the way of the shared value being honoured. It’s here that you name the villain and explain what they’re doing and what’s motivating them. Finally, you share the vision, the solution, the “We can fix this” message that’s consistent with the shared value.
This allows you to avoid getting mired in polarized wheel spinning. Instead you can have the conversation you want to have, not the conversation your opponent wants you to have.
Applying the value sandwich idea to disinformation, we start with what is true, call out the lie, then reaffirm what is true while providing solutions.
Shenker-Osorio believes there should be twice as much truth for any reference to what is untrue. For example, applying this to climate change disinformation, you start with: “Today across our zip codes, communities and countries we see dangerously extreme weather from horrific droughts to torrential rains, we breathe contaminated air, and we fear for the water in our children’s cups.” So this first part is what’s true.
The second part is: “But today a handful of corporations or a handful of fossil fuel billionaires or a handful of greedy oil CEOs want to spread lies about the causes and consequences of these dangerous changes to all that surrounds us. They pay off politicians, hoping to spread confusion and fuel fear about the clean energy future that is ours for the taking.”
Part 3 is: “But we know that when we act together and power our country through the wind and the sun, we can make this a place that we’re proud to leave to all our kids.”
If you work as a scientist, as an expert or for an institution that is undermined by disinformation, the work of John Cook, senior research fellow at the University of Melbourne, and Stephan Lewandowsky, a cognitive scientist from the University of Bristol, on pre-bunking and debunking is invaluable.
Debunking occurs after people have been exposed to false or misleading information. Pre-bunking takes place before such exposure. This is sometimes called inoculation because the concept is to build immunity by exposing people to a weak version of disinformation.
By pre-bunking and debunking, you can weaken misinformation in several ways: fact-based, technique-based and source-based.
A source-based statement might be: “Here is this argument coming from this organization which is a far right think tank that is funded by a fossil fuel company.”
Fact-based would be: “Here is an argument casting doubt on the greenhouse effect, but here are the facts of how the greenhouse effect actually works.” This helps show how that argument is wrong.
Technique-based pre-bunking states: “Here is the technique that this misinformation uses to cast doubt on facts, and this is how the technique casts doubt.” This is probably the most appropriate technique for pre-bunking because you’re trying to provide general immunity against future encounters with misinformation, but you don’t know exactly what form it will take.
If it’s fact-based or source-based it’s specific, whereas technique-based is general. Therefore, you can generally inoculate people against that technique in a variety of contexts where misinformation is being articulated. Technique-based inoculation also allows you to sidestep political triggers. Nobody likes to be misled, regardless of their politics.
We need to develop stronger counter-disinformation narratives and messages that call out the creators of disinformation to inoculate the public against this propaganda while trust in experts and reputable institutions is rebuilt.
The ideas I’ve presented here compose a tool kit empowering citizens to not be fooled by disinformation and to push against it. Let’s all learn and practise the arts of technique-based pre-bunking and positive reciprocity. Each of us can engage in trust building, high-quality listening, two-sided messaging and truth sandwiches.
We can speak up against an adversarial system that has gone too far. We can change the tone and content of our political conversation.
At stake is nothing less than our collective grasp of reality and the survival of our democratic ethos.
Read more: Election 2025, Media
Tyee Commenting Guidelines
Comments that violate guidelines risk being deleted, and violations may result in a temporary or permanent user ban. Maintain the spirit of good conversation to stay in the discussion and be patient with moderators. Comments are reviewed regularly but not in real time.
Do:
Do not: