This article is republished from SciLight, an independent science policy publication on Substack.
In December 2023, I moved from Washington, DC proper to the suburbs. My husband and I, and our two dogs and cat, simply needed more room than the single bedroom condo we could afford in the city. Six months later and we’re really happy with our new home, although suburban life is quite different.
One of the ways that suburban life is different is the increased amount of solicitation. We’ve had numerous folks knock on our door to sell their wares or services.
One solicitor, in particular, still stands out to me: a young woman who knocked on my door in January – only a few weeks after my husband and I had moved in. It was pouring rain outside when the knock came at my door, and the dog alarms started to sound. I looked through my door’s peephole. A young woman with long black hair stood on my porch in a raincoat with a notepad and clipboard. I opened the door and stepped on my porch to chat with her.
She told me that she was a city official visiting new homeowners. She then proceeded to tell me that a couple of months ago, the city discovered the water supply was contaminated. She was there to test my water supply to make sure it was safe and to ensure that if it wasn’t that I had the opportunity to have the city install new filtration devices.
My emotions took hold of me, and I felt fear. I had just moved into this city, and they didn’t alert me about a serious water pollution issue?
“If you have just 5 minutes, I can come into your home and collect the water samples needed,” the young woman told me.
The fear that this woman had stoked in me, and the fact that she wanted to get into my house so quickly, caused me to take pause. Surely, the city would have told me about a serious water pollution issue, and if they didn’t then there is a serious problem here. Right?
So, I asked the young woman, “You said you were with the city, right?” “No,” she replied, “but we do work with the city.”
Odd—I was pretty certain that she said that she worked for the city at the beginning of our conversation.
“I’m sorry, but I don’t think that I have time today, and I’d need to put my dogs up before letting someone in my home. Can you email or call me to schedule something in the future?”
“Sir, it’ll just take 5 minutes and I’m happy to wait for you to put your dogs away,” she replied.
The pushy behavior made me more suspicious—I knew then that I definitely would need to check with the city first. “I’m sorry, I just can’t today. Can you leave me a card with your information?”
“I usually have one, but because of the rain I forgot them all today,” she says.
“That’s ok. Feel free to stop by another time.”
I never saw that young woman again.
A quick Google search of my weird situation validated my concerns—this young woman was serving me disinformation. She did not work for the city or even with the city. There was no water pollution concern in my city—not now or months ago. If I had allowed them to “test” my water, their “lab” would have sent me a very concerning report with more disinformation regarding the level of pollutants in my water. The disinformation’s ultimate goal—get me to purchase an expensive filtration system that doesn’t do anything. Apparently, this is a common scam that happens across the nation.
I’m glad that my brain sent up some warning signals—otherwise I could’ve lost a good deal of money to solve a problem that isn’t real. But does everyone’s brain send out these same warning signals? No, and that’s a huge problem in a world where mis- and disinformation is on the rise, and where generative artificial intelligence will continually make it more difficult to tell fact from fiction.
Large-scale disinformation campaigns, when effective, can have huge impact, such as persuading people that an election was stolen. Or convincing people to not take a vaccine. Or persuading people that climate change is a hoax.
Lucky for us, psychologists have been studying human susceptibility to mis- and disinformation. By better understanding when, why, and how humans take in mis- and disinformation, psychologists also can better understand strategies to combat it. One of the strategies that is showing promise is called “inoculation.”
Inoculating against mis- and disinformation
If inoculation makes you think of vaccines, then you likely already understand the strategy here to combat mis- and disinformation. Psychological inoculation is like a vaccine, but for your brain. Vaccines produce antibodies that help strengthen your immune system against a virus, so it’s better prepared to fight that virus when it enters your body again in the future. A psychological inoculation triggers “mental antibodies” that help train your brain to spot mis- and disinformation so you’re not persuaded by it when you come across it in the future.
Disinformation researchers, Jon Roozenbeek and Sander van der Linden, explain that the idea of psychological inoculation has been around since the 1960s when there were concerns that American captured soldiers might be brainwashed by enemy troops. William McGuire, a social psychologist, suggested a “vaccine for a brainwash” and thus the idea of psychological inoculation was formed.
Roozenbeek and and van der Linden have shown that inoculation strategies can be effective. In one experiment they randomized over 6,000 participants to watch a video about strategies by which mis- and disinformation spreads, or a neutral control video. These videos are really great, and you can view them yourself. The members of the group assigned the videos about disinformation strategies were significantly better able to recognize manipulation techniques, to discern trustworthy from untrustworthy content, and to share material with greater discrimination than those who didn’t watch them. The results of this study are published in Science Advances.
The nice thing about the videos used in the study linked above is that they’re short. And short videos can be distributed to lots of people—like in ads on YouTube that you’re forced to watch (if you don’t subscribe to YouTube Premium). While the videos are effective in inoculating folks, they’re not as effective as other strategies that may be more engaging.
For example, Roozenbeek and van der Linden also have used gaming as a tool for psychological inoculation. In the game, which you can play here, you play the role as a fake news producer and master the strategies of spreading misinformation. Those who play the game are better able to identify misinformation. However, the effects size for retaining the ability to spot misinformation is larger for those who played the game versus those who watch the YouTube videos. So, it seems the game might be more effective in helping individuals spot future misinformation than watching YouTube videos about disinformation techniques.
In another study published in Nature by McPhedran and colleagues, psychological inoculation was shown to be more effective than false tags. False tags are used by social media sites, such as Facebook, to alert you that a post’s content may contain mis- or disinformation. In the study, participants received social media posts based on real content that either contained misinformation or did not. Participants could interact with the posts in several ways—share it, comment on it, respond to it with an emoji, “like” it. Prior to viewing the posts, some participants received inoculation training. Those who received inoculation training were far less likely to engage with the posts in any way.
Can we scale inoculation?
For psychological inoculation to be effective at a large scale, Dr. Gordon Pennycook says that researchers need to identify two things: a) tactics that are both prevalent and diagnostic of misinformation, and b) tactics that are simple enough that people can learn the heuristics to identify them. Pennycook says that research, like that of Roozenbeek and van der Linden, has focused on (b), leaving (a) largely assumed.
For example, Pennycook says that identification of a mis-/disinformation strategy, such as manipulation, may not continue to be present in inoculated individuals outside of a lab. In a talk given earlier this year, he says that a study hasn’t been conducted that finds individuals inoculated in a lab setting go home and share less manipulative content than they did prior to be inoculated, for example. This kind of real-world data will need to be collected to show that inoculation continues to be effective outside of an experiment.
Another argument critical of psychological inoculation has been made by Pennycook and Dan Williams, a lecturer in Philosophy at the University of Sussex. Both experts argue that if you prime an individual to be cognizant of a specific technique, such as using emotional language in a headline, that it just makes people more skeptical of all information whether that information is true or not true. So, inoculation techniques may just be training people to be skeptical of headlines or social media posts using emotional language, not actually able to discern whether misinformation is present. If you want to learn more about Pennycook and van der Linden’s views, and hear them discuss these issues and their research, then check out a discussion they had together.
One thing that is clear to me is that an alarm sounded in my brain when a scammer tried to convince me my water was toxic. Maybe I was inoculated because my parents never trusted solicitors when I was growing up. Or maybe my scientific training helped me critically analyze the situation I was in. Whatever it was, we need more alarms like that going off in everyone’s head when they encounter mis- and disinformation.