Can we inoculate ourselves against misinformation and conspiracy theories in the way we do for infectious diseases? Instead of debunking, can we “pre-bunk?” Sander van der Linden, co-founder of Inoculation Science, has created games that offer to do just that. Baratunde plays one of them and speaks with Sander about online misinformation campaigns, polarization, and how we can better protect ourselves.
Go to howtocitizen.com to sign up for show news, AND (coming soon!) to start your How to Citizen Practice.
Please show your support for the show in the form of a review and rating. It makes a huge difference with the algorithmic overlords!
We are grateful to Sander for joining us! Follow Sander at @Sander_vdLinden on Twitter, or find more of his work at inoculation.science.
- PERSONALLY REFLECT
Reflect on the game.
After you play the game at https://inoculation.science and watched a few vidoes, reflect on how they made you feel. Are there online experiences you’ve had that make more sense once you consider you might have been intentionally manipulated? How do you think these games will affect your future online experiences?
- BECOME INFORMED
Play the game.
Point your browser over to https://inoculation.science and play their set of inoculation games. In addition to Breaking Harmony Square, which we featured in this episode, they offer games to help you limit the harm of fake news and COVID misinformation.
- PUBLICLY PARTICIPATE
Share the game.
Finally, share the games with people you care about. Friends don’t let friends spread misinformation.
Baratunde Thurston 0:02
Welcome to How to Citizen with Baratunde, a podcast that reimagins citizen as a verb, not a legal status. This season is all about tech and how it can bring us together instead of tearing us apart. We're bringing you the people using technology for so much more than revenue and user growth; they're using it to help us citizen.
Alright, play. Let's destroy society! Congratulations, you're hired. Welcome to your first day as our new Chief Disinformation Officer. Let's get started. We hired you to sow discord and chaos on Harmony Square. That's what's up. That's what I'm here for. Who are we about to f*** up?! So, right now I'm playing Harmony Square. Normally this game doesn't have music or sound, but my team took some creative liberties to bring you into the experience with me, and clearly I'm having way too much fun about to sow some chaos. Harmony Square is a green and pleasant place. It's famous for its living statute, its majestic pond swan and its annual Pineapple Pizza Festival. Any place that does a pineapple pizza festival deserves some discord. Yeah, pineapple pizza, that's disgusting. Now the goal of this game is to disturb this hypothetical quaint small town's peace and quiet by fomenting internal divisions and pitting its residents against each other. I am oddly excited about the prospect. There are no bears here, never have been, but Harmony Square loves elections so much, they keep voting for a bear patroller anyway. One of the ways you get to divide this absurd little town is through an election. It's for bear patrollers, and there's only one candidate, Ashley Ploog. What kind of language do you think is most likely to ruin the bear patroller election? Tears and fears are facts and logic, tears and fears. So, as I'm going through the game, I'm getting presented with these choices for how I intervene, and I got real excited about the chance to make fake news memes. Since you're so clever, why don't you choose some electrifying buzzwords to include in your meme? I get to pick three buzzwords. Oh, corrupt, abuse and lie. Yeah, we're taking Ploog down. Now you can put together an emotionally-abusive meme. So, I decided to go with this meme: it shows two white dudes handshaking above the table but underneath money's changing hands, unopposed is just a fancy word for corrupt. Careful, though, you posted some content that wasn't emotionally exploitative. It cost you a couple of lifes. The Megaphone will be much more successful if you use the right buzzwords. Down! Alright, Coach. Yo, this is mad devious. This is by far the best thing I've seen that explains this, like I'm Breitbart. You know, this is wonderful, and by wonderful I mean terrible. This is a really twisted game because I thought I was playing it aggressively by choosing to disparage a newscaster on a small scale, you know, talking trash to a friend or family member. It turns out the game is like, "that's not devious enough. You must scale your deception." So, it encouraged me to create a more public platform for the disinformation. To Hell with responsibility, let's crank it up to 11. I'm trying to destroy the town. This game has really gotten me... why'd they even call it Harmony Square? I think that's what bothers me. I want to make it Discord Square. Let's go. Can we get some tiny-violin music up in here? You did it, you ruined the biggest moment in Harmony Square's history; they're all at each other's throats now. Okay, in the end, let's see still counting, you have reached 56,884 followers. You did better than 86% of the people. Who knew simulating the emotional destruction of a small town could be so much fun, and that I'd be so good at it? I mean, it's fun, but it's also scary. Now Harmony Square is a fictional place obsessed with democracy, electing bear patrollers, but other than the bear patrol thing, it's pretty similar to the world we live in right now.
Archival (News) 5:00
Election has cast a light on misinformation online and it's...
...false election stories from hoax sites...
...The dangers of fake news.
Baratunde Thurston 5:08
Ever since the 2016 presidential election, our Internet has become increasingly more divisive with fake news spreading like wildfire, and combating all this misinformation, that can feel like you're playing Whac-A-Mole. You take one down, three more pop up; but in this case, millions more pop up, whether it's misleading headlines, divisive memes or trolls, and these little rodents, they're burrowing into the fabric of our society. It's kind of maddening, and admittedly disgusting with the whole rodent metaphor. After a few years into this fake news ecosystem, social scientists are seeing how easily we can fall prey to some of our basest human instincts, and fear is a hell of a drug and a terrible motivator as we click, share and retweet about the pandemic, racial injustice, climate change, and more. Many of us have become aware of how deep and dark the pit of social media can be, but few of us have had the chance to drive that dissent like I got to do in the fake world of Harmony Square; but Harmony Square isn't just reflecting real world problems in a cute way, it's helping us battle them too. Yes, a game can help us fight misinformation and disinformation. Believe it or not, this game was preparing me to fight off trolls by getting inside their heads. In other words, it's like a fake news vaccine. Sander van der Linden is one of the people behind Harmony Square. He's a professor of psychology at the University of Cambridge, studying how people are influenced by social media and misinformation. His team of researchers have partnered with game developers to create several games just like Harmony Square. They call themselves, get this, Bad News Games. Essentially, these are choose-your-own-adventure games all free to play online, and they're a real way that science can go out and reach people, but even he admits it's an ongoing struggle.
Sander van der Linden 7:24
If I told you that down your street, I went to one of the restaurants and I got food poisoning, it was real bad, a week later, I tell you, "oh, look, listen, it actually wasn't that restaurant. It was another one." Every time you're gonna pass by that restaurant, you're gonna think food poisoning.
Baratunde Thurston 7:38
After the break, Sander's prescription for our social media ills and the lesson in Dutch salutations.
What's up Sander? Welcome to How to Citizen.
Sander van der Linden 7:57
Pleasure to be on the show.
Baratunde Thurston 7:58
Sander van der Linden 7:59
That's close, that's close, yeah.
Baratunde Thurston 8:01
Correct my terrible Dutch. What's the proper pronunciation?
Sander van der Linden 8:04
Baratunde Thurston 8:05
Sander van der Linden 8:06
Wow, that's good. That's good, yeah. Yeah, you know, people say, "oh, you know, Dutch and German's pretty much the same," but we feel strongly that there's a nuanced difference there.
Baratunde Thurston 8:16
Your field of specialty is timely and it's fascinating, about human decision-making, and influence, and judgment, and communication around all those things. Can you break down how you describe what you research?
Sander van der Linden 8:30
At a very basic level, I try to research how people are influenced by information, how people are persuaded by ideas and information, and what I've become really interested in in the last few years is how we can help people resist and detect attempts to manipulate us online, but also offline when it comes to fake news and misinformation, disinformation, all of those things. I also study, you know, the nature of how information spreads on social media and how that influences people, and what is happening on these platforms, why people engaging in flame wars, and why do we see polarization. Is social media a good thing or a bad thing? So, very difficult and complex questions.
Baratunde Thurston 9:14
When did you start down this path?
Sander van der Linden 9:16
I started out pretty late. I didn't necessarily come from an academic-oriented family, so I got a job. I thought that's the thing you need to do, you need to make some money and get a job, go into the real world. So, that's what I did and I actually ended up working at a bank and had a kind of a crisis in terms of what I was doing with my life, so I decided to quit that job, go back to school and that's how I got into academia. I think I was lucky that I got to work in a few jobs that I really didn't like, so I never looked back in terms of my own experience because what I get to do now, experiment on people, is fun. Even when I was little, I loved experimenting on people and learning about how they react, and I would set up elaborate schemes to see what people would do just because of my curiosity in human behavior.
Baratunde Thurston 10:00
I hear you using the term misinformation and, occasionally, disinformation, and out in the wild, these terms are often used together, sometimes there's a slash between them, sometimes people use them interchangeably even if they might not intend to. So, for the record and for clarity, what is misinformation? What is disinformation? How are they different?
Sander van der Linden 10:22
Yeah, that's a great question. I've defined misinformation as, in a lot of the work we're doing, information that is simply false or incorrect, so this can include things like simple journalistic errors, but it doesn't tell you whether it's misinforming people by accident or intentionally. So, for me, disinformation is misinformation coupled with some psychological intent to deceive or harm other people. That's also why people get more upset about certain kinds of disinformation than others, because we can all forgive people for making honest mistakes and errors, but it's different when someone's targeting you or actively trying to dupe you. It gets complex, let me give you an example. Let's say that the Chicago Tribune--which is, otherwise, a reputable outlet--I think the headline was, "Doctor Died Shortly After Receiving the COVID Vaccine." Now, these were two independent events, and one might have nothing to do with the other. There was an investigation that was ongoing, but are you misinforming people by constructing a headline in that way? So, it's not only that there's these sort of fringe outlets spamming us with disinformation. Here's the question, did they do that intentionally? I don't know. I think these are the complex, bigger questions that we try to study.
Baratunde Thurston 11:37
The formulation of disinformation equals misinformation plus deceptive intent, that resonates with me. That's how I've tried to understand it, but what you just shared about the Chicago Tribune example reminds me that even misinformation, the innocent version of disinformation, can be harmful. Both versions breed a level of mistrust overall because just my level of doubt is raised now about vaccines, about the Chicago Tribune, about Facebook, because I just don't know, whether they intended it or not, falseness is spreading throughout the land.
Sander van der Linden 12:17
I think that's a great point because, even something that wasn't intentionally created to be harmful, let's say it was a mistake, it can then be used or weaponized by people who have a certain motivated or political view. So, if you don't like the vaccine, this is now a great example for you to start sharing, "see, you know, a doctor died because they got the vaccine," and now it can be weaponized and used in social media for a cause that maybe it wasn't intended to serve. So, something can start out as misinformation and then become disinformation.
Baratunde Thurston 12:46
So, our whole podcast is called How to Citizen, and the premise is that we all have a role to play in shaping our societies. The whole self-government thing, we believe in it, we're nerds. For that reason, I'm curious what you've learned about human behavior and decision-making in digital spaces that can make it hard for us to self-govern and participate in our society.
Sander van der Linden 13:08
You know, I think one of the lessons that I've learned, at a basic level, people do have a motivation to be accurate, we do want to know what's going on in the world. For most people, that's kind of a default baseline, but, then, when you're put in situations that basically thwart that internal sort of radar that you have, things can get pretty ugly. What happens when you go on social media is that there's all sorts of different incentives that appeal to people that have nothing to do with accuracy. What are other powerful forces that influence our decision-making? You go online, you see something that's, not only been shared by somebody you trust and know, that type of information gets priority from people because we use it as a heuristic. If information comes from somebody you already know and trust, there's an implicit assumption that it's been vetted and verified by that person, and they wouldn't share anything to dupe you. Now, it also has a thousand likes, it's been shared 50,000 times, that's a powerful indicator that something important is going on that you might want to share it, as well. Then, there's the filter, so Facebook is filtering things based on your prior click behavior and things that you've looked at. Then, you know, you're focused with making a decision: what are you sharing online? What are you paying attention to? In one study, we looked at millions and millions of posts on Twitter and Facebook, and what's the stuff that generates the most engagement: it's posts that derogate the outgroup. So, if you're a liberal, the outgroup is a conservative; if you're conservative, the outgroup is a liberal. So, we coded posts for, whether it was positive or negative, about liberals or conservatives, and across millions of posts, the number one thing that got the most traction is basically trash talk about the other group. That is what gets engagement on social media. So, when you come in all accuracy-motivated, and calm, and honest, and you get distracted by this incentive to start hating on people essentially, because that's what gets you likes, that's what gets promoted, that's what gets shared, that's what's the norm on the platform, then that's what influences your judgments and behaviors much more so than other facts. It's kind of like you're back in high school, when you're on social media. That social pressure is back and how we fix that is a big problem, so we certainly have some ideas, whether or not social companies are keen on them is another. I should say that I do advise social media companies. Part of what they're doing is, on a micro level, they're trying to fight misinformation, they're trying to get fact checkers on board, I help them on how to debunk misinformation more effectively on their platform, and these are kind of micro solutions, right? There's a problem, they try to fix it by getting more facts out there, by upgrading the way that they correct misinformation on their platform, but at the end of the day, I think what they're not thinking about is that if you really want to change the incentives that people face on social media, you're going to have to rethink the whole nature of the platform. What we want to envision is a place where people have a motivation to be accurate, to share factual information, to have constructive conversations, to have positive conversations. The other thing that they'll say, and I think this is very interesting to me, is that, "we didn't design this platform to help people be as accurate as they can be. That's not the purpose of social media."
Baratunde Thurston 16:36
They admit that?
Sander van der Linden 16:36
They admit that! "We're not promoting, we're not interested in getting everyone to have the scientific opinions and the truth driven."
Baratunde Thurston 16:44
"We're not in the truth business around here."
Sander van der Linden 16:45
"We're not in the truth business." They want people to have fun, to have the conversations that they want to have, even if they're spicy, and they'll admit and say, "look, that's not our purpose. Our purpose is to let people have all kinds of conversations, and we're not going to regulate, necessarily, what people say." I think that's the issue. I think, maybe to enhance a better environment for everyone on social media, we have to just fundamentally change the incentives, which means you're not going to get as much engagement, and that's a difficult ask. If you really want to fix this problem, we're gonna have less engagement, that means you're gonna make less money, means you're gonna have different kinds of incentives, and I think that's just not a business decision they're willing to make.
Baratunde Thurston 17:26
When we come back, Sander and I get into how playing games can enhance democracy. You can do both at the same time! It's dope.
Talk to me about how someone engages in spreading wildly inaccurate information on one of these platforms. I'm talking microchips in my Maderna shot, I'm talking 5G towers pumping out COVID, even things that are counterfactual to observed reality flying around; what is the psychology of that kind of spread? Why do people keep sharing it?
Sander van der Linden 18:11
Yeah, why do people keep sharing it? One theory is what I call more of a generous take on the human decision-making condition that we find ourselves in; it's called the inattention account. If you think of the brain as a computer being bombarded with information, so our memories are limited and our attention span is limited, there's too much going on and we're getting distracted. If only there was a way to bring accurate information to your attention, then the problem would be solved,
Baratunde Thurston 18:43
An overwhelmed human is the charitable interpretation. What's the other one?
Sander van der Linden 18:47
The other account is a bit more nefarious, right? It suggests that people are actively biased, and that we share content because we want to promote or identify with the kind of social groups that we belong to. We have a political identity that we want to make salient to people, but it might be the case that you share content, not because you really believe it--it's not that you don't believe climate change or you don't believe in the vaccine--you're sharing it because it reinforces the narrative of your group. It makes the connections that you have with other people you care about stronger.
Baratunde Thurston 19:20
Sander van der Linden 19:21
It helps give you a sense of purpose and agency, and that you're belonging to movement--I'm thinking about QAnon, for example--so it reinforces what we call a sense of social identities.
Baratunde Thurston 19:32
What you're describing sounds like gang colors and membership. You're literally signaling your membership, and the facts, or lack thereof, don't matter nearly as much as you waving that color. It's very understandable, and I think I can even see it from my own experience with this whole idea of when I see an article that confirms me, I'm like, "yeah, that's who I am. I knew big, evil corp was big and evil," then I'll share the hell out of that; but then, if I see some wonderfully-written defense of like, "well, why globalization has actually been on net," I'm like, "whatever, that's BS, somebody made that up, that's misinformation," because I actually don't want to believe it, because it challenges me, not intellectually, but identificationally or something. I already know who I am, I don't want to be someone different. I've invested a lot in this identity, so I'm not going to share something that challenges me, then you just add fuel to that fire when you put it on a technology platform that has a financial incentive to turn up those dials and hit both explanations of why we do what we do. So, you have this concept of prebunking that I find fascinating. Can you explain?
Sander van der Linden 20:52
Yeah, absolutely. Prebunking is the idea that, rather than trying to correct something after-the-fact, which we usually call debunking, it's that you try to do it preemptively; but the idea here goes further, it follows the biomedical vaccination metaphor exactly. So, just as you inject people with a weakened dose of the virus to trigger the production of antibodies in an attempt to help confer immunity against future infection, turns out you can do the same with misinformation. When you expose people to severely and sufficiently weakened dose of the misinformation "virus," quote, unquote, or the techniques that are used to spread misinformation, people can build up cognitive or intellectual antibodies against them and become more resistant. So, we should all prebunk when it's possible. You know, viruses have different incubation periods, misinformation pathogens have different periods in the sense that, even when you've already been exposed, it can still be beneficial. At some point, it's going to be too late, but prebunk when you can. If that doesn't work, we can do fact-checking in real time, and if that doesn't work, we can still try to debunk and correct things after the fact. I guess we haven't really talked about why that's less effective. It's very brief. It's less effective because once you're exposed to a falsehood, it sits in your memory, it makes friends with other things you know, and we know from research that when people acknowledge a correction, even when they acknowledge it, they continue to retrieve false details from memory about the event. I think it's something very basic. I don't know where you live, but if I told you that down your street I went to one of the restaurants, and I got food poisoning and it was real bad. A week later, I tell you, "oh, look, listen it actually wasn't that restaurant, it was another one." Every time you're gonna pass by that restaurant, you're gonna think food poisoning. That's the difficult thing with corrections; it lingers in your mind because this association has been made, and that's why prebunking is ideal.
Baratunde Thurston 22:45
How did this idea emerge? Can you put me in the room or the zone that you or your team were, or wherever this idea came from? How did it emerge?
Sander van der Linden 22:55
Yeah, well, I actually can tell you that there was a psychologist in the 60s.
Archival (Psychological Warfare: A Combat Weapon in Korea) 22:59
...This is psychological warfare, or at least it's one phase...
Sander van der Linden 23:02
His name was Bill McGuire. He's no longer alive.
Archival (Psychological Warfare: A Combat Weapon in Korea) 23:05
Sander van der Linden 23:06
He developed some articles, very early articles on something he called the Inoculation Theory, which, at the time, was following the biomedical immunization metaphor...
Archival (Psychological Warfare: A Combat Weapon in Korea) 23:15
...But the use of this force as an integral part of combat has now taken on new forms...
Sander van der Linden 23:20
He was concerned during the Korean War that some of the prisoners of war, and there was this whole paranoia about them being brainwashed at the time. Now we know that there's other explanations for why some of the soldiers voluntarily didn't choose to come back to the United States. One example was racism, but at the time, the predominant narrative was that the soldiers were brainwashed.
Archival (Psychological Warfare: A Combat Weapon in Korea) 23:42
...Here, also, was a chance to see directly into the communist state through the eyes of typical, average young Americans...
Sander van der Linden 23:47
So, McGuire was thinking, "well, is it possible to develop this vaccine for brainwash? How would you do that?" The kind of key solution at the time was the military and the White House, they were saying, "oh, the problem is American values aren't clear enough to people," and McGuire said, "that's actually not the issue. The issue is that the soldiers were not prepared for the type of manipulation strategies that they would be confronted with, because the Chinese camps at the time, they weren't violent, necessarily. They said, 'welcome to the other side. Here, we're going to educate you about what's really going on with communism. It's not some evil thing, and we're not going to necessarily harm you. We just want to re-educate you.'" They presented a lot of counterarguments to capitalism, they had daily lectures and classes, so I think what McGuire was trying to say was that they really hadn't anticipated an attack on the foundations of capitalism, and a lot of them started... They had no prior defenses against it.
Baratunde Thurston 24:48
Sander van der Linden 24:48
They just assumed capitalism is good.
Baratunde Thurston 24:51
They were prepared for a war of military arms and weapons. They were not prepared for a war of information.
Sander van der Linden 25:00
Baratunde Thurston 25:01
Sander van der Linden 25:02
He never got to the propaganda and the misinformation, he kind of left this idea and moved on to other ideas. It got buried for 60 years, so I was sitting in the library one day and I came across one of his articles, and I was like, "wow, this is going to be so... If we could develop this idea further now in this context, this is going to be so interesting." So, we kind of picked up where he left off and started actually testing this in the context of misinformation, and we thought, "how could we bring this into the 21st century?" One of the things we did is we started simulating a social media feed in a simulator machine together with a gaming company and a media literacy company that we teamed up with, and a bunch of programmers--big team--and then we decided to produce some real world interventions where people can enter what we called the disinformation simulator. They would be exposed to these weakened doses of the key techniques that are being used to deceive us online, and then we found that over time, people can build up immunity
Baratunde Thurston 25:59
This is so, so perfect, because simulations are used in trainings of all kinds. Pilots have flight simulators and infantry have the first-person shooter simulator, and we use games to teach. So, you built this simulator, this game to extend McGuire's thinking about inoculation theory into a more modern-day practice, not against Chinese political propaganda, but against social-media-distributed propaganda. What a fascinating path from the 60s to now.
Sander van der Linden 26:36
One of the quotes that I've pulled out during this process was from the Harry Potter books, Professor Severus Snape, who said, "if we want to fight the dark arts, then our defenses must be as flexible and inventive as the arts that we seek to undo." I think our common realization was the dark arts of manipulation are evolving. Science has a yawn factor for a lot of people. We got to go out of the lab, produce some things that are entertaining and fun for people, so that we can actually get this out and test it in the real world, and make it fun and entertaining, so people don't get the feeling of attending a lecture, but they're actually playing a part in generating their own antibodies.
Baratunde Thurston 27:23
I've played the game that you and your team have created that's built around this inoculation theory, and I got to tell you, I'm very impressed. It revealed... It was like lifting a veil on the matrix. I was like, "oh, that's how this works! Oh, I was invested in creating disinformation." You, kind of, stimulated me as a chaos monkey, as an agent of chaos, visited upon this fictional place. Can you explain the game Harmony Square, and how it works to put into practice this inoculation theory we've been talking about?
Sander van der Linden 27:58
Yeah, you know, it's great. So, we have a couple of interventions. In Harmony Square was one that focused on disinformation during elections and political disinformation. We also have Bad News, which is our general simulator, which is not specific to a particular domain, it's sort of very broad; but Harmony Square came about because there was an interest in inoculating people against foreign influence techniques that are being used to meddle with democracies and elections. Of course, it was such a big topic, that we wanted to do a specialized version of some of the more general simulators that we build.
Archival (Harmony Square) 28:38
Congratulations, you are hired. Welcome to your first day as our new Chief Disinformation Officer. Harmony Square is a green and pleasant place. It's famous for its living statue, its majestic Pond Swan and its annual Pineapple Pizza Festival.
Sander van der Linden 28:54
We took the basic idea of Harmony Square, which is the last democracy on Earth.
Baratunde Thurston 28:59
Sander van der Linden 29:01
That's a depressing. You enter into a peaceful town. The content's all fictional, and it's supposed to be a bit ludicrous that there's this fictional town and something very innocuous happens. Using that kind of narrative, we try to inoculate people against some of these techniques that are used to polarize people.
Now, if you go see an illusionist or a magic show, the first time you might be duped, and there's really two ways to fix that: one's I'm going to give you a blueprint of how the trick works, which is kind of like a factual treatment; or I could let you step into the shoes of the illusionist for a little while so you can discover the trick on your own, and that way, you're never going to be duped by it again.
Archival (Baratunde Thurston) 30:04
Chaos is what counts. Let's create another Alter Ego account and pretend we're on the other side of this fight. I'm totally ramped up, I am invested in this: egg on the other side, as well. Oh, we're definitely using bots. Deploy the bots!
Baratunde Thurston 30:34
I love that, man. When I tell you I enjoyed it, you made me into a monster and I loved it. That's how effective the game was. I've played a lot of games, I've overseen hackathons with creative activists and comedians and stuff before, so I thought I knew what I was getting into, and by the end, I was like, "I'm gonna destroy this town. We won't even remember it existed." So, you have incentivized really devastating behavior. I was rewarded for it. You track the amount of followers you get after each wave of these campaigns, and did you want to escalate or go home? Definitely want to escalate, right? So, I look back after this, I'm like, "oh, man!" Can you connect the dots from this game back to the real world and how an experience like this, whether it's this game or some of your other projects, helps me interface with and process my actual social media feed better? How are my defenses more activated against the real life misinformation and disinformation?
Sander van der Linden 31:41
Yeah, absolutely. So as you said, there's a bit of a shock value to the game because, precisely, one of the core elements of inoculation theory is that people need to experience a sense of threat to motivate themselves to want to defend themselves against misinformation attacks. We need to activate your antibody production, so we need to get people into the mode. As you said it, there are elections going on during the game, there's a newscaster and you can see the approval ratings live, and as you cast your chaos, they're affected, and there's this candidate and you have a smear campaign about them.
Baratunde Thurston 32:11
How are you going to ruin Ashley Ploog's unopposed run? Message family and friends, or create a fake news site? Alright, here's another option: Ashley Ploog's disgusting chat messages leaked. "I bleeping hate bears." This is the kind of chaos we need. Post this, and then it's like a sloth photo with the words...
Sander van der Linden 32:31
It's meant to be a bit amusing. What we do at the end of all of our interventions is we evaluate it empirically. So, at the beginning of the game--and I'm not sure if you participated...
Baratunde Thurston 32:40
I did. I did everything I was asked to do.
Sander van der Linden 32:42
...We give people some simulated social media headlines, and we asked them how reliable they think they are, how confident they are in their judgement, what they would share on social media, and things like that. The types of headlines that we give reflect what's going on on social media. So, I'll give you an example: basically, as people protesting saying "end Father's Day," and other people, you know, so it's this innocent issue that's getting blown up, potentially, by nefarious actors, because they wanted to sow divisions. So, we want to know how people become more attuned to this strategy of, for example, in this case, polarization. Another was a news article, a tweet about a news article, that was a father and his son, and they went out hunting and shot themselves or something, and somebody commented, "oh, 1.5 MAGA bullies less in the world." It's this type of deeply polarizing debate that we wanted to address. What we found is that people are better able to recognize these strategies in the sense that they find these posts less reliable after playing the game, they're less likely to say that they'll share this type of content on social media. Once people leave the game, we started to follow up with them week after week--and don't worry, we get ethics approval on this from the university--but we sort of attack people with misinformation week after week. So, attack sounds nefarious, but we basically present people with social media posts that are misinformation and we ask the same questions. What we found is that, actually, for a psychological vaccine, it lasts pretty long. For about two months, the antibodies are still there. After two months, it helps when you boost people in between, so we found that there's a decay. Like with the Pfizer vaccine, you need a booster shot, otherwise it wears off. There's too many distractions we call interference going on in the world that makes people forget and get less motivated, but you can boost people in between by re-engaging them. One of our interventions, Bad News, which is the main simulator, went viral on Reddit and they actually crashed our service, so they call it the Reddit Hug of Death.
Baratunde Thurston 34:42
The Reddit Hug of Death, yeah.
Sander van der Linden 34:43
We started scraping what the Redditors were talking about, for example, and it was really interesting to learn about, you know, people sharing their experience about the game and what they've learned. This started getting us thinking about herd immunity and, wait, maybe people are sharing inoculation with each other on social media. Wouldn't that be cool? If people talked about what they've learned, shared with others, so that even people who didn't directly play the game can benefit from the vaccine sort of speak, so that's kind of what we're working on now.
Baratunde Thurston 35:14
So, I played one of your games, and I'm a good person, Sander. I am open-minded, I vote for the right people; but there's other people out there, Sander, who are not the best, and they spread lies and deceptive information all the time. Are they playing your game too, Sander? Is the other side playing this game? Are my QAnon brethren playing your game?
Sander van der Linden 35:40
Well, it's interesting. I don't think that diehard conspiracy theorists are playing our game, but we are thinking about some ways of trying to reach a broader audience, in terms of the inoculation and getting it scaled to people who might not volunteer to come in and learn more about this stuff.
Baratunde Thurston 36:00
The epidemiological metaphor of vaccination and inoculation is so clear, as a strength to me, but as something that has some limits. So, we're in a real epidemiological challenge right now with COVID-19, and vaccines are a tool, but they're not the only tool. Right? With any infectious disease, we don't just rely on people to inoculate themselves against the threat. We have public health agencies, we have government policies, companies institute barriers. So, yeah, big question, but simplified: what else can we do if not every person on earth plays your game to still get a handle on the misinformation-disinformation challenge?
Sander van der Linden 36:46
I think the uncomfortable spot that we're all in, especially as a scientist, is that it seems unrealistic. What else can we do that they're willing to accept? And for us, YouTube actually desn't work directly with outside scientists, so it's actually very difficult to implement an evidence-based prebunk on their platform.
Baratunde Thurston 37:05
So, Sander, I got to pause you there. Earlier in our conversation, you acknowledged, you know, these platforms acknowledge they're not in the business of accuracy or truth, right? They're in the business of engagement, they're in the business of entertaining conversation, probably user growth, but what you just said about YouTube doesn't work with outside researchers directly, that sounds to me like they don't want to know the truth. They're actively avoiding understanding the impact of their platforms on us. What's your read on that decision?
Sander van der Linden 37:40
I think there's probably some truth to that. It probably doesn't only apply to YouTube, but to most social media companies, because the fact of the matter is, we can say what we want them, and we say a lot to them, and we have meetings with them regularly, and they listen to us, and they do respect us, and they take our evidence; but, you know, they say they have their own internal evidence that they don't always want people to evaluate externally, so what can we say?
Baratunde Thurston 38:01
So, wait, they... Hold on, they have their own facts?
Sander van der Linden 38:05
Yeah, they have alternative facts. They have their own alternative facts.
Baratunde Thurston 38:08
Oh, this is terrible.
Sander van der Linden 38:10
It's tricky, you know, because they say, "oh, in your experiment, you're dissimulating Facebook. We can actually see what's happening on Facebook," it's like, yeah, but if you don't want to share it, then we're not getting anywhere. We have very little information about what is actually going on on these platforms. The studies that we do from social media is because we get limited access to scrape millions of posts, but it's only a snapshot, really, of what's what's going on. They're very hesitant to work with scientists, and it takes a long time to build up relationships. We're getting there, but it's difficult. I just want to say I agree with you. To us, it seems like a win where you can get them to implement an evidence-based solution, even if it's a minor one. I will say, the people that we work with and the research teams that these companies, they are really motivated and they really want to fix the problems. I think the issue is with the high-level policy executive people who just shoot down the more radical solutions that we need. I think that the problem is with the people higher up, not necessarily the research teams who are going out, making connections with researchers, reading our papers, wanting to fund our research, wanting to implement the solutions, and then they go to their bosses and they say, "yeah, interesting, we'll think about it."
Baratunde Thurston 39:15
Well, look, congratulations for having any level of dialogue and partnership with these large organizations. So, we're going in the right direction. Whether we're going far enough fast enough, we should argue about that on social media. I'd love to know how you think about the impact you've had, whether it's in the partnership world with these companies, whether it's working with governments, or whether it's just individuals from Reddit or some other social share coming across some of these games.
Sander van der Linden 39:47
I meet with Facebook weekly. I take time out of my schedule every week to try to be...
Baratunde Thurston 39:51
Sander van der Linden 39:52
Baratunde Thurston 39:53
That's incredible. How much furniture do you throw during these meetings?
Sander van der Linden 39:57
Well, you know, the team is really good, but the decisions that are ultimately made, it's slow. It's a very slow process, so we're making progress, but it's not going fast enough. We need more radical changes and solutions. We try to team up with the organizations that are impactful in the area that we work in, whether it's the Department of Homeland Security who can distribute this to all political parties and so on; or with our COVID-19 game called Go Viral, we got some support from the World Health Organization and the United Nations, and they have volunteers that can target this intervention at vulnerable audiences and really help scale it across millions of people, but there's billions of people in the world, not just millions. So, I think, what can people do? Here's my general philosophy for society: I think we need a firewall system to mitigate the post-truth bias that's creeping in. So, this firewall system, or a multi-layer defense system, should start with a prebunking. We should all prebunk: the WHO, the social media companies, even regular companies, and then, at the same time, we have to radically reinvent the incentive structure of social media. So, nothing big, nothing big that I'm floating here.
Baratunde Thurston 41:02
Sander van der Linden 41:03
That's all, that's all.
Baratunde Thurston 41:06
So, we call this show How to Citizen. We think of citizen as a verb, not a noun or legal status so much as a posture of participation in society. What's your view on what citizening means to you?
Sander van der Linden 41:24
I think being a good citizen means, not only maintaining a healthy information diet or yourself, but also helping other people to discern fact from fiction in their lives. I think that's how I see my role as that it's not just about me, it's also about helping my fellow citizen not get duped.
Baratunde Thurston 41:42
This is a refreshing take on a challenge that so many of us just feel hopeless about. So, thank you for another perspective on that. I'm excited that you've built something that's fun,and terrifyingly effective at the same time. That's a hard trick to pull off, so thank you, Sander, for the time. I look forward to a more sane and healthy information environment for us all to inhabit.
Sander van der Linden 42:10
Thanks so much for having me on.
Baratunde Thurston 42:25
We've all been there--I know I have--just kickin' it on the internet and some jerk shows up spreading infuriatingly-incorrect garbage. So, we do what any good citizen is supposed to do. We dump data, we fire off facts, we counter that misinformation with real information to prove that jerk wrong, but Sander wants us to reimagine and reframe the way we approach misinformation. We can't hit people over the head with facts. Whac-A-Mole is ineffective. Games like Harmony Square, on the other hand, they teach us some of the dark arts of misdirection and illusion, and like peeking behind the curtain and seeing the great and powerful Oz for the first time, once you see him and his dirty bag of tricks, he loses some of his power. So, stay safe, stay alert, think twice before you hit that share button, but honestly think also about who even wants you to hit the button in the first place and what they have to gain from it. As we check in with ourselves about the content we consume, next time we get a lesson on tech nutrition because machines got to eat too.
Kasia Chmielinski 43:45
Bias in, bias out, like garbage in, garbage out. You feed this machine something, the machine is going to look exactly like what you fed it. You are what you eat.
Baratunde Thurston 43:55
By now, you know we're committed to giving you things to do beyond listening to our episodes. At our new howtocitizen.com website, we've got every episode, transcripts, links to the guest, but most importantly, we have things you can do to actually practice citizening. So, in that spirit, for this episode, here's some things you can do. Point your browser over to inoculation.science. That's right there's a .science domain name. Get your science on, head on over to inoculation.science and play the set of games that they've built. In addition to breaking Harmony Square, which you heard me playing and acting the damn fool as I did so, they've got games to help you limit the harm of fake news and COVID misinformation. After you've played some of the games and watched some of the videos, reflect on how they made you feel. Are there online experiences you've had that make more sense once you consider you might have been intentionally manipulated. How does that feel? I suspect it makes you mad, but also might make you feel more empowered. Do you think these games might affect how you engage online in the future? Finally, share these games with the people you care about. So many of us have folks in our lives, and we don't want to waste hours and hours convincing them of something that's so obviously false when we take real information into account. Look, friends, don't let friends spread misinformation. It's kind of as simple as that. I don't expect you to memorize all this. Everything I've said, there's a version of it in the show notes in the podcast app you're listening on right now, and we've got all these links over at howtocitizen.com. You can also engage with us on IG, on Zuckerberg's property, we are @howtocitizen, where you can share and learn from other people who are on this journey with us, including me. That's all I got for now. Peace.
How to Citizen with Baratunde is a production of iHeartRadio Podcasts and Dustlight Productions. Our executive producers are me, Baratunde Thurston, Elizabeth Stewart and Misha Euceph. Our senior producer is Tamika Adams, our producer is Alie Kilts and our assistant producer Sam Paulson. Stephanie Cohn is our editor, Valentino Rivera is our senior engineer and Matthew Lai is our apprentice. Special thanks to Sam Paulson for creating the chip-tune arrangement of the How to Citizen theme and the Harmony Square-inspired tunes to accompany my gameplay. This episode was produced and sound designed by Tamika Adams, with additional help from Sam Paulson. Additional production help from Arwin Nicks. Special thanks to Joelle Smith from iHeartRadio and Rachael Garcia and Dustlight Productions.
Transcribed by https://otter.ai
Let us know your thoughts about the episode. What did you learn or what surprised you or challenged you?
Share what you’ve learned. Knowledge is power! Tag #howtocitizen so we can reshare!