The Darya Rose Show
May 10, 2021

Renee DiResta on how social media obscures the truth–and what to do about it

Renee DiResta on how social media obscures the truth–and what to do about it

Renée DiResta discusses how misinformation spreads online, and how social media platforms can be taken advantage of to propagate false narratives. Understanding how misinformation spreads is the first step in protecting yourself from falling victim to falsehoods.

Apple Podcasts podcast player badge
Spotify podcast player badge
Overcast podcast player badge
PocketCasts podcast player badge
Google Podcasts podcast player badge
Amazon Music podcast player badge
Pandora podcast player badge
Stitcher podcast player badge

Renée DiResta is the technical research manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social and other media networks. Her areas of research include disinformation and propaganda by state sponsored actors, also health misinformation and conspiracy theories. Renée has advised Congress, the state department and other academic civic and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience, conspiracies, terrorism, and state sponsored information warfare.

Twitter @noupside

Notice: Any purchases made through my links to Amazon will result in them sending us a few cents that will certainly not cover the cost of running this show.


I'm Dr. Darya Rose. And you're listening to The Darya Rose Show, where we bring a fact-based perspective to answer all those confounding questions that come up in our day-to-day lives from achieving optimal health, to making conscious choices about your purchases and raising kids that thrive. We are here to help you navigate your life with confidence. Hello, and welcome back to The Darya Rose Show. Today, we are continuing our discussion of how do we know what's true.

And specifically we will be discussing the role of social media, which I think we all sort of intuitively understand is a massive, gigantic, enormous part of the problem. And one of the reasons this conversation has gotten so confusing and why you can talk to two different people and they completely have different realities and beliefs about what is actually true. 

My guest today is one of my favorite thought leaders on the subject. Her name is Renée DiResta. She is the technical research manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social and other media networks. Her areas of research include disinformation and propaganda by state sponsored actors, also health misinformation and conspiracy theories. Renée has advised Congress, the state department and other academic civic and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience, conspiracies, terrorism, and state sponsored information warfare.

Renée regularly writes and speaks about the role that tech platforms and curatorial algorithms, that's a fancy way of saying how they decide what to show you [laughs], play in the proliferation of disinformation and conspiracy theories. She writes for lots of publications. You can specifically see her at Wired, The Atlantic, but she's all over the place. New York Times, CNN, all that good stuff, because she's absolutely brilliant and a lot of people wanna know what she has to say, because this is a really hard problem.

And frankly, there are not a lot of good answers and what I'm hoping that you'd get from today's episode is better understanding of how this stuff works, because there isn't some magical button or wand that you can use to tell you whether or not you can believe, uh, an information source on the internet. But when you have an understanding of how Facebook and Twitter, and even Google feeds you information, you can become a little bit more savvy and get a better understanding of how it works and that gives you a little bit of a defense to question what you're looking at and whether or not you can truly believe it.

And, and that's really the best that we can do. So I hope that this really helps because I think that a lot of us feel the need to have our guards up. And I think we also need to educate our communities because I don't know about you guys, but my family [laughs] and a lot of people I know are not the best at parsing what they can and can't believe on Facebook and other platforms. So they should help elucidate some of that and hopefully you will learn a lot. So thank you and enjoy. Hi Renée. Welcome to the show.

Renée DiResta: Hey, Darya. It's great to be here.

 Why don't we get started? I'm so excited to talk to you, by the way. I've been following your work pretty closely for probably five years now. I think I started panicking in early 2016 [laughing] and I started seeing what you were doing. So why don't you start by telling us a bit about, you know, what you do and why it's so important right now?

Renée DiResta: Yeah, so I'm at Stanford Internet Observatory. I run research over there and we study the abusive information technologies. So the thing that gets... The most attention probably right now is the work that we do on, uh, mis and disinformation. But we also do work on understanding how emergent technology is transforming everything from communications to ways that people engage with each other. So that could be end to end encryption and what that technology is doing as people move into end to end encrypted chat apps, instead of Facebook. We look at things like emergent AI and how artificial intelligence generated faces and texts are changing our ideas of, uh, reality or unreality.

We look at harassment at child abuse, unfortunately at mental health issues and ways in which platforms have become these defacto places that people use for everything, from their political information, to their health information, to their mental health information, to you name it, connecting with their friends, buying their food, the ways in which technology changes, how we engage with each other. And for us, we focus specifically on the misuse of that.

 So, you're trying to figure out exactly which dystopian hell we're going to live in one day.

Renée DiResta: [laughing]. That's one way to put it.

 Yeah. All that stuff sounds really scary. Like I've seen those deep fake videos of people talking, but not, and at that face with Jim Carrey, switching into Jack Nicholson [laughs]. And I don't know if you've seen all that, but it's, uh-

Renée DiResta: Yeah, no, I've seen it all. Yep.

 So specifically how those technologies can be manipulated to impact the rest of us.

Renée DiResta: Exactly. That's our focus. And a lot of that right now has been mis and disinformation. And that's because as social networks have become the sort of defacto place for getting news and having political conversations, that's had pretty profound impact of course, on our democracy and our, our political speech, but also we're in a pandemic, we're all in lockdown. And so health misinformation has also become a very big focus in part because, uh, what people see on these platforms really shapes their impression of how to stay healthy in the age of COVID-19.

 Totally. And that's why I have shifted my focus of my work to include the sort of bigger picture about truth and how do we know things 'cause it doesn't matter what you talk about, if nobody's sure they can believe you. And I was starting to feel very, just weird about just doing my normal work and I just feel like I had to take a step back and say, let's just all agree on some ground rules here and on some facts. That's why I wanted to talk to you [laughs]. So why don't you tell us the difference between misinformation and disinformation 'cause I feel like those words get mixed up a little bit?

Renée DiResta: Yeah. Misinformation is information that's false, but it's inadvertently spread. So misinformation, when somebody is sharing it with you, they often believe it to be true themselves. And the intent is usually to help inform their friends and family and their community. So there's a desire to share something because they genuinely believe it and they believe it's important for people to know. Disinformation is information with an intent to influence and there is a deceptive component to it. So the deceptive component could be in the content of the message, uh, meaning it could be false.

It sometimes it's true, but inflicted in a certain way like propaganda, but it is put out by somebody who is not what they seem to be. So the deception is in the voice or the actor and sometimes they're using deceptive means to disseminate it, bot accounts or coordinated posting, things that spammers, for example use, that these folks are using those kinds of tactics to spread political or health messages to the public. So the difference is really that intent, that desire to either help someone by sharing this information or to deceive someone by sharing this information. So it's the extent to which the person who is spreading it is weeding.

 Got it. So can disinformation become misinformation? For instance, if some financially motivated anti-vax person creates a meme and then my aunt sends it to me in Facebook, is that different disinformation that became misinformation in her hands because she believed it and she doesn't know any better.

Renée DiResta: That's such a great question. So when we struggle with this as academics, when we talk about it, a lot of the time we'll still use the word disinformation because the origin, we spend a lot of time trying to understand the origin of the narrative too. And so if the origin, uh, is from this poison tree, if you will, we still treat it as a disinformation campaign. And that's in part because the entire goal is to get real people amplifying it.

If you only have your botnets or your fake accounts or your cluster of sock puppets that you control, spreading and sharing and commenting on the stuff, then you've really not done a very effective job getting that message into the public discourse. So the entire goal is to get real people to pick up and amplify and share and spread. And that's an interesting transition point.

 Yeah. So who are these jerks?

Renée DiResta: [laughs]

 Where does this come from? What are the, the big players here?

Renée DiResta: So there's a range of different folks who can run these campaigns, if we're talking about disinformation in particular and that's because the cost to create and spread a message is really virtually zero at this point. So you can create content for free, pretty much on any platform, you can record video, hosting is free, I put it up on YouTube, if you want to, share it to Twitter, post it to Facebook. There's no cost associated with any of that. And originally the types of folks who were engaging in coordinated campaigns that were overtly manipulative were entities like terrorist organizations that were using propaganda to recruit new adherence.

So extremism was a big use case for the stuff in the early days, state actors like Russia, China, but now at this point, it's Iran, Saudi Arabia, Egypt, dozens of countries. I'm trying to think of the number off the top of my head have been implicated in a state sponsored information operations take downs by Facebook and Twitter, and you got the state actors. And then now you have what we call the domestic ideologues. And that's people within one country who are running these campaigns against their own countrymen, against their own fellow citizens to achieve a particular political objective and the domestic political space.

So if we were talking about that in the context of the US that would be Democrat or Republican groups using manipulation. It's even difficult to say Democrat and Republican at this point, just anyone along the vast political spectrum from far left to far right, has the capability to use these tools and to use them to influence and to use them to manipulatively influence. And that is kinda where we are today.

 And I'm curious in the health space, it's obvious to me why Russia and ISIS would want to create propaganda. That's how they keep power, but for the more subtle health things like Plandemic, where did Plandemic come from?

Renée DiResta: So Plandemic is an interesting case. Judy Mikovits, the scientists who had a number of her papers retracted, were the central character of that film. She had a book to sell. And so she had written this book, really taking advantage of COVID in a sense, because her whole story that she claims is that Fauci mistreated her and is the reason that she lost some job and had her paper retracted. And I believe was arrested at one point. So she took the opportunity to pile on the Fauci haters as this came up. And some of the folks who had been, "whistleblowers" for project Veritas got involved in the process of turning her into, uh, a social media figure.

So for very long time, she'd just a couple of thousand Twitter followers, really wasn't active. It was just one of these speakers on the anti-vax conference circuit that nobody outside of the anti-vax movement knew about. But what began to happen was through this amplification network, Robert F. Kennedy Jr. who's a very prominent anti-vaxxer, and these folks from the project Veritas community began to really do the work of trying to boost her. And they turned it into like book talks, basically they started booking her for a bunch of YouTube channels

And this is where you see the interesting frontiers between one community and another, which is there's the kind of diehard core anti-vaccine community, but then they overlap and certain interest areas with members of other communities. So one example would be while not all anti-vaxxers believe in QAnon, there are a lot of QAnon people who are anti-vaccine. So there are members of both communities. And so what, with some of this content, particularly with Mikovits in the weeks leading up to Plandemic was that anti-vaxxers were sharing the content.

And then it was hopping into the QAnon community because people who were anti-vaxxers were sharing it into their QAnon groups. So that dynamic of people as conduits for this information actively participating in the sharing of this stuff is a really key dynamic driver of it, particularly when it's small, before platforms are really amplifying it. And so Plandemic was unique in that while Mikovits have been doing these interviews for awhile, she had a tendency to ramble. The interviewers were a little bit far out on the conspiratorial spectrum, a little bit beyond what mainstream audiences would be receptive to. But with Plandemic, it was a very slickly produced video by somebody who had done work for, I think it was, uh, Bernie Sanders and Tulsi Gabbard, so understood how to make effective emotionally resonant political pieces.


Renée DiResta: And this guy filmed a shorter interview with her, where he really kept her on target and focused her in, on making specific claims about masks, about Fauci, and then framed her as a whistleblower. And the other thing that, that he did with this video that was very effective was, he immediately framed it as this is the video the tech platforms are gonna take down, you have to help evade their censorship.


Renée DiResta: And he knew they were gonna take it down because it violated their policies about health misinformation, because it was making wild claims, completely scientifically inaccurate about masks. It was saying that one of the ways that we could combat COVID was for everybody to go to the beach because they were healing microbes in the sand, but it just made a lot of-


Renée DiResta: ... really just inane. Oh yeah. There was a lot of like inane. They are trying to keep you indoors because being outside is what we need right now, that the sea air and the healing microbes in the sand, they are trying to keep you from them. But this kinda thing really appeals then to people who are not necessarily part of the anti-vaccine conspiratorial community, but believe that big tech is censoring people. And so it's, how do you make your message appealing enough to a number of these different, highly engaged communities?

And we call this network activism. Nobody is saying that they don't have a right to be highly engaged. This is how in many ways political change and political movements grow today, but the challenge for the platforms was what do you do about this video that went viral? Because until the video came out, there was no indication that it was absolutely demonstrably going to violate their policies on COVID misinformation. And so to even have that framing that they are going to sensor this video led to more people sharing it, being engaged and, and feeling that they were somehow fighting the machine of big tech. So they-

 Got it.

Renée DiResta: ... got people off the fence and, and tech took it down after something like eight million people have viewed it, which is also not the best implementation of that policy.


Renée DiResta: You're gonna have it and you're gonna take it down after eight million people have seen it that just turns it into forbidden knowledge and-

 And more enticing.

Renée DiResta: Exactly. It makes it more appealing.

 Okay. So in that case that you have the true believers and some politically motivated people, and then another kind of weird group of true believers using it, being opportunists ar- around this topic of COVID to, to get their message to a bigger group, which is always the goal, right? To get your bad information in front of the most people.

Renée DiResta: Exactly. So it's to get, it's to get real people to pick it up and share it for you. Again, this is the anti-vaxxers are an interesting case because the, the real core of the movement is very much true believers, but there are these people who are figureheads, prominent figureheads within it, who literally have, uh, snake oil to sell you [laughs]. They've got-


Renée DiResta: ... colloidal silver to cure your COVID. They've got essential oils to, to detox you, instead of having you get your child a MMR vaccine. So there's-

 A financial motivation.

Renée DiResta: There's financial motivation. There's also some of them run media properties and want you to go visit their new sites, which, you know, have heavy ad load and things. And those people like Larry Cook, who runs, uh, an anti-vaccine organization, who in addition to selling these sorts of detoxes and whatnot wants you to go to his site because he has a massive Amazon referral operation underway.


Renée DiResta: It's the combination of true believers and grifters as this is-


Renée DiResta: ... what's really driving that.

 So I'm curious, then I have a training in science. And so I saw Plandemic in within two seconds, I was like, this is ridiculous propaganda, but you know to me, it didn't look like real news, but I'm curious, why does fake news catch on and why do people believe it?

Renée DiResta: So I think some of it is really, it's engaging. It's sensational. It's interesting. There's a lot of information swirling around us at all times. And the way that social engagement works, creating content that's makes you wanna click on it or dunks on something or insinuates a conspiracy where there is none, it's compelling. It makes people curious. And so they click on it and they engage with it. And right now we have curation algorithms that, that help social media companies use them to help curate our feed, because there is so much information out there that we're not gonna have time to read or see at all.

And whatever number of people you're following, many of them are producing a ton of content. And so this is how Facebook came to rank the feed anyway. If you remember, like the olden days, I, I think we're about the same age, right? The olden days, it was-


Renée DiResta: ... reverse chronological and you could get to the end.


Renée DiResta: There is no longer reverse. Chronological is not the default and you never get to the end. There's always something else to show you. And so that dynamic is in some ways, what really influences what we see. It's not that the platforms are choosing to show you something because they have a particular viewpoint that they want you to see, which is a thing that I think a lot of the people who, uh, believe strongly in the anti-conservative bias claims, there's no evidence for that.

They, they claim that Facebook is trying to surface content that it wants you to see. That's not it at all. It's trying to surface things that it thinks you are likely to engage with.

 Right. Of course.

Renée DiResta: And what the, the best predictor for what you are gonna engage with is some combination of things you've already engaged with, things you like, things that it has intuited that through your past actions that you're gonna be receptive to, but also things that people who are like you engage with. And this is where, when people start to talk about echo chambers and being nudged into, Facebook having a particular perception of your political beliefs, and only showing you things that reinforce them, that's because of this incentive.

They wanna keep you on site. They have to curate your feed for you 'cause there's so much information. And so the stuff that is really engaged with is what then, it becomes a self-fulfilling, you know, feedback loop, a small amount of engagement that gets more engagement. It gets more engagement and that is really influencing what of the available content they have is, you know, what is a small percentage that you then see?

 Yeah. So I remember very distinctly in 2015, 2016, when that political stuff was going on. I remember specifically seeing a lot of people in my Facebook feed sharing websites that kind of looked like ABC news, or kinda looked like CBS or something that you would be like, Oh yeah, that's a normal news site. But then if you looked harder at the URL, there would be a few letters off. Or it would literally-

Renée DiResta: Yeah.

 ... be trying to look like real news, but be fake news. I feel like that's where fake news, like the word came from, when I started [crosstalk 00:18:51]-

Renée DiResta: Yeah, yeah, yeah. That, that was, it was demonstrably false stories or, you know, a, uh, kinda like, uh, manipulative site that was trying to pretend to be a real news organization. That was an interesting dynamic. That was... A lot of that was tied to economic motivation. So the early days of fake news, it's really gotten kind of ret-conned into being like, uh, a Russia thing. It actually wasn't fake news, was a very separate piece of the information manipulation online.

There were these Macedonian teenagers who realized that they could write sensational news about the US and we just put up these blogs, writing these things, target them either through ads or Facebook page, or posting them into groups to get them in front of the right people. And then the people would share the articles. And again, this is where that jump happens from inauthentic originators to real people picking it up and sharing it because they believe it, or they feel that their community needs to know about it.

And the motivation for these kids was money, right? It was to get people to go visit their site, and then they would earn money on the ads on the site. And that dynamic gradually got lumped into something else, which was what Russia was doing. Russia was making fake news websites, pretending to be groups of people. The stories on them weren't necessarily false. They were actually just politically inflected, inflammatory propaganda, but they weren't false. So there was-


Renée DiResta: ... the distinction between fake news, which was stories like Pope endorses Donald Trump, or Megyn Kelly fired from Fox News. And these stories would go viral and trend on Facebook in part, because Facebook was criticized for human editors, somebody made the claim that the human editors were biased against conservatives, and that led to a firestorm. And in the course of that firestorm Facebook made the decision to eliminate the human curators from overseeing trending topics. Now, what came to unfold was that most of the human curators were trying to stamp out the false news from these garbage domains.

And these garbage domains were targeting conservatives. And in a sense, if you were pulling out the garbage content that could be misread as anti-conservative bias. In reality, it was curating a way these spammy fake sites that were deliberately manipulative. But when Facebook removed the human curators from the process, those sites just trended, because again, they had solved this problem of network distribution, writing a sensational headline and getting ordinary people to share it. And so in some ways, by removing the people and going to purely, uh, algorithmic filtering, purely algorithmic curation, the problem got even worse.

 Hmm. So in that case, I feel like the, the line gets really hard for people, especially people who aren't trained in science or journalism, or in some ways to learn how to de- detect what's difficult. It gets really hard for people to know if they're looking at something correct or not. And I'm curious how your team makes that, makes those distinctions and, and how do you guys make calls like that? Is it ever hard or?

Renée DiResta: So we don't, we don't do fact checking, like we're not a fact-checking org. I'll give a specific example. So during election 2020, we worked on understanding voting related misinformation. So very narrowly scoped. And that meant people who were saying things about the procedure and process of voting, people who were trying to suppress the vote with claims about you vote here on this other day. That's not Tuesday, that kinda thing. Within the context of trying to understand those dynamics, we would see things that appeared to be false or misleading content.

And what we tried to do was not fact check them. We tried to flag them. So when things were appearing to get significantly larger, when a claim was beginning to snowball, there was an opportunity to pass that off to a fact checking partner or to a government partner if it was geographically specific. So for example, uh, Sharpie marker is in Maricopa County, Arizona, that became the story called Sharpiegate. The people best equipped to respond to that word, the local and state officials, uh, in Arizona, and then the Arizona local media that occupied a position of trust within the community.

Those were the folks who needed to understand what was happening, where this narrative had come from and where it was going. And then they in turn, did the attempts to counter the misleading claims, which were that poll workers had handed out Sharpie markers to Trump voters so that the marking of the ballot would lead through and render their vote useless. But this was the claim that the election officials countered it, the local news entities carried that countering and, and did their own fact-check, shared it to Facebook and tried to get the truth of the matter out to the community, through sources that they would be inclined to trust.

So Stanford Internet Observatory saying that, your Sharpie marker worked fine on your ballot. First of all, we don't know anything about ballots and we don't know anything about the particular machines and use there. And we also don't know the people of Arizona and their concerns, and so we are not the best equipped to counter that narrative. What we could do was understand that it was emerging and help other people who could counter it, see it and understand it.

 So I always put myself in the position of normal person [laughs], right? So a normal person maybe wants to keep on Facebook, 'cause that's where they get to see their nieces and nephews and pictures and birthdays and weddings and stuff. But at this point, a lot of people are aware that there's a lot of misinformation online, but is it just up to us to decide what we believe or not? I'm just curious, what does a normal person who wants to know the truth, but isn't necessarily super well-trained on a specific subject. How do they know what to believe?

Renée DiResta: Yeah. This is the question I think that faces us today, particularly as people are really concerned about overreach and content moderation and that becoming a form of censorship while at the same time, there are certain manipulative practices that can be used to, to ensure that people see and receive bad information that people who are financially or ideologically motivated can use. And so I'll use the example of COVID for now, there are certain keywords where if you type them into a search engine, there is just such a voluminous amount of information from highly reputable sources, tons of interlinking from other reputable sources is how Google works. It's going to try to surface the most authoritative information to answer your question.


Renée DiResta: It also recognizes that there are certain areas that are battlegrounds, for example, financial tools or products, where they wanna make sure that if you have a question about your finances, what pops up isn't some website that some spammer has tried to use SEO manipulation to get to the top that would lead you to make a decision that was harmful to you. And so Google calls this your money or your life is the name of their kind of search quality thinking-


Renée DiResta: ... which is to say that for health and financial related topics, they should have a higher standard of care in ensuring that they're returning reputable things. What happens though, in some cases, is there keywords and terms that just haven't really been searched for before. I wrote an article about this a couple of years back in the context of the vitamin K shot. Now the vitamin K shot is not a vaccine is an injection, but it is not a vaccine. It's an injection of a vitamin.

And the intent is because newborns are born with low levels of vitamin K, which helps clotting, receiving the shot helps prevent rain bleeds. It helps prevent hemorrhages by introducing this clotting factory and the ability to, to stop these hemorrhages from occurring. And so it's a relatively routine intervention we all thought-


Renée DiResta: ... what began to happen though, was as the anti-vaccine movement gained traction and was producing tons and tons of content. Some of the people in it started writing these articles about how there were preservatives in the vitamin K shot that were going to harm your child. And it was an unnecessary medical intervention, and it was really a vaccine and you didn't actually know what was in it and all of these conspiratorial-


Renée DiResta: ... and misleading nonsense. And so for awhile on Google, the top search result for vitamin K shot was this random blog called the healthy home economist, which was just spouting off absolute garbage about vitamin K shots that was not in any way, best medical practice and what unfortunately happened. And the reason I wrote about the vitamin K shot in particular was because also there were people who were in these parenting groups on Facebook who were sharing this as like, well, I searched on Google and, and here's what I found.

Or they were saying, no, I read that, the medical establishment is just trying to get people, like trying to get them used to the idea of vaccines through this vitamin K shot, you don't really need it. And then there were two kids who actually died because of bad medical advice that the parents received in these Facebook groups. So the stuff has impact. The problem is all of the platforms have search features, not very many of them are, are of the caliber of Google search.

And not very many of them were thinking about prioritizing authoritative sources. And the problem was that if you search for the stuff, you would see the content that existed for them return to you, which was the content produced by the anti-vaccine movement. And using that example, you can imagine in the era of COVID where there's a new term every day, there's some new drug, there's some new vaccine brand, even just the term COVID-19 for awhile there are these sort of frontiers where as people are searching, trying to get information what's being returned is whatever the search engine or platform has to return to you.


Renée DiResta: So people who are producing tons and tons of content, trying to win that pride of place in search results are not necessarily the most reputable voices that you wanna be getting your medical advice from. This is one of the real challenges when we think about, this is a very long way to answer your question. But when we get to the point about who is responsible and how do people decide, sometimes the information at our disposal is not necessarily high quality. And this is where platforms have tried to figure out what to amplify.

Recognizing that it's not always going to be content from institutions because per the vitamin K example, it is CDC wasn't out there writing up tons of stories about vitamin K because it just didn't even occur to them that it was a thing that they would need to do.


Renée DiResta: Similarly, for COVID they took a long time to update some guidance. So if you were searching for content about masks, you weren't gonna find it from the institutional authorities. And so the platforms now have to try to decide what does it mean to be authoritative? And how do we surface authoritative information, reputable information, because people are using our platforms to get information. And that is the real challenge that faces us today. Platform curation can have a life or death impact on people, and yet that is a phenomenal amount of power to have. So how should we think about what we want that system to look like?

 So it really is right now just the platforms curating and us making our best judgment call when we look at a specific website.

Renée DiResta: Exactly. And so there's a lot of questions about what should curation look like going forward. I remember maybe four years ago now advocating that we could surface reputable domains because I was like, look, when you think about your email inbox, garbage doesn't show up in your inbox, it shows up in your spam or junk folder. Now you have the freedom to go into your spam and junk folder and look through it and see if something was unfairly stuck in there.

But for the most part, there are collaborative ways in which email providers are thinking through what is a garbage domain? What is a spam domain? What is a malicious domain and filtering that out for you. And we don't see that as a form of censorship, right? We're not saying, gosh, those poor [laughs]-

 Right. [crosstalk 00:30:28].

Renée DiResta: The first amendment rights of this poor spammers, how could they not make it into my inbox? Somebody think about the, the free speech of the spammers, there's like a different mindset.


Renée DiResta: But if you were to say the same thing and say maybe garbage domains could be deprecated then right now that turns into a whole circling, the drain conversation about who is to decide what a garbage domain is. And so there's like a paralysis that, that goes along with that because somebody is gonna decide, and right now it's gonna be the platform. And there's just a lot of really fraught debates where the very notion that some debates are lower quality or have worse information is, is treated as like, uh, highly incendiary political claim.


Renée DiResta: And I think that's troubling.

 Yeah. What's also troubling to me, as you were talking, I was thinking about how a shocking number of people that I consider educated and smart. It's come to a point where they don't trust things that should be trustworthy because a big message of the misinformation is you can't trust the authorities, right? Is you can't trust the government.

You can't trust the CDC. You can't trust scientists, you can't trust institutions. And that's really scary because then they listen to these wackos. And even if they have access to the correct information, they second guess it, or maybe don't heat it because of some misinformation they saw already.

Renée DiResta: And that's bigger than social media. That's a lot of the problems of people losing trust in institutions and media. And that's a multi-year process that has, you know, multi-decade I think at this point.


Renée DiResta: One of the challenges is how we think about rebuilding that trust. And I think that also is a bigger challenge than social media, but the ways in which media and institutions communicate has to change and using COVID as the example here, let's just go to the masks debacle, right? So CDC had a particular stance on masks that dated back to 2012 and SARS, I believe it was. And until they had sufficient evidence to confirm the means of spread of COVID and whether masks might be efficacious or not, they didn't move on changing that guidance.

And yet what was happening was that people on social media were chatting with each other many, very smart people who weren't necessarily epidemiologists or public health experts, but who were looking at data that was coming out and looking at mask behaviors in other countries that were dealing with the COVID pandemic that seemed to be having lower rates of spread and they all seem to be using masks in their day-to-day life. And so there were just a lot of ways in which people who weren't necessarily institutional experts turned out to be right about masks.

And so when the CDC finally did update its guidance, it really looked like it was leading from behind and it had dropped the ball and they could have communicated much more transparently throughout that phase saying things like, this is why our guidance is what it is. This is the evidence that we have and based on that evidence, here's what we think the likelihood is that masks will work or not work. And here's the threshold of confidence we have to get to before we change our guidance on this. But there's, uh, interesting phenomenon that people expect in the age of social media that every time you opened your app, your newsfeed, whatever Facebook, Twitter, new and accurate information is gonna be up at the top of it.


Renée DiResta: And that's not the speed with which some of the scientific research is going to happen. So this was the same thing with hydroxychloroquine, or you name it, any random narratives that emerged related to what was useful for curing COVID or how COVID spread was confronted with this phenomenon, by which somebody who wanted to make a sensational claim on social media could do so. But the institutions that are tasked with reputable communication were operating on a different timescale. So rather than saying, this is our best guests right now, they chose to say nothing until they had that higher degree of confidence.

And then what fills that void is all the content put out by the people who do not feel similarly constrained and are just pushing out and saying whatever it is they want to say, and some of them are right, and some of them are wrong, but unfortunately this then falls to social media platforms to figure out how to curate it and to members of the public to decide what to trust. And what they see over and over again is a news article that went out with a headline that sounded very sure was then proven false two weeks later.

So then they distrust the medical establishment and the media establishment, because the ways in which these headlines are written, made them believe one thing that then turned out to be wrong. So why should they believe the next thing they say becomes like a boy who cried wolf dynamic.

 Yeah. It's so hard. I remember that masculine. I remember thinking the main argument they were using, I felt was we don't have enough masks and the people who are the front line workers need them. So you shouldn't have them. And I was still stocking up on N95 masks [laughing] because I know that diseases can spread that way. But yeah, everybody else was just like masks who needs them [laughs]. And it was over.

Renée DiResta: Yeah. That was a real challenge. There was, I think some of it was them not knowing how it spread. Then there was this concern that there was going to be a massive run on masks. And if they told people before, they were really sure if masks were gonna do anything, and then they found out that they needed the masks, then people would've been hoarding them. This was, if you remember, people were hoarding toilet paper, like there [laughs], there was a lot of interesting dynamics.

Remember the articles about the supply chain and the empty grocery store shelves in the early days of the pandemic, as there were lots of runs on things. We had N95 from like the fires actually-

 Oh, right.

Renée DiResta: ... yeah, we were in California and interestingly come to find out, okay, so I have N95 from the fires, turns out we had the vented kind, right? Which don't work for [laughs] epidemics.


Renée DiResta: Um, they filter what you breathe in, but let you breathe out just fine. So this was like a whole interesting education that a lot of us got very quickly, even knowing that I think was a thing that it took several weeks before people began to realize that summit 95 were invented.

 Okay. I feel bad, 'cause I was trying to help people make better decisions on what to trust, but it sounds like there's really not a good answer right now. We're really still in a period where it's up to our curated algorithms that are different for each of us to show us things. And we have to decide on our own what is, and isn't credible. And for somebody who's been trained in this stuff like me or a scientist, journalists, people who've been educated on how to discover the truth. Probably have a better shot at it, but nobody believes us anymore [laughs].

And it gets really tricky. And I, I especially worry about people turn, people do turn to me. You know, I have a community and especially my own family, but our parents are not good at this [laughs].

Renée DiResta: Right.

 They believe everything wrong. I still get autism vaccine stuff from family members that I've told a million times. It's nothing, those papers were retracted. It's not true. And they still see a little video and they sent it to me like, Oh, I still worry about this. And that's one of the reasons I'm doing this series of shows is I'm trying to educate people on why we can trust a scientist and why we can trust some journalists, good journalists. And, but beyond that, it's just really murky waters right now.

Renée DiResta: Yeah. I think that this gets at the media literacy challenge. Media literacy is currently a very big topic of conversation. My son is seven and he's in first grade, virtual first grade. And he came to me with an assignment where he had to watch a video, and then they were being asked to identify what was the fact and what was an opinion. And then they were getting this lesson that was very interesting after the fact opinion lesson, which was, it was a video of Justin Trudeau in the claim. And the video was that he had a hippo that lived in his house. He had a house hippo was a pet.


Renée DiResta: And so they were asking the kids to think about like, well, I saw this on YouTube. It must be true. And to try to get them thinking more critically about what they see on these channels, recognizing that they are gonna get a lot of their news and content from platforms that have yet to be created perhaps, but YouTube and TikTok also. And it was good to see that they were already beginning to get that to the younger generations. But at the same time, there's not as much happening for older adults who are out of school.


Renée DiResta: And so I think the question of what are the kinds of content and outreach and who helps folks understand how the internet works today, how social networks work? One of the things that I've been consistently struck by even beyond the question of the news and the information and the factuality, or whether it's true or lies is actually, they don't have as good understanding of, of the fact that it's curated. And so many of them really do sincerely-


Renée DiResta: ... believe that they're being censored because their friends aren't seeing all of their posts. So I've had conversations with folks about this, where I've asked on Twitter, why do you feel that you're being censored? What is it about this, that this narrative that, that resonates with you? And they'll say things like my friends don't see all my posts. And so that idea that-


Renée DiResta: ... social media platforms are curating what their friends see, that there's so many different posts to choose from. And that not every thing that you put out is going to be seen by every person who follows you. I think even that phenomenon, which is something that those of us who have been on social media for, since it was created, intuitively understand, it's something that people who come later to it don't necessarily get.

 And that nobody actually cares about what aunt Kathy says.

Renée DiResta: Right.

 Like I know government is, is Facebook's not censoring you [laughs], it is as if they don't care [crosstalk 00:39:32].

Renée DiResta: Yeah. That piece is also very interesting, right? The interesting phenomenon by which people think that somebody in Twitter is sitting there and cares so much about their hot take on a particular story that they are being [crosstalk 00:39:42].


Renée DiResta: It is a pretty interesting dynamic.

 Gosh, I guess I just have to keep talking loudly and saying the same thing over and over again. It's tough because as, as we talked about what you butt up against all the time is censorship, free speech, first amendment stuff. And on the other hand, you're relying on big corporations that have a financial incentive to keep you on their platform. And we all know what kind of material keeps us on their platform. It's hard to even know where a solution is gonna come from. Nobody really wants it to be the government and nobody can really trust the alternative.

Renée DiResta: Well, that's-

 Tricky. Yeah.

Renée DiResta: That's the question that faces us now, right? Which is there's gonna be things that governments can't regulate and what content goes or stays on a social platform. I would hope we all agree is not a thing that a government, you know, barring the obvious be illegal and the, the outright harmful, but otherwise you don't want the government coming in and telling Facebook and Twitter how to moderate their platforms. The idea that they're going to be moderated solely, according to whatever speech is tolerated on the first amendment is naive.

And, and that is because there are very different experiences of speech online than offline, harassment, for example. The things that people are saying in harassment brigade, they all may be protected first amendment speech. And if they were happening in the real world, there'd be a particular experience of them. But the dynamic when they happen on social platforms is that it really comes to feel like a mob or a brigade targeting a person with the goal of pushing that person out of the conversation.

And so what platform you're thinking about here is, is somebody else's speech silencing the speech of somebody who now feels that they can't participate in the space to these trade-offs.


Renée DiResta: If you believe that this is a, a virtual gathering place is somebody's freedom of speech, stifling a freedom of assembly. There are a lot of different ways in which these concerns manifest, and so that's why platforms have in terms of service, where they lay down the groundwork and they say, these are the norms that we are setting for our community. And this is what you're expected to adhere to if you're a participant in this community space. But as these platforms have become extremely important and world leaders use them, then we get questions of if Facebook or Twitter takes down Donald Trump, what does that mean for political leaders worldwide?

What does that mean for silencing on the platform anyway, somebody who the people may need to hear from? So there's a lot of these questions that are coming up, and I think we're gonna see entities like the Facebook oversight board and civil society, and others really begin to weigh in a lot more heavily here in-


Renée DiResta: ... how to think about what those norms and rules should be given that the platforms have a phenomenal amount of power in these spaces where massive audiences convene.

 Wow. So much to think about and so much to worry about. Actually, I haven't had a good night's sleep in a year and a half. Cool. But thank you so much for your wisdom. And I hope if nothing else, this shed some light for people on how this stuff goes around and the flow and understanding at least how it works, can make you a little bit more savvy in combating. It's influence on us individuals.

Renée DiResta: Yep. I agree.

 All right. Thanks Renée.

Renée DiResta: Thank you so much. It was great to chat with you.

 I hope you enjoyed today's episode with Renée. If you'd like to learn more about her work or anything that she's doing, you can go to her website, which is just, or you can follow her on Twitter @noUpside. If you're enjoying the show, it would really mean a lot to me if you wouldn't mind going over to your favorite platform and subscribing. And if you're feeling extra generous, or you really love our content, I would love it if you could give us a rating, a five star rating, that would be really helpful. Help us get more great guests like Renée and stay in touch with everything I'm doing over here at The Darya Rose Show. Thank you and have a great day.