Episode 234

Election Disinformation

March 1, 2022

Episode 234: Election Disinformation 

Lisa: Hi, I'm Lisa Hernandez.

Lizzy: And I'm Lizzy Ghedi-Ehrlich.

Lisa: And we are your hosts for Scholars Strategy Network’s No Jargon. Each month, we will discuss an American policy problem with one of the nation's top researchers without jargon. And this month we have decided to focus on the impact of disinformation on U.S. elections.

Lizzy: Yeah, and this is a real, a long time SSN member and someone who is pretty well known, um, since he's in the media. So.

Lisa: Yeah, Rick Hasen is popular and for good reasons, I mean, I learned a lot from reading his book, which is coming out on March 8th called Cheap Speech which we talk about a lot – deep into, um, disinformation, which is what the book's about. So I was excited to get to talk to him and learn a lot more about a topic that I felt overly confident that I knew, like how to decipher this information myself, um, which I realized I'm not that adept to do considering I’m deep into cheap speech. That is most of what I consume, for most of my day. Right. I'm just like doom scrolling, um, yeah, and have been for years and years. So, it was interesting to get to relearn a little bit about. the issues that we come across when we're just scrolling through the internet and taking everything as a matter of factly. Like there's a ton of people just sharing their opinions literally at all times. Um, so it was kind of startling and a little bit scary, but he offered some good solutions that maybe people can listen to later on, on what can be done to try to limit the amount of disinformation that we're just constantly inhaling on our phones and computers.

Lizzy: And if you're listening to this week's episode, you might've just come from doom scrolling or otherwise engaging with some cheap speech. So maybe this'll give you a little bit better understanding of what that is. And then of course, you know, there's a lot going on in the world, right now, and a lot of disinformation available about it.

But it's coming up on election season for a lot of states and municipalities too. So this is something for us to think about a little bit for the future.

Lisa: Well, that is, um, tough preparation, but.

Lizzy: I know, sorry to startle you, but it's true.

Lisa: Oh, goodness. Uh, lots of attention too, But, hopefully folks can learn a little bit about what can be done, what solutions are in like policy actors and court's hands, and even just like as individual consumers. So let's get into our episode here for this week's episode. I spoke to Rick Hasen, professor of law and political science at the University of California, Irvine.

Professor Hasen is a nationally recognized expert in a lecture law and campaign. Finance regulation is the coauthor of a leading case book on election law and author of a new book titled cheap speech coming out on March 8th. Here's our conversation.

Lisa:  Professor Hasen. Thanks for coming on No jargon.

Rick: Please call me Rick. Great to be with you.

Lisa: All right, Rick. Um, so I want to talk about your new book, Cheap Speech: How Disinformation Poisons our Politics, comes out a week after this episode airs on March 8th, and I wanted to ask, was there a particular event or series of events that inspired you to write it? And can you briefly share something about that?

Rick: Well, it's interesting you say that because my book project actually started as a law review article back in 2017 and I was writing it mostly during the 2020 elections. And one of the things that I was warning about in the book is the potential that speech, false speech about elections could actually cause election-related violence.

And as I was going through the edits on my book, we had the January 6th insurrection where those who were trying to disrupt the counting of the electoral college votes in the 2020 election, where Joe Biden was going to be certified as the winner, you know, people came in and I would say that it was cheap.

That was one of the reasons why we saw this happening. So either completely rewrite my introduction then rather than talk about what happened in the future. Look back at the lessons that we could learn from our recent past.

Lisa: That is quite a disrupting event to have during the editing process. So you mentioned cheap speech. Could you describe what cheap speech really means and the way that you define it in the book?

Rick: Yeah, that seems like an appropriate thing to do on the No Jargon podcast. So. Back in 1995, a law professor, a very eminent first amendment scholar named Eugene Volokh wrote an article in the Yale Law Journal, where he predicted the upcoming era of cheap speech. And what he meant by cheap speech was that it was going to be much cheaper for people to produce and share information.

It was going to make intermediaries like newspapers, much less impatient. You know, if you were trying to get an idea out there in the past and say you had a great idea about some public policy issue about immigration or abortion or whatever, you'd write up a, a, you know, an op-ed and you'd send it to your local newspaper and make it might even try and send it to the New York times.

And if those few places said, no, you really didn't have an outlet for your speech. Now today anyone can post their thoughts on Facebook, on Twitter. They can make a tick-tock video. and Volak saw this coming and he was very optimistic about the future of cheap speech. And he, you know, he asked the question, you know, how well will our society do with, the loss of these intermediaries?

I think they will do quite well, but others may differ. And I would say fast forward 20 something years. And the picture is. murkier. Certainly there are great advantages to the fact that everyone can get their ideas out there, but cheap speech has meant not just, easier to spread a speech. You know, everyone has a megaphone today, uh, but it also created lower valued speech. So it's really expensive to produce quality journalism, right? You have to hire journalists and, and, you know, you have to have an apparatus of editors and fact checkers and all the things that journalists do, that's really expensive, but to produce misinformation or disinformation, false statements, you can do that really cheaply.

And some of that bad information is driving out good information. The fact that there's now a, you know, what started as Craigslist and now otherwise, For example, selling things online meant that the advertising model for newspapers collapsed and so local journalism in many ways is in a terrible position.

And so it's harder for voters to get more reliable information, easier for them to get false information. And so cheap speech is about what's happened in our society where not only is it easy to spread information, but the kind of economics of the situation has made it easier to spread disinformation.

And my focus, particularly on what that's meant for elections in American democracy.

Lisa: I want to talk a little bit about how it seems like disinformation is a part of cheap speech. Could you also define just for listeners, what disinformation is or what constitutes as disinformation?

Rick: So in the book, I follow a convention. I mean, there are not, you know, a hundred percent agreement on these terms. What I mean by misinformation, let me start with that. What I mean by misinformation is false information. Empirically false information that is, spread. It could be spread in any way.

You know, I'm focused a lot on, false information spread either online through social media or on cable news or something like that. And just information is this false information that is deliberately spread by someone, either for a political purpose or, um, or sometimes, and for a financial person.

Uh, there's a lot of money to be made. For example, by those who spread the false claim that the 2020 election was stolen. It is actually something quite profitable for some people to do. You can fundraise off of it. If you're a politician, you can sell advertising on your podcast. If you are spreading this kind of misinformation or making videos and, you know, you can, monetize this kind of misinformation. So disinformation is deliberately spreading false information.

Lisa: So this may be just a chicken or the egg kind of question, but you've shared that this information is deeply linked to the intense polarization of our political climate in the United States. And I'm wondering, is it our increasing divisiveness that has contributed to this explosion of disinformation or is it that all the disinformation on the internet has kind of worsened our divisiveness?

Rick: Yeah, that's a great question. And I do think that this is a feedback loop that is, polarization existed in the United States before social media became so prominent, but social media makes it easier, not only for us to get into our silos and to share information with like-minded individuals, but also to organize for political action.

The groups on Facebook that organized around the so-called stop, the steel movement is one example and also to fight on social media. So one of the things, uh, I've heard is, you know, you go to Facebook to make friends, you go to Twitter to fight your enemies. And, there are things that people are willing to say on social media and ways that they're willing to act that they were never would never off Facebook.

The kind of bullying, the kind of hate speech. Sometimes the misogyny, the racism, you know, the really nasty stuff that we see on social media. I think it brings out the worst in people, and I can't think, but that it's contributing to our polarization by really convincing everyone that the other side is not, not only wrong, but, uh, inhumane and, you know, not someone worthy of trust. And so that really makes things much worse.

Lisa: Absolutely. And not, not someone worth trusting and not someone that worth engaging with at all. So you kind of keep getting siloed more and more into the depths of your own little area pocket on the internet. um, you are an election expert, of course. So you talked about how you were looking into the 2020 election. How does information has impacted, um, our consumption around voting and democracy. So I want you to give some examples of how this information has impacted our democracy in the past couple of years. Why do you think it is thriving right now? I mean, it's easy to share, who is sharing this, this information as well.

Rick: Right. So, uh, let me delve a little deeper into the example I gave a few minutes ago about the January 6th, insurrection. If you imagine that. We had the same polarized politics that we have today, but we had the technology of the 1950s. I think it's very difficult to imagine that we would have had a storming of the Capitol because really what it took was Donald Trump and some of his allies relentlessly spreading false information that the 2020 election was stolen.

By one count in the New York Times, Donald Trump, over 400 times between Election Day and three weeks after election day spread on Twitter, the false claim that the 2020 election was stolen. We know that people believe this. We can see their posts on Facebook and elsewhere. We know they used Facebook to organize groups, to oppose the, you know, the so-called Stop the Steal Movement to try to oppose the certification of Joe Biden as president, even though he was getting enough votes in the electoral college and ultimately to organize and go to Washington, uh, after Trump, deliberately on social media said come to the Capitol on January 6, will be wild. He called for wild protests, and then he got exactly what he asked for.

Lisa: Yeah. I mean, it's hard to imagine a world where people weren't aware of the warning signs that there was going to be this big disruption as a result of this election. But I want to talk a little bit about the era of digital disinformation, how it started, how it has grown. And do you think we have reached its peak right now?

Um, what do you see coming in the next coming years?

Rick: So I'm not a technologist, I'm really an election law scholar. So I can't predict for you what's going to be the next thing. But if you look for example, TikTok, wasn't a factor in the 2016 election, right? It didn't exist. It was barely a factor in 2020, and it's going to be a larger factor in the future.

The ability to spread these videos, you can imagine people spreading lots of videos about false things happening at polling places. You know, we saw some of this in 2020. I certainly don't think that we are at the bottom. I think things can get far worse. Uh, one of the things I talk about in the book are what are called the deep fakes, right?

So deep fakes are, digitally altered audios or videos who can imagine a video showing a presidential candidate, you know, having a heart attack, you know, collapsing or, uh, you know, involved in some kind of sexual scandal. You can imagine all kinds of videos that could be – that would be believable enough, that some people would say, well, you know, is that, did that really happen?

And what I think is that when these videos become prevalent enough, it's not as though people are going to believe everything they see in here, it's actually going to be quite the opposite. They're going to not believe even accurate things. And so it's going to be harder for voters to tell what's true, and they're going to discount everything and that's going to make it harder to get voters to expect reliable or trustworthy information. And so we have to think about ways to signal trustworthiness. And in the book, I propose a number of things, including a potentially a requirement that social media companies have to label altered videos as. Uh, whether they're altered for nefarious political reasons or they're altered for fun, you know, they would just have a mark showing that they're altered.

Like, so there are certain things that we can do consistent with the first amendment. And one of the themes of my book is there's a lot we can't do because of the freedom of speech protected by the first amendment to try to deal with the spread of disinformation. We don't know exactly what form it's going to take.

But, uh, it seems like spreading false audio and video is the next frontier and that's already happening.

Lisa: Um, and so you mentioned the first amendment being this issue, um, when placing policies to combat just information on what are some things that courts can do, to combat disinformation campaigns other than the. The limiting of audio and video,

Rick: Well, I wouldn't have, let's be clear. I don't propose limiting it. I propose labeling it. There are some, there are some old proposed limiting it, right. Which I think raises some serious problems. So, so this is a good illustration of the issue. On the one hand, we want to have voters get accurate information.

We want that. Not lose confidence in the election system because of false claims that the election was stolen. On the other hand, the first amendment and the theory behind the first amendment is that we want to protect political speech. We want to have robust debates. We don't want in campaigns to have the government decides what what's truthful or not.

Right. So you could say it's a crime to lie about anything related to the election. You know, that would give too much discretion to some government bureaucrat. Imagine this bureaucrat is appointed by whichever president you just like the most. And you don't ask the question, you know, uh, what do we do about this disinformation and, Rather than have that kind of authority.

I think we can have narrower laws. And so for example, one of the things I propose is that there can be laws against lying about when, where, and how people vote. So you tell people you can vote by text, which is not a way that we allow voting in the United States in our elections, that could be made a crime and.

I have a long discussion in the book that would be okay under the first amendment. I think the courts would allow that. but something that goes broader and even something that would make it a crime to lie about the last election being stolen to take one of the big problems we had with 2020, I think that is more constitutionally worrisome and problematic.

And that may be. Workout as something that would be upheld by the Supreme court. And as I go through a number of different proposals for dealing with not just the problem of disinformation, but other problems that occur because we are in this era of cheap speech, I come up again and again, The Supreme court's understanding of the first amendment as being a potential impediment to some of the things that, we might want to do to deal with this information.

Even recognizing that we would want to protect free speech. And so I think law is going to be only part of the answer to how to save our democracy from the spread of disinformation and from the other, uh, pathologies that occur because of cheap speech.

Lisa: Before we go on to the other policy solutions or non law solutions, um, that there are to protecting both our speech and protecting our democracy. I want to see if you can give us a few examples of situations where putting a stop to disinformation can start to cross that line into unjust or unconstitutional censorship.

Rick: So, right now there's talk about, a law that would require social media companies to carry all, politics. Or a kind of revival of the fairness doctrine, the fairness doctrine existed back in the 1970s and required broadcasters to provide equal time or to provide a right to reply.

Uh, and the south side only to broadcast journalists to try and kind of give. some kind of even handedness. So while I think that these kinds of laws are, uh, at least some of them are well-intentioned, I think they do run the risk of, uh, forcing people to private individuals or private companies to carry speech that they, might not want to carry and be associated with.

So take, for example, the fact that after January 6th, when Trump was, uh, encouraging. the activities that led to the insurrection, both Twitter and Facebook decided to deep platform Trump that is he can no longer post on these platforms. And, uh, this Florida law that would say that's illegal, that private companies have to carry Trump, even if he's calling for violence.

And so that is really worrisome because, you know, that would be a kind of. telling people what kind of speech they must make, even when that speech is, uh, dangerous. And so I think that, uh, you know, those kinds of laws should be seen as running afoul of the first amendment. It was kind of surprising to me, but, uh, Justice Clarence Thomas was one of the most conservative justices on the court.

One of the justices who believes that you can't, for example, limit spending in elections, or even require disclosure of those who spend money on elections, you know, very deregulatory, libertarian type of position on these questions. Justice Thomas has taken the view. States probably can force a company like Facebook or Twitter to carry Donald Trump's speech.

Even if they find his speech, something that they disagree with and don't want to spread because they think it might spread eight or spread disinformation. So you know, the current state of the first amendment is very much in flux. It, you know, over the last say 30, 40 years, it was the conservatives who were lining up more speech, less regulation side. Uh, here we see with the example of Justice Thomas, it is kind of flipping and also Justice Thomas and Justice Gorsuch, another conservative justice on the court. Both suggesting that we should change the libel laws and make it easier to sue newspapers for libel. I think that would make it much harder for newspapers and other bonafide journalistic outfits to engage in legitimate critique and analysis of what politicians are doing. So more regulation is not necessarily good. It could be, uh, even, tougher, even more, of a, problem going forward in terms of thinking about what it is that we really need to deal with the threats to our election that comes from this new information era and what is actually going to be allowed by the Courts. 

Lisa: So how are private companies going to be forced to not ban certain speech that they don't want other platforms? Is it because they're recognized as public platforms at this point?

 Rick: So just like the New York times or Fox news is a private company and they get to decide what content to include or exclude. I would say that these platforms are in the same place, Twitter and Facebook are private companies, you know, you can buy stock in them, you know that. And so the argument that they can be regulated is one that says.

That these companies are less like newspapers and more like telephone companies. Right? So the Supreme court has said the telephone companies, for example, could be required to carry content or to take another example. shopping malls can be compelled to allow, um, solicitors to collect. Signatures on petitions on ballot petitions.

I think that these companies are much more like newspapers. They curate content. It's not as though when you go onto Facebook, you see anything that anyone has posted. For example, if people try and post pornography or certain kinds of hate speech or images of violence that is curated. And not only is there content that is excluded by these platforms, some content is promoted, right?

So you see at the top of your feed and some content is, uh, demoted. you know, that's going to be at the, at the bottom. And, just like The New York Times decides which, news and which, opinions it's going to publish. That's the same kind of thing that these private companies do. Now, I do think that there are some problems with private companies, but I don't think that we want to deal with them through regulating their speech. So let me, let me talk about a couple of things we could do. One is if we think these companies are too powerful, we might use antitrust law and break them up and say, you know, Facebook shouldn't be able to also own WhatsApp and Instagram.

Right? So that's one thing we could do. That's not speech related that would just create more companies. Uh, another thing we could do, and this is what I talk about in the, in the last part of my book, there's kind of a non-legal solutions. We can put pressure on these companies so we can pressure or boycott companies that don't deep platform, people who call for violent overthrow of the United States government.

Uh, you know, so there are things that can be done both legally and politically to deal with. Uh, what we might see as problems with how the platforms deal with the issue of speech without requiring that they carry certain speech or that they impose certain, evenhandedness requires.

Lisa: Hm. So are there other means of regulating social media, but we're talking about how these disinformation campaigns impact elections?

Rick: So one thing that we know is that, when Campaigns are deciding what messages to send to voters, uh, in the old days, you know, most of those messages would be for all voters. So you'd advertise on TV and radio. It's possible. You'd have, somewhat different messages sent in the mail to some voters, just try and target messages to say to, to women or to a particular ethnic group or something like that today.

It's possible for, campaigns to, uh, piggyback on the data that social media companies have collected and use that to what is called micro target political ads. So for example, a campaign might have a number of, uh, potential voters who fit into a certain category.

Say they're targeting a young unmarried African-Americans. And so they pick a group of their supporters or people who they've collected their, uh, email addresses or their, Facebook profiles. And they can share that with Facebook and Facebook will find other people through what's called the lookalike feature. 

And they will target voters who have similar patterns based on the data that Facebook has collected. And they will target advertising just to these groups. And this doesn't necessarily spread misinformation, but lets campaigns talk out of both sides of their mouths and to really use the private data that individuals have given up by choosing to use a service like Facebook, use that private information as a way of micro-targeting political messages just to these groups. And I think that this can further polarization and it could further the kind of, uh, divisiveness that we see in our electoral politics. And so one of the things that, um, I propose in the book is that, uh, there'd be a ban on the micro-targeting of political ads. Campaigns can still send out whatever they want, but they can't piggyback on the kinds of deep data that companies like Facebook are able to collect and use that data in order to send these micro-targeted messages.

Now, this is another area where I think that there's a big question. Whether under the first amendment, as the Supreme Court currently understands it, such law would be permissible. I think in fact, it's likely that the conservatives would reject such a law, but I think that's because they don't recognize, uh, the true danger that we face today from, this kind of ability to manipulate private information that, these companies are collecting ads at a discrete groups of individuals based upon data that people don't recognize that they're handing over. And so if I'm right, that the Supreme Court is unlikely to uphold such regulation. We might have to go to the fallback, which would be going to the social media companies, as a public and trying to pressure them against the, uh, ability of campaigns to micro target politica ads. 

Lisa: Um, so it seems like maybe policy actors don't have a full understanding of the complexities of this digital era that we live in, especially when it comes to targeted voter information or targeted political information. Would you agree with that?

Rick: Well, what I would say is that, you know, the biggest problem is that voters, who use the social media companies, do not realize how much data they give away. You know, every time that they click, they provide these companies with more information about their preferences and that information can be used as a way to try to manipulate them and really undermine the idea of free choice.

It's like the difference between being able to take a photograph of someone and use that information, say to target ad versus a mind reading machine you know, what the social media companies is getting much closer to the mind reading machine.

Lisa: Absolutely. And that is quite a scary mind reading machine later.

Rick: Right. 

Lisa: Um, and I want to ask a little bit about, of course we're here at the No Jargon podcast. This is also a part of the Scholars Strategy Network, which means that we're constantly thinking about ways that scholars can use their expertise to help solve problems our country is facing.

So in addition to lawmakers and private corporations making necessary changes to fight the disinformation campaigns, can you share some steps that scholars can take as well, to help here?

Rick: Well, you know, one of the things that, um, Professor Nate Persily of Stanford has called for is legislation that would open up, the, social media companies for research. And I talk about this, uh, briefly in Cheap Speech. Uh, in many ways, these companies are black boxes, Facebook, especially doesn't share much information.

They were sharing some information with a professional. New York university. They didn't like the way her research was going and they shut her out and claimed that she had violated their rules, not clear that she violated their rules. but I think that scholars need to organize in order to get this information and be able to use it in order to figure out what can be done about these problems, about people losing confidence in the fairness of the election process, about people believing conspiracy theories that might affect how they choose to vote. How can these things be understood. So for example, there's a debate as to how much disinformation actually affects people's decisions. Right? So, that is something for which we can have a lot more research. I think we do know that disinformation can have real world consequences. When we look at the events of January 6th, 2021, you know, that, that is to me, the poster child for, you know, for those who say disinformation is not really a problem, proved to me that it's a problem.

I say that without the disinformation that was spread on social media, we never would have had that.

Lisa: Thank you so much, Rick, for coming on No Jargon. It was a pleasure to speak to you. And I'm glad that we could talk a little bit about some of the solutions that we can take to hopefully better our democracy.

Rick: It was great to be with you. I appreciated the conversation.

Lisa: For more on Professor Hasen’s work, check out our show notes at dot org slash no jargon. No jargon is a podcast of the scholars strategy network, a nationwide organization that connects journalists, policy makers, and civic leaders with America's top researchers to improve policy and strengthen democracy.

The producer of our show is Mandana Mohsenzadegan. If you like the show, please subscribe and rate us on apple podcasts or wherever you get your shows. You think give us feedback on Twitter at no jargon podcast and at our email address, no [email protected].

Support No Jargon

Stay connected with America's top researchers