In this episode of Democracy and Destiny, host Ciara Torres-Spelliscy—law professor and Brennan Center fellow—examines the rise of “crypto bros” and the corrupting influence of unregulated digital money in American politics. She covers the conviction of former Congressman George Santos for wire fraud, identity theft, and FEC violations and later speaks with Harvard Law professor Lawrence Lessig about the dangerous intersection of AI, social media, and campaign finance. They unpack how engagement-driven algorithms polarize the electorate, the broken promises of copyright law in the digital age, and the looming threat of AI-manipulated elections.
Democracy & Destiny with Ciara Torres-Spelliscy, Episode 2 with Professor Larry Lessig
[00:00:00] This is Democracy and Destiny with Ciara Torres Spelliscy. I have the per curiam opinion and judgment to announce on behalf of the court in Buckley against Valeo. We have a cancer within close to the presidency that's growing in case 0 8 2 0 5, Citizens United versus the FEC, Justice Kennedy has the opinion of the court.
The First Amendment's core purpose is to foster a vibrant political system full of robust discussion and debate. There is no right more basic in our democracy. Then the right to participate in electing our political leaders. With fear for our democracy. I, along with Justices Kagan and Jackson dissent, welcome to the show.
Ciara Torres Spelliscy: I'm Ciara Torres Spelliscy. I'm a law professor in Florida at Stetson [00:01:00] Law and I'm a fellow at the Brennan Center for Justice at NYU School of Law. I work on the intersection of election law and corporate law. This show was inspired by my third book, Corporatocracy, How to Protect Democracy from Dark Money and Corrupt Politicians, published by NYU Press election day 2024. I realize in today's busy world, reading a 300 page book is not on everyone's to-do list, but even as a law professor, I have the time to listen to radio shows and podcasts when I'm commuting to campus or walking my dog. So here we are. This is "Democracy and Destiny."
Today's episode is about the new money in politics, especially from Crypto Bros. I will be joined today in a few minutes with my guest professor Larry Lessig from Harvard Law School, who will talk about the intersection of technology, law, and campaign finance. First, let's start with pay to play today.[00:02:00]
The term pay to play comes from the radio payola scandals. From the 1950s and 1960s, record companies would pay radio stations to play their music. Hence, it was literally pay to play. Today the phrase pay to play is shorthand for all kinds of political corruption, especially when government contractors or others with business pending in front of the government, pay bribes to public officials to get a private benefit, like a lucrative no bid contract or approval of a corporate merger. One of the things I learned while writing Corporatocracy is that political corruption is prosecuted frequently, but the media just doesn't report it as often as other things like celebrity news. That leaves the misimpression for the public, that corrupt politicians or shady government contractors are getting away with crimes all the time.
They are [00:03:00] not. So, I swore to myself that if I ever had a news generating platform that I would highlight that political corruption can be met with serious legal consequences, including fines and jail time. So our example of pay to play today is from the DOJ. According to the Department of Justice ex Congressman George Santos was sentenced to 87 months in prison for wire fraud and aggravated identity theft.
Santos filed fraudulent FEC reports, embezzled funds from campaign donors, stole identities, charged credit cards without authorization, obtained unemployment benefits through fraud, and lied in reports to the U.S. House of Representatives. Former Congressman George Anthony Devolver Santos was sentenced by Judge Seybert. Santos pleaded guilty in 2024.
U.S. Attorney John Durham said "George [00:04:00] Santos was finally held accountable for the mountains of lies, theft and fraud. He perpetrated for the defendant. It was judgment day and for many of his victims, including campaign donors. Political parties, government agencies, elected bodies, his own family members and his constituents. It's justice." End quote. "His lengthy prison sentence is a just ending for a weaver of lies, who believed he was above the law," said Nassau County District Attorney Donnelley.
The Party Program Scheme.
During the 2022 election cycle, Santos was a candidate for the United States House of Representatives. Nancy Marks who pleaded guilty on October 5th, 2023 to related conduct was the treasurer for his principal Congressional campaign Committee. Santos and Marks devised an executed a fraudulent scheme to obtain money for the campaign by submitting materially false reports to the Federal Election Commission.
[00:05:00] Santos and Marks agreed to falsely report to the FEC. In fact, Santos and Marks both knew that these individuals had neither made the reported contributions nor given authorizations for their personal information to be included in such false public reports... these falsely reported loans included one for half a million dollars, when in fact Santos had less than $8,000 in his personal business bank accounts.
As a result of qualifying for the program, the Congressional campaign receives significant financial support.
The Credit Card Fraud Scheme.
Santos devised and executed a fraudulent scheme to steal the personal identity and financial information of contributors to his campaign. He then repeatedly charged contributors credit cards without their authorization. Because of these unauthorized transactions, funds were transferred to Santos campaign and to his own bank account. In furtherance of the scheme, Santos sought out [00:06:00] victims he knew were elderly persons suffering from cognitive impairment or decline.
Our next segment is Corruption Junction. I have been writing about the issue of money in politics for two decades. One way to think about this book is it is the Supreme Court's horrible Citizens United decision meets the horrifying events on January 6th, so that we are literally on the same page. Let me read a short excerpt from my book, Corporatocracy.
[Reading from Chapter 1 of Corporatocracy]
[00:14:06] Ciara Torres-Spelliscy: Now we get to the heart of the matter, which is the problem of money and politics. My guest, professor Lawrence Lessig, is a professor of law at Harvard Law School. He is the author of several books, including They Don't Represent Us, Republic Lost and How to Steal a Presidential Election. You may be the only person I'm interviewing for this series who is a technology expert. So I'm glad to have you here today to speak about the state of American democracy and to help us think through how technology policy intersects with money and politics. Welcome to this section of the show, which is called Corruption Junction.
[00:14:41] Larry Lessig: It's great to be here. Thanks for having me.
[00:14:43] Ciara Torres-Spelliscy: Where did you grow up?
[00:14:44] Larry Lessig: So I grew up in the kind of Kentucky part of Pennsylvania town called Williamsport. I lived there from about six until I went to college.
[00:14:52] Ciara Torres-Spelliscy: And what inspired you to study law?
[00:14:54] Larry Lessig: It was my, actually, my uncle in 1974, he was asked by the house to [00:15:00] be the lawyer that would explain to members of the house why Nixon was guilty of high crimes and misdemeanors.
And weekend in August before Nixon resigned, he came up to visit us. He took me for a long walk and he, he said to me, you know, law is the only. Discipline or the only field where it's your power comes through reason, not through money or not through force. There was something deeply attractive about that idea.
[00:15:27] Ciara Torres-Spelliscy: So you and I have crisscrossed paths many times since 2010 when the Supreme Court decided Citizens United. There is this democracy circuit for academic, nonprofit, and philanthropic symposia where we have often been in the same room. So I, I remember seeing you in Los Angeles. In New Orleans. In New York, and you were even kind enough to give a keynote speech at my law school.
[00:15:53] Larry Lessig: I remember. Yeah. Yeah.
[00:15:54] Ciara Torres-Spelliscy: Stetson. And then I saw that one of your latest TED Talks was in Berlin. So I have to [00:16:00] ask, do you actually enjoy all the travel that comes with your life?
[00:16:03] Larry Lessig: So actually, I used to, before COVID, I traveled every week, and then COVID taught me, I don't like to travel because I really, it turned out loved being at home.
[00:16:14] Ciara Torres-Spelliscy: What do you do to keep yourself sane during these trying times?
[00:16:18] Larry Lessig: Well, the only sanity is my kids. I think anybody who has a clear sense of just what's at stake can't help but be terrified, and especially lawyers, when you see the extraordinary transformation that the administration's trying to affect.
It's hard to explain it to people who aren't lawyers, but you know, for those of us who see it, you can't. You can't unsee how profound and significant it's going to be.
[00:16:42] Ciara Torres-Spelliscy: So if you could go back to around maybe 1990, how would you have wanted the internet to be regulated or not regulated back then?
[00:16:51] Larry Lessig: The thing that broke the internet wasn't conceivable by anybody back then because the thing that's broken the internet is the [00:17:00] engagement business model, which is driven by AI, which is focused on changing your preferences to turn you into the sort of person who can be easily predicted so that the platforms can rent you out to the highest bidder. And they've been enormously successful in that, in developing technologies that force us to obsess with our machines and spend as much time as we possibly can on our machines so that they can rent us out.
All of that was inconceivable, you know, in the 1990s because the technology wasn't there, the machines were not powerful enough. Nobody had demonstrated quite how it worked. This is before Google had even been born, and certainly before the founders gave up on their fundamental commitment not to have advertising at the core of Google.
I think that people sometimes think that the internet is a thing that's existed sitting in the public's mind since the 1990s till today. But the Internet's actually been very many things. And the thing we ought to understand it today as is not a [00:18:00] technology but a business model and recognize that business model has imposed extraordinary externalities on society as it's turned us into extremists who hate each other so that they can sell more ads.
[00:18:12] Ciara Torres-Spelliscy: Yeah. One of the things I tell my law students is you are the product. When you are on these big social media networks, you are the product. They want to sell your information…
[00:18:24] Larry Lessig: Yep.
[00:18:24] Ciara Torres-Spelliscy: To advertisers so they can better target. Ads at you. And this is true also in the political realm, that they wanna be able to target political messages that will appeal to you.
[00:18:37] Larry Lessig: But what's important about that is to emphasize that it's not just passive. It's not just that they're like kind of understanding who we are and feeding us what we want. The really critical thing, which, you know, Stuart Russell wrote about the machines are learning how to change our preferences so that they can serve us ads better. What Max Fisher's book, the Chaos Machine demonstrates is that if you [00:19:00] look at the spread of social media across major democracies, you can see that the more the social media spreads, the more polarized and radicalized the public becomes.
Because the social media realizes that if it can turn us into right wing or left wing crazies, were more reliable products. Were more easily manipulated and steered into whatever content. They can rent to their to the highest bidder. So it's kind of manipulation of us that the technology is affecting, not just serving us what, we actually would choose to watch.
[00:19:33] Ciara Torres-Spelliscy: One of the first places I encountered your work was when I was a Columbia Law student at the time. What had just happened was the auctioning of spectrum that was going to be used by cell phone companies. Was there a better way to distribute that spectrum that would've been more small d democratic.
[00:19:54] Larry Lessig: Oh, absolutely.
That was a critical mistake. Even though it was progress [00:20:00] over the old model for allocating spectrum, and from the beginning there were skeptics about this Ronald Coase who said that, you know, there's no reason to license spectrum many more than there's reason to license where you live on your property.
You could sell spectrum just like you could sell property, and if you sold spectrum, then people could resell it to the most. To, to the people who value it the most. And you would get a kind of efficient spectrum market just like you would get an efficient property market if, but what Coase didn't recognize, 'cause it wasn't actually common or understood at the time, was that there was another way to allocate spectrum, which is to use technologies that effectively share the spectrum space and share it in a radically efficient way.
This is basically the technology that WiFi works on, but WiFi has actually been. Allocated to a pretty terrible slice of the spectrum. This alternative kind of ultra wideband spectrum allocation would basically say anybody can be using the spectrum and the technology would be smart enough to figure out how to share the use of the [00:21:00] spectrum, which would facilitate a wider group of people being able to use the spectrum than just those who are able to pay to buy access to the spectrum.
And that. spectrum as a commons would've been a far superior way to allocate spectrum than auctioning it off. Of course, the government would get as much money, but the economy would be much richer because there would be much greater competition about exactly how best to deploy and facilitate use of this spectrum resource.
[00:21:27] Ciara Torres-Spelliscy: What is the biggest flaw in the Digital Millennium Copyright Act?
[00:21:31] Larry Lessig: The biggest flaw is the decision to entrench a particular business model for copyrighted material, and that business model is the idea of facilitating the lockdown of content to authorize quote unquote “sales.” But it turns out it's not really sales. It's basically licenses for a period of time of access to that content and locking up any other use and locking up or making illegal the ability to hack or work around the [00:22:00] technical. Measures that have been deployed to give the copyright owners maximal control over their copyrighted material. And the reason that's a mistake is not because one should be against copyright. I think copyright's incredibly important. But because it entrenched businesses who ultimately were not so interested in the artists. You know, the day beginning of the internet, everybody was afraid of Napster because. For ways of sharing content for free and how are the artists gonna get paid? But I think if you talk to most artists and you ask them, how much are you being paid by Spotify to have access to your content, they would tell you not very much.
You know, pennies on the hundreds of thousands of listens to that content. And that's because the system essentially locked in access to these huge players rather than facilitating. Much more dynamic competition and business models to facilitate artists as well as access to their work. I think the best example of this is way back in the day, I became friends with Sean [00:23:00] Parker, who at the time was at Napster, eventually became one of the earliest Facebook investors.
And, but in his days when he was with Napster, he was a deep advocate of changing copyright laws to facilitate greater competition than business models for. Building and developing artists. About a decade after I first met him, I was on a panel with him. At that point, he had become a board member on Spotify. I was astonished that he was so unreflective about this transformation, but he said, you know, I've now decided that I think copyright laws are really good. And of course, what he was saying is. The way they are and made it extremely hard for any other business to try to compete with Spotify. And of course now he had a vested interest in Spotify being the number one platform to get access to music.
And so that's not surprising if you're interested in it for the money, but if you're interested in it for the way it facilitates support and protection of artists, I'm not sure. This entrenching of the power of labels or the [00:24:00] platforms actually made any sense.
[00:24:01] Ciara Torres-Spelliscy: What is Creative Commons?
[00:24:03] Larry Lessig: We were in the middle of the copyright wars where it seemed like there were just two sides to this war. Either we were for copyright or against it. We recognized there was actually a third position, which is: not people who oppose copyright, but people who wanted to create work, but they wanted to assure that it could be shared and built upon freely. So rather than all rights reserved or no rights respected, this perspective was some rights reserved. Then some rights given over to the public to make it easy for the public to know how they could build upon and share their work. So we launched these. Licenses as a way to give people the opportunity to show through their acts, make true that they wanted their creative work built upon and shared.
But very quickly, people rallied to the recognition that this, in fact, was what they thought about copyright.
[00:24:55] Ciara Torres-Spelliscy: Let's take a short break and we're [00:25:00] back.
[00:25:17] Larry Lessig: If you're in the education business or if you're a scholar, platforms like Wikipedia are committed to the idea that the people contributing to their platform or people who are sharing their content wanna guarantee that others are free to share that content too. These licenses provided an infrastructure for doing that and have since become the infrastructure for the Open Access publishing that has become central to the way many academic journals work and central to Wikipedia.
[00:25:46] Ciara Torres-Spelliscy: When I was at the Brennan Center, we would release our reports using Creative Commons licenses. I don't know if they still do that anymore, but that was important to me when I was there. Facebook starts in 2004. Personalized google [00:26:00] search starts in 2005, as does YouTube. Twitter starts in 2006. These entities are what has created the information landscape that most Americans live in. So is the information silo problem on social media a problem for American democracy?
[00:26:18] Larry Lessig: It's a severe problem, but I really think that there were stages of that problem.
And, you know, the early stage you would focus on the fact that these were siloed platforms. It was hard to move content between them. So it was not the old internet where there was basically an internet and everything on it. It was like four or five separate internets that had people in their walled garden to prune and flourish as they could make them flourish.
The real change for each of these platforms was recognizing the full potential of AI as a driver of content. You know, at the beginning of social media, what drove content, what made something viral was a certain artistic skill [00:27:00] of the artists who created it, like it was humans that were genius at figuring out a meme or an image or a turn of phrase that would then inspire others to wanna click and reshare it.
That creativity was extremely valuable for a whole bunch of people. You think about a place like Buzzfeed, which hired a credible number of social media creators, what made them great was that they had a special talent, and that talent was to capture people's attention and get them. To act for you by sharing the content with others and thereby drive attention to whatever the ultimate advertising objective of the platform was.
But what social media discovered was that humans were good, but machines were. Orders of magnitude better. And so they increasingly deployed technology to manipulate the content, to turn the audience into a more reliable, more predictable product that they then could lease out to people [00:28:00] who wanted access to people's attention.
So, you know, Mark Zuckerberg famously said, look, I'm happy to respect privacy. I'm happy not to keep any of the data services are gonna gather on people. 'cause he didn't need to. Once the AIs saw the data and they learned what kind of person you were, like, you know, it turns out we're not very different and not very complicated.
Especially if they can turn us into radicalized lefties or radicalized, right wingers. And so, the platforms increasingly did that because they had a single objective. Their objective was to maximize engagement. Now, the byproduct of that engagement, has been destructive to our democracy, just like the byproduct to the fast food industry has been destructive to our collective health, fast food industry.
Is out there just trying to find a way to sell more potato chips. It does that by constantly experimenting with a mix of salt, sugar, and fat to find the most addictive, most compelling processed food that they can. And then ultra processed food does this in even more [00:29:00] aggressive way. As they do that, what they've done is produce a public, which is incredibly sick because this food turns out not to be good for humans, even if it's the food that's the most addictive.
The same thing is happening with social media. These are companies that are in the business of engineering attention. They do that through computer technology that increasingly learns the best way to capture and to direct attention. The most troubling fact about that capture is that the way that it's best to do that is to basically change our preferences, and as they change our preferences, collective consequence.
The externality from this is that we become a society that understands each other less is more polarized, more committed to our view, and more able to get a constant feed. Of information that confirms our view. It's a business model whose externality is the weakening, maybe destruction of democracy. And when you think about it like that, you think it's, you know, I don't think there's ever been a point [00:30:00] in the history of humanity when the business model of media was against democracy.
Byproduct of this business model of media is a incoherent, radically under informed public. Even though we're spending more and more of our time looking at content than we have. At any time in our history so that's a product of the business model and the business model is enabled by the technology, but it's distinct from the technology.
[00:30:24] Ciara Torres-Spelliscy: Sort of reminds me of the Yellow Press and Hearst and starting the Spanish American War.
So Facebook, Google, YouTube, and Twitter are all just tools. You could use them to share cat videos. You could use them to undermine democracy as Ani di Franco says, “every tool is a weapon if you hold it right.” How much do you worry about online political disinformation?
[00:30:45] Larry Lessig: So I worry a lot about it, but I wanna. Question the framing, because by calling it a tool, it makes it seem like there's an intentional actor behind it for a political purpose. If you're telling me we're talking about [00:31:00] X and Elon Musk, I'm willing to believe you've got an intentional actor there.
Deploying the tool of X for a political purpose, but the biggest consequence to our politics doesn't come from particular people deploying a tool in one way or another. And it certainly doesn't come from users of this quote tool, deploying it one way or the other. You've got businesses that are deploying their platform to maximize engagement with the consequence that we are radicalized.
But their objective is just to make money. Their objective is, you know, Mark Zuckerberg doesn't wanna destroy American democracy. It's just that's the only way he can make money. So that's what he does. And when we use the tool and we like the things we like and we steer away from things we don't like, and we yield to the constant push or pull of these platforms. I think it's overplaying our role to say that this is now our tool. No, we are the tool of it. And so I, I think that we need to recognize the loss of agency that we [00:32:00] individually have and the loss of social agency that these companies have because this business model is just so incredibly lucrative to them.
That they literally can't resist it. I was honored to be the lawyer for Frances Haugen when she came out, as the Facebook whistleblower. The Facebook files are filled with examples of engineers recognizing the severe harm the platform was doing to different demographics of their users and recommending solutions to Zuckerberg about how to make the platform less poisonous for young girls or less poisonous for political speech because less promulgating of misinformation. And the single dimension that Zuckerberg would apply to each of these proposed changes was how would it affect engagement? And if it reduced engagement, it was not gonna happen.
Now that wasn't because he liked what it was producing, it's just he was not willing to pay the price to remove what it was producing because that price was the price of reducing engagement, which is how he measured the [00:33:00] value of his company, how Wall Street measured the value of his company. I think we need to recognize we're kind of lost in the face of this technology and this business model.
Typically, we would say, well, this is a role for government to step in and. Figure out how to enforce them to internalize the externalities they're imposing. How could we tax the business model or make it so the business model is no longer the model they choose. But of course, we don't have agency over our government because our government itself is captured by the influence of the extraordinary money that's inside of our political system.
So that's how we're doubly bound here.
[00:33:33] Ciara Torres-Spelliscy: I wanted to focus a little bit more on artificial intelligence. Ethan Molik is one of my friends. There's a lot of sunlight between me and Ethan on ai. He is quite the evangelist and I think we should pump the brakes. So should AI be trained on copy written work?
So I.
[00:33:51] Larry Lessig: So I think absolutely we should be free to copy. We should be free to train on all copyrighted work. That doesn't mean copyright owners [00:34:00] shouldn't get compensated, but they shouldn't be compensated through the copyright system. I think we need to create a sui generous way to assure compensation for work that AI has been trained on.
But I think it's the basic commitments of copyright and the enlightened. The work that's created is out there to be learned from. And, you know, we're just at a stage where the most important entity learning stuff is going to be ai, you know, and, and it's really learning stuff. AI reads it. AI has done the reading.
I've interrogated AI about my own work, and it is smarter than I am about my own work. It has better insights about like what I'm saying and the implication of what I'm saying, and certainly better insights. Than anybody on my faculty. So I think we need to accept that the future is a future where this will be the medium through which intelligence is generated.
Sure, let's make sure that people who create get compensated for it. But let's not start building, let blockage. To stopping [00:35:00] the machine from understanding the knowledge or the culture we're producing. Because what that's gonna do is to make sure that the only culture that gets reported, or it gets folded into this is the culture that, you know, , is most commercially valuable.
It's that it's exactly the way to reinforce about the way these algorithms change preferences . In the old, in the kind of 2010s to 20 fifteens. The thing we were all obsessing about was Chris Anderson's work about the long tail and how what the internet did was open up the opportunity for all this diverse content to be accessible for the first time, and this was a great success.
The great thing about Napster was this. Archive of the widest diversity of content of musical content that anybody had ever seen. Well, the latest measurements of the long tail show that what AI algorithms are doing is replicating the day of, you know, Casey top 40. It's because the algorithms are [00:36:00] increasingly steering us back to the very same small universe of content that we had in the days before the internet.
And that's not surprising again, because, you know, to the extent they can turn us into boring music listeners, there's a better, clearer market to be marketing too. The thing to worry about here is that we're gonna build rules into the system that will force culture into that narrow space. And, and, and I certainly don't support that.
Are you worried about AI picking up human biases and magnifying them?
Absolutely.
Absolutely. And it's a intractable problem, but it's in part an engineering problem. It's part a foundational problem. We're not gonna get rid of it, but in part, it's engineering. There are clever and important ways that we can avoid and minimize some of it.
At least the difference between that bias and the bias that's built into our system right now is that we can see that bias. We can track it and we can recognize its manifestation. [00:37:00] The bias of like federal district court judges is much harder to track or to understand.
[00:37:06] Ciara Torres-Spelliscy: What are positive and negative uses of AI with respect to our democracy?
[00:37:11] Larry Lessig: The positive uses are the extent to which it encourages and facilitates people having a less targeted understanding of reality. It angers Elon Musk that his AI is not sufficiently right-wing or white nationalist. There's all these. Demonstrations of how GR gets reality, right? And that reality is inconsistent with Elon and Grok directly Calling out Elon is one of the best, biggest misinformation purveyors on the internet.
So to the extent we're talking about platforms like that, it could be a complement it with an e, it could help democracy to help us have a better foundation, a better basis in understanding reality. The worst thing AI is doing for democracy, it's already done because it's AI [00:38:00] driving the algorithms behind social media.
Those algorithms are incredibly destructive, so I think that's a terrible use of AI. And there's a great piece by Aza Raskin, whose great sin in life was inventing the infinite scroll that he is doing his penance by co-founding Center for Humane Technology with Tristan Harris. Aza describes 2024 as the last human election, and by that he means by 2028 the following is completely plausible.
So imagine people start deploying AI bots designed to just engage with people in a playful, fun way and just, you know, you can, you could say, let's pick a target demographic, white men under the age of 30. With that target demographic, let's start deploying video generated OnlyFans AIs. And the objective of these video generated OnlyFans AIs is to induce this target demographic to spend as much time as they can with the AI.
Use your [00:39:00] imagination to imagine how in fact they do that. But as they do that, they develop models of the target audience that they are engaging with. Those models help them understand how they might manipulate that target audience so they could spend three years talking about whatever, and then come time for the election, they could begin to leverage their knowledge, their persuasive authority, their.
Withholding of affection or their withholding of attention in exchange for the target doing what they want it to do. You know, once you describe it, you're like, whoa, there's nothing stopping that right now. I've just described existing technologies. We're not talking about technology that tries to fool you.
'cause the people. Using these video generated AIs on OnlyFans are not confused about whether this is a human on the other side, but they're just enjoying the engagement, so that's why they engage. It's completely plausible that this could be being built right now, which is, which means that it must be being built right [00:40:00] now.
There must be people doing it right now, and it's completely invisible to the political process. Like there you could be doing it. Generating your huge following. You are now, capable of manipulating 'them as you wish, and not have to file a single FEC report because of what you're doing. And then at the very end, when you use your persuasive authority, just like you know Taylor Swift, God bless her, tried to use her persuasive authority to turn her base against.
Uh, Donald Trump when you use your persuasive authority, it's just what famous people have been doing since time immemorial. But the difference between Taylor Swift and these AIs is that there's one Taylor Swift and her connection to your psyche is very weak and different for different people. But there'll be 10 million of these AIs, and they will know you precisely and they will know exactly what buttons they need to push and they will push those buttons.
And who knows what the product of that will be? Depends on what the arms race. Between one side and the other [00:41:00] is deploying this technology.
[00:41:01] Ciara Torres-Spelliscy: Last Supreme Court term, I was expecting two blockbuster First Amendment cases about social media, Murthy versus Missouri and Moody versus Net Choice, and instead we got a punt and a remand. What did you think of Murthy versus Missouri? Should the White House be allowed under the First Amendment to pester big social media platforms to change their content moderation policies?
[00:41:25] Larry Lessig: The problem is at that level of generality, it's hard to answer. I think what is frustrating about the particular context of these cases is they all grow out of catastrophic national threat.
We all know that the history of First Amendment is a history that's contingent upon whether we are facing a catastrophic national threat. So in the middle of World War I, the First Amendment looks different. From what it looks like, you know, in the middle of, uh, you know, the 1970s, and that's the way it should be to the extent you've got a government saying, Hey, what you're saying is actually leading [00:42:00] people to do things that threaten their health and we'd like you to stop.
I don't have any problem with that. If instead you're saying, Hey, what you're doing is advancing in ideology that I don't agree with a set of political values that we don't agree with. Or, you know, if you deploy the kind of extortion racket strategies, which the current administration is deploying against all sorts of institutions, including the press.
See example, CBS being threatened with a $20 billion lawsuit on the basis of a legal claim that has zero. Plausibility in American law given the First Amendment, then I'm much more troubled by it.
Do you have any thoughts on Moody versus net Choice, which decided that Editing news feeds is like editing the New York Times.
The way that case was written makes it sound like all the way down. We're gonna have this kind of first amendment blocking the ability to regulate the environment of newsfeeds or the environment of these social media platforms. And if that's true, then we're in trouble. [00:43:00] I don't quite think that's necessarily what it is set up.
And I do think the court's gonna be open to understanding the distinctive character of the AI driven platform content that newsfeeds represent. I do think there'll be more space for government regulation that we've seen right now, but I worry that the First Amendment's gonna become an insuperable barrier to doing sensible regulation in this context.
[00:43:24] Ciara Torres-Spelliscy: Should the courts apply the company town case Marsh v Alabama to either social media platforms or something like the Metaverse?
[00:43:33] Larry Lessig: I certainly have believed that. Zephyr Teachout and I wrote an amicus brief in the context of the Texas case, Texas End of Florida cases about, you know, thinking more like a Prune Yard standard, but thinking about how we understand these as themselves, kind of governmental context, and they should be able to create environments that are respectful of the values that you're trying to establish.
But I don't think we're, I don't think we're there yet.
Did [00:44:00] the Supreme Court get it right in upholding the TikTok ban?
I mean, the problem with it is on the surface, absolutely, because it was a regulation of a foreign company in the context of American political speech. And so that traditionally has been completely permissible.
The challenge to it is it wasn't really about security, and it wasn't really about foreignness, what we know from the people on the hill. In fact, from Mike Gallagher, the congressman, who was successful in pushing this along. Is that the only reason? Congress passed this, is that there was too much pro-Palestinian speech on TikTok and they didn't like the fact that all of these terrible images of Palestinian children were spreading so broadly among the youth in America. If you, if ever there was a case for testing the question, whether the ulterior motive is the real motive, I think this should have been a case.
[00:44:49] Ciara Torres-Spelliscy: If you could defenestrate a legal concept out of American law, what do you think should be in the dustbin of history?
[00:44:57] Larry Lessig: The idea that Citizens [00:45:00] United gave us Super PACs. My friend Bernie Sanders, constantly out there saying, Super PACs are the end of American democracy. True.
And therefore we have to overturn Citizens United. False. Because Citizens United did not give us Super PACs. It was SpeechNow, a DC circuit case three months after Citizens United, they gave us Super PACs and it did so on the basis of an obvious logical mistake and once you point that mistake out and get a chance for a court to finally consider it, because that case was not appealed to the Supreme Court because the Attorney General didn't believe it was gonna be a significant category of speech.
Once you get that case before the Supreme Court, I predict the court is gonna recognize that speech now was wrong and that Super PACs are not constitutionally mandated. And indeed, right now. Most of my work is with people in Maine who have succeeded in getting an initiative passed that bans Super PACs in Maine.
[00:46:00] Which was immediately challenged by a group funded by Leonard Leo. We recruited Neil Katyal to intervene to defend the law. And, we're arguing to uphold the law. And the arguments to uphold the law are, I think overwhelming. And I don't think the court has any reason not to uphold the idea that states and the federal government have the right to limit the size of contributions to an independent political action committee, even if they don't have the right to limit how much that committee spends. So we've gotta give people a sense, gotta give people hope. Because when you say what you gotta do is overturn Citizens United. What anybody who knows anything knows is that there's zero chance we're gonna overturn Citizens United. But when you say to the court, look, embrace the logic of Citizens United, what follows from that is that Citizens United does not entail Super PACs.
Then I think there's a fighting chance of removing 80% of the problem simply through the court correcting. A lower court mistake.
[00:46:58] Ciara Torres-Spelliscy: Is [00:47:00] cryptocurrency inherently anti small d democratic?
[00:47:04] Larry Lessig: No, I don't think it's inherently small d, anti- small d democratic. There's a lot of fraud and, you know, manipulation.
Corruption that is deployed in the current infrastructure or regime of cryptocurrency. I mean, look at the president running meme coins where he is auctioning off access to himself in the White House to the top meme coin holders for his own meme coin, which is led to his personal wealth doubling in the first a hundred days of his administration.
All of that is corrupt for a million reasons unrelated to the architecture of cryptocurrency. So it's corrupt because we don't know who's buying these crypto coins. There's no, you know, FEC requirement that you over, you turn over like who the owners of these is. And there's a very simple way that foreign governments can parlay a favor with the White House by buying huge amounts of this and it being then let, they let them know that they're the ones purchasing that, and so therefore [00:48:00] there's a favor to be returned. So I think it's used right now in an incredibly corrupt and terrible way, just like you know, it's used to facilitate the buying and selling of illegal content like child porn or drugs.
That's certainly true with the technology, but I don't think the technology is inherently evil or inherently bad. I think if we had a well-regulated infrastructure for cryptocurrency, it would be a valuable addition to the finance mix.
[00:48:26] Ciara Torres-Spelliscy: So my favorite Scalia quote says, “requiring people to stand up in public for their political acts fosters civic courage without which democracy is doomed.” What do you think would foster civic courage today?
[00:48:38] Larry Lessig: I think that the most inspirational would be people willingly exposing themselves to criticism by engaging with people they disagree with. Most poisonous reality of politics today is that we are given the opportunity to live within our own bubbles and [00:49:00] praise ourselves because no one within our bubbles questions what we believe, but that dynamic is deeply destructive of democracy.
So I think we ought to be encouraging and rewarding people who put themselves out there in a context where they're not surrounded by people they agree with and who work hard to try to find common ground nonetheless.
[00:49:20] Ciara Torres-Spelliscy: Well, thank you so much for being here today.
[00:49:22] Larry Lessig: Sure. My pleasure. Thanks for having me.
[00:49:29] Ciara Torres-Spelliscy: Let's take a short break. We're back. As someone who spends her time focused on political corruption, it's easy to end up with a dim view of humanity and to get dispirited about everything. But one thing that has kept me happy and sane over the past eight years is my 100 pound chocolate labradoodle. So with your indulgence and to lighten the mood, let me share my life motto with you, which is: “Loves dogs hates corruption.” One of the bad things about living in Florida is [00:50:00] hurricane season. It's a lot to be the tip of the spear experiencing climate change. When I first moved down to Florida, I remember asking other professors at Stetson, does Tampa get hit by hurricanes?
And the response that I got at the time was that the Tampa Bay area hadn't been hit with a major storm in 80 years. And that was true when I asked that question, but in the time that I've been here, it has been hit by three major hurricanes. One of my only regrets with having a 100 pound chocolate Labradoodle is that he is simply too big to fly, but that means he is in a car with us.
One of the webpage that we use when we're traveling is a webpage called Bring Fido. They have a great list of dog friendly hotels, restaurants, and dog friendly parks. When Hurricane Irma hit that was. A wild experience because the entire state ran out of [00:51:00] gas. So by the time we decided that it would be wise for us to evacuate, it was actually no longer safe to leave because we would risk getting stuck.
On a highway with no gas. We were in our home when Irma hit our house. That was one of the scariest nights of my life. And then after the storm, we didn't have power at my house for a week, and at my law school, they didn't have power for two weeks after Hurricane Irma. As a family, we tend to get out of Florida if a forecast says that a hurricane is likely to hit nearby, what that is meant is driving out of Florida for 8 to 10 hours. Whoever is in the backseat has the 100 pound doodles snuggling them. His favorite was a hotel in Atlanta because there was a big roomy couch that he could chill out on while we obsessively watched the news to see how the hurricane was impacting our area. I must say, if you have to experience the [00:52:00] stress of outrunning a hurricane, then I highly suggest taking a big snuggly dog along with you so that there's something positive to an otherwise miserable experience. Okay, now back to business.
Now we get to our final segment. “The fixes In.” Many of the problems with our democracy seem unfixable, but that is not true. These problems were created by human beings, and they can be solved by human beings. We can improve laws and practices at the local, state, and federal level. One of the topics we discussed today was the role of technology in harming our democracy.
Concern about the power of big tech crosses party lines. If this is a topic that you care about, ask your member of Congress and your two Senators to enact laws that reinforce democracy through technology. As you're using your right to petition the government, you can also encourage your congressional representatives to [00:53:00] keep the First Amendment in mind.
You should be choosy about what platforms get your attention. If the platform has given up on fact checking or has turned into a free for all or misinformation, you can choose to spend your time elsewhere. Just remember that democracy is worth defending and a little truth goes a long way.
Thank you to my guest for joining me today. This is a production of Ciara Torres Spelliscy who can be found on social media as ProfCiara, P-R-O-F-C-I-A-R-A. The episode was mixed by WBAI. Our logo is by Entire World. Theme Music was composed and performed by Matt Boehler. This show is based on the book Corporatocracy, published by NYU Press.
This has been Democracy and Destiny with Ciara Torres-Spelliscy.