title The one thing the Supreme Court won’t touch

description The Supreme Court is aggressive on almost everything. Except the internet.

Sean talks with Vox’s Ian Millhiser about a surprising pattern at the Court. While the Court has been eager to reshape schools, healthcare, and civil rights law, it has consistently taken a cautious, almost hands-off approach to regulating the internet. They unpack a recent case involving music piracy, the broader legal fight over who’s responsible for what happens online, and why even a highly ideological Court seems wary of breaking the digital world.

Host: Sean Illing (@seanilling) 

Guest: Ian Millhiser (@imillhiser)

We would love to hear from you. To tell us what you thought of this episode, email us at [email protected] or leave us a voicemail at 1-800-214-5749. Your comments and questions help us make a better show. And you can watch new episodes of The Gray Area on YouTube. New episodes drop every Monday and Friday.

Listen to The Gray Area ad-free by becoming a Vox Member: vox.com/members.
Learn more about your ad choices. Visit podcastchoices.com/adchoices

pubDate Fri, 24 Apr 2026 08:00:00 GMT

author Vox

duration 2417000

transcript

Speaker 1:
[00:01] What to Make of a Life is the new book from Jim Collins, bestselling author of Good to Great. Based on 10 years of research, What to Make of a Life offers transformative teachings on what it takes to navigate your way through periods of fog, make it past life's inevitable cliffs, and keep the inner fire burning bright long and late. Step into frame with What to Make of a Life, the instant New York Times bestseller by Jim Collins. Available from Harper Edge wherever books are sold.

Speaker 2:
[00:59] Learn more at adobe.com/do that with Acrobat.

Speaker 3:
[01:11] Ian Millhiser, welcome to the show.

Speaker 4:
[01:13] It's good to be here, thank you so much.

Speaker 3:
[01:15] Good to finally get you in. Just as a little bit of an introduction, Ian is a senior correspondent here at Vox. He's been covering the Supreme Court, the Constitution since 2019, I guess. Basically, Ian is our resident lawyer covering the Democracy is Collapsing beat. Ian is the author of a new piece about what I think is a surprising pattern at the Supreme Court for all of its aggressiveness in a ton of areas, some of which we will get into. The court seems unusually cautious when it comes to regulating the internet. So given the stakes of the internet, I thought now would be a great time to get you on the show to break it down for us. How's that?

Speaker 4:
[02:01] That sounds great.

Speaker 3:
[02:02] All right, let's do it. Okay, so this court, clearly, even if you're not a lawyer, even if this is not your beat, you are aware that this is a court that is comfortable shaking things up when they're feeling the Holy Spirit as it were. But they do seem unusually wary of doing too much damage to the internet. Let's just start with a quote from Justice Kagan, which is in your piece. She said, and now I'm quoting, these are not like the nine greatest experts on the internet. But what do you think she really meant beyond the obvious there?

Speaker 4:
[02:36] Yeah. Well, let me contextualize the quote and I guess this broader conversation, because like you said, this is a court that rarely asks the question, hey, do we the nine justices actually know what we're talking about here? They often just do things. We're in the middle of a period where the court is being very aggressive in asserting its dominance over public schools, historically outside of segregation. The Supreme Court has stayed very, very far away from curriculums, from the day-to-day management of school, for the simple reason that these are nine people with law degrees. What the hell do they know about administering a public school system? We've had these two cases recently, Mahmoud and Mirabelli, where the Supreme Court has been very aggressive in saying that the public schools need to be very attuned to the desires of religious conservatives, especially when those religious conservatives object to queer people, queer themes in lessons, books that even have gay characters in them. The Supreme Court has probably made it impossible to teach a book that has a gay character in them. And so that's something that's very new. The court has been tearing down independent agencies. One of the court's major projects in the Trump administration is we have all these agencies, the FEC, the FCC, the Federal Trade Commission, where the president is not allowed to fire the commissioners. And the reason why is because Congress thought it was important that we have some of these agencies be somewhat insulated from the politics of the moment. And the Supreme Court is tearing that institution down. So this is the least conservative court I have ever covered.

Speaker 3:
[04:34] You might call them activists in several areas. An activist court, yeah?

Speaker 4:
[04:38] Yeah, you know, they are politically very far to the right, but they don't want to conserve anything. And in the context of this very aggressive court, this court that overrules major precedents all the time, they are just very, very timid about doing anything that could change how the Internet operates. And you know, this in a moment where there's a lot of public clamor for more regulation of the Internet, their hands-off approach to the Internet is very much out of step with how they approach virtually every other issue that has come before them.

Speaker 3:
[05:14] Let's just sort of start at the beginning, which is this recent case, which let's just assume most people are not nerds for this kind of thing and are not tracking individual cases. And this is what you wrote about, right? This is the Cox Communications versus Sony Music Entertainment case, just at its most basic level. What was this case about?

Speaker 4:
[05:30] Yeah. So Cox is the most recent in a series of cases where like various plaintiffs have come up with very aggressive legal theories that would put internet-based companies into serious bind if the courts were to embrace them. So the facts of Cox was this was a suit brought against the internet suit for those provider by, it was Cox v. Sony Music Entertainment by the music industry. And essentially what they wanted to do is they wanted the courts to say that, if you are an internet service provider and you know that one of your customers is downloading illegal music and you don't cut them off, then there are just tremendous financial consequences for this. A jury ordered a billion-dollar jury verdict against this one internet service provider in this. Billion. Billion with a B because the internet service provider wasn't cutting people off fast enough after they downloaded illegal music. The Supreme Court didn't just reject that legal theory, they nuked it from orbit. They said that only in very little limited circumstances, you have to show that the service provider either intended for people to pirate music using its system or that it marketed it. Use our internet service and you can download the new Taylor Swift album for free. Unless the internet service provider is running ads like that, then they can't be sued. I think that's the right outcome. Part of the reason why the court reached the decision it did is because the method that the music industry was using to track who was infringing was very imprecise. They might determine that someone in an entire college dorm or an entire hospital or an entire office building was illegally downloading music. But what they're asking for is for the whole hospital to lose its internet service. I think the Supreme Court was right to say that an entire hospital or entire dorm building should not lose its internet access because one person somewhere in that hospital has decided to illegally download a Sabrina Carpenter album. But it is part of a pattern where this is the most recent of several decisions where the court has taken a very cautious approach to the internet. Like I said, this is a court which normally does not show caution very often at all.

Speaker 3:
[08:04] So let's say the court ruled the other way here and didn't toss out the verdict against Cox, right? You say that it would have severely damaged many Americans' access to the internet. Like how so? Lay that out. Like what would the consequences have been if this went the other way?

Speaker 4:
[08:19] So the way that the music industry was tracking people's, you know, who was illegally downloading, they apparently had some software that could pinpoint using an IP address. The problem is that IP addresses are pretty imprecise. You know, that was the point that the Supreme Court made in its opinion, is that, you know, you might have multiple users in the same dorm room, the same hospital building, you know, you might have many, many users, the same coffee shop, who are all sharing an IP address. And if the music industry had gotten its way, what would have happened is if one person using that IP address had illegally downloaded a song, everyone else could potentially lose their access. Cox lost this lawsuit at the trial level and at the appellate level. Had to go up to the Supreme Court before the music industry lost. You know, there was another case that the Supreme Court decided a few years ago. This was the case where Kagan made that remarks about how the justice aren't experts in the internet. It's called Twitter Vita Minute. The facts of this were the families of victims of an ICE attack in Istanbul. So it was a mass murder event where ISIS, not ICE, ISIS, killed, murdered a bunch of people in Istanbul. And there is a federal law which says that if you aid and abate or provide substantial or by knowingly providing substantial assistance to certain acts of international terrorism, then you can be held liable for it. And so the argument was, well, Twitter, YouTube, Facebook, all these folks, they're letting ISIS use their system, ISIS post recruitment videos. They are helping ISIS be a more effective terrorist organization by giving them a communications platform. And therefore Twitter and Facebook and YouTube and all these sites should be liable for this terrorist attack that happened in Istanbul, Turkey. And again, the lower courts were fairly favorable to this argument. If you read the statute, the pretty plausible legal theory and the Supreme Court said, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no. Like you can't have a rule where if someone who's a bad actor is using a product that is available to everyone, that the person who provides that product, if I buy a Ford truck and I use the truck to run over my wife, the fact that I am a murderer and I use Ford's product to commit murder doesn't make Ford liable. And they said essentially the same rule should apply to the Internet here. So you have this pattern of harms that in some way can be connected to some behavior by the Internet.

Speaker 3:
[11:08] I could imagine maybe someone making the case that the Internet is a fundamentally different piece of technology, and it's not in the same category as a truck or something like that. But I'm having a hard time imagining what the persuasive legal argument would be to hold everything users do on the Internet, to hold the ISPs responsible for that, as well as everyone else who's using that system.

Speaker 4:
[11:34] Cox and Twitter were correctly decided for that reason, because the connection between the thing that the company was providing, providing Internet service, providing a social media site, and the harm was pretty attenuated. There is a very famous suit that is just starting to take off in California. This is, I guess, I would call it the drug addiction theory of the Internet. There's been a lot of popular literature and a lot of discussion lately about how these Internet companies are setting up their algorithms to make them very addictive so that you keep using their product. One legal theory that has emerged is that if you can show that an Internet company, say that TikTok or YouTube or someone has designed their algorithm in order to tweak the dopamine system of your brain so you can't look away from it, then they should be liable for producing an addictive product in the same way that the cigarette companies once upon a time were held liable because they were selling an addictive product. I mean, I think that's a more plausible legal theory than the legal theories that the music industry brought or that the terrorism victims brought. But it would obviously fundamentally remake the Internet if all of a sudden, the courts were micromanaging. Well, how attractive do you make this product?

Speaker 3:
[13:11] It just seems to be the reality that responsibility is messy online. I mean, IP addresses, shared networks, dorms, hospitals, even apartment buildings. The problem is the problem that the law wants a neat, clean theory of responsibility, but the infrastructure of the Internet just doesn't allow it.

Speaker 4:
[13:33] Right. That's where I'll go back to Justice Kagan's comment about how we are not the nine best experts. I think that Kagan understood is that, I mean, look, I'm not going to argue that everything is great with the Internet right now. I'd love to get into why I left Twitter and why I think that social media has caused tremendous harms. I think the people who are worried that their kids aren't learning algebra because they're too busy staring at their phones are right. But what you don't want to do as a regulator, and the Supreme Court is being asked to be a regulator here, is come up with new rules that make things worse.

Speaker 3:
[14:27] Support for the show comes from Bombas. If your sock drawer could use a little love, you can upgrade to Bombas. They design their socks with a keen eye for detail, offering everything from dress socks to sports socks. The ladder is made with a cushioned, sweat-wicking design that stops them from sliding down your foot, which I hate, while you stay active this spring. I wear my Bombas socks pretty much every day, and I especially wear them when I'm running, which I do all the time, and we're now entering the summer down here on the Gulf Coast, which is essentially a giant, sunny sauna, and the only socks I wear at this point are my Bombas socks. Everything else gets too hot or too gross, but when I take off my shoes, when I get back from a long run, my Bombas socks are still dry, still cool, as are my feet. I cannot recommend them enough. They really are great. They have more socks too, with breathable, soft, high-quality basics, including underwear and T-shirts. You can go to bombas.com/grayarea and use code gray area for 20% off your first purchase. That's bombas.com/grayarea, code gray area at checkout. Support for the show comes from Delete Me. Delete Me makes it easy, quick and safe to remove your personal data online at a time when surveillance and data breaches are common enough to make everyone vulnerable. The reality is that we're all susceptible to having a private information stolen, public figures and private citizens alike. Delete Me can help protect you and your family's personal privacy or the privacy of your business from doxing attacks before sensitive info can be exploited. Our colleague Claire White recently tried Delete Me.

Speaker 5:
[16:11] Delete Me is a tool that anyone online should have in their back pocket. Delete Me has saved me not only hours of removing my data from online, but has saved me hours of worrying about who has their hands on my data and what they're doing with it.

Speaker 3:
[16:27] Take control of your data and keep your private life private by signing up for Delete Me. Now at a special discount for our listeners. Get 20 percent off your Delete Me plan when you go to joindeleteeme.com/vox and use promo code Vox at checkout. The only way to get 20 percent off is to go to joindeleteeme.com/vox and enter code Vox at checkout. That's join deleteme.com/voxcodevox. Support for The Gray Area comes from Found. To the small business owners out there, when was the last time you really felt like you had your finances under control? If the answer is somewhere between I can't remember and never, Found says they can help you wrangle your business finances once and for all. Found can help eliminate the clutter by giving you one platform that handles it all, including banking, bookkeeping, invoices and taxes. That means no more paying for multiple subscriptions or dealing with clunky, outdated apps. They've automated things like tracking expenses, finding write-offs and budgeting for tax time. And you can even send invoices for free and pay your contractors. Everything, all from one app. You can take back control of your business today. You can open a Found account for free at found.com. That's found.com. Found is a financial technology company, not a bank. Banking services are provided by lead bank member FDIC. You can join the hundreds of thousands who have already streamlined their finances with Found. What makes me interested in this story is that the court is not cautious in all these other areas. Like you said, schools, health care, civil rights, reproductive rights, they are perfectly fine imposing sweeping changes and letting everyone else deal with the fallout. So why so much restraint here? Why is the internet, as you put it, an institution that these justices will protect?

Speaker 4:
[18:40] So I think one possible explanation is that Supreme Court, so the difference between Supreme Court justices and senators or other elected officials, isn't that Supreme Court justices aren't partisan. They are appointed through a partisan process. They are named by a partisan president. They are confirmed by a partisan Senate. Anyone who pays attention to the courts knows the difference between a Republican judge and a Democratic judge. A Republican judge opposes abortion, a Republican judge opposes affirmative action, a Republican judge wants more expansive legal exemptions for religious conservatives. I can list off a bunch of these and Democratic judges want the opposite of all those things. They're fairly consistent in that. But the difference between a justice and a senator, is that a senator has to run for reelection. So their partisan operating system is constantly being updated. If I am an elected official, and I want to win an election in 2026, I need to convince my voters that I believe what they believe in 2026, which may be different than what they believed in, say, 2018. And what I think may explain at least some of the justices' caution around the internet, I mean, I think this explains the decision in the Moody case in particular, is that the most recently appointed Republican justice is Amy Coney Barrett. She was appointed in 2020. Clarence Thomas has been sitting on the Supreme Court since 1991. Many of these folks got their job long before the Twitter wars, long before the Republican Party organized around how mad it was that Donald Trump was banned by Twitter and Facebook, or that long before people even conceived of this as an issue, that we're going to form a politics around whether or not you can get kicked off of Twitter or TikTok or wherever. And so, like, you know, Kavanaugh and Barrett, and it was Roberts who was also in the majority in Moody v. Net Choice. Like, I just think that, you know, they care about the things that Republicans cared about when they were appointed to the court. And that list of issues didn't include these new Internet issues. What I'm afraid is going to happen, and I think is likely to happen, is that, you know, if Trump gets to appoint more justices, and we see, you know, more people, more justice coming out of a Republican party that cares very much about being able to control content on the Internet, that this libertarian approach that the court has taken in cases like Moody is going to change.

Speaker 3:
[21:35] If that were to happen, right, and the court changed in that way, and they were no longer scared to break the Internet, what would that broken Internet look like?

Speaker 4:
[21:43] It's a good question. I mean, Cox was a unanimous decision. Twitter, the terrorism case, that was a unanimous decision. So like, you know, if nothing else, like Trump is going to have to replace a whole lot more justices, if he really wants to come to the rescue of the music industry, I don't know if that's going to happen or not. But where I think we're going to see things change very quickly, if the membership of the court starts to change, is that we're going to see political speech be something that is targeted. And like states that are trying to do things like force media outlets to publish more Republican voices, you're going to see those laws start to succeed. You know, you have two justices on the Supreme Court now. This isn't really Internet related, but the Internet's very tied up in the First Amendment. Clarence Thomas and Neil Gorsuch have both called for overruling, probably the most important freedom of the press case ever, a case called New York Times v. Sullivan, which says that essentially like if a reporter has an innocent mistake in one of their articles, they can't be sued for millions and millions of dollars for libel. If you don't have New York Times v. Sullivan, then well, I will tell you what's going to happen, because I will tell you the facts of New York Times v. Sullivan, which was that the New York Times published an ad that was written by civil rights activists that had some minor factual errors in it. The Jim Crow state of Alabama got a jury to say that the New York Times had to pay in modern dollars, I think it's like $6 million, because it published this ad which had factual errors like it misidentified the song that was sung at a particular rally. I try very hard to be accurate in my reporting, I'm sure you do as well, but I can tell you as someone who's been in this business for two decades, it is impossible to never have a factual error in anything that you're going to say. You're going to make a mistake. If reporters are suddenly liable for minor factual errors that were completely unintentional, and that don't really impact the thrust of their story, then you can't have journalism.

Speaker 3:
[24:02] Just to stay with this for a minute, I guess. If that were to happen, and it's not totally implausible, the consequences for journalism are pretty clearly ruinous in the ways you just laid out. But what about for just private citizens, who don't work for institutions like that? What would it mean for just private citizens, who have substacks or just use social media to make political speech and content and that sort of thing? I mean, how draconian do you think it would be at that level? I mean, how enforceable could it be?

Speaker 4:
[24:33] I guess, first of all, I want to come to the rescue of journalism as a valuable public service. I mean, I think one way that- Please do.

Speaker 3:
[24:43] I'm not waving it away. I am pro-journalism. I'm just trying to understand what the consequences would be for citizens as well as for what we do.

Speaker 4:
[24:53] It's not just professional journalists who would be impacted by that. Everyone who speaks is protected by decisions like that. So if some random person who's not a journalist puts up a TikTok video where they say disparaging things against, let's say it's not even they say disparaging things against anyone important. Let's say someone gets a bad grade and they say some mean things about their math teacher. Let's say that all of the things that they say about their math teacher are actually true, but there's one irrelevant factual error in there. They say that the teacher drives a green car when the teacher actually drives a blue car. Without New York Times v. Sullivan, that student could potentially get sued, potentially to financially ruin his degrees by that teacher. I could apply that set of facts to any hypothetical you imagine. People who publish things that turn out to be factually inaccurate, even if it was an inadvertent mistake, anyone who does that could potentially be in trouble.

Speaker 3:
[26:02] What's interesting is that the court, again, we've been talking about this, the court is so split along these familiar ideological lines. To the extent that there is a consensus here, and I guess there still is, and maybe it's motivated by different things, but what is, as far as you can tell, driving it? Is it free speech absolutism? Is it just market libertarianism? What is it that's driving this consensus such as it is on the court?

Speaker 4:
[26:28] Ten years ago when I was covering the court, there just wasn't that much of a difference between Democrats and Republicans at all on First Amendment issues. Everyone was a libertarian on First Amendment issues. And if anything, the Republican Party wanted a more expansive vision of the First Amendment than the Democratic Party, because the Republicans wanted the First Amendment to protect everything that the Democrats wanted it to protect, plus they wanted to strike down campaign finance laws under the First Amendment. So both parties had a very libertarian approach to the First Amendment, and the Republican approach was even more libertarian than the Democrats. And over the course of maybe a decade, that has been completely scrambled. And I think there's several reasons for this. One is that I think social media enabled Republicans to figure out the actual preferences of corporate America, and they didn't like them, and once they learned what corporate America's actual preference was, they sort of abandoned free market ideology altogether. I mean, it would be like, let's say you walk into a bar, and you sit down at the bar, and there's some guy at the end of the bar who is very, very loudly disparaging some identity group. I don't even care who it is. If I'm in that bar for more than a minute having to listen to that guy, I'm going to walk the hell out, and I imagine most people would as well. So a smart business is going to remove that person. They're going to say, you're chasing away my customers, you got to leave. That's part of what happened on Twitter, is that you had people who were saying things that a large number of Twitter's customers didn't want to hear, and they essentially made it known, look, it's them or me. The company said, well, there's more of you than there are of that asshole, so we're kicking that asshole out. That's one thing that happened. The second thing that happened is that advertisers care what content their ad appears next to. And so, if I'm, I don't know, a beer company, and I pay Twitter or Facebook or YouTube or whatever to run my, you know, my Budweiser ad, and this ad winds up appearing next to something racist, or it winds up appearing next to an image of a swastika or something like that, I'm going to be pissed, because it makes, you know, it causes people to associate my brand with something awful. And so, I'm going to call up the social media company and say, look, we're not going to give you any more ad dollars unless you fix this. And the way that the company fixed it was you ban the people who are putting up the swastika pics. And so, all of this was just free market capitalism. Milton Friedman would have loved, you know, everything that Twitter was doing during its most censorious era, because all that it was doing was, you know, meeting the needs of the market. But that antagonized the people who were being banned. They organized themselves politically. They found a party that was willing to side with them. They found other people with shunned ideologies. You know, why is RFK Jr., this weird aging hippie, with all these dumbass ideas about vaccines that have nothing to do with anything that the Republican Party has cared about prior to 2024, suddenly a cabinet secretary and a Republican administration? It's because he's part of the coalition of the shunned. You know, there are all of these people who were pissed that they were getting banned by social media sites, you know, often for different reasons. And the racists and the Nazis and the anti-vaxxers all organized and came together and realized that they had a common interest and formed a politics around it. And that became a really powerful force within one of our two political parties. And now that ideology, that new coalition that is organized around a phenomenon that didn't even exist 20 years ago, controls the government and controls our health ministry. So like, this, you know, the Internet has scrambled our political coalitions in ways that I think we're just beginning to see the consequences of. And, you know, so why hasn't the Supreme Court, you know, why hasn't the Supreme Court fully embraced the RFK-ification of US politics? I go back to my earlier explanation because, like, many of them are older Republicans who formed their views before any of this happened. And so, you know, Brett Kavanaugh, who, like, still believes what Republicans believed about the First Amendment in 2016, continues to enforce a 2016 version of the Internet and not the 2026 version of what Republicans think about the First Amendment.

Speaker 2:
[31:39] Now's the time to save on new carpet at The Home Depot. Receive 10% off your total carpet project and 12 months special financing. Plus, we'll measure your space for free. Choose from a variety of stylish, on-trend options fit for everyday life. With LifeProof, LifeProof with PetProof technology, Home Decorators Collection and Traffic Master Carpets. Save 10% and get 12 months special financing, now at The Home Depot. Offer valid April 16th through May 3rd, 2026. Exclusions apply for license at seahomedepot.com/licensenumbers. Have a break, have a Kit Kat.

Speaker 6:
[32:39] Pepsi Prebiotic Cola in Original and Cherry Vanilla. That Pepsi tastes you low with no artificial sweeteners and three grams of prebiotic fiber. Pepsi Prebiotic Cola. Unbelievably Pepsi.

Speaker 3:
[33:02] Technology always moves too fast for the law and for institutions to keep up. And that's just a huge problem for society. It's true of the internet, it'll be even more true for AI. It was true of radio and telegraph, the printing press and television and all the rest of it. I mean, this is just what technology does. It moves very, very quick, and culture and law and politics do not. And the collision between those two realities creates a lot of chaos.

Speaker 4:
[33:31] There's an optimistic story to be told for what you just said. Let's take the printing press. One of the worst times in human history to live was the 200 years after the printing press was invented. The printing press led to the Protestant Reformation and all of the holy wars and conflicts that came because you suddenly had competing churches fighting for control in Europe. It led to the rise of linguistic nationalism. All of a sudden, people in France are reading French language newspapers and they think, well, the thing that makes me a French person is that I speak the French language, and if you do not speak the French language, then maybe you do not belong here. I mean, lots of bad things happen. I sit back now five or 600 years later, and I'm very happy that the printing press was invented. Books are pretty good. Yeah, I think books are great. I have nothing against Protestants. Like, I mean, it's worked out great for all of us over the long term. And, you know, it's not like there wasn't a government solution to that problem. I mean, people had to learn how to live in a world where there was all of this access to information that hadn't previously existed. And it took us a while, but we eventually did. And, you know, I don't think I buy the thesis that YouTube or Twitter or TikTok or something like that is so unlike any communications technology that has come before, that the entirety of human imagination is inadequate to this problem and that all of humanity is doomed to elect people like Donald Trump for the rest of, you know, until eventually, you know, some Trump of the future nukes us all and human civilization. I don't believe that. I think that the trajectory there is probably going to look like the printing press where things are going to be really crappy for a while as we're trying to figure out what the world looks like when we have this level of connectivity, and we'll eventually figure out how to operate in that world. But I mean, I'm 48 years old, I might be dead before that happens.

Speaker 3:
[35:40] That's the thing. And again, leave it to me to doomify your good story here. But the difference is how much disruption can we tolerate, right? I mean, that's, it's always the question of, all right, new technology, disruption, and then there's this lag time between when the technology is invented and how long it takes us to adapt and figure it out and live peaceably with it. But in a world with AI and computer viruses and nukes, right, and all the rest of it, our capacity to derange and annihilate ourselves is infinitely higher than it was then. That's what worries me.

Speaker 4:
[36:17] No, it is worrisome. And I mean, there's certainly, you know, I would put Twitter on the list with nuclear weapons of things that I could, I would invent if I was capable of doing so. But at the same time, like, you are right that a thing that happens over and over and over again is that someone invents something new, it's usually disruptive, it leads to terrible short-term consequences. I wrote a book about the history of the Supreme Court where I discuss at length how awful it was to be a worker in the immediate aftermath of the Industrial Revolution, where these people were being crammed into these factories with machines that would rip their arms off and forced to work horrible hours for terrible wages, and it was miserable. I am happy now that the Industrial Revolution happened. I think that humanity has on net gained a lot from the fact that we have factories and we're able to mass produce things now. And I guess, I recognize this is unsatisfying, but we have seen this narrative play out so many times where something new is invented, it leads to terrible short-term consequences, but the long-term benefits are great. I think the burden for someone arguing that this new technology is so different than everything that has come before it, and it's just going to make everything worse, and we're never going to get the benefit from it. I think a person making that argument has a difficult battle ahead of them. They have a high burden of proof, if that's what they want to demonstrate.

Speaker 3:
[38:02] Do you feel like the dynamics could change? I guess the hinge factor here is who leaves the court and who gets brought on to the court. I guess that is just everything.

Speaker 4:
[38:14] My answer to any question that begins, are you optimistic about the Supreme Court? I'm going to give the same answer, which is tell me who sits on it in 10 years. If Trump gets to replace a bunch of more justices, or if we keep electing people like Trump, there is no set of circumstances where that ends well, period. It's just going to get more partisan, and the regulation of the Internet will just become more personalized to advancing the MAGA movement than will anything else.

Speaker 3:
[38:46] So for people interested in the future of the Internet, are there any other cases coming down the pike that you're keeping an eye on, and maybe people should keep an eye on as well?

Speaker 4:
[38:56] That California case involving the addiction theory of the Internet, that won't wind up in the Supreme Court probably for several years if it gets there at all. But other people are going to bring that theory, and there's going to be a First Amendment challenge to it. So I think the justice are eventually going to have to weigh in on that. Then the other issue that I think is very live right now is, what are the First Amendment rights of children? There had been a case, so in the early 2000s, there was a lawsuit that struck down a federal law, which required internet company or the required porn websites to have age gating. So if you wanted to look at porn online, you had to prove you were at least 18 before you could do it. The Supreme Court struck that down, and the court just recently handed out a decision saying, states can require age gating. Maybe you can explain that decision because we just have more sophisticated software now and so it might actually be possible to put up a reliable age gate on a porn website, and that just wasn't something that we could do 20 years ago. But the first question I had when I read that porn case is, look, I don't think 15-year-olds should be looking at porn hub, I have no problem with the law that prevents them from doing so, provided that it is effective and it doesn't infringe other people's First Amendment rights. But my concern is whether that's going to be the camel's nose under the tent, that leads to young people losing more important First Amendment rights, or whether something like that can be cabin to be just about porn. And the next case in line is probably going to be something similar to this Mississippi case I described, where you have states that want to restrict, that want to essentially age gate all social media.

Speaker 3:
[41:00] That's where it gets a lot thornier.

Speaker 4:
[41:02] Yeah.

Speaker 3:
[41:02] For sure. I mean, I'm in Mississippi and it is a conversation here, for sure. Well, if you're listening in or watching and you're interested in the Supreme Court, you should check out Ian's stuff on Vox. He's the best. All right, Ian, you're the best dude. Thanks for doing this. It's been a lot of fun, man. We'll do it again.

Speaker 4:
[41:18] Thanks so much for having me.

Speaker 3:
[41:23] All right. This is the end of the episode. I enjoyed it. I hope you did too. Let us know what you think. Drop us a line at the grayareaatvox.com or you can leave us a message on our new voicemail line at 1-800-214-5749. Please also rate, review, subscribe to the podcast. It helps us grow our show. This episode was produced by Thor Newwriter and Beth Morrissey, who also runs The Joint. Engineered by Christian Ayala, fact-checked by Melissa Hirsch, and Emma Munger wrote our theme music. Our executive producer is Miranda Kennedy. The Gray Area comes out on Mondays and Fridays. Find it wherever you listen to podcasts. If you watch podcasts while you listen, you can do that too. Go to youtube.com/vox for video versions of The Gray Area. The show is part of Vox. Support Vox's journalism by joining our membership program today. Go to vox.com/members to sign up. If you decide to sign up because of this show, let us know. Support for the show comes from Quince. If you've been looking in your closet recently and seeing all sorts of clothes, you can't even remember buying, or worse, clothes you do remember buying and thinking, oh yeah, I can definitely pull this off. Now may be a great time for a reset. Instead, you can fill that closet with durable evergreen pieces with Quince that last year after year. Quince makes high quality, everyday essentials using premium materials like 100% European linen and their insanely soft, moisture-wicking and anti-odor, flow-knit, active-wear fabric. I've tried Quince myself, I've gotten several shirts and sweaters, they're all great, I wear them all the time. I even got a backpack for my last trip that held up really, really well. And just the other day, I actually ordered a silver chain because it was such a good deal. I'm not even a chain guy, but I'm also neck deep in a midlife crisis, so I'm trying things out. I'll keep you updated. Anyway, refresh your wardrobe with Quince. Don't wait. Go to quince.com/grayarea for free shipping on your order and 365 day returns now available in Canada too. That's quince.com/grayarea to get free shipping and 365 day returns quince.com/grayarea.