title Hour 3: Why Are You Saluting That? (feat. Ronan Farrow)

description "Him and Jemele would do NUMBERS!"



The fallout from the last hour continues, and we break down the show's reactions further, including a hilarious Greg Cote moment he didn't even realize happened. Should we bring back Dr. Fred Johnson tomorrow night as the show's NFL Draft expert? After that, Ronan Farrow joins the show to discuss his latest piece about Sam Altman, the dangers of AI, and why Altman is beefing with Elon Musk.
Learn more about your ad choices. Visit podcastchoices.com/adchoices

pubDate Wed, 22 Apr 2026 16:45:00 GMT

author Dan Le Batard, Stugotz

duration 2405000

transcript

Speaker 1:
[00:00] Now, at McDonald's, a McDonald's is $2.50, so you can get your gym gains on, or just get lunch for only $2.50. Get more value on the under $3 menu. Let me time only.

Speaker 2:
[00:11] Prices and participation may vary.

Speaker 3:
[00:12] Prices may be higher for delivery.

Speaker 4:
[00:15] Right now at The Home Depot, shop Spring Black Friday Savings and get up to 40% off, plus up to $500 off select appliances from top brands like Samsung. Get a fridge with zero clearance hinges, so the doors open fully even in tighter spaces in your kitchen, and laundry that saves you time, like an all-in-one washer dryer that can run a full load in just 68 minutes. Shop Spring Black Friday Savings, plus get free delivery on appliance purchases of $998 or more at The Home Depot. Offer about April 9th or April 29th, US only C-Store online for details.

Speaker 5:
[00:45] LinkedIn is pretty amazing at helping you grow your small business. We cannot stop your new clients from e-mailing you at 3 AM. We can help you sell, market and hire in one place. We cannot help you be in three places at once. And while we can't help you organize your calendar, LinkedIn can help you land more clients, so you have a calendar to organize. Grow your small business on LinkedIn. Learn more at linkedin.com/smallbusiness.

Speaker 6:
[01:15] This is The Dan Lebatard Show with Stugotz Podcast.

Speaker 7:
[01:23] You've heard me lamenting the state of the media, the state of journalism later this hour. Ronan Farrow is still carrying out the Lord's work on behalf of journalism, calling truth to power, chasing around billionaires, and really tackling the heaviest of subject matter that divides us these days. His recent profile of Sam Altman and open AI, and the fight with Elon Musk, and the way that these billionaire high schoolers are getting more and more evil in the things that they crave and claim. His work is important, and I don't know if it's going to be around 10 years from now, because all of these guys are going to buy up all the news entities, and people like Ronan Farrow won't be able to do the kind of work that he's doing, and people like the Atlantic won't be able to do what just happened to our FBI director. They will just keep getting away with that stuff more quietly. So we're going to keep giving Ronan Farrow a platform here to talk about the stories, because he's telling the most important ones. He doesn't have to be doing this with his career. This is the child of Woody Allen, this is the child of Mia Farrow, or I'm sorry, Mia Farrow. It is Woody Allen's child as well, right? And his journalism career, it's just such an unusual path for him to take to keep fighting these titans with all of his time and effort. And so I want to keep celebrating the work that he is doing. But I cannot shake what happened with Dr. Fred Johnson. And it, my life slowed down. And I saw all of your faces in slow motion in a way, like I'm telling you, I've been doing this for a long time. And when people hear what it is that we've been doing, they've never heard me stunned into silence. And my face, if you're watching this on video, and all of our faces will betray, there's nothing hidden on anyone's faces.

Speaker 3:
[03:10] I feel like I was calm, cool and collected.

Speaker 7:
[03:13] Well, I want to, I urge the audio audience here to take a look at YouTube and go over there so that you can have the experience of what it is that I just combed through in the back, asking, investigatively, not unlike Ronan Farrow, get me all the faces of the exact moments of where everyone's real feelings betrayed them. And let's make the art of, let's snapshot everyone who was on this show and put their shock on the screen when Dr. Fred Johnson said he would not allow Juju to date his daughter. And all of us, just upon looking at him, were just frozen and just the perfect awkward comedy of that guy's judgment was brought in before the NFL Draft by champions to accurately judge people in a way that sniffs out bullshit. The second hand embarrassment that Zaslow talks about that he's told us before, what was the movie that you said?

Speaker 8:
[04:09] Friendship.

Speaker 7:
[04:10] The movie Friendship, that's right. The movie Friendship.

Speaker 8:
[04:12] I was on my way to London watching. I had to get, I had to pause the movie and get up and walk the aisles on the airplane for a few minutes.

Speaker 7:
[04:19] Because the movie is purposely awkward and second hand, second hand embarrassment is a thing for Zaslow.

Speaker 8:
[04:25] It made me really anxious.

Speaker 7:
[04:26] Take a look at Zaslow. Zaslow, after Fred Johnson said this, Zaslow never looked at him again. Nope. He wouldn't look at him or me. Zaslow was not making any eye contact. I felt the heat off of Zaslow's embarrassment. In my bewilderment and shock, a race bomb went off in my face and I was unprepared for it, even though I've been preparing for that all my life.

Speaker 8:
[04:49] Like I know you're saying how, you know, what the Dr. Fred Johnson's reaction was. I don't know what his reaction was. Like I wasn't looking at any monitors anymore after that. I wasn't looking at anyone in here's face in particular. I wanted to, I wanted to hide.

Speaker 9:
[05:04] I think low key we did a great piece of investigative journalism though, like Freudianly about how this process works in the NFL.

Speaker 7:
[05:13] The second hand embarrassment. Let me see this photo of Zaz again. Trista is shocked. Cody, this happened during the break. I'm not even making this up. Juju at some point reflexively, gracefully said salute and Cody said out loud, why would you salute that? And Cody just asked us during the break, not making this up. He just asked us during the break. Did I say that out loud? Because he thought it was an inner thought.

Speaker 10:
[05:38] So I don't even remember saying that.

Speaker 8:
[05:47] I don't like that.

Speaker 7:
[05:48] It was a private thought.

Speaker 10:
[05:50] I don't know. I can't even explain it because I don't recall saying it, but it's certainly what my reaction was. I admire and give JuJu all the credit in the world for being able to not respond like, what did you just say to me?

Speaker 8:
[06:06] Yeah, I got to be perfectly honest. I know that's your thing, JuJu. Salute takes a hit. Because you saluted that, salute needs to be built back up.

Speaker 7:
[06:16] Salute took a hit today?

Speaker 11:
[06:18] It did not take a hit. This is what I read into it. JuJu, you can tell me if I'm wrong or right. It was like, salute, like wow, you really came and said everything that you were thinking. Sometimes it's like, it's like you have to almost respect the blatant racism, salute.

Speaker 9:
[06:35] Right, I'm just happy to show that my therapy is real. I went to therapy and it works, salute.

Speaker 7:
[06:43] I went to journalism school. This is not how human beings react in this situation when they are supposed to have a response at the ready. Look at, I am hurt. I've been wounded. I physically, look, if I told you, if I showed you guys that picture and said, did someone just shiv me? I didn't give you any other context. I just said, was this person stabbed by something sharp? Would you say yes or no? I look like I'm, all right.

Speaker 11:
[07:11] It looks like you actually have to take down a capo maid man of your own. Like, it's like a sadness and like, this is what I have to do.

Speaker 3:
[07:20] It looks like you have dancing swords.

Speaker 12:
[07:21] You found out he went against the family, Dan.

Speaker 7:
[07:24] Zazlo, you could not look at me either during that?

Speaker 8:
[07:27] No, this is the first I'm seeing your reaction. I was just looking like straight kind of down across the room here.

Speaker 7:
[07:34] So what are you doing? So your last few minutes of thought are just focus in. Don't, the embarrassment near your ears. Don't look at anybody. Don't make eye contact with anybody. Focus in on what? You were looking, the puppet is over there with some-

Speaker 8:
[07:50] I'm going off into nothing. I'm staring in a direction where I don't want to look at anyone. I certainly don't want to look at the monitors. I definitely don't want to see the guests. I'm afraid of every- I have terrible second hand embarrassment. It gave me a lot of odds.

Speaker 3:
[08:04] I feel like no one handled it better than me though. I feel like my face is calm, cool, collected the entire time. Do me next, sir. That didn't go over well. No.

Speaker 7:
[08:13] You tried to help there.

Speaker 3:
[08:14] I was just like-

Speaker 7:
[08:15] You did try to help.

Speaker 3:
[08:17] I honestly was trying to set him up because he's like, now that's a good man right there.

Speaker 7:
[08:20] I would let him date my daughter.

Speaker 12:
[08:24] Can I have the diagram here? Maybe I can do some diagrams and stuff. Can we put the Zazz picture back up really quick? Yeah, the Telestrator. Let's see here. Here we go. Yeah. What I'm looking at here with Zazz, I look at his eyes and I'm like, wow, this guy is nervous. Something happened here. Obviously, we know where he's looking. He's looking this way. We're just trying to make sure exactly what he's looking at.

Speaker 3:
[08:45] Oh, no. That's inappropriate.

Speaker 12:
[08:47] His focus was not on the monitors. His focus was down into the center of the table, which is where he's looking at right here. Can we go to Chris Cote?

Speaker 3:
[08:54] For some people that are offended, that's a penis.

Speaker 1:
[08:56] Yeah. That edit line just.

Speaker 7:
[08:58] We have to do better.

Speaker 12:
[09:01] I will do better. So obviously, we've got the eyes. Look at the eyes. Look how big those eyes are. He's looking at it like, oh my God, I can't believe.

Speaker 3:
[09:06] It's a late basketball. You can see how late that was up last night.

Speaker 12:
[09:09] Yeah. What did he just say? You know what I'm going to do? I'm going to try and help him. What I'm going to do is I'm going to rotate this way and I'm going to ask the guy, hey, I've got something going on over here. Let me help him with doing that.

Speaker 8:
[09:21] 3D.

Speaker 12:
[09:22] That ended up not working for him either.

Speaker 11:
[09:24] For the audio audience, it's more phallic photos.

Speaker 7:
[09:27] But it's gratuitous. That's not clever. Chris wasn't looking in that direction.

Speaker 12:
[09:32] He was about to. That's the thing. If we had the movie of it, he went like this, looked into the camera and then looked over and said, do me next.

Speaker 3:
[09:40] Tony, can you draw one of those 3D boxes? I was like those.

Speaker 7:
[09:43] We bought a telestrator for good purposes and you guys, I'm going to take it away from you and not allow it to be used anymore.

Speaker 3:
[09:50] Don't do Trista.

Speaker 7:
[09:50] Don't do Trista. It's not just that you guys are using this instrument for wrong. It's also, and I'm going to keep saying it, that you are alienating the audio audience because in a very competitive podcast industry, other audio audiences are not asking for people to not see penises for five minutes drawn on a screen. We need to do better than that. Can we do better than that? Because the tele-strater...

Speaker 12:
[10:21] It's a diagram.

Speaker 7:
[10:22] That's the best that we can do because Chris was not going to look at the left. He was scared.

Speaker 3:
[10:26] That's how you do it.

Speaker 7:
[10:27] Our leadership...

Speaker 12:
[10:29] You had your eyes closed. You don't know what you're talking about.

Speaker 7:
[10:30] Well, I wasn't even talking about my leadership. My leadership fell apart. Everyone saw that. My leadership was disarmed by racism so obvious that Juju shrugged at it because that's where I'm living every day with a bunch of people like that. And I'm shocked by it as a 57-year-old.

Speaker 12:
[10:49] You're shocked. Your eyes were closed, which you didn't see Chris Cody turn over to the left.

Speaker 3:
[10:53] I'll handle this.

Speaker 12:
[10:54] I'll ask the doctor, hey, do me.

Speaker 8:
[10:56] Yes, there's one eye and there's a circle on another eye.

Speaker 3:
[11:00] When Dan's feeling ashamed, usually he has his head more down in this direction.

Speaker 6:
[11:03] Yeah, yeah, yeah.

Speaker 8:
[11:06] What else?

Speaker 3:
[11:07] Sometimes he cries. He cries?

Speaker 8:
[11:09] What else?

Speaker 3:
[11:10] And sometimes they make a puddle. The tears make a puddle. Yeah, yeah, yeah.

Speaker 9:
[11:17] You know what would be funny is here?

Speaker 8:
[11:19] Nice.

Speaker 9:
[11:19] You know what would be funny is here? We asked Doc, bro, what were you thinking? And he was like, he was wearing a Magic City hoodie.

Speaker 12:
[11:28] You want to get him back on?

Speaker 3:
[11:32] Ronan Farrow next.

Speaker 7:
[11:33] Well, I was just going to do that as soon as you landed the last dick joke.

Speaker 3:
[11:40] Folks, listen up, quick break in the action. Are you counting down the days until payday? Instacash from Moneylion can help you access up to $500 of your hard-earned pay early. There's no interest, no credit check, and no monthly fees. So you can manage those in-between expenses with less stress. Download the Moneylion app and link your qualifying bank account to see what you qualify for. Moneylion, make money easy. Instacash is subject to terms and eligibility requirements. Expedited delivery requires a turbo fee. See moneylion.com.

Speaker 12:
[12:06] Sometimes I feel like we're all really good at handling everything around us and just ignoring what's going on in our own head. Like your phone breaks, you fix it immediately. Your car makes a weird noise. You're like, all right, let's figure this one out. But then your brain's off. Stress, burnout, not sleeping right. We just kind of go, yeah, I'll deal with it later. And later just keeps moving and moving and moving. And that's why therapy matters. Not because something's wrong, because it gives you a way to sort things out before it all stacks up. The problem is that actually getting started has always felt like a process. Finding someone, figuring out insurance, waiting weeks just to talk to somebody. And that's usually where people tap out. And that's where RULA comes in. RULA is a healthcare provider group that makes therapy easier to actually access. They connect you with licensed therapists who take your insurance and sessions can be as low as 15 bucks. You answer a few questions, find someone who fits what you need, and you can be talking to someone as soon as the next day. Thousands of guys have already used RULA to finally get the care that they need. Don't keep putting it off. Go to rula.com/dan and get started today. That's RULA, rula.com/dan. Take the first step, get connected, and take control of your mental health.

Speaker 13:
[13:10] Sports fans, all the sports are coming together. It's a great time to just sit on your couch, text your friend, hey, come over, let's watch the games. And when I do that to my friends, guess what they text me back? I got the Miller Lights. That's right. They pick up Miller Light pretty much anywhere they sell beer, and they come over to my place. We take that first sip and we realize, man, we just made a regular old-fashioned night into a special night. Thank you, Miller Light. And shortly thereafter, we got multiple screens on, everybody's dialed into something different, and the whole night just keeps building and building and building. That's why I reach for Miller Light. It can take an ordinary night and take it to an extraordinary place. It's clean, refreshing, easy to drink. Proof of taste with simple ingredients. Just 96 calories and 3.2 carbs. The original Light beer since 1975 and still hitting different. Cheers to legendary moments with Miller Light. Great taste, 96 calories. Go to millerlight.com/dan to find delivery options near you or you can pick up some Miller Light pretty much anywhere they sell beer. It's Miller time. Celebrate responsibly Miller Brewing Company, Milwaukee, Wisconsin, 96 calories and 3.2 carbs per 12 ounces.

Speaker 6:
[14:17] Dan Lebatard.

Speaker 7:
[14:18] What is the worst part of the life?

Speaker 6:
[14:20] Stugotz.

Speaker 3:
[14:25] The worst part of the life of what?

Speaker 6:
[14:27] This is the Dan Lebatard Show with Stugotz.

Speaker 7:
[14:40] An honest question in my bewilderment. What was worse, the racism or his judgment? I'm asking the question honestly, though. Just the judgment of having that be the answer when you're the doctor of leadership, I'm telling you, I can't write a skit like that's funnier than this, and I think this man should be our draft expert tomorrow night. Like, I think this guy should be on our livestream tomorrow night as racist draft expert and then just let it fly. I need that.

Speaker 9:
[15:11] Him and Jemele with two numbers.

Speaker 7:
[15:14] Just let it fly. Just give us your analysis. Just let's go to him as an expert every time. Just tell us based on what this guy's suit is. Tell us whether or not he's going to be a success in the league or not.

Speaker 3:
[15:25] I'm being told our video team has put together a video where we're going to hear it and see all of our reactions simultaneously.

Speaker 7:
[15:32] Oh, we're going to watch this together. Okay, audio audience, I promise you, I will stop doing things like this in the future where the funniest stuff is just on video. But I'm lying about that promise. I'm going to probably-

Speaker 3:
[15:44] Voice video the day next.

Speaker 7:
[15:46] Yeah, I'm going to get worse and worse at this as we go as I try to please both audiences and they both get mad at me for over explaining things. Let's see the video.

Speaker 9:
[15:55] Character was like, you have to be around me more or can you read me from here?

Speaker 14:
[16:00] Well, yeah, I wouldn't want my daughter dating you.

Speaker 8:
[16:03] Oh, so you guys said you can give it out.

Speaker 5:
[16:10] I told you I can give it back.

Speaker 7:
[16:15] Cody not knowing that he said out loud, why are you saluting that?

Speaker 10:
[16:20] I thought I was thinking it. But the question you asked is a fair question though, because my second or third thought was feeling bad for this guy. Like what did you just say? Why would you say that?

Speaker 7:
[16:38] You felt bad for him?

Speaker 10:
[16:39] That was like my second or third thought because this guy, not only is he a leadership coach, but he's judging a league with 70 percent black guys.

Speaker 12:
[16:51] NFL is going to find a place for him very soon.

Speaker 7:
[16:53] No, he just harmed his career.

Speaker 9:
[16:55] I just wanted to say throughout the funny, thank you all for having my back though, because that's very important. Y'all making me feel like a part of the family and I really appreciate y'all have my back like this. So salute to all of y'all.

Speaker 7:
[17:08] No, not salute. It's been devalued.

Speaker 3:
[17:09] Can we cut to Tony smiling throughout all that, speaking of having his back?

Speaker 7:
[17:13] The salute has been devalued. Ronan Farrow is a Pulitzer Prize winning journalist. You've heard me say before that he does all the best stories on the most interesting things, whether it's Harvey Weinstein, whether it's Britney Spears, whether it's surveillance, and now artificial intelligence, which I can't even imagine how scared he was as he went deeper and deeper on, you know, some things here that Elon Musk says are more dangerous than nukes. And Elon Musk finds Sam Altman, it seems like, a bit of a protege and disgusting. But I will not put words in his mouth. I will ask Ronan Farrow, thank you for being on with us. I will tell the people, the newyorker.com is still doing the kind of journalism that matters. Ronan, thank you as always for joining us. What led you to this story? Why did you decide to do this?

Speaker 15:
[18:05] Well, thank you for that plug. And it's always a pleasure to be on here and to be amongst people who get how important it is, right, that we all get quality information in this democracy. Regardless of what issues you care about and that affect your life, you need the fourth estate. It's enshrined in the Constitution for a reason. And the business model has fallen apart around this kind of enterprise journalism. There's huge headwinds. Actually, while we were finishing up this story, you saw during our fact checking with OpenAI, they went and acquired TBPN, right? We saw Elon Musk a few years ago acquire Twitter. We're now living in a kind of oligarchy where you see some of the historical hallmarks of oligarchy with respect to media and journalism. If someone doesn't like the town square, they can just acquire the town square. So thank you for the plug and I hope people do go and read this article. I hope if they care about this kind of accountability journalism about Silicon Valley, they support outlets who do it. There aren't that many. And that leads to the answer to your question. I had been working on an extensive body of reporting about Elon Musk, which we talked about, I think back in the day. And it became apparent to me that some of the dynamics that I documented around Musk, right? This larger picture of systems that had failed, of a system of government in this country that has become hollowed out, of regulation that is not powerful enough to control these guys that have such disparate wealth and such disparate power. In Elon's case, that was to do with through his Starlink technology, I was documenting how he was actually calling the shots on the battlefield in Ukraine. And national intelligence officials were saying, we can't confront this guy, he's got too much power, he's our only vendor, he's sort of supranational in terms of power. It became apparent to me in the course of that, that the next frontier in terms of the private sector being over empowered, having all of the levers of power in this country, shaping policy to an excess extent and really having no mechanisms for accountability, while doing something that by their own admission is really dangerous for all of us, all of our jobs, all of our physical safety, all of our health, that's AI. And so I set out to dig into AI and eventually teamed up with a partner, my colleague Andrew Morantz at The New Yorker, who's a wonderful reporter. And we really put a year and a half of our lives into looking at OpenAI as the market leader and as an example that is, in the view of some of their critics, particularly extreme of power run amok and of a need for a real overdue conversation about accountability in AI leadership.

Speaker 7:
[20:47] I have a thousand questions here because I want to know why does Elon Musk hate Sam Altman so much? Can you explain that to our audience?

Speaker 15:
[20:56] So this actually relates to the big themes of the piece. Elon Musk was one of the big initial sources of funding for OpenAI and one of its founders. And the way this happened was that Sam Altman, who we document in this piece, had had a history of episodes in prior parts of his career where people around him alleged that he had honesty problems, right? That he couldn't be trusted, that he was saying different things to different people. Now, in Silicon Valley, there's a culture of this, right? It's an industry that is based on hype, selling hype, inflating value, long before there's a real product. But even by that standard, people started talking about this issue in their eyes with Sam. And he was departing one role where he had been followed by these claims that there was just too much lying from him to not put too fine a point on it. And he, in his latter years in that role, started getting very fascinated by AI. And look, you know, none of the portrayals in this piece are monolithic. I think Sam, in his mind, I spent a lot of time talking to the guy, also believed this was the way to change the world for the better. You know, that is the story that he tells himself and pitched others on. And the specific way he made that pitch to Elon Musk is he saw that Elon Musk was saying, AI may kill us all. Yes, it's the most powerful technology maybe that the human species has ever come upon, but also it's the most dangerous. And Elon had been running around saying and tweeting, this is more dangerous than nukes. And so Sam went to him and said, and we have the emails in the piece, I agree, you know, this is more dangerous than nukes. It needs to be controlled. We can't let a for-profit company like Google win, you know, I'm paraphrasing, but only really lightly. And he proposes a non-profit that is safety focused, that will go slow rather than focus on growth, that won't be shackled to caring about profits. The entire goal being we can do AI safely, because they were all afraid of a Terminator Skynet scenario where AI could go rogue, it could fall out of alignment with human interests, and it could potentially, you know, launch nukes, it could dismantle financial markets. There's all these apocalyptic scenarios. Not everyone believes in those, but the founders of OpenAI did, and that is how the initial money came in. That's how the initial talent came in. They were all convinced of a pitch from Sam Altman that this was going to be a non-profit forever that was all about going slow and safe.

Speaker 7:
[23:35] Americans have no idea right now what comes for us in terms of labor, never mind everything else as it relates to AI. We are not prepared for what it's going to do to labor, but what do you do Ronan with? Somebody throwing a Molotov cocktail at Sam Altman's home, charged with attempted murder, two suspects arrested for firing a gun toward his home, and him responding after the first incident, both the fear and anxiety around AI is justified. We're in the process of witnessing the largest change to society in a long time, and perhaps ever.

Speaker 15:
[24:09] Well, first of all, no one should be resorting to threats of vandalism or violence. It's not productive. It actually is a very effective tool for any of the targets of that violence to deflect from the real policy conversation that needs to happen. Obviously we want all of these guys and all these industries to be safe and to have a real, meaningful, deep conversation about policy here. Accountability journalism is an important part of that conversation. And in this case, there was an interesting part of the response in which Altman put up a blog post saying, well, there's been this incendiary article. And actually the DA of San Francisco used that same language of incendiary. They were very lockstep using the same language. The message being, let's all be careful about our words. But obviously, this was unrelated to the words, at least when it comes to really careful, fair journalism that vets the facts, that gets the response from the subject. I mean, Sam is quoted on every single assertion in this piece. Actually, this is a piece that OpenAI has relied on in their legal proceedings. They sent letters to the attorneys general of Delaware and California after this piece came out saying, look at this reliable piece that's reporting on, among other things, some of the stuff that Elon Musk was up to around this competition with us. And they're asserting that Elon Musk was being anti-competitive. So they're relying on it where they like. And I think there was an interesting conversation after the episode you described where people said, well, wait a second, you can't blame journalism for what, as you quoted him saying, is actually a real moment of national anxiety. That precedes this piece and that will continue. And part of the reason for that is what I just described in the kind of origin story of OpenAI. Sam Altman told everyone that AI was going to maybe kill us and certainly take a ton of our jobs. You know, he's in this piece saying a lot of people are going to lose their jobs. He said that publicly many times. It's not a radical statement, right? Credible economists are saying it may expose millions and millions of jobs in different sectors to disruption. And also, because of the way in which there is this immense hype bubble around AI and our entire economy is kind of draining into this massive vortex of AI spending because it's so expensive to develop. Other economists are also warning of a risk of an economic bubble that this has all of those hallmark signs. Sam Altman at times has said, yeah, a lot of people are going to lose a lot of money in terms of a potential market collapse, at least in part. So people are scared and they're anxious. And as he himself says, there's real reason for that. Of course, it accomplishes nothing productive and is only a distraction and a really unfortunate thing if there's any acts of vandalism, someone throwing a bottle at his house, anything like this, threats to any AI executives. It's just, it's not on. And I hope anyone who reads this and cares about this really gets into contacting the representatives, asking for sensible AI regulation. Europe has more safety testing on models that's mandatory. You could have better whistleblower laws, so the kinds of people that talked in this piece are more protected. There's a lot of really good sense things that we could be doing. Because right now, this is a very long piece, and I really recommend people set aside some time to look at it. It looks at how it's disrupting geopolitics, how it's potentially a threat in terms of safety, and how all of that and the transformation of the mission of OpenAI from this non-profit to one of the biggest for-profit companies in the world, how that should open up a conversation potentially, certainly in the eyes of a lot of Sam Altman's critics, about not just how trustworthy he is, but do we need to be more careful about who's leading and shaping this technology? That's what the conversation should be. It shouldn't be threats.

Speaker 14:
[28:15] Hey, listeners, I want to tell you a story here that actually means something to me in a very real way, and that's Chewy. Earlier this year, my wife and I had to deal with our beautiful six-year-old cat named Moe's passing away pretty suddenly after a brief battle with an illness. We remain torn up over it, to be quite frank, we talk about him every day. And when we forgot to cancel our food subscription with Chewy, rather than jumping through hoops, they not only canceled our subscription, they refunded our order that showed up, that we didn't need. They told us to donate it to the nearest shelter and gave us addresses of said shelters. They sent us literature on grief. And then a couple weeks later we received a letter in the mail thanking us for being incredible pet parents and even a little stone that hand painted his name on it and we keep that out next to some flowers now in our home to this day. And I cannot thank them enough for the humanity in that moment. You don't really get treated that way by companies, especially ones as large as Chewy. The way that they treat people genuinely matters to me. Chewy has over 100,000 products delivered in one to two days. Chewy has everything you need to keep your pet happy and healthy. And right now, you can save $20 on your first order and get free shipping by going to chewpanions.chewy.com/lebatardshow. That's C-H-E-W-P-A-N-I-O-N-S dot chewy.com/lebatardshow to save $20 on your first order with free shipping. chewpanions.chewy.com/lebatardshow. Chewy, you guys rock. Thank you. Minimum purchase required, new customers only, terms and conditions apply. See site for complete details.

Speaker 13:
[29:47] Hey Roy, buddy, you know that energy shift when the game gets good and everybody altogether in unison knows to stand up on their feet?

Speaker 4:
[29:56] Oh, absolutely, Mike.

Speaker 13:
[29:57] Yeah, you've been at many big time sporting events. You know that moment quite well. That's what it's like when you take your first sip of Cuervo.

Speaker 4:
[30:04] Oh, delicious.

Speaker 13:
[30:05] It's the signal that says we're not checking the time anymore, pal. It's when small talk turns into stories. Cuervo, man, it's that high five, a random stranger effect. That's right. The game is popping. You're hugging people you never met before. That's the kind of energy that Cuervo brings. It's so smooth, so delicious. That's the Cuervo effect. Keep it Cuervo.

Speaker 2:
[30:29] What are you reaching for? If you're a smoker or a dipper, you could be reaching for so much more with Zin nicotine pouches. When you reach for Zin, you're reaching for 10 satisfying varieties and two strengths. For a smoke-free and spit-free experience that lets you lean in. For chances to break free from your routine and a unique nationwide community. Whatever you're reaching for, reach for it. With America's number one nicotine pouch brand. Find yours in wherever nicotine products are sold near you. Warning, this product contains nicotine. Nicotine is an addictive chemical.

Speaker 6:
[30:59] Dan Lebatard.

Speaker 10:
[31:01] He called me on my own podcast. He called me full of shit claiming that I'm faking interest in the solar eclipse.

Speaker 3:
[31:08] You do do this. You love to just get excited about everything.

Speaker 10:
[31:11] Okay, Junior.

Speaker 6:
[31:11] Stugotz.

Speaker 10:
[31:12] I had to school you and explain to you.

Speaker 8:
[31:15] He was going to take you to Augusta.

Speaker 10:
[31:17] When I was 17 years old, Alan Sherry and I used to haunt the Bueller Planetarium.

Speaker 6:
[31:23] This is The Dan Lebatard Show with Stugotz.

Speaker 7:
[31:33] Ronan, a yes or no question, actually. I'm gonna try and confine you to yes or no on this. Do you believe that in 10 years, the billionaires will have bought up your ability to do this story?

Speaker 15:
[31:46] Well, I, as an individual, and to the extent that I command a little small piece of real estate in the journalism and media world, will remain as independent as I possibly can. The issue is the distribution models. I know you asked for yes or no, but if a journalist wants to get their stories out, the New Yorker Magazine is an incredible, unique place. But of course, it has a family that owns a company. There's layers above it. My TV life, when I make investigative documentaries and film, those are at HBO for the most part. I have two more coming down the pike that we're premiering at Tribeca Film Festival in a few weeks. Those are all labors of love. They're this kind of important storytelling that I hope gets meaningful information about America to people. They, to get to millions of people through television, have to go through a network. And that network is owned by a company. And if you look all the way up to the top, you can do your own research on who owns, you know, all of the platforms. I encourage people to be aware. And I guess the commitment I can give in response to the answer to your question is, I will always try to curate the working level groups of people around these stories to be as independent as possible. And if there's ever interference in these stories, I'll walk as I have in the past. But that gets harder and harder to do, there's less and less space to do it. So I'm really glad you asked the question. Because the answer is for most journalists, most of the time, no, you don't have a choice. And you can just walk that tightrope and try to stay as independent and strong as possible and not tolerate interference and go directly to viewers and listeners. This is why I'm glad to have your audience, have you guys caring about it. Because we've all got to be finding the journalism and supporting it directly. Like I'm going to be going on Substack and launching in the next couple of weeks. I hope people will support what I do there. More independence is better. We all need it to get good information.

Speaker 7:
[33:48] I believe that his career is under threat because of the things that he tackles and because of the weakening of media positions, even though he's the very best and he works among and with the very best. When he pours himself into a story, it's exhaustive. When it comes to things you've reported on the important subject matter, has there been anything you've reported that gives you more fear than, for our future, than what you just reported in here?

Speaker 15:
[34:13] Well, my beat is that I tend to tackle things that are scary in some systemic way. So you know, there's elements in each of them where I've thought, wow, I really want to spend a year of my life on this because I'm alarmed for all of us. But I have to say, in terms of like sweeping economic, cultural and political stakes, this one may win out. You know, it's not just the technology itself having these existential stakes that are so vast for our future. It's not just the economic layer I talked about. It's the way in which all of those are against the backdrop we started the conversation discussing. That in a post Citizens United America, this Supreme Court decision that really unleashed the flow of special interest money into politics, you have these same guys that are building this tech and warning us that it's so dangerous, really operating without meaningful regulatory and political muscle to keep an eye, to ensure people are safe, to ensure that it's not just a runaway profit motive. You just don't have right now in this country a set of political incentives and political willpower and structures to rein this in. It's very hard to see how that changes unless people really care about this, know about this and vote on this basis. Because right now what's happening is we're talking about these issues. I think people are starting to care about these issues. But also if you're running for office right now, you are dealing with a flood of AI money into politics. Super PACs devoted exclusively to striking down regulation on AI. There is a ton of muscle behind making sure that this gold rush is completely unimpeded and that no thought about safety, about re-skilling people to make sure that the job losses are counterbalanced. Any of the sensible things that I think there should be conversations about, there's just not room. So I would say the single scariest thing within this story is the structural issue of what is happening in terms of the power of the private sector and tech specifically, and the disempowerment of our government and the systems that were supposed to put a check.

Speaker 7:
[36:31] And media, your reporting made it feel like America, it doesn't have all of the wherewithal in its systems to impede a doomed future.

Speaker 15:
[36:43] Well, I will say that while the piece just in its bones, it's not an op-ed, but in the facts it conveys, it does kind of sound that alarm, right? That is in the facts of how America and how Silicon Valley work right now. I would say, nevertheless, I emerge from this reporting without my optimism totally shattered. I really do think the math of politics and of democracy can still work. Even though there's this flood of AI money, there's so many people who are scared to talk about this stuff and even more scared to undertake the process of trying to regulate this. There's also the simple math of getting elected. And I think that the poll numbers are clear, right? More than half of Americans now have more questions or reservations or more of an awareness of downsides than they are optimistic about AI. People are anxious, they are worried. There is a national conversation starting, particularly in recent weeks, about a crisis of integrity in AI leadership. Politicians can be made to pay attention to that. So I would say, you know, everyone at home listening, check if your candidate is getting big AI money and make sure you're calling in and saying, hey, our representatives in Washington should be working for us. And, you know, look at the issues, safety, the geopolitics, the threats to employment, all of the things that might affect you and your family. And have a conversation with your representatives and figure out ways to make your voice count right now. You know, and that's not throwing bottles. That's getting in there and trying to catalyze a mass movement that demands accountability.

Speaker 7:
[38:26] Go to thenyorker.com to read more of Ronan Farrow's Pulitzer Prize winning reporting, including his latest, Sam Altman May Control Our Future, Can He Be Trusted? Less than a minute if you can, Ronan, because we're out of time. You interviewed him more than a dozen times. How did that go over time? How awkward was it by the end when you're saying, hey, you're a known liar?

Speaker 15:
[38:46] Well, first of all, I give Sam credit for actually participating and deeply. Second of all, it was very revealing and there's 16,000 words for anyone who wants to read it on exactly what portrait emerges. But I would say in short, anytime you do a piece like this, it is extremely contentious. I won't get into specifics about the behind the scenes process here, but more often than not, I receive some variation of threats, a lot of pressure, a lot of incoming, a lot of demands to change things. All I can say is in this case, I listened deeply and that the resulting piece is actually very generous to Sam. He may not feel that, but it is a piece where I actively took things out that were true, just because I thought he was making a good argument that maybe they were sensationalist or a distraction from the main point. I really worked hard to make sure this was forensic and measured and fair to him, doesn't reduce him to being a simple villain of some kind. He's a complicated person, and I think the critiques of him have some validity, but really what I hope is that it opens up a bigger conversation about all of these guys running AI with our future in their hands.

Speaker 7:
[40:01] Chris Cody, go ahead and take the whole 16,000 words, shove it in AI and get a quick summary.

Speaker 8:
[40:06] No, that's happening all the time too.

Speaker 7:
[40:09] Thank you, Ronan. I appreciate it. I always appreciate the work. We'll talk more tomorrow about the fact that people are getting through online degrees in weeks now because they're just using AI. AI crashes as soon as school gets out because everybody's cheating their way through college. I want to play for the audience here, Trista's rap again because I want to demand from Greg Cody that he create a freestyle collaboration to do with Trista and Juju that rivals this, that she does not have ghost writers. Cameron and Mace are not writing this for her. She writes her own shit. You guys got to compete with it.

Speaker 11:
[42:18] Air Force Ones.

Speaker 13:
[42:23] Sports fans, all the sports are coming together. It's a great time to just sit on your couch, text your friend, hey, come over, let's watch the games. And when I do that to my friends, guess what they text me back? I got the Miller Lights. That's right. They pick up Miller Light pretty much anywhere they sell beer, and they come over to my place. We take that first sip and we realize, man, we just made a regular old-fashioned night into a special night. Thank you, Miller Light. And shortly thereafter, we got multiple screens on, everybody's dialed into something different, and the whole night just keeps building and building and building. That's why I reach for Miller Light. It can take an ordinary night and take it to an extraordinary place. It's clean, refreshing, easy to drink, proof of taste with simple ingredients, just 96 calories and 3.2 carbs. The original Light beer since 1975 and still hitting different. Cheers to legendary moments with Miller Light. Great taste, 96 calories. Go to millerlight.com/dan to find delivery options near you, or you can pick up some Miller Light pretty much anywhere they sell beer. It's Miller time. Celebrate responsibly Miller Brewing Company, Milwaukee, Wisconsin, 96 calories and 3.2 carbs per 12 ounces.