transcript
Speaker 1:
[00:00] But that's what I think of when I think of these outlets. It's like they're journalism, but they're just like dull, bland garbage designed not to offend anybody. That's just not really useful. I think you'd be better off going and watching The Muppet Show for a half an hour, and you probably come away better informed than most of this stuff. It's just not good. It's not interesting. They're not really interested in tech, and I think it's embarrassing. And I think we have to really, I'm hoping this era ends with some sort of Renaissance.
Speaker 2:
[00:43] Hello and welcome to Tech Won't Save Us, made in partnership with The Nation magazine. I'm your host, Paris Marx, and this week my guest is Karl Bode. But before we get to that, just a reminder that this month is the sixth birthday of Tech Won't Save Us. I've been doing this show for all these years now. I have interviewed over 300 amazing guests to give the many thousands of listeners to this show the insight they need on what these tech companies and these executives are doing to our world and to our individual lives as we have to interact with these products and the way that they change the society around us in ways that are really not serving us and degrading our existence as they become more and more powerful. So I think that these conversations are really important. Many of you do as well. And if you do enjoy this, if you want to make sure I can keep doing this work, having these interviews, providing these perspectives on the tech industry, this month I'm trying to get 100 new supporters of the show over on patreon.com. So if you do want to help support the show, help support the work that goes into making it, take a minute to go to patreon.com/tech won't save us, become a supporter and help ensure that I can keep doing this. Thank you so much. As I said, this week's guest is Karl Bode. He's a freelance reporter and he has a newsletter called The Fine Print that I highly recommend you go subscribe to, as people do with these letters. By now, you've probably seen this story about Sam Altman in The New Yorker. It has been making the rounds. There has been a lot of commentary on it. Karl wrote about the problem with how the media reports on these CEOs and how they seem to get tricked or get the wool pulled over their eyes time and again by these people who are basically lying to us in order to enrich and empower themselves. Time and again, they get held up as these important figures who we should be paying attention to, who we should be giving some degree of benefit of the doubt to, or at least believing the types of things that they tell us. Time and again, it turns out that they are not really how they are being presented to us. We have someone like Elon Musk who presented a certain image of himself, and of course, was never always that person, that person who he presented himself to be, and now he is embracing right-wing politics. He is having all of these detrimental impacts on our society. Now we see someone like Sam Altman who has followed a similar mold, who was praised as this rising CEO genius, and now more and more we have people admitting that he is seemingly a compulsive liar who will do anything to gain power and influence. And is that really the kind of person who we want wielding this much power and influence in our society? I think not. And certainly, Karl doesn't think so as well. So we dig into this report in The New Yorker on Sam Altman, what we take from it, what we think about it. But we also extend that into a conversation about how the media reports on these CEOs, reports on these tech companies, the real flaws with that, how, you know, there are a lot of great tech reporters out there, but there is also a lot of reporting that happens that just serves to boost up these CEOs to reflect the narratives that they're trying to tell the world, and we're not served very well by that kind of reporting. So I think you're really going to enjoy this conversation. I always enjoy talking to Karl. I was really happy to have him back on the show. If you do enjoy this conversation, just as you enjoy many conversations on Tech Won't Save Us, again, I would ask you to consider going over to patreon.com/tech Won't Save Us, helping us meet our goal for the show's sixth birthday, so that I can keep doing this work, keep having these conversations and keep educating you on what the tech industry is doing to our society. So thank you so much and enjoy this week's conversation. Karl, welcome back to Tech Won't Save Us.
Speaker 1:
[04:19] Hey there, thank you for having me. Nice to see you.
Speaker 2:
[04:21] Absolutely, it's always great to talk to you. I love watching your commentary online of everything going on with the tech industry, and in particular these tech billionaires and everything that they're up to and how they're making the world a worse place. And you've been writing recently about the way that the media reports on these CEOs and these tech companies. Of course, not a new subject matter for you, but you've had some recent pieces that were really intriguing to me, and especially on the back of this Sam Altman piece in The New Yorker. I feared it was a great time to actually talk about how the media reports on these CEOs and reports on the tech industry again, because we've certainly talked about it on the show in the past. But there's new details here, there are things to dig into. And so I guess I just want to start with a broader question. Why does tech media, and really, I guess media in general at this point, report on these tech CEOs in the way that they do?
Speaker 1:
[05:15] I think over time, they've just become an extension of marketing. I think as media got consolidated under the ownership of mostly right-wing, very rich, white men, they have a very vested interest. And it's not subtle when their news coverage is kind of polluted with their motivations, right? So you get what I affectionately call a lot of CEO said a thing reporting. Well, they'll just mindlessly parrot whatever the CEO said. We're going to Mars. AI is going to be sentient in just three weeks if I get $1 billion. And they'll just like, they'll repeat it. They won't include any context, any history. It doesn't matter if the CEO has been full of shit for 10 straight years. We won't mention that. So it's a very specific class of reporting. And I see it all the time. I said, you know, Elon Musk has obviously been a huge beneficiary, Sam Altman, Zuckerberg. Just this weird pantomime. It's like, it's not really journalism. It's like stenography and just parroting. And they don't, you know, they could easily call up like an academic. Academics are desperate to be called up on the phone and ask questions because they've been studying subjects for 30 years, right? They wouldn't even include like a paragraph where an academic comes in and says, that's not really very plausible, you know, because that would require actual journalism and actual work. And that's not what this class of journalism does. And I think as things have gotten consolidated and they fired a lot of the real journalists, what's left is this weird simulacrum. That's just kind of pathetic and sad, quite honestly. And I think Sam Altman has really benefited from that.
Speaker 2:
[06:42] Yeah, I completely agree with you, right? It's wild to look at some of these articles. Like, there's a certain level where you can understand why it happens, right? Where I feel like we have been through this era of the resources really coming out of a lot of journalism. There's the pressure to produce a lot of articles now that it's online and things are being detected by the search engines, and you're trying to optimize for the search engines, and I guess now the chat bots or something, I don't know. But you have that degree of, I guess, the pressures that are there. And then on top of that, you have less resources to actually put into the reporting on many of these things. And so it just becomes easy, as you say, to kind of repeat the press releases, to repeat the statements of the CEOs. And it's wild to read some of these articles, like, for example, Elon Musk talking about going to the moon or going to Mars. It can't even include the fact that he said we'd already be there by now, you know? And it's like, you know?
Speaker 1:
[07:40] Or the optimum, the endless stories about the optimum robot, you know, when nobody's really seen anything that has a functional battery life. It's just there's endless examples of it. It's constant. And as AI has been integrated into this, I think what these folks want that own these outlets is just like to create this ouroboros of like click engagement, you know? Everybody's, whatever gets people to click. So if people click on it, it's good. They're going to automate that. They're just going to generate a bunch of ad revenue. They don't care about the ethical implications. They don't care if the tech works. They don't care how it impacts labor. They're just building this massive monolithic thing that shits out ad money for them without the pesky need to pay journalists a living wage or health insurance. So you see it everywhere. You'll see it everywhere. It's not been subtle. There used to be some debate. Media academics have warned about this stuff for 20, 30 years, what we were building. It used to be like they'd get like a stinky raised eyebrow at some of their claims. But these days, it's so unsubtle, especially with all the authoritarian bullshit in the states. It's really not even up for debate anymore. Most of these outlets are just extensions of the extraction class. They're creating these swaddling narratives for rich people to tell themselves to feel good about what they're building, and a lot of it is just gibberish and bullshit.
Speaker 2:
[08:51] Yeah. I feel like I've had Victor Picard on the show before, I believe.
Speaker 1:
[08:56] He's great.
Speaker 2:
[08:56] Yeah. He does really great work looking at assessing the media industry and what would have to be done to make it better, to better serve the public and I guess what we expect the media and journalism to be doing in our societies and democratic societies in particular, which is probably like a reason for me to have him back on the show soon as well and talk about how this is all going because he's fantastic on these issues, right?
Speaker 1:
[09:23] His big thing is that corporate power in journalism cannot exist side by side. They have just completely different financial interests. You can sometimes get something that looks like decent journalism. I'm not saying that all corporate journalism is 100 percent awful. But his big point is that those two can't cohabitate very effectively. He's a big advocate for publicly funded media, which Trump just destroyed the last vestiges of in the United States with his assault on NPR and PBS. But I think that's true. I think we need some publicly funded, crowdsourced public media in this country to actually, that actually cares about the truth because it's very clear. A lot of these corporate outlets simply don't.
Speaker 2:
[10:02] No, I completely agree. I think we need much more public media and certainly, it's the case in the United States. But even in a country like Canada that has a major public broadcaster, I would say that it needs more resources and more funding. We need to look at more public structures in order to put more resources into journalism so that we can again get the benefits that come of it. We need this institution that is holding power to account, and as it becomes decimated, as there are fewer newspapers, as there are fewer journalists, as there's less resourcing going into it, it is not able to do that. We very much see that reflected in the way that the tech industry and these CEOs are reported on and have been reported on for many years. As you say, Elon Musk is someone who has benefited immensely from this type of reporting, and the way that he can just throw out these big sci-fi ideas and it will be echoed and repeated, and he will get magazine profiles and be on the covers of these magazines, and everyone will be fawning over these big ideas. It doesn't matter whether he will actually deliver them. It's very important to pumping up the valuation of the companies in the big picture.
Speaker 1:
[11:11] Right. They're creating an alternate reality where history doesn't exist in many ways. He's an overt white supremacist. He says hateful, ignorant, vile shit all of the time. And you could literally go to any story about him, pluck any story, writers, AP, New York Times, Washington Post, off the news wires. And you will not find, you might find a vague reference to the fact that he's controversial, but you won't find any overt mention that he's full of shit, consistently full of shit, an overt racist, unsubtly white supremacist, fascist supporter. His incompetence in the Doge stuff, pretending he was going to cut government efficiency and then just blew up a whole bunch of money and stole a bunch of data and ran off. That's relevant. If I'm going to write, if I'm a real journalist writing a story about Elon Musk, I'm going to have at least one or two paragraphs maybe about his history of just abject failure and disgusting comments. They just memory hold this stuff. Because that's not what a lot of these outlets are interested in. These outlets are actually interested in accumulating wealth. Tech is incidental for a lot of them, I think. They don't actually care how the tech works. They don't care how OpenAI really works. They don't care how electric cars drive, the engine works, which actual engineers worked on this. They're not interested in that stuff. They're not interested in engineering. They're not interested in tech. They're interested in accumulating money. It's obvious. Once you look at it through that frame, it becomes very clear why they behave this way, I think.
Speaker 2:
[12:29] Yeah. Well, the CEO says a thing, articles that you're talking about, they feel very much like clickbait articles. It's the thing that comes out. It's a grand statement. Oh, what is Elon Musk saying now? I need to click this and see what it is.
Speaker 1:
[12:43] It's weird because I don't even know how many people read these articles. I think they're aimed at an MBA grad person who doesn't want to think too deeply about the ethical impact of tech. I think that's one of the target audience. I think they're aimed at people who are just gobsmacked and easily impressed by innovation in tech, people who really want to believe that there's billionaires out there who are going to take us to the moon and solve all the problems. I think that's a comforting narrative for them to push. I think those two get fused, but I still don't know how many people actually click on this stuff. I think its primary interest is to swaddle, like I said, the extraction class and these narratives that they are good people doing noble things. American ingenuity and innovation is the forefront, and a lot of nationalism creeps in. It's just for an industry that prides itself supposedly on telling the truth, it's interesting how challenging it is for them to recognize when they're not doing that. You would think 30 years as a reporter, a lot of these guys would be better at that, but I think over time, like I said, a lot of the more critical thinking reporters get weeded out, and what's left is whoever toes the line, with exceptions. That New Yorker article was an excellent, I think it took the long way home, I think it buried the lead in some spots. But that was an excellent analysis of Sam Altman, and that came from a corporate establishment, Media Outlet. So it's nice to see the paradigm broken up occasionally anyway.
Speaker 2:
[14:11] It does feel like there's been more permission for this kind of stuff to be written in the past few years, in a way where it might have been harder to see, I don't know, 10 years ago, right? This kind of report and this kind of image of the tech billionaires being presented. But because the public has swung so much against not just the billionaires, but the tech industry to a certain degree as well, in Silicon Valley, especially in recent years, it feels like there's more of an opening for this kind of critical reporting on tech billionaires, the tech industry. But then it's odd to see the critical stories about, say, the story about a worker who died in an Amazon facility recently, and they just told everyone to keep working. These kinds of stories about the harms and the clear problems with the tech companies and their models, then sit alongside these stories that are just like, again, kind of the CEO says a thing or the kind of rewritten press release kind of stories. It's like, it's weird to see these two sitting next to one another then.
Speaker 1:
[15:16] Yeah, totally. I do think the public really craves the truth. I think they can see their lived-in experience, like young people trying to get into the job market can see that the promises about AI making their lives easier at four-day work weeks. They can see that's bullshit, right? They want somebody to tell them the truth. So they look to these corporate outlets and they don't. So when somebody does, I think they can elevate a monk above the mire, especially in the AI era when everything's got this homogenized sameness to it. I think a lot of the reporting is going to be more and more of that. I think authenticity and truth is going to have a premium. At least I like to tell myself that.
Speaker 2:
[15:54] No, I definitely feel that way. It's interesting you were saying, there are a lot of journalists who just don't seem to be able to see through the statements that these companies make or the lies that these CEOs tell. To me, it feels like on the one hand, there's this group of journalists who really want to believe in what the tech industry is selling. They want to believe in the grand narratives of transformation and the world getting better and innovation and the sci-fi futures being realized and all that stuff. There's that class of people. But then it also feels like there's another group that is like, it's risky to put your neck on the line and to say, this is bullshit and this is not happening. Because what if this is the time when they're actually going to follow through and deliver something and then you've been shown to be wrong by questioning it?
Speaker 1:
[16:49] Yeah. But you could be honest. You could just choose to be honest. You could just be like, okay, I was a booster previously and this came out and it's good or it's bad. You don't have to worry about that. If you're a journalist, you're just going to where the truth is. You don't have to be worrying about what your legacy is. I think that's the polluted thinking that comes from access journalism. You've commented on your show a lot about these certain tech access journalists who are very happy to be close to power and excited to be called up on the phone by CEOs. But the CEOs are picking them for a reason. They're not picking them because they're a good reporter. They're picking them because they know that reporter won't really push them. Even the ones that sell themselves is like truth to power outsiders, like Kara Swisher, I know you've talked about in the past.
Speaker 2:
[17:31] Absolutely.
Speaker 1:
[17:32] Her entire legacy is that she's this outside the box, free-thinking, rugged guerrilla journalist, who's given it to the man and asking tough questions.
Speaker 2:
[17:42] She got the leather coat and the sunglasses.
Speaker 1:
[17:46] She's got a pretty strong track record of being a little too cozy with these CEOs. It happened with Elon Musk, it's happened with Sam Altman, and I think the industry is really quite populated with those folks because that's what billionaires want to give their money to. The people that fund media are very rich, white conservative men or at best centrist. There's very few left-winger billionaires that are out there funding media and the media reflects that. If we had a bunch of progressive billionaires with hearts and ethics who actually cared about the functioning of the country and were just purely extractive, I think you might have some media outlets that reflected that but we most certainly don't. So I think the journalists that are left are the ones that tell stories that these people like. And they really like them if they tell those stories and pretend to be bold truth tellers. That's a tough balance to reach to sell the public that you are simultaneously carrying the industry's water and hyping up their products but also holding them to account.
Speaker 2:
[18:46] We're being quite critical but as you've said, there are a ton of fantastic journalists reporting on the tech industry who do really great work at these publications. I don't want to make it seem like we're shitting on all of them.
Speaker 1:
[18:58] No. I mean, every outlet has half a dozen or more excellent reporters here and there. The Wall Street Journal has great reporters, Reuters, AP, New York Times, Washington Post. They all have good reporters buried in there but they are not the norm.
Speaker 2:
[19:10] Yeah.
Speaker 1:
[19:11] These voices are not mainlined above all the other gibberish that corporate power wants you to consume.
Speaker 2:
[19:17] Totally. It's like you see someone like Kevin Roos, obviously at The New York Times, who really wanted to buy into the crypto stuff until it all fell apart. Now, it's like a big AI booster and this is one of their chief people explaining to their readers what the tech industry is, how it's working. It's clearly selling the industry's narrative to the public in a way that is very useful to the industry, but is really not informing people about what is the reality of these technologies, how they're really working. It's kind of putting that through the lens of what the industry would want people to believe, and then you have people being misled about how AI works, what it is actually doing at the moment, even though, of course, they would claim that that's, or someone like Bruce would claim that that's not what he is doing, what his reporting actually contributes to, but that's because he sees all this in a certain way. You mentioned Zuckerberg earlier. It would always stand out to me that whenever Zuckerberg would make a big announcement, one of his first interviews would be with Alex Heath. He used to be at The Verge now, he has his own kind of thing. It was always presented as like, look, I got the big Zuckerberg interview. It's like, yeah, because he knows you're not going to ask the hard questions.
Speaker 1:
[20:29] Yeah, it's not the point of pride you think it is, because they chose you for a reason. Most of the time, they don't even choose journalists anymore. They'll choose, what's his name? Lex Fridman or something like that. There's this array of fake journalist podcasts that they'll go on, because they know they're going to get peppered with saw fault. But yeah, if you land a CEO interview, it's not like you're being chosen for your chops. Roos is interesting to me because when he was talking about AI, there was one interview he did, I think, with Casey Newton last year that stuck in my craw a little bit, where they really attributed all sorts of human malice and motivations to AI, which suggested to me they don't, or at least Roos doesn't understand how this tech really works. It doesn't think or truly understand. It's providing you a sentence structure based on just massive input of what it thinks the correct answer is. There's some moment in the article where he talked about how it would be good for mental health therapy and then there's another point in the article in which he says it has ulterior motives and stuff like that and has bad intent sometimes. It's just that that advertises to me. If you want to see if a tech journalist understands AI at all, see if they attribute human intention because that's not what's actually happening.
Speaker 2:
[21:46] Absolutely. It feels like on the one hand, there has to be this kind of, you need to take seriously the statements of the tech CEOs and the tech companies, what they claim the technology is doing or what it's becoming because we hear from these AI CEOs, how the technology is starting to think for itself and AGI is right around the corner, maybe it's already here, the computers are starting to question things. We see these stories from Anthropic like every six months about how the AI is thinking now and they're so shocked and all this kind of stuff, right? Because they asked it whether it can think and it told them yes or something. It's ridiculous kind of stuff. There's that piece of it where you take what the companies are saying seriously and you give it weight, but the growing number of studies about the social harms of the technologies, what it's doing to people's critical thinking skills and all the stories about people's mental health breakdowns and the way that it is assisting people at taking their own lives or in planning school shootings and all these sorts of things. Those details need to be not taken as seriously or not addressed in the way that they should be, because what matters is what the companies say and not all of the growing pool of evidence that what they're saying is not reflective of reality. Right.
Speaker 1:
[23:06] Yeah, exactly. Which is a shame because the technology really is interesting. How it works, what it can do. I mean, it's software. It's interesting evolutions in software, some of which are actually useful and there's interesting conversations to be had about how things work, but that's not what you ultimately wind up getting. You get a lot of boosterism. You get this hyperbolic, just gibberish that pushes the idea, like you said, that AI sentience is just right around the corner. It's a shame. You should easily have a paragraph or two where you explain context, history, how the tech works, throw in an objective quote from somebody who's truly an objective academic, and has actually really studied this. It's not like that's hard to do. They just choose not to do most of this particular type of journalism.
Speaker 2:
[23:47] Yeah. We've been talking about this broadly in a more abstract sense. Obviously, we've pulled out specific details, but I did want to pivot a bit more to this story in The New Yorker that we're talking about, right? That goes into Sam Altman, who he is, and I guess the presentation that we have had of him for the past few years, and how that is not really reflective of who this man seems to be, and how reporting and journalism has contributed to giving us a particular image of who Sam Altman is that is maybe not reflective of reality. I guess I want to start with, what did you make of this piece and what really stood out to you from reading through it?
Speaker 1:
[24:26] I thought it was great. I thought it was good. Like I said, I thought it buried the lead a little bit. It took the long way home to the point that this guy has consistently lied about everything all the time. One central theme through the piece is that he just tells everybody what they want to hear. All the time, he's good at that.
Speaker 2:
[24:41] I remember this really came out in Karen Hao's book as well, I think was one of the key things that really stood out from that for people who read it, right?
Speaker 1:
[24:50] Yeah. He's like Elon Musk. He's not as severe as Elon Musk. I don't think he's got the same authoritarian trappings or white supremacist interests as Elon Musk's, but he's got a lot of the same skilled opportunism, understanding media, understanding how to tell people what they were looking for to get money, which is a talent. It is a talent. But also, like Musk, he's not really an engineer. The article goes into a little detail about how he really doesn't understand the finer details of the technology he's talking about.
Speaker 2:
[25:18] I wonder if he's also asking OpenAI employees how many lines of code that they have written.
Speaker 1:
[25:24] Right. Yeah, it's similar. I bet his management style is not a lot better. I'm sure we'll see more on that. But yeah, a lot of the stuff is stuff that the board member, the trajectory itself is fun. This OpenAI was supposed to be started as a nonprofit that was concerned about the public interest. Like last month, they signed a deal with the Pentagon to be their chief surveillance and targeting partner. And then you read their press release and their pinky squaring that none of this is being used nefariously, right? And you have no way to confirm that. And we no longer have functioning regulators in the United States to show insight. And we'd never have any transparency into domestic surveillance. So that trajectory is amazing. A company that was started on the promise for good and has ended here in the toilet and is partnering up to authoritarians to help them spy on people and target minorities in foreign countries around the world. That's an interesting story, right? And in 2002, I think, the OpenAI board made clear observations when they tried to fire Altman, that he was not a reliable narrator, that he had multiple financial conflicts of interest, that you really don't want, this is not a kid you want, with his finger on the button, as one board member said. So this is stuff we knew already, but the tech press kind of broadly decided to just ignore it. You know, at the time when those board members came out, highly critical of Altman, I remember the general tone in the tech press was that these board members were all hyperbolic cranks, you know, to be disregarded. Sam Altman is just a pioneer. He's doing things on the edge of innovation. You got to give him a lot of slack. These board members just don't understand how things are, how things work. And now here we are four years later in this New Yorker article basically confirms everything those board members said. Now, to be clear, a few of those board members might have been a little crankish, but several of them were also very reputable people who ultimately were proven right across all of this. So I'm interested to see where did tech journalism fail us in that arc? You know, because this is a very unsubtle, clear story again, like Elon Musk's story. And they were not there telling us the truth. Why weren't so of my questions are why weren't they telling the truth? What were the motivations for them to downplay those concerns about Altman? And why are they still downplaying those concerns?
Speaker 2:
[27:33] Well, and I remember at the time of Altman's ouster as well, they weren't initially very forthcoming with the reasons that Altman had been pushed out and it seemed like their, for lack of a better word, their media strategy for why they had done this was not very strong. Yet, you have someone like Altman who can immediately call in a ton of resources, a ton of influential people to start putting out the counter-narrative to defend himself. And you immediately have people in the press who are open to taking his version of events, believing what he has to say, and then echoing that as the conclusive narrative for the story, right? Whereas the board is just being hit with questions, it doesn't seem to be properly prepared for explaining why they had really gotten rid of this guy or nervous to really come out and provide a real full justification. And there were certainly plenty of journalists presenting the Altman side of things, but it feels like nobody did that more vigorously than Kara Swisher, right?
Speaker 1:
[28:43] Yeah. Yeah. I think she called them cloddish. I forget the exact quote. I saw the tweet the other day. Cloddish. Cloddish clouds being cloddish was her assessment of that on Twitter at the time. It was just completely like there's no way this could be true that Altman is a boy genius. They were really invested in Altman being a boy genius. I've sat and watched. There was one interview I saw him do with a bunch of Indian developers where like Musk, he just spewed out a lot of stale sci-fi tropes, jumbled them together and sounded smart. The audience was just so gobsmacked by him and so impressed and it struck me as so phony. It really stuck with me for a long time. I don't think people are very discerning about truth. I don't think they're very discerning about what's actually innovative and interesting. I don't think they're very good judges of character or intelligence in the United States. I mean, look at the people we elect to office and the CEOs that fail upward. I find the whole thing really fascinating and sad simultaneously to watch this play out. Over and over again too, you see these same cycles play out. There's a lot of differences between Musk and Altman, but there's also a lot of similarities in the way that they are able to exploit a lazy press, tell people what they wanted to hear, get millions of dollars. I think at argument, somebody will make the argument, oh, but they're good businessmen. They got a bunch of money, so that makes them good businessmen. But we're halfway through the story. OpenAI is very likely going to be one of the first big casualties of the AI bubble when it drops. They're likely to get gobbled up for a song by one of these bigger, larger companies. So I don't think you can say, oh, but he was a good businessman because I don't think we've really written the book yet. He was good at telling people what they wanted to hear, and he was good at accumulating a bunch of money. Those are skills, but they're not necessarily ethical or valuable skills.
Speaker 2:
[30:34] There's still several directions that this story can go in because we have not reached the end of it. As Altman has even acknowledged, there is very likely an AI bubble, and the question is just when it pops, and we see what is going to happen from here. There was even a quote in that story from a senior Microsoft executive, not named, saying, I think there's a small but real chance he's eventually remembered as a Bernie Madoff or Sam Bankman Fried level scammer in reference to Sam Altman. It's like there are even people in the industry who recognize what this guy is. But as you say, the story really shows how effectively he has played people throughout his career in order to, to advance himself and in order to enrich himself and to empower himself. Whether that is going back to his first company, Looped, whether that is at Y Combinator and how he used that position and eventually was forced out. Obviously, there are multiple versions of the story of what happened there. Then seeing, of course, how things have played out at OpenAI and how he has used that as a vehicle to expand his power and influence, not just in the tech industry but throughout the world. He has been very successful at using everything available to him, his network, his relationships, to his advantage. But there's a real question as to this company that has delivered him this degree of influence and power that he has now, the degree to which he has stretched the truth and lied in order to get it to this position. What is going to happen once the air goes out of the room and everyone sees that a lot of what he claimed was not accurate, not true, and I think a lot of people are already realizing that. But it feels like it takes a bit of time for the markets to catch up or further to be a certain event that causes there to be a shift and all of a sudden, everyone accepts or realizes or admits that they knew something was wrong, but they didn't want to say it. Like with crypto, when it all imploded all of a sudden, obviously, there were critics throughout and I think the critics had a bit more of a voice in the crypto time and were taken a bit more seriously, depending on where you're looking and whatnot. But there were still a lot of people who bought into it, who echoed the stories, who didn't want to be too critical, too questioning until it became clear it was all a load of bullshit that was imploding. Then all of a sudden, a bunch of people were like, oh yeah, I always realized that there was something wrong here.
Speaker 1:
[33:10] Yeah, it always tracks back to the money, right? People are still making money off this hype cycle. We're still at the front end of it. I saw an announcement today that Allbirds was pivoting. The Allbirds, the shoemaker was pivoting to AI and their stock jumped 300 percent. So it's pretty clear we're still on the front end of this. On the other end, when people start suffering and all the people, all the tech guys that benefited off the front end of the hype cycle have exited their investments and all the public is eating the losses on the other side of it, then you'll see a renewed sense of skepticism and see a lot of those same access reporters like Swisher saying, I saw this coming all along. I warned about Sam Altman from the beginning. I warned about, she did that with Elon Musk too. I warned about him from the start. But clearly they did not. Clearly they were part of the mythology making that was required for them to make all this money on the front end of the hype cycle. So I think.
Speaker 2:
[33:58] Vera feels with Swisher that it was almost like she had this breakdown with Elon Musk in the relationship. It totally fell apart and she needed like somebody else to fill that position or that void or whatever you want to call it. Altman was there and ready to become her new favorite pupil or her new favorite CEO. She has done events with him. She has very clearly boosted the company, talked about how great he's doing. I remember when she was promoting her book, she did an event with Sam Altman where she both tried to present herself as a truth teller who asked the hard questions and holds the industry to account while here's a major tech CEO pumping up your book and telling you how great you are.
Speaker 1:
[34:47] Gonzo Outsider. I don't think it's entirely malicious. I think a lot of these folks really do believe in the innovative impact of technology. They want to believe. I can't always fault them for wanting to believe, but it's still journalism. It still requires skepticism and doing the hard work and realizing and having a self-awareness that you're too close to your sources. Holding these events with the top billionaires. What also frustrates me is there's so many people in desperate need of platforming and elevation. There's so many academics, people doing real scientific work, so many engineers making cool shit out there. But most people can't name a single Musk company engineer outside of Musk. Most people can't name a single OpenAI engineer outside of OpenAI. So if you're going to be a journalist, why not spend some time elevating those folks? If you really are that in love with technological innovation, why don't you spend a little more time with the people actually doing the work, and a little less time with the extraction class of masthead weirdos that we've plunked at the top to accumulate money? Because I think, again, what it shows is that the interest is in the money, it's not in the tech. I see that time and time again.
Speaker 2:
[35:54] Definitely. You were mentioning before that there are people still making money off of this, and that's part of the reason why it hasn't imploded, and there are certainly people waiting to make their money or to cash out or what have you, and that is part of the reason that this is trying to sustain. I feel like we've seen this in the past where you look at the attempted WeWork IPO, and there were people really trying to get that over to the finish line before it all fell apart, so they could make their money and it just didn't make it there. Whereas there are a lot of other ones that do get it there only to crash afterward, because the people who want to make their money have finally made it and it doesn't matter anymore. I feel like this discussion of an OpenAI IPO is really important to these conversations, and to the question of where the AI bubble or the AI industry or all this money that is infused in generative AI is actually going. Because I guess part of the question is, if they are able to cash out, if OpenAI is able to get to the IPO, then is there less incentive to try to keep this thing going in the way that it did before, if they can make their money and get out at that moment?
Speaker 1:
[37:08] There's no logic to any of this. As you see with the Tesla stock failures, there's no real same logic driving any of this. So they're going to ride this thing into the earth. They're just making money and they're going to ride it into the earth until they can't, and then they're going to move on to something else like quantum computing. They're going to do a new thing that comes out and they're going to ride that. These are not people that care about tech. They don't care about people, a lot of them. They care about money. That's what they are. It's not subtle. As somebody who writes a lot about consumer protection and government regulators, I think there's a real risk here that we have an AI bubble hitting us at the same time that all the rampant Trump deregulation and gutting of regulatory safety agencies and consumer protections. I think we're in for a really rocky next five years, and I think the AI bubble is really going to be only one small part of that. But then again, for people who've waited for the Tesla stock to collapse, it's amazing how long this is taking to really flesh out and to see any sort of real world accountability for any of this. So I'm not going to get into the business, I don't think, of ever predicting when it'll happen. I just know that this is not sustainable when Allbirds is pivoting to AI, and all the CEOs are weird white supremacists with heads full of cottage cheese. That's not a sustainable vision for anybody.
Speaker 2:
[38:19] Yeah, when I saw the story about Allbirds pivoting to AI and how it caused the stock to jump, one of the first things that came to my mind was when all the companies were making metaverse announcements and saying that they were planning to do things in metaverse a few years ago.
Speaker 1:
[38:36] Yeah, that's another perfect example of what I've been talking about. The pivot, the $83, $85 billion that Mark Zuckerberg spent on pretending to be interesting and just the money they burned. They just burned through money to create the most derivative, just the most uninteresting VR. They really thought, he was really confident that he could just basically buy domination of the entire video game, AR, VR industries. The tech press was right there with him. They're like, yeah, super exciting boss. That's really innovative stuff. You saw some skepticism of the metaverse stuff, like this looks stupid, the legs aren't there. But they were perfectly happy to accept the company's rebranding effort. That whole meta rebranding effort came as they were lodging privacy scandals and the press was really happy to sell the idea that Mark Zuckerberg really was revolutionizing work and play, right? That this guy who hadn't innovated, I don't think he's innovated in 20 years, right? Like it takes some skill to create such a large ad monopoly, I'll grant that. But I don't think he's done anything interesting in literally 20 years. But you could pluck, again, any of a million stories from the news wires about meta or Facebook. You won't see any of them mention the fact that he fails all the time. The stuff he makes isn't very light. They get very excited about the meta-ray-band glasses.
Speaker 2:
[39:55] Oh, my God. Yeah.
Speaker 1:
[39:56] You'll see them talk, well, at least this is selling well, but I never see anybody wear those. Now, they've got the reputation as the perv glasses that people are using to stalk women. It's stunning to me with all that's happened and how clear and obvious all the failures are that basic journalism. I'm not just talking about business insider, Fortune Forbes, there's certain websites that are mostly purely business or they're written for an MBA to feel good about themselves. But I also see this stuff across Reuters, I see in the Associated Press, I see it in main wire magazines. They do the same thing where they avoid contextual history, they can't be honest about the thing they're writing about. It's like they're afraid of offending anybody. They don't want to lose advertisers, they don't want to upset sources, they don't want to upset ownership. So you get this weird simulacrum of journalism, you get this weird, it's like a Ken doll of journalism where the genitals have been sanded off to create a smooth hump so that nobody gets offended. That's a gross initialization that I come back to a lot. But that's what I think of when I think of these outlets. It's like they're journalism but they're just like dull, bland garbage designed not to offend anybody that's just not really useful. I think you'd be better off going and watching The Muppet Show for a half an hour and you'd probably come away better informed than most of this stuff. It's just not good. It's not interesting. They're not really interested in tech. I think it's embarrassing. I think we have to really, I'm hoping this era ends with some sort of Renaissance in terms of like, all right, let's get back to focusing on people and what matters and building interesting stuff and push some of these VC ghouls off into the periphery where they belong. I think they've really dominated the discourse. It's gross.
Speaker 2:
[41:39] Yeah, I think so too. I'm not going to be able to get that image out of my head now for the next little while.
Speaker 1:
[41:46] Yeah, my apologies. That's what it is. It's like a bland, inoffensive fake journalism because they're so afraid of offending people. Not offending people, but losing ad revenue, losing clicks, offending sources, defending the ownership, and you see it all the time.
Speaker 2:
[42:03] Oh, definitely. I just wanted to go back to what you were saying about the bubble and the moment that we're in. I agree with you. I don't think there's much value in trying to predict when it is going to implode because I think that there have been multiple moments so far where it's looked like, oh, it has to go right now. It has to be on the cusp of finally imploding, but it keeps going because there are many reasons for it. There's a lot of government money going into AI. There's still a lot of hype around it. There's still a lot of people hoping to make money off of it. There's still a lot of reason for the companies to keep sustaining this. As you say, could this be like a Tesla-like example where despite all the realities that should cause the valuation of this company to be far less than it is, it just keeps being sustained because so many people are financially invested in keeping that way because they would lose a lot of money if it went in the other direction. Yeah, I think we still need to be watching where this is all going. We still need to be understanding how the company is operating and what's happening there. But in terms of predicting when an implosion is going to happen, it's not going to be nearly as easy as, say, as with cryptocurrencies and things like that. No.
Speaker 1:
[43:21] I think a lot of it will be tied to Trumpism's fate, and the MAGA contingent's fate. As he loses power and wanes, there is going to be a resurgence of an interest in rebuilding the public trust, I think. I don't believe we're stuck in a permanent cacostocracy of just idiots and dipshits running this country on the ground. I won't allow myself to believe that that's just going to stay a permanent fixture. Maybe I'm deluding myself. But I do think eventually on the other end of this, there's going to be a need to recognize that regulators are important. Real engineering is important. Real science, foundational science, investment, grants, restoring public media, restoring trust in cornerstone institutions. I do think eventually we're going to hit that point. And I don't think these huge idiot bubbles are going to be useful at that point. And I think you might see a slight shift away from them. I mean, America is what it is, right? You're foundationally always going to have what you see around you, which is just this mad obsession with wealth and artifice, right? I think we're a country that's really in love with artifice. The illusion of smarts, the illusion of class, the illusion of power, and I think Trump is perfectly representative of that. And I think as MAGA wanes, I think you'll start to see maybe some sea changes. But again, we've been waiting for him to wane for a decade. So again, it's hard to get into the prediction business. But historically, based on historical evidence, I don't think we're in this permanently. And I'm hopeful, again, that there will be some sort of Renaissance that takes us in a better direction here, a little bit over the horizon.
Speaker 2:
[44:56] I think it's very important to hold on to that kind of hope, that things are going to get better instead of just believing that, okay, everything is bad now and it's going to stay bad and nothing can ever get better. That is almost like a self-fulfilling prophecy then. You need people to have hope and hope that's grounded in something, that things are going to get better and that is possible, and that there are things that can be done in order to realize that.
Speaker 1:
[45:22] Yeah. As a reporter, I wouldn't wake up in the morning if I just thought there was no point to this. It was just going to be a slippery slope down into more just a perpetual horizon of Elon Musk selling me gibberish. I don't think I could make it through the day. No, I refuse to believe that. I do still hold out a hope that humanity has a better heart than what we've seen in the last decade or so.
Speaker 2:
[45:43] Absolutely. I feel the same about climate change too. It's like things are looking bad, but we need to still have hope that we can actually do this and turn things around and address this problem.
Speaker 1:
[45:53] The young people I meet don't have a choice. They're like, we don't have a choice. We have to live in this world. We're going to try to build something better. They're just, for those of us who have seen several incarnations of this level of stupidity over the last 20, 25 years, it's a much different viewpoint from a kid who's coming up into this and wants to build something, not just wants, but has to build something better for themselves. I think we can do it. It would be nice if we had a journalism that was capable of telling people the truth as a backstop, and we had a resurgent interest in funding education properly. That might be helpful.
Speaker 2:
[46:24] Yeah, instead of replacing teachers with chatbots or something.
Speaker 1:
[46:27] Right. Exactly right. Yeah.
Speaker 2:
[46:29] I did want to ask you, we've been talking a lot about Altman, we've been talking a lot about media coverage of the tech industry more broadly. As I was saying earlier, it does feel like there is a public swing against the tech industry, against AI, against these major billionaires, especially as they have cozied up to Trump and become implicated in that kind of politics. I feel like that is represented in a way by these recent attacks on Sam Altman's house in San Francisco. There was one initially with a Molotov cocktail, and then there was another one a couple of days later. It was surprising to me, I feel like it was less surprising that one happened. Because there was a couple of months ago, there was a story about a guy in, I think, Tennessee, who was looking to bomb the XAI data center, and he was caught before he could actually put the bomb together or try to do it. And so I was like, okay, this stuff is out there, right? There are people who are angry about this, who want to do something about it. And so when I heard that Altman's house had been attacked, I certainly figured that or felt that this was like an escalation from where we have been in the past, obviously. But then to see just a couple of days later that it had been attacked again, this was like, okay, this really feels like something is shifting now, if this can happen like multiple times, if this is seeming to become more regular. But I wonder what you think about that and what we've been seeing there.
Speaker 1:
[47:54] The rage against AI is white hot right now. It's wild to go out there and look at just how pissed off people are. And I don't think they're just pissed off at AI, although they have very good reasons to be pissed off at AI. With the energy consumption in the climate era, the attack on labor, people that are dictating the coordination and trajectory of AI are generally terrible people. But I think a lot of the rage is just at the extraction class. We've had a decade of this. We're culminating Trumpism, which is a grotesque bulbous extraction class icon. So yeah, I'm not surprised. Sam Altman has made constant promises that Gen Y should be really happy that they're entering the market right now. And Gen Y is entering the market, and there's no jobs, there's very little hope, and I understand why there's anger. I understand it completely, and they should be angry. This country has been hollowed out by corruption completely. I think the young generations can see it much more clearly than older generations can, and they have every right to be pissed off, and I understand why they're pissed off. And it's not just, you know, the tech industry lied constantly for the decade preceding this. AI was just like, this is more recent, AI is more recent. These are companies that spied on everybody constantly, lied about everything constantly. Facebook was grotesquely just growing into foreign markets with no concerns about whether their information platform was causing genocides. You know, Facebook was also big in going into India and trying to dominate the entire internet. They would offer like a free version of the internet that was, you know, if you remember the free basic stuff that they tried to offer. They would offer free internet that was, you know, curated by Facebook where you could only access certain websites. These companies just engaged in repeated terrible behaviors at impossible scale, right? And then when authoritarianism came to town, they immediately dropped everything and cozied up to them completely, like, oh, we're not going to engage in content moderation of your racist, bigoted propaganda and the internet. Sure. You know, they immediately cozied up to authoritarianism. So I don't understand why anybody would be surprised that the public is angry about this, right? These are these are vile people. The authoritarians in charge are vile people, and they've completely thrown all their ethics in the toilet to partner up. So yes, people are going to be mad about that. And I don't think this is going to be the end of it. I think it's going to escalate. I think there's been a top-down class war going on from the extraction class downward, and I think the public is starting to wake up a little bit to that. And it's going to take all sorts of colors and shapes and forms, some of them violent, some of them not, some of them smart, some of them dumb. And I don't think it should surprise anybody. If you've paid attention to US history in the last 10 years, I don't think that people are violently angry, should be remotely surprising to the billionaires who have a lot of responsibility for that anger.
Speaker 2:
[50:39] Part of me does wonder what it does to their paranoia. Like, if you look at someone like Elon Musk, it's clear for some time that he has really been paranoid about his safety and the way that the public sees him. I remember a few years ago, there was a story that someone had been tailing a car that his son X was in and he felt that it was like someone who was looking to get him because he thought Elon Musk was in the car and blah, blah, blah. It came out later that it was like a grime stalker that was just trying to, I guess, find grimes or whatever. But it's like he is someone who has surrounded himself with security. He has created a vehicle like the Cybertruck that seems designed to be impenetrable to protect him from the threats that he feels are outside. It feels like this mindset has taken over a lot of these tech billionaires where they have very much insulated themselves from the world that exists around them, in their closed off communities with their securitized vehicles, in their private jets and their exclusive terminals and their exclusive areas where they go, so that they don't need to interact with the public. Part of me wonders seeing actual attacks against tech billionaires, and of course, Sam Altman isn't the first one, whether this further pushes them into the arms of the security state and supporting the repressive crackdowns on people's rights and things like that.
Speaker 1:
[52:12] Absolutely. Yeah, that's what's absolutely... They're going to use it as evidence of the fact that the AI detractors are hyperbolic weirdos and zealots, and the more violence there is, it is kind of productive to them, to any kind of progress because they will just treat that... You see it already, they will just treat this as outliers. These radical extremists don't understand what we're building here. We're innovators. Why are they so unreasonable? You shouldn't listen to what they say. That's going to be the play. I mean, you already saw it in his responses to The New Yorker article, right? You'll notice with Sam Altman pretty consistently that he never really takes ownership of anything he's done. You'll see it in headlines especially, like he tries to do this thing where he's not personally responsible for anything that's happening. He pretends to be on your side a little bit, right? His company failed to create functional suicidal ideation guard rails and his chat bought it, but he won't honestly own anything. Legally, for many reasons, he can't. But yeah, I think they will absolutely take violence as an act to harden up their security. I was surprised his house was that findable to begin with, quite honestly. I would have thought that a lot of these guys would have built underground compounds long ago.
Speaker 2:
[53:18] He does have one of those as well.
Speaker 1:
[53:20] Yeah. I know Musk does. I know Zuckerberg does. I think they're only going to harden. These are not people that are empathic and open to the plebs' concerns. You know what I mean? If they were, we wouldn't be in the situation. I think they're just going to get harder. They're going to get tougher. They're going to get more militaristic, and they're going to increasingly frame critics of what they're building as outliers and radicals. Much like the authoritarians do with any criticism of them, they're ideologically going to align. But in a country that's just desperate for infusion of money into functional resources and infrastructure, I don't think it's going to be a sustainable project for them to continue just being purely extractive. But we'll see. It's going to be an interesting decade here. I'm just not entirely sure what the other side of this looks like. Just as a curious person, it's fascinating. It's a terrible time to be alive in many ways. It's a very stupid time to be alive every morning you wake up, and it's just the untold horrors are very dizzying. But if anybody that's interested in history or human beings, it's just I find it fascinating from stunned to stirred.
Speaker 2:
[54:29] Yeah. No, it absolutely is. I feel like we already see that with, say, Mark Hendryson's Techno-Optimist Manifesto, where he's very clearly calling out the enemies, the Luddites and the Communists and the safety people and all this stuff. Then the long-termist ideas that these people echo where it's very clear that they don't seem to see much value in the lives of regular people.
Speaker 1:
[54:51] No. I always like that they hate the humanities so much. Elon Musk and all of these guys hate the humanities, but if they bothered to study the humanities, they wouldn't make such stupid decisions all the time. They're just so mad. The humanities shouldn't exist. We should eliminate the humanities. But if they'd actually understood the novel that they just read, they wouldn't have screwed up so much. They're a very weird sect. It is a religious sect of people obsessed with money. Tech again is incidental to their goals. It could be any other industry. It just happens to be tech right now because that's where all the investment money is. But in another era, it could easily be a different industry that they use as their spearhead.
Speaker 2:
[55:31] Definitely. I feel like we see the religious angle of it more and more with the AI moment and stuff. Yeah.
Speaker 1:
[55:37] Who is that CEO, Karp?
Speaker 2:
[55:39] Yeah, Alex Karp.
Speaker 1:
[55:41] His statements are just so bizarre and radical. Yet again, you'll read an article about him the next day from a reputable news organ. It's like he didn't say anything insane at all. It's like, oh, what wonderful little nuanced, exciting innovation is he building for us? And then there's no reference to the fact that he's documentally quite mad.
Speaker 2:
[56:00] Yeah. That wonderful innovation from Palantir that we love. Karl, I have one final question for you before I let you go. It's like a two-part question, I guess. But we've been talking about the way that the media reports on the tech industry, and we've been talking about Sam Altman in particular. So I wonder, do you think that this renewed attention on Altman's predilection for lying, along with the way that people are clearly turning against him, do you think that this is going to have actual real consequences for this man who does seem to have been so empowered these past number of years? Do you think that there's any prospect or hope of the media really changing the way that it reports on this industry as public opinion keeps seemingly turning against it?
Speaker 1:
[56:48] I think there'll be accountability for him eventually. I think this stuff adds up. I remember that article a few years ago that said that, I remember the board said this and then it happens again because he's not going to change his stripes. These guys can't change their stripes. They're going to keep making these mistakes and each time, I think it adds up in the public consciousness that, oh, these are not reliable narrators and I can't really trust them. On the short term, I don't think anything changes for him. You've got that IPO looming. I think there's going to be a lot of hype. There's going to be a ton of CEO set of thing journalism about both the SpaceX and the OpenAI. IPO is going to get dumber, I think, for a while, still here. But I do think, ultimately, there will be accountability when the money crash happens. And especially if the US economy really tanks out and people really struggle with this expensive gas and expensive services, I think that could potentially get much worse with the destruction of our regulators, like I said. So I think if they are confused by the anger headed their direction right now, I'm not sure they've seen anything yet. I think it's going to get worse if people's independent direct immediate realities are more painfully impacted, which I think they will be. As for tech journalism, I think we have to untether it from corporate power and advertising. I think that's a priority. I think if we're building anything in the new age, we have to publicly fund and crowdsourced journalism. You have to stop giving money to shitty corporate outlets and start giving money to independent reporters, start giving money to worker-owned news outlets. I think we have some control and agency over what we're sharing, what we're consuming, what outrage bait we're clicking on and retweeting or reskating. I think it's important that the public develops a sort of immune response to bad actors and trolls. So there's a lot we can do to reshape things. I think it's possible to rebuild media, but right now it's not looking great. The layoffs have been immense. The consolidation is immense. You've got deals like the Warner and Paramount mergers. Post-Trumpism could be something else, but it's a functional media that serves the public interest. It's something we're very much going to have to fight tooth and nail for in this country. Because it's very clear that corporate power does not want an informed electorate. So that's where we stand. It's an ugly landscape, but I like to believe that winning some of these fights is possible.
Speaker 2:
[59:04] Yeah, like we said before, I think we need to hold on to that hope, because that is really important. And otherwise, I think there's no prospect of even trying to have those wins, right? We need to believe that it's possible for it to become possible. Karl, it's always great to talk to you and to get your insights on all this. Thanks so much for coming back on the show.
Speaker 1:
[59:21] Yeah, I always like to talk to you and I appreciate all the work you do. Thank you.
Speaker 2:
[59:26] Karl Bode is a freelance reporter and writes The Fine Print newsletter. Tech Won't Save Us has made a partnership with The Nation magazine and is hosted by me, Paris Marx. Production is by Kyla Hewson. Tech Won't Save Us relies on the support of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters and help us meet our goal for the show's sixth birthday by going to patreon.com/tech Won't Save Us and making a pledge of your own. Thanks for listening. Make sure to come back next week.