transcript
Speaker 1:
[00:01] This episode is brought to you by Caraway. Now, if you're like me, you care deeply about what you put into your body. But the longer that I've been on this health journey, the more I've realized just how many toxic things end up in our home environment, not just microplastics and food additives, but in all kinds of consumer products, cosmetics, even our cookware. And this is where Caraway comes in. Caraway makes beautifully designed and elegant non-toxic cookware, free from PTFE, PFOA, and other questionable chemicals. And honestly, once you make the switch, it's pretty hard to go back. I've been using their cookware at home, and what stands out is how easy everything feels, food releases cleanly, the cleanup is fast, and they come in a bunch of cool colors to match your kitchen aesthetic. Caraway has over 100,000 five-star reviews for a reason, and now it's your turn. Their cook set is a favorite. It can save you up to $190 versus buying the items individually. Plus, if you visit carawayhome.com/richroll, you can take an additional 10% off your next purchase. This deal is exclusive for our listeners, so visit carawayhome.com/richroll, or use code Rich Roll at checkout. Caraway, non-toxic cookware made modern.
Speaker 2:
[01:24] I'm not worried about AI destroying humanity. I'm worried about Sam Altman running an AI company that he will lead to destroy humanity. If this technology goes wrong, it can go quite wrong. Today, we're speaking with Nick Bilton, former New York Times columnist and Vanity Fair special correspondent. I was a reporter in Silicon Valley covering tech, and I do believe that the tech elite are evil. The fear that we're all going to be killed by AI is actually part of their fundraising. You've got Sam and Elon and all these people out there being like, we're going to die, we need more money. The bucket loads of cash come in, and it's all nonsense.
Speaker 1:
[02:02] They've crafted a certain public image tailored to what people seem to like.
Speaker 2:
[02:08] I also think that there's another thing. The biggest lesson I've learned in my life is that, Nick, it's so great to meet you.
Speaker 1:
[02:17] Thank you for doing this. I'm so excited to talk to you.
Speaker 2:
[02:19] Thank you for having me. This is very, very exciting.
Speaker 1:
[02:22] You're somebody who I've been reading for so many years, at least going back 10 or 11 years with your reporting in the New York Times. And you've covered Silicon Valley and the tech titans. And there is an obsession that we have with these billionaires, the technocratic class, these people who are lording over the devices and the apps that we're all using every single day. And these are people that you have spent time with, who are also, for the most part, masterful storytellers and myth makers. What can you share about what you know about people like Steve Jobs or Jeff Bezos or Zuck or these people, that is a common thread with respect to their myth-making superpowers, that we'd all be better off for understanding?
Speaker 2:
[03:21] Well I do think that there is a through line. They're all obsessed with their own self-image, their legacy, and they are all obsessed with telling a story. And you can see that through one very simple metric, and that is how many people are on the communications teams of these companies, and hundreds and hundreds and hundreds. Ironically, Elon fired all of his communication folks because he could do it better in his mind, but they are always telling a story. And look, I mean, Elon's a perfect example. Let's take The Boring Company for a little storytelling lesson here, right? So, he was driving to work one day. There was a bunch of traffic. He tweets, this is years ago, he tweets, there should be tunnels under the ground where you can drive so that there's no traffic in LA. The next thing you know, he gets to the office at SpaceX and he's like, the tweet's getting a million responses. And he's like, let's start The Boring Company and we're gonna build these tunnels. I went to SpaceX for a meeting a few months later and there was a dirt patch across the street with one of these big tunnel boring machines. And I said to someone I was meeting with, I was like, what is that? And he's like, oh, it's just, it's like a little show, Elon does to show that he's gonna build these tunnels underground, it's never really gonna happen, you know? And yeah, they've done some, but this was years ago, right? They never built, there are no tunnels under LA. I drove on the freeway to get here.
Speaker 1:
[04:59] There is that sort of small network of tunnels under Vegas, right? Yeah, but it doesn't really seem to alleviate the traffic problem, it's sort of a novelty.
Speaker 2:
[05:07] It's also incredibly claustrophobic, and people have panic attacks in them all the time. But the point is, that's a great story, you know? And when you look at these guys, they do this over and over and over again. And I think that they, and then they start to believe their own story. You know, I always, I got to spend, as a reporter covering tech, I spent time with all of them for very long periods of time. And there was always this, you know, you could see this thing happen. Like, I knew Zuck and Jack Dorsey and all those guys, and Kevin Systrom, all of them, when they were just starting out. And then there's this kind of shock of like, whoa, how'd that happen? I'm a billionaire. And then there's this like, I'm a billionaire, and I'm the one that makes the decisions and screw you. And they just believe the story that gets told about them, or the story they tell about themselves. I mean, I wrote the book on Twitter, and one of the greatest quotes someone gave me was, the greatest brand, sorry, the greatest product that Jack Dorsey ever made was Jack Dorsey. And I think that sums up a lot of these people in Silicon Valley. They create this product that is themselves, often taken from Steve Jobs.
Speaker 1:
[06:28] And his reality distortion field.
Speaker 2:
[06:30] His reality distortion field. And I have a funny story about that. And he, and then they believe the story. And then sometimes the story breaks them, which is what happened to Bezos.
Speaker 1:
[06:41] So tell that Jobs story, because then I have a follow up related to Jack and these other guys.
Speaker 2:
[06:47] So, when I was, when I was first thing out, starting out as a reporter at The Times, I get a, I was writing a story. It was about the iPhone and something related to it. That's not important. And I was writing a story about it. And I called Apple PR and I said, hey, I'm writing the story and, you know, and the PR woman says give me a second. And then puts me on hold and says, Steve's going to call you in a little bit. So, the guy who ran the PR department was this guy named Steve Dowling. So, I thought it was Steve Dowling who was going to call me. And, and instead she was like, no, it's Steve Jobs. And I was like, Steve Jobs is going to call me? And, you know, I'm like a young reporter at The Times. I'm like, is this how it works? So, I hung up the phone and he didn't call. And I went out, I went out for dinner that night. And I get this weird 415, whatever it was, phone call. And I answered, had a couple of drinks. And it was like, hey, it's Steve Jobs. And I was like, oh shit. And so I like go outside, I'm trying to take notes on my phone. He was on the phone with me for 45 minutes an hour. He wouldn't get off.
Speaker 1:
[08:04] Wow.
Speaker 2:
[08:04] And he was telling me that my story was wrong and this, that and the other and so on. And I remember I went in the next day and I wrote the story the way I had been told by him and I published it. And then afterwards I was like, wait, I wasn't wrong.
Speaker 1:
[08:20] You just got schnuckled.
Speaker 2:
[08:21] I just got completely played. And John Markov, who had been a tech reporter for many, many years at the time, said, he came over to me and he said one word, he said, no, his three words, he said the Steve Jobs reality distortion field, so four words. And he said, I was like, what do you mean? He goes, he's every time, it's what he does. He's like, he can change the way you feel about something. And it happened, I interacted with him a lot, and it would happen every time. And I got good at realizing, oh, the reality distortion field is coming on, and then he stopped calling me because I wouldn't listen to him anymore. But he was a master at making you believe the thing that he wanted you to believe. And I think that people, they looked at him, a lot of these tech titans like that's who I want to be when I grow up. And people, and what's fascinating from a storytelling perspective is they chose which part of him they wanted to tell us their story. So the assholes picked the asshole and made themselves more justifiable. The product people picked the product. And I think he was largely more influential in the way these people think about themselves than anyone in human history, quite frankly.
Speaker 1:
[09:40] Yeah, these people who become billionaires and it has this warping effect on their perspective on the world when they can literally basically create seismic shifts in society by dint of a simple decision and understanding that story is more important than truth and being able to wield a story that is in service to whatever that person wants people to believe can literally change everybody's perspective on that person dating all the way back to like Jack Dorsey's origin story with Twitter. I had a version of that kind of reality distortion experience. I had Jack on the podcast. This is very early on. I went to his house. I basically spent like the better part of a day with him. And I left that experience just enamored with this guy. And I was so charmed by his charisma and his warmth. And I thought, if anybody should be running Twitter, it's this guy. This is a guy who has created a world in which he carves out time to think deeply about these difficult decisions. And I just couldn't imagine, like I couldn't understand why people were having such issues with him. Because I had been, you know, I had like gone into his environment and, you know, like sort of marinated in the Jack Dorsey experience. And now looking at kind of like where Jack is now and these other people, you know, I sort of reflect back on that and was like, was I played? Was that genuine? I still don't know. I like to see the best in people and believe the best in people. But I can't imagine what it's like to be a billionaire who is wielding such unbelievable influence on the world. And I think what happens is with that wealth and that power comes this sort of sense that, well, you were so successful here that you must be right about all these other things, like this galaxy brain kind of mentality that takes over, that then spills into perspectives on politics and culture and all these areas that have nothing to do with being a successful entrepreneur necessarily.
Speaker 2:
[11:55] Well, you just summed up everyone in Silicon Valley. Like every single person who is successful, who is a billionaire that I've ever met and have met all, they all think, oh, I'm the billionaire who is this successful, who clearly knows what they're doing in this arena, so I know what I'm doing in every single one. And I could cite a million instances where people, I'm the world's expert on real estate, I'm the world's expert on COVID-19. And it's like, and then people like Elon, like Sam Harris is a good friend of mine, and he got into a debate with Elon about COVID-19, and Sam proved to be right, and Elon stopped talking to him. And that's the mentality. If you're not in my yes man circle, then you're out. And so it becomes this reinforcing-
Speaker 1:
[12:48] Yeah, self-reinforcing cycle. And all the way to like, hey, is democracy really working? Maybe we should go back to some kind of monarchistic society. Like when you see this dark enlightenment movement and these very influential people like Naval Ravikant, and the kind of these gurus of Silicon Valley who have such outsized influence as a result of becoming successful in technology, that is deeply concerning to me because they do wield so much influence. And if these people say something online, like people pay attention and listen, and kind of now here we are being lured into the web of kind of these crypto state recessionists, where everything is in question, and these people know better than any expert in their respective field.
Speaker 2:
[13:48] Yeah, and I think that, like Jack Dorsey is a perfect example, because Jack, as I researched with the book, he changed the story of how Twitter was founded hundreds of times quite literally. Like it was like, at first, he came up with it on a slide in a park with Biz Stone, and then it was, oh, actually, I came up with it when I was a kid in my bedroom watching police scanners. Oh, actually, it was in Oakland when I was a babysitter with blue hair, and it all changed. The reality is, he came up with one slither of the idea, as did other people, specifically Noah Glass, as the book talks about, and Noah was written out of history. The same thing is true with Square. Jack didn't come up with Square, a friend of his did, and he took the idea and so on and so forth. We just saw it recently. Square just laid off half of their workforce, almost half of their workforce, and Jack cited it as AI. Well, actually, if you go and look, and you actually dig a little further than, oh, almighty tech guru is way ahead of the curve. His employee base has doubled in size in the last few years for no reason other than it's not run well. And so the narrative was, let's get rid of half of the employees and say we're doing it because of AI. It's literally back to what it was a few years ago.
Speaker 1:
[15:20] It was just management, poor management decisions.
Speaker 2:
[15:23] And the stock went up because, and another example, look at the Tesla stock. Like Elon can talk about how he is gonna have a million taxis in two years, and there's like 12 of them.
Speaker 1:
[15:40] Yeah, it doesn't matter if it's true or not. It never bears out just like the boring company story. Like he can just say, oh, we're gonna have full self-driving by this day and he can move the stock price immediately. And every time you look through the rearview mirror six months later on things that he says, like these things never end up bearing out, but it doesn't matter. And this is mirrored in the Trump administration. Like it's just rhetoric and just move on. And there never seems to be any kind of reconciliation or accounting for these statements.
Speaker 2:
[16:11] What I think is so-
Speaker 1:
[16:12] Even like self, like a normal person would think, yeah, I say these crazy things and they never happen. Maybe I should reflect a little bit on my relationship with truth and veracity. And like, what does that say about me? Like these people don't do that. It doesn't matter. They don't care.
Speaker 2:
[16:28] The thing that I think is so shocking to me, and I do not understand it. I don't know if you have an answer. I don't understand. We are in a world where you could, for the first time in human history, you can have a conversation with machines that can tell you what is right and what is wrong. And for the most part, with the exception of hallucinations and so on and so forth, they can actually detail it. So when you go and ask Claude or Gemini or Chachi, whatever you talk to about the history of squares, employees and the number and so on and so forth and revenues and blah, blah, blah, it will show you. But no one asks these questions. If you go and ask, is it realistic that Tesla is going to have a million self-driving cars in this period of time, or humanoid robots in your house and so on and so forth? But no one wants to do that. They just say, oh, that's it, quick. And the stock goes flying through the roof and everyone's like, all hail Elon. And I think it's like, and Mark Zuckerberg too, with social media and it's just to me, what I don't understand is why don't people ask questions? Why do they just take it as gospel?
Speaker 1:
[17:43] I think it has something to do with the power of storytelling. Like take Zuck, these people who are so image conscious, they've crafted a certain public image tailored to what people seem to like that then becomes, you know, like enamoring to people. Like for some reason, and maybe this is related to like monarchism, like, oh, a leader that I can look up to, and I believe in, and will fuel my cognitive dissonance. It doesn't matter what's true. Like, I like that guy. And for some reason, my affinity for this person makes me feel like I'm in proximity to them.
Speaker 2:
[18:22] I also think, I think you're completely right. I also think that there's another thing. And if I were to say the biggest lesson I've learned in my life as a professional all over the last 30 years is that people are just people. They all get headaches and they all have anxiety and they all want to be loved.
Speaker 1:
[18:47] And they all want to believe that they're good people doing the right thing and acting on the half of the betterment of society.
Speaker 2:
[18:55] And they want to believe that. And I think that what happens is because of media, people become more and more famous and so on and so forth. And we look at these people and we're like, it's why people get so nervous when they meet celebrities or they meet leaders. They're just people. I never get nervous when I go and interview someone or because I'm like, you're just a person. You probably woke up this morning and you didn't sleep well last night because your dog was sick or your kids. All these things that we all experience, but they make it look like it doesn't happen. And I think that so they create this bullshit narrative around who they are and what they are and that they're gonna live forever or that because I'm a billionaire, I'm an expert on everything and it's all nonsense. And I think once we get to a point where people realize it's nonsense, maybe that's when it changes, but I don't know if people want to believe that it's nonsense.
Speaker 1:
[19:49] Yeah, I don't see that happening.
Speaker 2:
[19:51] I think that they believe that these are the gods.
Speaker 1:
[19:54] Right, and now we have AI. So take open AI, for example, the storytelling around that and the cognitive dissonance with the titans that are at the helm of these respective AI companies who are obviously spinning a yarn and doing some very advanced storytelling around how we should think about this as they accumulate power and are participating in this race towards AGI, et cetera.
Speaker 2:
[20:23] Well, what's fascinating about the storytelling around AI is two things. One is the fear that we're all going to be killed by AI is actually true. Like I definitely genuinely worry about it, but it's also part of their fundraising. And so, you know, you've got Sam and Elon and all these people out there being like, we're going to die. We need more money to make sure we don't. And the bucket loads of cash come in. It's all for them. It's a fundraising mechanism. And the other thing is, someone said to me, I did a story for Vanity Fair a couple of years ago about AI and creativity and how it was going to replace jobs and so on and so forth. And one person said to me, a very clever thinker on this stuff said, I'm not worried about AI destroying humanity. I'm worried about Sam Altman running an AI company that he will lead to destroy humanity or someone else. And the reason for that is because they are so obsessed with being first in the story of the first person to create AGI that they put all the other things aside and the goal is, it's about them. It's not about the AI, it's about them as the leader of the AI company. And what's crazy is, wrote in Hatching Twitter, this was a white box on a screen that you could type 140 characters into. Look what it did to the world. It changed, it's the reason Donald Trump is in office. He ran for office many, many, many times before. It's the reason that we have all the culture we live in today and that was just a white box with 140 characters. Now there's this new box that we can type into that is way more powerful and we're all just doing it without thinking about what happened the last time and all these people don't care because they want to be the famous invention, the creator of the last invention.
Speaker 1:
[22:21] There's lip service to safety considerations, et cetera, but as far as I can tell, there doesn't seem to be a lot of evidence to suggest that any real efforts or deep work is going into ensuring that these things have guardrails on them and that they are in the public interest. It's just this race forward based on fear and lack, like, we have to do it before China, et cetera. It's important. I'm the one who's in charge. You should give your money to me. You should trust me. We have no reason to trust any of these people. And all the evidence suggests that this is going to damage us in ways that we can't even possibly begin to imagine. And yet it's so convenient and helpful in the short term and entertaining in an advanced way in comparison to Twitter and 140 characters allowed us to kind of be lulled into this world of social media that, you know, I think it's pretty clear to everybody right now has deranged us in ways we couldn't have imagined when, you know, you wrote Hatching Twitter.
Speaker 2:
[23:26] It's deranged us in so many ways. And it's, you see it with podcasts, funnily enough.
Speaker 1:
[23:36] Sure, because the incentive structure.
Speaker 2:
[23:38] The incentive structure, like Tucker Carlson, like the guests have gotten crazier and crazier. Megyn Kelly, the anger about, you know, fighting with people that she was once close with, or Ben Shapiro doing the same thing. It's like they're all, they all just come at each other. And then the, you know, the Pod Save America guys, they like go even deeper on the left because it's, the incentive structure is the more insane and intense and scary and the more I pull back the covers on the real thing, the more views I get and the more crazy we get as society. And it just, it just is this, this spiral.
Speaker 1:
[24:17] I think about this all the time. You know, yeah, podcasting has become all about like, they're lying to you or the number one expert who's gonna tell you the thing no one wants you to know. And unless you're platforming just kind of conspiracy-addled people or the person with the hottest, craziest take, the extremism across the board on all sides of the spectrum, you have no chance at garnering eyeballs and getting attention. And the incentives, the incentive structure is such that like if you want that, then this is what you need to do. And that's the dollars follow. And as somebody who's been doing this show for a very long time, from the very early days into now, it's like, where do we even sit with this anymore? There's a reason why I do this show. There's a purpose and a meaning behind it that I care about. And I see these other people and these other podcasts that have gotten really huge, and they're influencing the next generation of podcasters. And it has nothing to do with asking yourself, why am I doing this in the first place? What am I hoping to accomplish? It's just like, here's how you get big. Here's how you get attention. It's like attention for what aim, for what purpose?
Speaker 2:
[25:33] Couldn't agree with you more. And it's like, you know, I thought, I always thought that the debate about whether Hitler was a good person or a bad person wasn't a debate, that guy.
Speaker 1:
[25:43] You know, this is where we're at. And you know, from 140 characters to this, and people like, well, maybe we should have that conversation.
Speaker 2:
[25:50] Yeah, Joe Rogan's saying, Oh, that's interesting about Hitler. It's like, it's you're like, what are you talking about? It's like where, how are we, how would these things even a debate? Like, how are they even a conversation? Like there are way more important things that we should be talking about that are real. And and it's it all comes down to the incentive structure. And this, to me, is where, you know, I do believe that the the tech elite are evil, quite honestly. I think that they they know exactly what these things do. You know, Facebook knows there's like so much that has leaked about things that they know that are bad for kids and for society. And we, and the same with Instagram and Twitter and all these social platforms, and yeah, and TikTok. And they do nothing to try to stop it.
Speaker 1:
[26:48] Nothing.
Speaker 2:
[26:49] And to me, look, I don't know if there is a God in the universe, or there's something, but if there is, they're not going through those pearly gates. The things that they have done with intent, and the unintentional, but then done nothing about it. You know, look, I mean, look at Snap, for example. There was a period of time that Snap was, you know, the whole point was to, so people could have disappearing messages and they wouldn't, it was a great intention, right? We want to make it so that you're not posting things on the internet that can come back and haunt you later. And then fast forward to a year ago, the number one cause of death among teens in the United States is fentanyl. You know where one of the top places they buy it? Snap. And so, and there had to be a congressional hearing for them to even make any changes. Like these things, there are unintended consequences always, but our responsibility is to fix those. We never know what they are with technology. And then at the same time, there are consequences we know exist. One example is kids should not be using social media like this. It is just so unhealthy. I don't have it on my phone because I can't stop looking at it. And I'm a grown adult who's written about it for 20 something years. And these companies that hide all that data, I just think it's completely and utterly evil.
Speaker 1:
[28:11] As some of you know, I am in a very different season of training than I've ever been in before. I'm rebuilding slowly, intentionally after this spinal fusion surgery that I underwent this past May. And I'm learning what it means to be patient with my fitness and how to prioritize sustainability over intensity. And I got to say that Whoop, specifically my new Whoop 4.0 wearable, has been this just enormously helpful companion in this process. It's a screenless, wearable health and fitness coach that gives me personalized insights into my sleep, into my recovery, my strain, and my overall health, helping me to really understand what my body is actually ready for on any given day. And that awareness is what is helping me really stay focused and consistent, which is essentially everything right now. I do have some meaningful goals ahead. I am very intentional about getting back to pain-free running and hopefully lining up for the New York City Marathon and to celebrate my 60th birthday in the fall. And Whoop is helping me make the best decisions that are moving me the most expeditiously forward toward those moments with greater results and intention. So I would suggest that you check it out. Go to join.whoop.com/role for one month free of Whoop. Here is the dilemma. When you choose headphones, you usually have to decide. Do you want to be fully immersed in what you're listening to? Or do you need to stay aware of what's going on around you? Well, most earbuds force you into one camp or the other. But Shox has figured out how to bridge that gap with the new OpenFit Pro. It's their first open ear headphone with open ear noise reduction. What does that mean? That means you can actually focus on your podcast or your music without being completely sealed off from the world. If you're running or riding a bike, you get that situational awareness that actually matters for safety. They're super comfortable. They've got incredible battery life, up to 50 hours with the case. And crucially, the sound is just superior because it's optimized for Dolby Atmos and powered by this tech called Shox Super Boost that provides really dynamic distortion free audio. And for even more options, all Shox headphones are worth checking out. Visit shox.com and use code RichRoll to receive an exclusive offer on your purchase. You're somebody who, when you put a book out, I read it right away. I think people would be surprised that I actually don't read that much unless it's a book written by somebody who's coming on the podcast. That monopolizes most of my reading, and it kind of crowds out the opportunity for pleasure reading. But both of your books I read immediately when they came out, and you have this unbelievable talent for novelizing nonfiction and creating these books that read like thrillers. The book on Twitter and the Russell Ulbricht book about the Silk Road were just amazing works. Thank you. And you're somebody who has a lot of irons in the fire. You do all of it. You make documentaries, you produce, you write, you've done scripted television, you do long form journalism written for Vanity Fair. You're even in the process of writing the book and the screenplay for this Martin Scorsese movie with The Rock about the Hawaiian mafia. This has to be perhaps like the most high profile project that you've ever worked on.
Speaker 2:
[32:03] Yeah, I mean, I definitely, I have ADHD, if you haven't noticed that, and I've kind of used it as a superpower to just do all different kinds of, there's a theme which is storytelling, of course. And then there's obviously some subtext to it, which is somewhat technology, but not necessarily technology as in devices, because the mafia project you mentioned is set in the 1970s. But yeah, I mean, I just, I love to tell stories and I just, I love to take on a new challenge. And I don't say that like trying to be hyperbolic or anyway, but if I've never done something before, to me, that is the most exciting thing because I'm going to figure out how to do it and I'm going to do it differently because I'm not doing the traditional route. And so, yeah, I've done journalism and podcasts and movies and docs and scripted TV, all of it. And it's just my ADHD at work.
Speaker 1:
[33:00] I mean, I have ADHD, I can only do one thing at a time. Yeah, like how does that operate for you?
Speaker 2:
[33:06] It's funny, my wife came in the other day into my office and I had two laptops open and I was like working on them. She's like, what are you doing? And I was like, and she took a picture and sent it to a bunch of friends, but I was like, I can work on two projects at once. And she was just like, you're out of your mind.
Speaker 1:
[33:23] Yeah, that's interesting. Mine works in the opposite direction. I can do very well if I'm just put blinders on and focus on one thing. And my wife is somebody who's doing lots of things at the same time and I'm just always marveling at that. Like, I don't understand how that works.
Speaker 2:
[33:37] Do you procrastinate?
Speaker 1:
[33:41] I do, but I can lock into a flow state pretty easily if I'm super into the project that I'm working on and I can lose hours and hours and hours. And I really don't want any interference. Like, I just want to be, like, I want the world to be, to disappear and just to be immersed in whatever I'm working on. But I can't, like, gear shift very easily.
Speaker 2:
[34:02] Yeah, gear shift, I don't procrastinate. Not a procrastinating bone in my body.
Speaker 1:
[34:06] Yeah, you don't seem like somebody who has writer's block, something that ever, how do you think about that?
Speaker 2:
[34:11] No, I mean, well, I think in the age of AI, writer's block doesn't exist anymore. It's gone. There's no longer a blank page because you've always got something to help you. But yeah, if I find the gear starting to kind of get a little bored, and I'm, you know, I just am like, okay, next project and move that over and work on that. And one of the things I learned, interestingly, I think it's, look, I don't know what we learn and what we have and ate and what we find and so on. But I remember at the Times when I was a technology columnist, I would be on deadline sometimes and I'd have to like run home or I'd have to run into the office. And I started writing columns on my phone on the subway or something like that, and then I would finish them. And you just kind of, I just learned to just, I could do, I could write anywhere, you know, it just doesn't matter. I'm just kind of zoned in and pick it up on something else. And I just kind of parlayed that into all the different kinds of writing I do on all the different projects.
Speaker 1:
[35:21] You didn't start out with the idea of becoming a writer, did you? I mean, your origin story into the New York Times is super interesting.
Speaker 2:
[35:28] I think I'm probably one of the least likely people to have ended up as a columnist of the New York Times when I was there. There was a world I was going to end up in jail, quite honestly. I mean, it's literally like I grew up in England. My parents got divorced and I moved to Florida and immediately got in with a lot of bad kids and my dad was off doing his own thing. And there was a world I was going in a completely different direction. And I had a moment where I'd run away from home, I was a teenager and I was working at Jack's Burgers in the mall in Florida. And I feel like this is literally the defining moment of my life, quite honestly, was I walked outside of Jack's Burgers, I was taking the trash out and there was a homeless guy going through the trash and waiting for the next trash bag. And I literally was like, wow, that could be me if I don't pull my shit together.
Speaker 1:
[36:25] And you're all 17 or something like that?
Speaker 2:
[36:27] 16 years old, 60, 17 years old. And I walked back inside and I was like, you know what? I'm done. I'm going a different direction. And that was it. And I like stopped talking to all the kids. Most of who are in jail are dead now that I knew in Florida. And I was like, I got to figure it out. And my GPA was a 1.9 in high school. You have to have a 2.0 to graduate. And I was like, how do I get, I literally was like, I've got four months to get from a 1.9 to a 2.0. I got to a 2.1. So, you know, I did pretty good.
Speaker 1:
[37:00] By the time you graduated. I mean, what was going on at home? What kind of chaos was happening that you were, you know, having such a difficult time.
Speaker 2:
[37:07] It was just total chaos. I, you know, parents were divorced. They got married too young. My dad was off dating and, you know, didn't want to deal with me and my sisters. And, and, and we, we were a lot. I was a lot. I was getting in trouble all the time. I got arrested like nine times. It was like, it was, you know, not for like anything crazy, like stealing and drinking and fighting and, you know, complete nonsense. But still like-
Speaker 1:
[37:31] Was drugs and alcohol part of that or just truancy and just punk rock like the world and stuff?
Speaker 2:
[37:38] I was lucky that I never got into drugs and I never have. And just, I just saw what they could do. And, but I drank a little bit, you know, it wasn't- Right. It wasn't, it was more just, I just probably was looking for attention or direction or something. And, and, you know, there was a culture shock of moving from England to Florida. And then you had all these, I remember all these kids that I became friends with and they were similar. And like they lived in New York and their parents got divorced or whatever. And they, or they got in trouble in New York and their parents were like, we're moving to Florida. And so you get like this pirate ship that forms and, and I got on the pirate ship and-
Speaker 1:
[38:23] But you have this moment of reckoning.
Speaker 2:
[38:25] I have this moment of reckoning, which 30 years later is still, or whatever, however many years it is, is still literally fresh in my mind like it happened yesterday. And I was like, I've got to, I've got to figure this out, otherwise I'm, I'm screwed. And I mean, there are kids that, they're men now, but who I was very close with, I think like eight of them are dead. One of them is on death row for murder. A bunch of them, you know, just one got, one was arrested for bank robbery with his mom. Like these were the people I was friends with.
Speaker 1:
[39:04] It's a crazy sliding door story. I mean, because your success is so insane. And you know, in comparison to where you could have been, I mean, that really is, you know, this, this sort of inciting incident, like a real transformation as a result of just you coming to it yourself, like not because you got locked up. It wasn't anything all that dramatic, right? It was just like a mindset switch.
Speaker 2:
[39:27] Well, what I think what's fascinating about it is two things is one is I think we all have an imposter syndrome and I've, and I've always been of the mentality of you, you know, the people who do have that are the ones that work harder and that appreciate it more and so on. The ones that feel like, oh, I deserve to be here. They often don't work harder. They don't push. They just are like, this is where I'm meant to be. So I've always, every time I've ended up in all these different roles as a director on a documentary for HBO, writing a movie for Martin Scorsese, as a columnist for The New York Times, writing stories that are breaking, you know, changing laws or leading congressional hearings, things like that. Every time I'm like, whoa, like how did I end up here? And it's, the mentality I always had was someone's gonna tap me on the shoulder and be like, I'm gonna go. And so I've got to do all this stuff that I get, I want to do until that happens. And it's always been this kind of driving force. But I also, the other thing is like, when I think about that homeless guy, like that homeless guy is probably dead, had no idea that he impacted someone else's life the way he did. And it always makes me think that every single solitary thing we do, we just don't realize, but it's all filtering out.
Speaker 1:
[40:46] It has meaning.
Speaker 2:
[40:47] Yeah, it has meaning. It has purpose. And whether it's up to us or somebody else, it does.
Speaker 1:
[40:53] So you get this 2.0.
Speaker 2:
[40:55] I get 2.1.
Speaker 1:
[40:56] 2.1.
Speaker 2:
[40:56] What do you do with that? So I got into, the only school I could get into was an art school. It was a school of visual arts in Savannah, Georgia. They had a campus. And I didn't, I mean, I literally couldn't write very well. I was, cause I hadn't really studied in school. And I, so I made a comic book of my life story. And that was my college application. And they felt it was creative enough. And that was something I was, I was always very creative. I was lucky to have that. And so it got me in to art school. And eventually I was spending a year in Georgia, and then I transferred to the New York campus. And, and I kept the ADHD clicked in. And so I kept switching majors. And so I was like, I wanted to be a fine artist and learn how to paint photorealistically. And that took like six months. And I was like, oh, am I just gonna do this for the rest of my life? And then I found graphic design. And I was like, oh my God, this is amazing. And then, and then I started, started reading a lot. Like I fell in love with books. And my mom always read, I always remember when I was a kid, she was always, if she was like blow drying her hair, she'd have a book on her lap. And, and I just started, I was a voracious reader, but I still never wanted to write. I ended up as an art director. I designed the very first Britney Spears doll. Like I just bounced around to all these different things and eventually became an art director at the New York Press and then the New York Times. And the goal that I had, I'd read all these war photography books and these war books. And I was like, I'm gonna be a war photographer. That's it. So that was my goal to get to the New York Times. So I get to the New York Times and there's two things that happened. The first thing is I end up in the business section as the art director, but because I knew tech so well and I'd always understood tech, it was just this thing that I could understand. I was always suggesting story ideas and like arguing with David Pogue that his thesis was wrong or whatever. And so I, and I also became good friends with a lot of the editors there because I would help them fix their iPhones and their iPads and stuff because they didn't know how to. And at the same time, I'm at the Times and my goal is like, I have this portfolio and I want to get it to the photo editor, this woman Michelle McNally, and then I'm going to go off and pursue my dream of being a war photographer. I was interning for the printer for James Noctoway. Do you know who James Noctoway is? He's the most unbelievable war photographer of all time. His photos are just breathtaking. And he had all these printers and I interned for one of them. And this was it. This was the goal. So one day I finally get Michelle McNally to go out for lunch with me. And I sit there and I give her my portfolio and she opens it up and it was this big orange book. And she flips through, she doesn't say a word. And I'm watching and she closes the book and she goes, you're a good photographer. And she's like, you know, you'd probably be a good war photographer. She goes, but I'm not hiring you to be a war photographer. And I was like, why not? And she goes, cause you're not fucked up enough. And I was like, what do you mean? She goes, all of these guys and these women that do this, she's like, they're adrenaline junkies. Most of them are alcoholics or drug addicts. Or they need the adrenaline. They can't live in normal society. And she's like, you know, they live on, some of them have beautiful houses and they sleep on their floor. You know, it's like, because that's where they're comfortable. And after that, I was like, okay, well, what's next? And that was when I became a writer.
Speaker 1:
[44:30] What's interesting about that is you could have then lobbied her by, you know, making your case for how chaotic your upbringing was and how fucked up you truly are.
Speaker 2:
[44:41] No, I'm up.
Speaker 1:
[44:41] And then maybe you would have been, yeah, you're like, you don't know. I got, you know, I had a one point whatever, and I, you know, got arrested nine times as a kid. I'm plenty fucked up.
Speaker 2:
[44:50] Yeah, I just, I think that I, I'm the kind of person who, I go off my intuition and my intuition told me, Maybe she's right. Maybe she's right. Yeah.
Speaker 1:
[45:07] And then you kind of exploit this white space because this is, what is this, the early 2000s when kind of the online aspect of journalism is brand new.
Speaker 2:
[45:17] Yeah, so it's early 2000s. And I, we had the New York Times, we had a research lab that was up on the 27th floor, I believe, and the goal of it was like five, six people and the goal was to build prototypes for what the future of journalism might look like to then inform the newsroom and so on. And so it was 2005 or 2006. And no, sorry, it was a little, it was a little late, it was around 2009. And the, we had an idea that there was going to be something like the iPad coming out. So we built like, we took a screen apart and made it look like a touchscreen that you could interact with the news on and so on and so forth. And I started doing a lot of public speaking about the future of media and things. And again, with imposter syndrome being like, why'd they invite me? And I remember I got a job offer to go work at Google in the Google News Group. And I went out for lunch with the editor of the business section at the Times, who I'd become close with. And I told him, oh, I'm going to leave him. And he said, it's a shame we can't keep people like you. And he's like, all of the people at the paper, he's like, they're all brilliant and they're geniuses, but none of them are interested in tech. All they want to do is write for the print paper. And I said, what are you guys doing with that tech blog, the Bits blog, by the way? And he said, well, no one wants to write for it. We get one post a month. And I said, I don't even know why I said it. It was literally like a puppeteer came and made my mouth work. And I was like, I would do it. And he said, really, would you really be interested in doing it? And I was like, sure. And he goes, well, why don't you send me a proposal of what you would do? So I sent it to him thinking it was just gonna go to him. He sent it to the entire tech department. And I kind of pissed them off because I'd said how bad their coverage was on the blog. And the deal, they said, let's try it for a month. And that's how it all started. I was 33 years old. It was the first time I'd written a professional word in the words of The New York Times.
Speaker 1:
[47:23] It's so unusual. What an unlikely story for The New York Times to basically green light this guy who actually was not a writer to be a writer for the most prestigious publication in the world simply because nobody else wanted to do it. And it was sort of treated like this bastard stepchild, which is all the more insane when you reflect on how tech obsessed we are. Like people just can't get enough of tech journalism.
Speaker 2:
[47:49] I think it was that, you know, I remember, I remember this really funny moment when I, when I first became a reporter and you go to this at the Times, you go to this, it's like an onboarding. Even if you've been an employee in this department and you go to this one, you still go through it. But when you first start, so I was now starting officially as a reporter. And I, you go and you go to the one, someone on the masthead's house for like a dinner and, and you meet all the people that have been there their entire careers and lives and someone and you get to ask some questions. But everyone goes around the room and they kind of introduce themselves, one of them and, and that's, you know, I'm Bob so-and-so and I graduated Magna Cum Laude from Harvard. And it got to me. It's like, what do I say? I got kicked out of art school. And it's, it's, I think what it was. There was no scenario, five years in the future or two years in the past that they would have ever hired me. But they were in a situation where they, they couldn't hire people that wanted to, you know, those people were going and working on starting their own blogs. And the other, the other people were, I only want to be on the page, front page of the New York Times. I don't care about the internet. And so I just found this slither of a doorway that I could squeeze through by accident. And then it turned out I was good at it. I knew how to do it for some reason.
Speaker 1:
[49:17] And that was it. So when people come to you or young people come to you and, and ask for career advice or how do I get into this? Like, how do you think about, you know, how you were able to, you know, get those third doors open and, and, you know, create opportunities for yourself that's translatable for the younger generation who's thinking about, you know, a career path similar to yours.
Speaker 2:
[49:40] Well, I think, I think the FERT does two things. One is I would say you have to, you have to be somewhat fearless and willing to make mistakes. Like, I don't, I truly do not, I mean, this is, like, honestly my superpower. I do not give a shit what people think of me. Like, I care. I want, I'm a nice person. I want them to know that I'm a good person. But, like, if they don't like my writing, I don't care. If they don't like that story, I don't care. Like, I'm just my-
Speaker 1:
[50:10] How do you square that with the imposter syndrome?
Speaker 2:
[50:12] The imposter syndrome is I'm not supposed to be here. They, I don't care is, is, is a different part of it. Like, the imposter syndrome is someone's gonna tap me on the shoulder and say, it's time for you to leave. You're in the wrong building. And the, the, I don't care is, there'll be another column next week-
Speaker 1:
[50:32] I'll find another building or-
Speaker 2:
[50:33] Yeah, or it's, or it's more of, like, you know, I watch people who want to be writers and they have one idea and that's it. And it's, and they think it's, like, that's the idea. That is never gonna happen, you know? Like, you have to, everyone I know is a successful writer in Hollywood has 50 things they're working on at once. You know, 50 different ideas, and one of them, if they're very, very, very lucky, will get made. And, and you have, and I think that, and you also, as a journalist, you, you can't obsess about, or even an author, that this is, it's, it's perfect. There's no such thing as perfect. There's 90% done and you can spend the rest of your life on the next 10%. And I, and I, and for me, I'm okay with 90%. Like, okay, let's get it out and we'll do the next one. We'll do the next one. We'll do the next one. And my goal is to just get better and learn more and, and put these things out, out there in the world. And I think that that's, they're two very different things. Does that make sense?
Speaker 1:
[51:35] Yeah, I understand. So not being so precious about your work, holding it loosely, having a healthy relationship with expectations around, like what's going to happen with it or how people receive it. And I suspect that that indoctrination had a lot to do with the fact that in your early columnist days, like you just had to be churning, you were churning out like so much, so many articles, you just had to do it and move on and like not get too caught up and, you know, making it absolutely, you know, turning these things into jewel boxes.
Speaker 2:
[52:06] Yeah, that's exactly right. I remember there was an editor who told me at one point when I became a columnist, he said, there is a piece of, there's a column of newsprint every Thursday that is going to have your name on it and the 1200 words you write. And if you don't file it, you won't have a job. And that was it. And so I had to literally insure, and it wasn't just that, I was writing stories and so on and so forth. And so I did the math at some point, it was well over a million and a half words I'd written. And I barely remember any of it. Like, I remember a few stories here and there, or some themes and things like that, but you just gotta write it and you publish it and you move on to the next one. And sometimes you get beat up for it, and sometimes people love it, and that's just it. And I think that the advice I always give to people is two things, is one is you're gonna get a better education in writing from reading than you are from someone explaining the structure of Hemingway's first opening page and his repetition of words and all these things like that. You're gonna get a better education from reading and understanding, and that's literally how I've learned how to write in every single form. And people like Cormac McCarthy said the same thing, like the writing is reading and iterating on what people have done before you. And so my books are, they read like novels, even though they're narrative non-fiction, because I love novel. That's all I read is novels. And so I want to tell a story. And then they also kind of read a little like movies because I love reading screenplays and writing screenplays. But that's all. I didn't go to school for writing. I literally got kicked out of art school.
Speaker 1:
[54:04] But you have a visual mind. I have a very visual mind. When you're reading your books, it's like you just see the movie, which made me think like, how come these books haven't been turned into movies yet? They must have gone through 20 cycles of development at this point. But how come they're not up on the screen yet?
Speaker 2:
[54:21] Well, it's a longer conversation about Hollywood and how broken that industry is. But yeah, I actually just learned... I have two boys that are nine and 10, and one of them has really severe dyslexia. And my wife was helping him with some of the phonetics of reading. I can't do phonetics. I learned that because I couldn't... When she was saying things, I was like, that's how you do it. And what I realized is I also have some form of dyslexia. And I don't imagine words in my head. I can't picture a word, but I can picture an image and then I can describe the image. So that's just the way my brain works. And so that's how I write. So I imagine the movie and then I tell you what's happening in the movie, or I imagine the scene. It's all visual and then it's put into language.
Speaker 1:
[55:19] As a writer, your talent is really storytelling, like this reverence for how to tell a story well, which is reflected in the books that you write, but also is mirrored in these tech moguls that you've done these deep dive profiles on and their relationship with storytelling. So I want to spend a few minutes talking about how you think about storytelling and its importance and how we should all be thinking a little bit more in depth about storytelling and how it operates in our own lives.
Speaker 2:
[55:54] Well, I think everything we do is a story, right? I'm telling you your story. You're telling me a story based on what you're wearing, what I drive, where I live, the way I talk to people. Everything we do is a story, and there are different ways to approach it. In different mediums, you tell the story differently. One of the things I find fascinating about nonfiction versus fiction, even if it's a short story or a news article, is like in a news article, you have what's called the lead, which is your way into the story. Then you have the nut graph, which is telling you, that's the second paragraph, which tells you what the whole story is about. And then at the end, you have what's called the kicker, which is the best part of the story, right? It's the part that you leave with. In fiction, it's completely the way around. You start with the best part of your story, and then you kind of go through it, and at the very end, you kind of tell us what it's about, and so on. And I think that with all different kinds of storytelling, you have to approach it differently. One of the things that I've found the most challenging as a writer is screenwriting. I think it's the hardest form of writing there is. There's no more difficult form because you can't use exposition. You can't. Every scene has to ask a question that another scene answers, then ask the question, and so on and so forth. Every character is trying to get in the other's way. One character, every voice has to sound distinct and different. There's all these rules, and you're showing you can never tell. You can never tell someone's interior and what they're thinking. And so I find it all very fascinating how the different forms of storytelling work for our brains to be able to kind of understand it. And that, to me, is one of the most fun parts of moving between all these different projects.
Speaker 1:
[57:47] Yeah, in screenwriting, every line of dialogue, every setup, every kind of slug line has to reveal something about character and advance the story and illustrate the themes. And you have to do it with such incredible economy. Like it has to be distilled down to its very essence.
Speaker 2:
[58:08] Yeah, so The Godfather is one of the best examples because the theme is about family, the characters are all very, very different. And what's so incredible about The Godfather and why it's cited as one of the best movies is you've got The Godfather, right? And then you've got his four kids. And each kid is a different facet of The Godfather's personality, which are all, you know, one's a goofball, the other is holding it together, the other thinks he's cocky, he's in charge, and there's the love and so on and so forth. And each character drives the story forward. So it's all those things that are happening. And if you have one single line of dialogue that isn't right, your entire feeling about these characters changes. And everyone has to have agency and there's obstacles. It's a really, really challenging form of writing. And I think it's why there are so few good movies in Hollywood, quite honestly.
Speaker 1:
[59:13] Fewer and fewer.
Speaker 2:
[59:14] Fewer and fewer. I also think, the other thing I would say is, I don't think there are a lot of good writers out there, great writers out there. And that's not a diss on society or people or whatever. I just think that there are, it is such a difficult thing to do really, really, really well, that there are only certain brains that can do it. Cormac McCarthy, for example.
Speaker 1:
[59:40] You couldn't teach what that guy does.
Speaker 2:
[59:41] You cannot teach it. And he didn't even, when you do read the interviews and the few that he did, like he always said, like, I don't know where it comes from. It's like, and I think this is some of the greatest artists say this, like Chris Martin had this great line in a documentary where he said, this song came to me from wherever songs come from. And I, and Carmen McCarthy was like, there's something in my subconscious that, like, that told me to do this. And I think it's the people who are the greatest artists, I think, are the ones that are most in tune with that. And it doesn't mean the people that aren't shouldn't write, they should do the things they want to do. But I think, like, the greats are the, or there's so few of them.
Speaker 1:
[60:25] Getting out of their own way, opening this channel to the subconscious, and, you know, in the, and kind of sitting in this space of allowing it, for it to show up and flow from hand to page.
Speaker 2:
[60:38] Yeah, and Rick Rubin talks about this, about how, about that specifically. I think it's, I believe that, you know, I still don't know if I believe that there's a point to all this, or we're just some little accident or a simulation or whatever, but I do believe that there is something in the universe that makes these things, these stories happen, and whether it's a collective consciousness and it, we, you know, some people have a little pinprick doorway, but I do truly believe that that is really where a lot of this art comes from.
Speaker 1:
[61:14] Well, despite the fact that movies seem to be not so good these days, we're not going to ever reach a point where we lose our appetite for great storytelling, but it is an interesting cultural moment in that, you know, we're kind of in this period of time in which we've lost our reverence for the great novel. Like, this is something I talked to, I had James Fry on here, we talked about this. I had Bruce Wagner, do you know Bruce Wagner? I had him on, he writes these incredible transgressive novels that are really just fantastic. And it's like nobody's reading these books, you know? We're not in that era that, you know, you and I were around probably around the same age, like when, you know, these people were like rock stars, you know, and everybody couldn't wait for their next book. And that doesn't seem to be the case anymore.
Speaker 2:
[62:07] I think there's a few things that have happened as far as books. There, it, funnily enough, the book industry has not shrunk to the degree that people believe it has. It's actually grown in some years. And the reason is, there's a couple of reasons. One is that audiobooks have opened up a whole new, a whole new genre of reader. And then, and most of those, you know, it's like the werewolves and the vampires and all that stuff, you get these kind of subcultures that have risen up as a result of the ability to write books like this. And the other thing that's happened is, I mean, I think one of the worst things that's happened is actually book talk and TikTok as a result of, because I don't think that they're driving a lot of the sales to some of the worst writing out there. And people will call me an asshole for saying this, and they have before, but I just, it really saddens me when-
Speaker 1:
[63:04] Meaning like just sort of low-rent genre fiction.
Speaker 2:
[63:07] Look, we all love a good crime novel or a romance novel, whatever, but there are also things, I believe with every ounce of my being, that the reason that we are supposed to write these stories is to make people think, we are, as storytellers, our job is to hold a mirror to society through a story. Ayn Rand said this. She said that to her, a novel is a way to make society think about things. I think she pushed people to try to think in a certain way, but to me, that is what the whole point of what we're doing is.
Speaker 1:
[63:45] Sure, to elucidate some truth about human nature or the world to help us make sense of why we're here.
Speaker 2:
[63:53] It's why we consume these stories, it's why we want the emotion. Whenever you're pitching a story in Hollywood, people are always like, what's the character's emotional drive? At first, you hear that and you're like, what are you talking about? It's set in space, it's great. Then you realize it could be set anywhere, it doesn't matter. It's about the characters and how we relate to the characters. I think that, which is why there's never a great movie about billionaires because no one can relate to being a billionaire, or most of us can. I think that what's both been great is there are more readers today who do consume in different forms, but at the same time the sad part is that the greats are not read like they once were.
Speaker 1:
[64:42] There isn't that monoculture moment where some genius just drops some work that lands like a thud and rocks everybody.
Speaker 2:
[64:51] Correct.
Speaker 1:
[64:52] But I still think that there is something perennial about writing books that has withstood the kind of gestalt of everything that we consume was sort of uploaded in the last 24 hours. Like there's a staying power. If you write a great book, it can make an impact in a way that other forms of media still can't.
Speaker 2:
[65:18] I completely agree. I wish that people would put down their phones a little bit more and stop scrolling, doomlessly scrolling, and just read a book. I mean, I'm a voracious reader. At night, I put my phone aside and I pull out my Kindle and I try to go through a book a week at least, if I can. I read a lot of older stuff, mostly from 1950s, 60s, 70s, a lot of old sci-fi, because I just feel like it was like the height of it. But I find it so much more rewarding than even most TV these days.
Speaker 1:
[66:02] This episode is sponsored by Rivian. When I think back on some of my fondest memories from childhood, 100 percent of them happened outdoors, on mountains, in lakes and oceans, getting muddy in the local creek, riding my bike around the neighborhood. Basic good stuff that leaves me thinking a lot about what kind of world we're leaving behind for the next generation. And this, in a nutshell, is what Rivian is all about. They're an all-electric vehicle company founded on a simple idea, keep the world adventurous forever. I've been around RJ, the CEO, and his kids, and it's so clear to me that this is his animating purpose. But he's not just thinking about them, he's making decisions based upon what our kids' kids' kids will inherit, which I love. And that philosophy is just deeply embedded in everything Rivian builds. These are zero-tailpipe emission vehicles. Without sacrificing power or performance, the interiors use thoughtful, sustainable materials that feel premium and intentional. And the first 10,000 miles are powered by 100% renewable energy with a growing charging network doing the same. It's not about choosing between exploring the world and protecting it. Rivian is like a passport to both. Meaning that when I'm driving the vehicle Rivian loaned to me, I'm not just driving through the world I love, I'm driving for it. Which is a pretty special feeling I want everyone to experience. One of the things I hear constantly from people in this community is, I have an idea, but I don't know where to start. To which I say, whatever you imagine is holding you back, exists only in your imagination. Let's say you have an idea to start a coaching practice, perhaps a creative project, a business or a course, whatever it is, a great way to turn imagination into reality is by giving it a home online. That is where Squarespace comes in. Squarespace is the all-in-one website platform designed to help you stand out and succeed on the world wide web we call the internet. You can claim your domain, build a beautiful site and run your business all in one place. It's actually insane how easy it is, especially when I think about my past experiences working with designers for months at great cost. With Squarespace, you can start with their AI design partner Blueprint or choose from a library of award-winning super stylish templates and customize everything with simple drag and drop editing. No design skills, no coding required. And if you're offering services, consultations, coaching, events, Squarespace has built-in tools for scheduling appointments, sending invoices, and collecting payments. It's everything you need to turn an idea into something real. Head to squarespace.com/richroll for a free trial, and when you're ready to launch, use offer code Rich Roll to save 10 percent off your first purchase of a website or domain. We have this fascination with these tech titans. We can't get enough of interviews with them. And there is a weird-
Speaker 2:
[69:19] We hope we're gonna learn something.
Speaker 1:
[69:20] Yeah, it's like this, oh, maybe I could be that person someday, or what would it be like to be a billionaire? Like this lurid kind of fascination with how these people live their lives that lure us into kind of a romantic relationship with them, while also, like that's the cognitive dissonance piece. Like, we know that Sam Altman is most likely steering us off a cliff, and yet, you know, we're like, what's Sam Altman doing? What's he up to? Is he a genius? What's happening? And then we're using open AI every single day. It's like, it's insane. And the history of humanity is just barrel forward and break things and we'll deal with the repercussions later. But the repercussions in this context are so exponentially beyond anything that our species has ever faced, and yet we're still really not course correcting for this.
Speaker 2:
[70:11] Well, I think it's the first technology in human history that can wipe out human history. I don't believe the nukes could have done that. There's a world maybe potentially, but all of the studies that I've read, all the research I've read, State Department reports, so on and so forth. You only need 150 people to survive for society to flourish. Society can come back from that. And the predictions were always, we still have a billion people on the planet. Maybe Antarctica looks like Los Angeles, but the planet still survives most of that. And the same with chemical weapons. Chemical weapons, they tried it in World War II, and the reason it didn't work out was because of the wind, because the wind would blow the chemical weapons back in the Germans' face, and so that didn't work. AI is, in my opinion, the first technology that could literally wipe us off the face of the planet. And I think that what we've been very lucky at until now is whenever something goes wrong, it is not catastrophic. Hundreds of thousands of people died in Hiroshima and Nagasaki, and then we were like, wow, nuclear bombs are really dangerous. We should try to make sure that doesn't happen again. And the same with other technologies and so on and so forth. The question is with AI, will it be too late once we realize, oh, that was a bad idea?
Speaker 1:
[71:44] What's your suspicion on that?
Speaker 2:
[71:47] My suspicion is that if you look at technology in terms of warfare, every new technology we build is one that can be used for both good and bad, right? And in the beginning, in the early days when we were living in caves and stuff like that, when we realized we could kill something with a rock, we could kill one person, maybe a few, and then you were probably going to get killed with someone else's rock. You get to this point where guns come along and you can kill more people, you know, maybe, you know, maybe a few with the ones that you stick the bullet in in the old days. Then you get to machine guns. You can now kill hundreds and so on and so forth. Each new technology allows us to kill more and more. So now we get to nukes, we're at hundreds of thousands. I believe AI is in the billions, if used correctly by someone nefarious. And so the question is, is what is that number? And who is the person that ends up using it? Because what's happening with technologies, it's making, it does this, what I say people don't think small enough is the way I think about it. Less and less people create the thing that can destroy the world. And the thing that can be destroyed, the world is even smaller than the thing from before. And so I think that we will have an instance, my prediction is, that we will have an instance where it is used in a catastrophic way. And the question is, does it kill 100,000 people or does it kill 5 billion people? And after that, there will be safeguards put in place, likely, but the question is, is how many people die in the process?
Speaker 1:
[73:37] And what is your sense of how it would kill people? Like, is it just creating a meltdown at the local nuclear facility? Or like, what is the means by which people are going to die as a result of it?
Speaker 2:
[73:50] Well, if you ask me to put my screenwriter hat on for a second, like, yeah, that's the classic, like, you know, the power gets shut off. There's a state department report that says if the power gets shut off in America, 95% of society is dead within a year. A lot more people, you know, hundreds of thousands are dead in the first few weeks. Like, you cut your finger, there's no medicine that's coming to you. The water gets dirty, you die of, you know, diphtheria. There's all these different things that happen.
Speaker 1:
[74:20] Literally just turning off the power grid.
Speaker 2:
[74:21] Yeah, there's the, just turning off the power grid. Yeah, literally nothing else, super easy. The thing that people, there's that great saying that there are nine hot meals between anarchy and, sorry, between society and anarchy. I think we're down to about four hot meals at this point. Like, I don't think we'd make it to nine.
Speaker 1:
[74:42] We're just on a trigger. You know, after COVID, everybody's ready to pounce. I don't think it would take that many days. And I think, like, and we've learned that, like, there will be no comedy among men. Like, you know, we will just immediately pivot to antagonism.
Speaker 2:
[74:58] Completely, 1,000%. And we never think it all through. Like, I, you know, I was working on a movie about, about like an apocalyptic movie right before COVID. And so I was doing lots of research into all the things that could go wrong. And I found out about COVID in China before it was like a mainstream thing. And so I, I was like, oh my God, we have to, I was like, this is going to be really bad. And we have to get food. And I stocked the basement. My wife and my sister, everyone thought I was out of my mind.
Speaker 1:
[75:27] Like pre-NBA suspending its games.
Speaker 2:
[75:30] Way, way, way. Like two months before. And, and, and it was just like a luck. It wasn't like I was some foresightful genius. It was just literally like, oh my God, that could be really bad. And I'd been living in my own head about all the things that could go wrong in society. But you know what I didn't do? I didn't get toilet paper. I didn't get an extra can opener. I didn't get masks because I didn't know. Like, so you can't even plan for the worst case scenario or extra bottled water and whatnot. I had a lot of chips, like, and pasta. But, but I think that the, the other thing is like, there are things that an AI could do today that could kill billions of people. Like, for example, have you heard the stories about the bank robberies where they fake the bank, the senior manager's voice and they call to do a transfer? Have you heard about these?
Speaker 1:
[76:23] I haven't heard about this. I've heard versions of this.
Speaker 2:
[76:25] This is like one, I think it was in, it's somewhere in Europe and Italy or something. And they, they, they, or Sweden or something like that. But that, there was a transfer that was done because they faked the, the, the boss's voice using AI and told him to transfer 50 million dollars or whatever it was to, to another account. So imagine that you have an AI that says, okay, we are going to poison the food supply or poison the water. And they call, oh, can you ship this to the, or whatever, or they change, they, they have them change some, some numbers or something like that. There are so many scenarios that you could just literally using a social engineering hack from an AI do things. There's all the drone warfare stuff. There's a million things that could go wrong. And we can't think of them all. And so how do we plan for them?
Speaker 1:
[77:23] Yeah, I have no idea. I have no idea. But on the subject of AI screenwriting and pandemics, the kind of inciting incident for me reaching out to you, it took us a while to get our schedules along for you to come here, was on the heels of me listening to Scott Z. Burns' Audible. I guess it was a podcast, a limited series, sort of like an audio book or an audio documentary in which Scott Z. Burns, legendary screenwriter, responds to this seeming desire out in Hollywood and in the world to come up with a sequel to the movie Contagion. The movie Contagion sort of had this second life during COVID because it's so, there was so much fidelity in that movie to kind of what we all experience. I remember watching it at the very beginning of the pandemic and I've now watched it again since then. And I was like, this movie is incredible. This is a great documentary. It's just unbelievable. And everybody wants a sequel to it. Scott's like, yeah, but I just, I can't see any compelling reason to create a sequel to this. I can't think of a premise that would be worthy of the time and investment that it would take to create a movie. And he goes on this journey thinking like, well, what if I use AI to come up with a valid premise? And it's this really engrossing kind of deep dive into what AI can do in terms of creative storytelling. And, you know, Steven Soderbergh is part of it. And there's one point where Scott reaches out to you and you're sort of the AI optimist in this equation. So explain, like, your role in how you use AI as a writer.
Speaker 2:
[79:09] Well, I will say that I consider myself a technology realist. I can see the good and the bad in all of it. And I think every technology does have good and bad. You know, I think that some technologies have more bad than good. Social media, for example, is one of those. Cars are great, but 1.2 million people die every year. We're still to this day driving cars. And you can, nuclear bombs, nuclear power, like you just go through all of them. And there's always a good and a bad for it. So I do see the good and the bad. I think as far as AI goes, you know, massive job loss, the potential end of humanity.
Speaker 1:
[79:54] What? It's like we're actually having this conversation.
Speaker 2:
[79:57] But just put that aside for a second. However, I also think it allows us to tell better stories quicker, to think about new ways of telling stories, and enables people to tell more stories. You know, I believe Hollywood is about to enter its MP3 moment and its MP3 moment is where, you know, anyone with a computer or an iPad or whatever it is, will be able to make a movie that looks like Mission Impossible in their bedroom. There will be good parts of that, that we can all tell those stories. There'll be bad parts of that. And to me, you know, I had a writers room at Netflix for a show I was doing right when ChatGDP came out, and after the room was done, I was like, oh, could I make my own writers room? And I've done that, and I do that for my own projects. And it's not like I've replaced people's jobs because I wouldn't be able to hire my own writers room for my own projects, but it is, I think there's incredible benefits to it. I use it, I mean, I tried to count the other day. I mean, I think I use it like 5,000 times a day. I have agents that are helping me write screenplays and fact checking books and all these different things at the same time. And I needed a website that I was doing some work on a script and whenever you use Claude and things like that, the quotes are always straight quotes and they should be curly quotes. I couldn't find a website that would curl them, it didn't have ads, so I just used Replet and I made my own website, curlmyquotes.com. I couldn't have done that before. I think that it enables you to do more in less time and tell better stories.
Speaker 1:
[81:48] So in the audio documentary, which is called What Could Possibly Go Wrong, I think this was called What Could Go Wrong, you enter the picture and you kind of counsel Scott on how he can create his own writer's room, essentially, by giving birth to a bespoke series of AI bots, each of which has its own personality, its own backstory, its own degree of expertise. And so Scott creates this writer's room, there's a virologist and there's a conspiracy theorist and there's a, you know, like all the, and then these people end up, and he's got his studio head and his agent and these people are all in communication with each other. And there's like a whole ecosystem that gets created. And this is something that you've done, you alluded to in what you just shared, which is absolutely fascinating. Like I'd never thought of AI in that context. Like I use it as a research tool, but as somebody who's writing a book right now, like I'm very reluctant to share like my actual pros with an AI. Like I don't want any kind of like, I want to write my book, I want it to be a human created book. I want to take advantage of AI as a research tool, but I don't ever want to be accused of like AI having its fingerprints on anything that like I've written myself.
Speaker 2:
[83:06] But I don't think it's, so I'm not, I'm not going on to Claude and saying, write this book for me, because it couldn't do it, it just couldn't. But I do go along to Claude. So one example of the way I use AI is, I just finished this book with Dwayne Johnson on the company, the Mafia in Hawaii, and we have 5.5 million words of research for the book. Newspaper articles from the 1940s all the way to the 1980s that were all taken from slides.
Speaker 1:
[83:39] Like Microfiche.
Speaker 2:
[83:39] Microfiche and so on and so forth. I have a researcher whose name is Nick, we call him Nick 2.0. We went to the National Archives, we got all the court documents, boxes that had never been opened in 50 years, thousands and thousands and tens of thousands of page of court documents. When I wrote American Kingpin, there was no AI. So what I would do is, I had an Excel spreadsheet that had all the dates and the times and everything, but I would have to remember what to search for. I would like, oh, he was wearing orange, so I'd search for the color and then I would piece it all together. What I did with this book and what I do with the screenplays I'm working on and so on, that are all based on some sort of reality is I use cursor, which is actually, most people use it for programming and I use it for both programming and writing. And I have my own agents that are specifically designed for what I do. And then I can have them take the microfiche, turn it into text and so on and so forth. And then I can ask questions like, oh, what were some of the interesting things that happened during the trial? What were, you know, tell me the story about the murder of Monty and Fuzzy. Tell me, like, and so it gets all those things. And then I can go read those parts of the transcripts, but I don't have to go through 5.5 million words of research. And so the other thing I do is while I'm writing, so do you know what a TK is? So in journalism.
Speaker 1:
[85:05] Oh, like you'll get to it later, you move on.
Speaker 2:
[85:07] Yeah, so in journalism, whenever you're writing on deadline and you need to fill something in later, you write TK. So two letters next to each other. And it's the greatest thing I've ever learned in my life because there's no word in the English language that has a T and a K next to it. So you can search at the end right before you go to press for TK and if it's like a fact about like the number of floors of a building or the guy's, you know, job title or whatever, you fill those in. So usually I would go and I would do find and replace and do the research and so on. So what I do now is I, as I'm writing, I'll be like the person, you know, this guy's walking down the street on TK street and he runs into TK guy and da da da da da. And then I write it and it's all in my style. And then I give it to one of my agents and I say, go fill in the TKs from all the research. And it just goes and does it. And it gives me, I have it give me all the things. So that to me has saved me a week's worth of work. So I'm still doing the writing, but it's filling in the little details that, you know, or like I'll remember from reading the dialogue between two characters and I'll write it. And then I'll say, go check it and make sure it's right and fix it if I made a mistake. And so it's doing that with a, the screenplay, you know, I'm writing the screenplay, but I'll say like, this is one way a lot of people I know use it. Like this paragraph is too much exposition. Give me 10 versions of how to take the exposition out. And then it gives it to me and then I rewrite it again and that's it. And so it's just thinking, it's not writing the book. It's not writing the screenplay. It's just helping me come up with things. And then at the end of it, I will, you know, one thing I did with the recent book I did, I uploaded the whole book and I said, are there any characters that need closing out? Are there any moments that don't flow? Give me a full critique on it. And it gave me a bunch of notes and I was like, great. And I went back and fixed them.
Speaker 1:
[87:03] Can you create bots for the characters like the Dwayne Johnson character? And then Tess is saying, is this something this person would say based upon all of this 5,000, you know, pages of research that you have fed it?
Speaker 2:
[87:16] Well, so there's a couple of things I did that are like that. So one thing I did was I had an agent and they're super easy to make. Like they're really not. You can just go to Claude and say, tell me how to make an agent that does X or skills or whatever. It's not that hard. But you, one thing I did that was really fascinating recently was I, I was writing about these mobsters specifically, and there was some parts I just didn't understand from a mentality standpoint. So I had an agent that created the characters, and then I could interview the characters. They're dead.
Speaker 1:
[87:56] Yeah.
Speaker 2:
[87:56] So I can't like-
Speaker 1:
[87:57] That's fucking wild.
Speaker 2:
[87:58] So I'm like, what's it like to bury someone alive? Like, what does that feel like? And it's looking at like, it's got the information I have, the interviews I've done, and so on and so forth. And then it's also looking at research online and everything. And so I'm literally having a conversation with a dead person. And, or what's it like? Another example was part of the book takes place and the movie takes place during World War II. So there's this amazing opening of the book where I'm not going to give it away, but it's a very visual moment during Pearl Harbor. So I wasn't in Pearl Harbor, right? I could go read a few books and da da da da. I said, I want you to be a person who is there in Pearl Harbor on day that the bombs dropped. I'm gonna ask you questions like, what does it smell like? What does it look like? What are you hearing? Where are the planes coming from? All these questions. And it gives it to me. And then I go right based on that quote unquote interview with the robot.
Speaker 1:
[88:58] That's wild. It isn't a binary thing. I mean, that's incredibly helpful and powerful. And in the Scott Z. Burns audio documentary, it's like the premise that AI comes up with for this Contagion 2 is a fucking banger. Yeah, it's great. I would watch this movie and he was not coming up with that on his own.
Speaker 2:
[89:17] But I think that the misnomer is that you're gonna... The misnomer is this. It's that AI will be able to write the next Scormer and McCarthy book. I do believe that is going to happen.
Speaker 1:
[89:31] You do.
Speaker 2:
[89:32] I do believe that. I think that we are a long...
Speaker 1:
[89:34] That's terrifying to me.
Speaker 2:
[89:35] Which is terrifying. I think we're a long ways away from it. But I do believe we're gonna... Maybe not a long way. Maybe a couple of years. I don't know. But... And I haven't wrapped my head around what that means. I do have some thoughts and we can talk about it. But for the rest of the... For now, you still need a human to write those stories. You still need a human to direct the AI. Because these LLMs have been created on the entirety of everything written in the past, you know, X number of hundreds of years. And I mean this in the kindest possible way possible. Most people can't tell a fucking story. And most people can't write, and they're really bad at it. And so it is being trained on everything, including that. And most of it is that, you know. And so it's not looking at the best writers. It's looking at all of it. And so-
Speaker 1:
[90:36] Treating them equally.
Speaker 2:
[90:37] And it's treating them equally. In fact, it's giving more weight to the worst ones because there's more of them. And so you still need a human to say, that's a terrible idea. That's cheesy. There's a thing that a lot of all of the AIs do when you ask it to write a scene in a screenplay. So one of the classic things in a screenplay is, you come into the scene late, you leave the scene early, which means you never walk in the door and you never walk out the door. You come in mid-argument. It can be four lines of dialogue. When you ask an AI to write a scene, even if you tell it to come in late and leave early, it will always be like, it'll just, it always tells you the thing that you already know. It doesn't know how to do any of that stuff yet. And I think it's a while away until it does.
Speaker 1:
[91:22] I mean, I take comfort in that, but as we progress towards an AI that can write like Cormac McCarthy or in an indistinguishable fashion, the question is, like, what is the human role in the midst of this? And, you know, there was a bunch of hullabaloo around these clips that were shared on social media where it was like a fight scene with Brad Pitt. And it looked pretty realistic, right? But you still look at it and you're like, yeah, I don't give a shit because like, I know this is fake and this isn't the real people. And, you know, as human beings, part of the attraction to that type of storytelling is the human element, is knowing that there was, you know, somebody behind that with a creative inspiration to create that. The question is, does that matter when we are looking at something that's indistinguishable or not? And you don't think it does?
Speaker 2:
[92:19] No, I don't think it does. I don't think people-
Speaker 1:
[92:20] You think that's sort of a romantic idea that humans are hanging on to. Because there is, I do feel like right now, at least, and obviously we're in early days, you know, authenticity is at a premium. Like we're sort of tired of the AI slop, and when you see something that you know is real or a person that you can trust, like that has value.
Speaker 2:
[92:41] Let me ask you a question. If I, so you like Cormoran McCarthy too, right? We'll use him as an example. If I came along and I said, I found his unwritten novel, it's amazing. You got to read it. And you read it and you're like, this is amazing. And then afterwards I said, you was written by an AI? Would you feel differently?
Speaker 1:
[93:00] Yeah, I don't know.
Speaker 2:
[93:01] Like you would still have enjoyed the novel. And I think that-
Speaker 1:
[93:04] But then I would be disappointed, I think, upon hearing that news in the aftermath.
Speaker 2:
[93:09] You'd be disappointed in the aftermath. But my point is, is that you will appreciate the art equally the same, whether it's written by an AI or the real Cormoran McCarthy. And I think that if a- when you look at that fight scene that everyone was sharing, it still had a little bit of- a little bit of AI in there. Eventually it won't. And it just won't. And I think that once we get to that point, if it's a good story, people won't give a shit if it's written- if it's made by a person or not. And they won't give a shit if it's a real actor or if it's a fake actor. I think that there will be some people in society that will say like, I will only con- I can imagine books in the bookstore that says written by a human. Like, I'm not condoning any of this in any way, shape or form. Like, if I could build a time machine and go back to 1960 when people wrote on typewriters, I would be happy to. But I just see that this is the future.
Speaker 1:
[94:03] It does seem inevitable to me though, that we're in this recursive loop of degradation because these AI tools are only as smart as what we feed them. As the Internet is increasingly populated with AI generated material, it's like we're making facsimiles of facsimiles of facsimiles and we're not seeding it with the best of what humanity has to offer, and we're seeding it with less human creativity and inspiration. Cormac McCarthy is a one of one, and he comes along and writes in a way no one else does, and that elevates the human spirit. But we need those people to come and kind of refresh how we think about literature or choose your art form or whatever the specialty is, and if those people don't and they're not feeding the AI, then we're just training it on what exists and it becomes this lowest common denominator thing, and that can't help but kind of degrade in it. We just careen towards idiocracy.
Speaker 2:
[95:05] I think we've careened towards idiocracy. We're past that point. I'm not condoning any of this. I want to say that. To me, I said this on a podcast last year and all these people were mad at me, but I'll say it again, I don't care. I think Colleen Hoover is a terrible writer, right? And she had 1.7 books on the New York Times Bestsellers because of TikTok and so on and so forth. And I don't hold it against her. I'm not saying I'm Colin McCarthy in any way, shape, or form. I can list a million other terrible writers. But what bums me out is there are so many incredible books by incredible writers that make you think, and that's the stuff that's on the top of the New York Times Bestsellers. But it's because we go to this lowest common denominator now. And my hope, and this is literally just my hope, is that what AI can do is help us, help guide us to back to something that is not slop. And I think if it's going to be slop, it will be, we don't need AI slop, we've got we've got human slop, right? I just, it doesn't, it's not going to change, it's not, it's whatever. But there is a scenario, and this is my hope, is that the scenario allows us to make better stories that pushes to think more, and so on and so forth. The question is, is, is Anthropic going to say, oh, we're only going to use the top 1% of writers, and then we're going to train the AIs on that, and then they're going to be more creative, and so on and so forth? Or is it, what's going to happen? I don't know. I do believe that no matter what, that the most creative people will always have a job of telling stories, whether they are coming up with an idea and telling an AI to write the book or whatever it is. But I truly do believe that. I think it's going to be like the art world, where you have 50 people that make a living and the rest of them paint flowers in their bedroom, and that's it. But I do believe that that's the case. But I think that to say that we're not going to consume AI content because it's garbage, we consume a lot of garbage today.
Speaker 1:
[97:26] Storytelling is not going away, but the thing that worries me the most in this shorter term window of rapid AI advancement is the incentive structure behind good storytelling because we are already in a post-truth world. And a great story using AI and visuals that are indistinguishable from reality has the capacity to manipulate the masses into, you know, name your idea, right? So this is something that is easily weaponized. Like we don't know what's real anymore. And we don't know that person that we recognize who's saying that thing, whether they actually said it. And even if we're told it's fake, the research shows that we still kind of believe it, you know, even if we're told it's AI. And what is this doing to the human mind and to the capacity for the human animal to maintain coherent societies? Like, I just see, like, this is like, yeah, this is how we're gonna destroy ourselves, I think, in the short run.
Speaker 2:
[98:37] I completely agree. And that's the biggest worry. I think that, you know, in the attack on Iran recently, Iran was using just basic AI tools like ChatGDP to make fake videos, or Gemini, I don't know which ones they were using, but they were using these basic AI tools to make fake videos of them bombing Israel, destroying it. And then they were putting that on television for everyone in Iran to believe that they're winning the war. It's like, you know, there are news clips, there was a news clip that, I'm on a text thread with a bunch of screenwriters, and one of the guys was like, oh my God, did you see, Iran is agreeing to all of the US terms, and it was a CNN clip of Jake Tapper. And the only reason I could tell it was AI was because Jake's hair looked too good, and, but it was an AI clip.
Speaker 1:
[99:29] And it's like enough to convince your very smart screenwriting friend. Yeah. This is just terrifying to me.
Speaker 2:
[99:35] It's terrifying, and I think, but technology, technologies cometh, and technologies are then cometh to take on the other technology.
Speaker 1:
[99:44] Yeah, I know, but this is not Napster.
Speaker 2:
[99:46] It's not, but the only way that we survive this is, is if the technology that is used to create it, there's other ones that are used to fight the bad ones.
Speaker 1:
[99:57] Sure, but that is analogous to the detection of performance enhancing drugs. Like the advancement is always ahead of the detection system and the correction system.
Speaker 2:
[100:08] 1,000%, and I'm not saying, I'm not being an optimist in this. I'm just, I know that's where we'll end up. Again, it's the worry of how much bad happens before we figure out how to solve the-
Speaker 1:
[100:21] And what is your sense of the timeline here?
Speaker 2:
[100:23] Well, the problem is, what's fascinating is, I've always been, when I was a kid, I loved computers so much I used to go on a weekend to the local corner shop in England and get the new coding magazine and write basic code and stuff. And I've always been obsessed with tech and just something about it that I'm so- I mean, I'm fascinated about technology from a human standpoint. It is the only thing I think that really truly separates us from other creatures, and other creatures make art and music and so on and so forth. Technology is the one thing that we do that really truly separates us, and yet it is inevitable that it will be the downfall of us and we can't stop ourselves. And I'm fascinated by that. And it is also the other thing I'm fascinated by is that we are obsessed with technology because we want to be able to do things quicker and easier and so on and so forth and advance more. And yet when the technology destroys us, the thing that we immediately go to is back to these cavemen that will beat each other up for food and so on and so forth. So there's this crazy dichotomy with the way we as humans work and I'm fascinated by it. And if right now there was some nuclear attack and the power went out, you and I would be out in the street with baseball bats when we haven't had a meal in four days. And we literally go to that lowest common denominator as people. We go to our animal instinct. So that I'm fascinated by. And as far as the AI question goes, I do believe that there's a scenario where it could be used for good. But I also know that there's a lot of scenarios where it will be used for bad. And the thing that's crazy is I've been covering-
Speaker 1:
[102:20] That's already true.
Speaker 2:
[102:21] It's already true.
Speaker 1:
[102:22] It's gonna get worse though. There are remarkable advancements that are happening like disease prevention and like cures and like just incredible shit.
Speaker 2:
[102:29] Yes. But there will be bad, lots of bad. And the question is, and we haven't really seen the real bad yet. We just haven't. It hasn't come about. There's little video clips and this, that, and the other. There's people who get scammed and so on and so forth. We haven't seen the real bad. It's coming. It's just inevitable. And what's been so insane to witness for me is someone who has been using technology their whole life, has been writing about it for more than two decades, is how quickly it's happened. And I had this moment recently. So one of the things I do when I do pitches is I create these AI images that I walk people through as I'm pitching the story. I find it really like a great way to tell a story. And I use Mid Journey for it. And I started using it about a year ago for this. And I went to the bottom of my Mid Journey feed, and the first images were so horrifically bad compared to what we can do today. And that's in like a year. And so, I have never seen a technology grow as fast as this. It's astounding and it's exponential. It's gonna continue to do it as it gets smarter. And so the question is-
Speaker 1:
[103:38] Well, it becomes self-improving.
Speaker 2:
[103:40] It becomes self-improving. It's on a daily basis. Every single day, there's this competition between DeepSea, Congemini, and Anthropic, and OpenAI and so on to quickly get the more models and more tokens and more, you know. And the question is, is what goes wrong and when and how do we, how do we prepare for it? And there's no answer to those questions.
Speaker 1:
[104:04] Yeah, and it is an interesting quirk of the human animal that we know all of this and yet we're like, well, we're just barreling forward anyway, cause that's what we do.
Speaker 2:
[104:16] That's the fascinating part.
Speaker 1:
[104:17] That sort of informs the argument that like maybe human beings are just intended to be the sex organs of this new form of life. And that's our, that's ultimately like the end game of our entire purpose. And it doesn't really matter whether we survive or not.
Speaker 2:
[104:34] I, as you know, it's interesting. I heard the story years and years ago from a friend who was at the dinner where Elon Musk and Larry Page got into the fight about, but do you know the story?
Speaker 1:
[104:45] I don't think so.
Speaker 2:
[104:46] They were at Larry's house. Elon was sleeping on his couch and and they were talking about AI way before any of us were talking about it. And Larry Page had said allegedly to Elon that, you know, robotics are, they're the future of this, of humanity. Like the next iteration of humanity, of evolution, sorry. And Elon got really mad. He was like, what do you mean? He's like, well, that's it. It's just evolution. And Elon and him started arguing. And Elon said, you know, Larry Page accused Elon of being speciest. And I remember at first hearing that and being like, what a psycho Larry Page is. And then you kind of see it all and you're like, well, maybe that's it. I don't know.
Speaker 1:
[105:36] And here we are and Elon is creating the robots. Basically like they stopped creating the Model S and the Model X so that they could allocate resources towards their robotics department.
Speaker 2:
[105:49] And it's like, we know the future. We know what it looks like.
Speaker 1:
[105:54] But we're like, yeah, it's going to be bad, but here we go.
Speaker 2:
[105:57] Yeah. And what I find so fascinating, I'm sure you've had this conversation, is this, that exact moment. Well, you just did, you laughed. So when I talk to people, we're at a dinner party or something, me and my wife, and it'll come up, and I'll tell them all my theories of what it is. And then people, there's a silence, and then there's a laughter. We know when someone almost gets hit by a car and they laugh right afterwards. There's a silence and this laughter, and then people will be like, so what do you guys want to do for dessert? And it's like-
Speaker 1:
[106:23] There's a powerlessness.
Speaker 2:
[106:24] There's a total powerlessness that we have. And so it comes out in us grinding our teeth at night or whatever, however it comes out. But it's so fucking fascinating that we know it's going to go wrong. And I know it's going to go wrong. And after this, I'm going to go home and I'm going to sit at my laptop and I'm going to talk to an agent. And it's like-
Speaker 1:
[106:50] Yeah, and that's just a weird cognitive dissonance kind of thing at the same time. Because what's the other version of it?
Speaker 2:
[106:55] I go live in the woods and wait for someone else to do it?
Speaker 1:
[106:58] You're balancing these polarities of existential dread while also being kind of amazed that we're alive at this period of time where we're giving birth to this thing. And there is, like you're watching a movie, like what's going to happen next? Like it's exciting. But how old are your kids?
Speaker 2:
[107:14] Nine and 10.
Speaker 1:
[107:15] Nine and 10, okay, so my kids are older, but your kids being younger are, that's even more of a oppression case. Like how do you think about the world that they're going to inherit? And what do you say to young people who are wondering like, where should I place my attention? What should I be doing right now to prepare for this thing that none of us can kind of really imagine what it's going to look like and what the skill sets are that we're going to need?
Speaker 2:
[107:43] I truly don't know the answer to that. I truly don't, I don't know. If I were me-
Speaker 1:
[107:47] And if you don't-
Speaker 2:
[107:49] If I were me, okay, and I was just coming out of art school at this point, I would have dropped out of art school probably, to be quite frank. But I always believe, I think I truly do believe, that the most important thing that humans do is tell stories. And I would continue to try to figure out how to tell stories, good stories that have an impact on society that is positive. Quite frankly, I know this may sound bullshit, but it's truly what I believe. Like, so to me, if I were coming out today, as from art school or trying to get a job, or whatever it was that I was doing, or trying to be a writer, my number one goal would be how do I tell stories that try to make sure society doesn't end up in the worst catastrophic place? Whether that's, I wouldn't have done this, but like becoming an influencer that's trying to talk about the good and the bad and the da-da-da-da-da. It's like, it's the only thing that we can do to control it is to tell stories about what it could be and to make people think about it. And so my answer is that, you know, and it's why I'm still doing it now because I believe like, I believe that, you know, the mafia book that I'm writing with Dwayne, it's sure it's about the mafia, but it's about colonialism. It's about right and wrong. And what happens to people? It's like it's about much bigger things. And I think that we, as a species, that's how we learn is through story. And the only way to try to save the species is to tell better stories.
Speaker 1:
[109:38] I think that I, that that chuckle that we have when we're kind of confronted with this reality is a reflection of our discomfort with uncertainty. And the truth of the matter is that the world has always been tremendously uncertain. And the human mind likes to fabricate rules and create structures that foster the illusion of certainty and predictability, but nothing really is certain. It's just that with AI, the uncertainty factor is through the roof all of a sudden. And we're just fundamentally wired to be uncomfortable with that. But in terms of like the anxiety levels of the adults who are trying to make sense of this, perhaps there's something to be said for just being in a relationship with your relationship with uncertainty, you know? And trying to find some peace in that and understanding that, you know, it's been uncertain all along.
Speaker 2:
[110:38] I totally agree. I mean, it's why people were drawn to religion because they're not as much anymore because that created boundaries around uncertainty, right? It's that there are rules and there's a reason and you have to trust in God and Jesus and whatever the other thing that this religion you believe in says. And it's, as societies, we've pulled away from that at a time when we probably need it more than ever. And I think it's a really astute point to say, like, you have to be okay. It's like that saying in AA, like, about being okay with not being okay.
Speaker 1:
[111:16] Yeah, there is somebody who's been in AA for a long time. Like, it's all about powerlessness. Like, we really don't have control over all the things that we think we do. Like, we can control our behavior and our kind of reaction or response to things that happen. And given an acceptance of that powerlessness, how do we find peace, happiness, meaning, et cetera?
Speaker 2:
[111:40] So, I read this Nietzsche quote that was actually really interesting. So, here it is. So, what if some day or night a demon were to steal after you into your loneliest loneliness and say to you, this life as you now live it and have lived it, you will have to live once more and innumerable times more? Would you not throw yourself down and gnash your teeth and curse the demon who spoke this? Or have you once experienced a tremendous moment when you would have answered him, you are a god and never have I heard anything more divine? And the point that he's trying to make is, you know that you are living the life that you're meant to live, if you would be okay living it over innumerable times. And so that to me is like, I am so fucking lucky that I ended up telling stories because it's what I'm meant to do. I don't know why. I don't know. It's literally, it gives me a calmness. It makes me, it's why I love writing books. Like most people hate writing books. I fucking love it. There's nothing I enjoy more than the challenge of telling a story in 110,000 words. And I think, if you told me I have to write books every day for the rest of the, like the civilization or whatever it is, like I'd be the happiest person ever. And I think that to me, that's really what it's all about. And it's if you know that you're doing that, that you're comfortable with the fact that you are living this life that you're meant to live, then that's what it's all about. That's the meaning. And that's man's search for meaning. That is it. And it could be like, maybe you love branding, you love running, you love climbing towers, like you love podcasting, whatever it is, it's about finding that thing. And I think that to me, and that to me is what it's all about until the AI kills us all.
Speaker 1:
[113:39] I think that's beautifully put. I would only add to that, that perhaps amidst this uncertainty or being in the eye of this AI hurricane, that maybe we can appreciate the moment a little bit more. Like, if this is truly, everything's transient, but now it just is like, you know, it's on a shorter timeframe, right? Like, okay, well, if this is gonna go away or things are gonna look very different in an unpredictable fashion, like in a very short period of time, let's try to really appreciate the lives that we have right now. And like yourself, I feel extremely lucky to be able to do what I do and sit across from people like yourself and learn from them.
Speaker 2:
[114:19] No, I completely agree. And I think one of the biggest... Somebody said this to me 15 years ago when the... It was 17 years ago. So when the iPhone first came out, they had a very prescient point, which was my biggest worry is not the screen and this and that. It's that we will never have moments where we just sit anymore. And boy, were they right. And I think like one of the things, and it's hard, it is so hard. But one of the things that I really have been trying to do is like just put it away. Even if it's for five minutes. Like people used to sit on their porch and just think, you know? And now like you're like, you're at a urinal and you're like, oh, I can read an email, you know? And it's like, it's, I think this is honestly one of the worst inventions in humanity. It is literally like, I wonder if Steve Jobs were still alive if he'd be like, oh, I did a great thing or like I did a terrible thing. And I think it really is a, it's just, you gotta, you have to fight against the technology, even though we all need to use it to be in the society we live in.
Speaker 1:
[115:28] I think that's a good place to end it. But I would ask you this final question. How is it collaborating with The Rock on this project?
Speaker 2:
[115:37] He's just a sweet guy. He's like really thoughtful, always thinking about other people. He's funny, he doesn't send text, he sends voice notes. And so you get these long voice notes from Dwayne and it's like, they're fun. He's like telling his thoughts and his...
Speaker 1:
[115:54] That tells me that he appreciates the fact that he holds a certain stature. Like if you get a voice memo from him, you're gonna save that thing. It's his voice, he's speaking to you.
Speaker 2:
[116:04] Well, I think it's also-
Speaker 1:
[116:06] As opposed to like just texting.
Speaker 2:
[116:07] He also works out like three hours a day or something.
Speaker 1:
[116:10] So he's just-
Speaker 2:
[116:11] I think he's in the gym. He's in the gym and rather than like, he's like lifting 400 pounds with one arm and then sending you a voicemail. But he's great, he's been an amazing collaborator and he really got in there. We did a lot of the interviews together like with the former mob bosses and like-
Speaker 1:
[116:26] And Scorsese.
Speaker 2:
[116:27] And Scorsese has been, that was a surreal moment. I was with him in New York recently and we were breaking the story and batting around ideas and talking about good fellas and this, that and the other. And he's 83 but like he's got more energy than me. You know, it's interesting. It's like, you know, that's an example of like the people that are doing the thing that they should be doing. And they're really good at it. And that, it's been really fun to be a part of that.
Speaker 1:
[117:02] In watching the Scorsese documentary. And there's that part where they're, you know, they're talking to his childhood friends and they're talking about how he's literally storyboarding movies as this little kid. And everyone's like, what are you doing? Like clearly this guy is doing exactly what he's supposed to be doing. Like he is living the fully expressed version of who he was always meant to be.
Speaker 2:
[117:24] I think that's what it's all about. Like it's about finding that thing and it doesn't matter what it is. I mean, you don't want to be a serial killer if you're meant to be, but like it's about finding your place on the stage in this play. Like and that's it. Because we don't know. We will never know. And AI is never going to tell us why we're here. Like we're never going to figure that out. So we just have to go with the thing that we are comfortable with in that moment. And that's usually the thing that we want to be doing. And that's what it's all about.
Speaker 1:
[118:00] I think that's a good place to end it, man.
Speaker 2:
[118:02] Yeah.
Speaker 1:
[118:02] Beautiful way to conclude this conversation. Thanks, Nick.
Speaker 2:
[118:06] Thank you for having me. It's a lot of fun.
Speaker 1:
[118:07] Yeah. Very cool. The movie obviously is going to be a couple of years before this thing is out.
Speaker 2:
[118:12] It's going to be a couple of years. Yeah. The book is coming out next year.
Speaker 1:
[118:15] Yeah.
Speaker 2:
[118:16] So I have a podcast I'm doing with Dick Costolo and Paul Kudrowski. Dick was the former CEO of Twitter. It's called The Nick, Dick and Paul Show. We talk about all sorts of stuff like this every week.
Speaker 1:
[118:26] Nice. All right, man. We'll come back and share some more when the book is coming out.
Speaker 2:
[118:30] Yeah. Love to.
Speaker 1:
[118:31] All right. Thanks, Nick.
Speaker 2:
[118:32] Thank you.
Speaker 1:
[118:32] Peace.