title Jeff Bezos and Lauren Sánchez’s morning routine

description It starts at 6:00 AM in their billionaire bunker. No phones. Instead, they write a gratitude list of ten things, with one rule: they can't repeat anything from the day before. I cover what they’re thankful for. 

Plus, Silicon Valley’s elite want to upload your brains, a recruiter exposes a North Korean spy, and ways to stop your kids’ endless scrolling.

Did the tablet kidnap your sweet child? Don't worry, Dr. Michaeleen Doucleff knows how to get them back. She’s a scientist and mom who knows how screens hijack a kid's motivation system. In her book Dopamine Kids, she shares how to swap the screen-time stress for play.

And meet the world's first AI store manager. Lukas Petersson of Andon Labs gave $100K to an AI agent named Luna to open a store. How’d it go? Luna didn’t only write code. She signed a lease, haggled with suppliers, and hired human employees. The result? Andon Market, a boutique in SF. 



Timecodes:

00:00 Lauren Sánchez and Bezos’s happiness routine

05:34 Why Silicon Valley elite want all your knowledge

08:14 AI cannot read an analog clock

11:10 The AI Jesus app

13:11 Job interviewer exposes a North Korean fake IT worker

15:43 Man finds hidden Wikipedia feature

17:01 Caller: ChatGPT diagnosed my car

21:44 Slay your property tax bill with AI

23:27 Caller: Dr. Michaeleen Doucleff

32:07 Shopper targeted with AI deepfake

34:00 AI beauty pageant

37:18 Mommy influencers takeover

40:48 AI school bus company ticketing you

43:49 Woman texts late grandma’s phone number

46:01 Caller: My Alexa has an attitude

49:44 Caller: AI is the boss at San Francisco retail store

56:55 AI Tool of the Week: Meta AI Muse Spark

58:54 Lithium battery rules for AirTags

1:01:31 Caller: Man’s 100,000 recorded concerts hit the internet

1:07:27 Gen Z emojis: what do they mean?

1:13:29 Kalshi prediction markets

1:18:31 Waymo and Waze partner up to fix potholes

1:18:54 Air New Zealand adds beds in the sky

1:20:46 Phone-free restaurants

1:30:18 AI helps with stocks

1:33:09 Caller: Granddaughter’s phone brings a stranger to her door

1:39:58 Your car is a snitch
Learn more about your ad choices. Visit megaphone.fm/adchoices

pubDate Sat, 18 Apr 2026 08:59:00 GMT

author Kim Komando

duration 6306000

transcript

Speaker 1:
[00:00] Every business owner I talk to asks the same question. How do I make AI work for me? Not someday, today. Sitting on the sidelines is not an option. Your competitors aren't waiting. NetSuite by Oracle is the number one AI Cloud ERP, trusted by over 43,000 businesses. It pulls your financials, inventory, commerce, HR and CRM into one system. When everything lives in one place, your AI knows your business. It doesn't guess, it knows. You make decisions based on the data and AI, not on your gut. You get alerted to red flags before they turn into problems and make fast, confident decisions that move the needle. How do I know? I'm bringing NetSuite into my business right now. From software and IT services to healthcare, equipment manufacturing, financial services and many other great American industries, NetSuite delivers a customized solution for your business. If your revenues are at least seven figures, get NetSuite's free business guide, Demystifying AI Right Now. It's free to you at netsuite.com/kim. That's netsuite.com/kim, netsuite.com/kim. Well, look at you. You got the Kim Komando Show podcast. You are so smart and I heart you for that. And now if you listen to the show on radio, it's three hours. Well, it's not three hours as a podcast because it doesn't include all those commercial breaks, but you do get it in one big file and we do separate the hours so you'll hear what's coming up in the next hour, the next hour. So make sure that, you know, it's not hours. It's about, I guess you'd say like 30 minutes for each one. And joining me, of course, is our co-host, Andrew Wabinski.

Speaker 2:
[01:30] It's all the same great information. We just got sleek, nice, new packaging.

Speaker 1:
[01:35] And, you know, what are the benefits of getting the podcast?

Speaker 2:
[01:37] You hit play one time and you roll through the entire show. It makes it easy. It's all packaged up perfect for you. It's the entire show. All the information just shortened down for you.

Speaker 1:
[01:47] You know what I think is best about the podcast?

Speaker 2:
[01:49] Yes.

Speaker 1:
[01:50] Is that, you know, I always have such amazing jokes.

Speaker 2:
[01:54] Oh, no. That's the reason why we have a fast forward button.

Speaker 1:
[01:56] No. With the podcast, you can back up and you hear the joke over and over again until you memorize it. And then when you see your family members and friends, you can say the joke.

Speaker 2:
[02:08] Just skip to the subject part of the Kim Komando Show podcast. It's the much better part.

Speaker 1:
[02:13] All right. OK. Well, sit back and enjoy. Every single day, every single day, they're playing pickleball every single day. Well, there was I don't want to say a puff piece, but there was an whole article in the New York Times about Lauren Sánchez Bezos. Oh, OK. And how she is in love with Jeff and how she just can't imagine her life being anything different than what it is right now.

Speaker 2:
[02:41] She'd be a lot poorer.

Speaker 1:
[02:44] And it gave a glimpse into her daily life. The happy married couple. It starts at 6 a.m. every single day. That's when they get up in their $250 million dollar Florida compound.

Speaker 2:
[02:55] Oh, that's a small, cheap home that they live in. $250 million dollars.

Speaker 1:
[03:00] And when they get up, they never touch their phones.

Speaker 2:
[03:04] No, they have the staff butlers that do it. Read my text messages, Charles.

Speaker 1:
[03:10] What they do is they come up with a list of 10 things. Every morning they wake up, 10 things that they're grateful for that day. 10 things. And you cannot repeat anything from the day before.

Speaker 2:
[03:23] That's pretty strict rules.

Speaker 1:
[03:25] They play pickleball every single day. They work out for an hour, six days a week. So think about this day one. What are we grateful for?

Speaker 2:
[03:35] Our health, our 500 million dollar yacht, our love for each other, our rocket ships, the money we save on Jeff Bezos' haircuts, my BFF Kim K, yes, Kim Kardashian. Thank you for clearing it up. Not Kim Komando.

Speaker 3:
[03:53] But what else could you be grateful for?

Speaker 1:
[03:56] I mean, think about it, everything, right?

Speaker 2:
[03:58] A trillion dollars, maybe?

Speaker 1:
[03:59] Gosh, did you know, did you hear what he just is raising? A hundred billion dollars.

Speaker 2:
[04:04] For?

Speaker 1:
[04:05] He wants to come up with a new line of robots that will manage the robots, that will manage the robots. So like manager, managing robots.

Speaker 2:
[04:15] So we're eliminating three layers of jobs.

Speaker 1:
[04:18] Exactly, exactly.

Speaker 2:
[04:19] Good job, Jeff.

Speaker 1:
[04:20] Yes, because after all, these are the type of little fun facts that we pass along week after week. On this, the nation's largest, best, award-winning, greatest show about all things digital. And it's called The Kim Komando Show, because after all, I am your beloved digital goddess, Kim Komando. And joining me is our co-host, Andrew Babinski. And you can find us on over 420 top stations from coast to coast. And of course, we're streaming in your favorite radio app. And you can also get The Kim Komando Show as a podcast, wherever you get your podcasts. You can take me on the go, or if you're just sitting around doing chores. And studies have proven that if you listen to The Kim Komando Show while you're doing your gardening, I mean, this is some deep research, that all your plants will grow so much better because they're getting more intellect as they're growing.

Speaker 2:
[05:07] Where did this study take place?

Speaker 1:
[05:09] Well, it was The Komando Institute. Ah. Yes.

Speaker 2:
[05:12] Kim asked Chad GPT, what's some crap I can tell my audience if they'll believe.

Speaker 1:
[05:17] And you can also find us on YouTube.

Speaker 2:
[05:19] youtube.com/kimkomando. Right now, there's like one, two, three, four cameras in here, huge cameras. And we don't have them here for no reason. We want to make a quality video production so we can put it on YouTube. And that's just another way to watch the show. So go and subscribe.

Speaker 1:
[05:35] And then also, it's a reason why I do my hair and makeup.

Speaker 2:
[05:37] That's true. So I used to do the show when there was no cameras. Oh boy, did she look like a mess.

Speaker 1:
[05:43] That's not true. I'd come down in my pajamas sometimes.

Speaker 2:
[05:48] I had a baseball cap backwards, jumping on the furniture.

Speaker 1:
[05:52] Okay, I just jumped on the furniture once.

Speaker 2:
[05:53] Yeah, but it was memorable.

Speaker 1:
[05:55] I jumped on the side. Look, I'm so tall. And the Komando hotline is open. We have a brand new phone number, by the way. It's easy to remember. Call 1-844-KOMANDO. Oh. 1-844-K-O-M-A-N-D-O. And I'd like to thank our listener who provided us that number. He actually owned the number. And he's like, hey, Kim, do you want it? I'm like, sure. Did he?

Speaker 2:
[06:17] Why did he own it?

Speaker 1:
[06:18] How did he?

Speaker 2:
[06:19] Why did he buy it?

Speaker 1:
[06:20] I don't know. He had it for years. So now we have it. All right. Then every single day, I read whatever I can. I talk to industry insiders and even my friends who are hackers to make sure I know exactly what's happening in the tech and AI space. And here are some things that you need to know about. And we're going to start by talking about what's in your head and what the heads of tech want to do with what's inside your head. Does that make sense? Nope. Okay, Elon Musk wants to put a chip in your brain. Okay, Sam Alvin wants you to upload your entire mind to a computer right before you die. Why? Peter Thiel says that death is just a disease that we can cure using technology. Now there's a name for what all these guys are.

Speaker 2:
[07:09] Well, I know the name.

Speaker 1:
[07:10] What?

Speaker 2:
[07:11] Crazy Pants Bananas.

Speaker 1:
[07:13] Transhumanists, a transhumanist. It's the idea that biological humans are only a stepping stone and a post-human species engineered by tech is going to colonize the galaxy.

Speaker 3:
[07:27] We're going to be living all over.

Speaker 2:
[07:28] So we're going to put our brains in robots and then those robots can survive long time travel and we're going to travel all over the galaxy. Correct. We can live on Mars if we don't need the Earth environment. If we're robots.

Speaker 1:
[07:43] Was there a movie like this?

Speaker 2:
[07:44] Every single science fiction movie ever made?

Speaker 1:
[07:47] I thought there was.

Speaker 2:
[07:48] Thanks.

Speaker 1:
[07:50] Would you do this?

Speaker 2:
[07:51] I don't... I mean...

Speaker 1:
[07:52] Because you're dead. Your body's dead.

Speaker 2:
[07:54] Right. But are you... Well, that's the question though. Are you dead if your thoughts and your soul and your feelings were inside of a machine?

Speaker 1:
[08:02] No. You'd still be alive. It would be you.

Speaker 2:
[08:04] Right. It would be your personality. So you're saying, and I'm just making an argument here, you're saying the length of your arm and the way your toes curl and your second toe is longer than your first, that doesn't make up who you are?

Speaker 1:
[08:18] It does.

Speaker 2:
[08:19] So then that would not transfer over. So it wouldn't be you.

Speaker 1:
[08:22] Okay. But I do have a big second toe. Is yours bigger?

Speaker 2:
[08:28] No, but there's a name for that. It's like the Nelson effect or something like that.

Speaker 1:
[08:31] They're actually supposed to be smarter if you have a longer second toe.

Speaker 2:
[08:35] Of course, that's the way you glom from that is that your toe is longer. Well, obviously, I'm smarter.

Speaker 3:
[08:42] No, it's true.

Speaker 2:
[08:43] Oh, okay.

Speaker 1:
[08:44] It's true. Science has back, you keep on challenging my science knowledge.

Speaker 2:
[08:49] The same scientists that said gardens will grow faster if they listen to this show.

Speaker 1:
[08:53] Hey, Stanford dropped the 2026 AI report. So Stanford big company gets all the headlines, right? What they have found is that the global AI investment last year was $581 billion.

Speaker 2:
[09:06] That's a lot of money.

Speaker 1:
[09:08] It was double what it was the year before. And they're going all in, by the way. And let's see, the same AI that's acing all those benchmarks, that is passing all these exams and taking jobs, it still can't tell you what an entry level job pays. These are things that AI does not still know. And if you are in an entry role position, that job is going away, probably this year. And it's the world's most powerful AI, let me finish, the world's most powerful AI. What do you think that it can only do correctly 8% of the time that a human can just look at and not even think about it?

Speaker 2:
[09:53] So, it's identifying something, if I'm looking at it. Identifying another, the race of a person, the gender?

Speaker 1:
[10:02] No. It is looking at an analog clock and knowing what time it is. It can only know what time it is 8% of the time, which is not far behind the Alpha generation in Gen Z. That's true.

Speaker 2:
[10:14] It's on par. I think you're rid of all the entry level positions. How are they going to create jobs for the second level?

Speaker 1:
[10:21] See, this is the problem. Because if you don't have the entry level jobs, and then the second level jobs are going to be going next. I mean, we're talking like 10 years, right? Okay. That's where the whole idea of we are going to have to provide people a universal income.

Speaker 2:
[10:38] I'm just saying, let's say you have three layers of employees at a company, at entry level, and then they get promoted to senior producer, and then they get promoted to supervise senior producer. If you don't have entry level, people aren't going to be able to train, learn, prepare themselves, understand the work environment, understand the job. So you're never going to have anybody qualified enough to move up to that second level.

Speaker 1:
[11:01] Yes and no. Because the entry level jobs are being replaced by AI doing the work. The mid level jobs are probably going to be the entry level jobs who are going to be calling upon AI to do certain tasks. So then you're going to need somebody above everything to say this is what we need to get done.

Speaker 2:
[11:20] To give everyone instruction.

Speaker 1:
[11:21] Yes, exactly.

Speaker 2:
[11:23] But most jobs are entry level jobs.

Speaker 1:
[11:26] Well, that's the problem.

Speaker 2:
[11:27] Right. We're going to be getting rid of most jobs.

Speaker 1:
[11:30] Exactly. All right. Let's come on back down to earth. We've all talked about the Chinese routers. That it's not a good idea to buy a router made in China. Right. Because they're sucking every information that you put through it.

Speaker 2:
[11:41] But it's super easy because only 99.9 percent of routers are made in China.

Speaker 1:
[11:45] That's the shift. That's no longer the case. The routers are now being made in Vietnam, Taiwan, and Thailand.

Speaker 2:
[11:55] But are they made by Chinese companies?

Speaker 1:
[11:58] No.

Speaker 2:
[11:58] Okay, good. No.

Speaker 1:
[12:00] But that's a good point because a lot of them are. So if you got TP Link, it's a bad deal. Don't buy it.

Speaker 2:
[12:07] Right. Because that's totally Chinese.

Speaker 1:
[12:10] All right. This app really disturbs me. I consider myself a religious woman. Okay. I go to church, not only on Easter. I don't understand. Why do you all people just show up on Easter?

Speaker 2:
[12:22] It's like we're not all overachievers like you.

Speaker 1:
[12:24] Okay. There's an app called Just Like Me. Have you heard about it? It's $1.99 a minute that you can video chat with an AI Jesus, and it's a digital avatar. You can also talk to Santa Claus.

Speaker 2:
[12:43] The fact that Jesus and Santa Claus are in the same category is very insulting to religious people.

Speaker 1:
[12:49] It really is. So it's $119 an hour for you to talk to Jesus. Yes.

Speaker 2:
[12:57] But they say on the website that it's not really Jesus.

Speaker 1:
[13:01] Yeah, but it's Silicon Valley just praying upon lonely people.

Speaker 2:
[13:05] Do you think it's Silicon Valley? I thought when I looked into this, this seems like an overseas Chinese company that's just pumping out AI chatbots as much as they want.

Speaker 1:
[13:17] I'm sure there's one of those too.

Speaker 2:
[13:18] Yeah, I'm sure.

Speaker 1:
[13:18] OK, that's the way it is. Now, this is what I thought that was really interesting. You can also talk to Chris DeWolf.

Speaker 2:
[13:26] So Santa, Jesus and Chris DeWolf?

Speaker 1:
[13:31] Chris DeWolf is the founder of MySpace.

Speaker 2:
[13:37] What? Wait a minute, who was our first friend on MySpace?

Speaker 1:
[13:43] Tom.

Speaker 2:
[13:44] Yeah, Tom. Who's this Chris guy? Our first friend was Tom.

Speaker 1:
[13:46] He was really the founder. But Tom was just in there too.

Speaker 2:
[13:49] OK, I always thought Tom was the founder.

Speaker 1:
[13:51] Tom's still sitting out there for some reason.

Speaker 2:
[13:53] He's everybody's friend.

Speaker 1:
[13:54] OK, so the thing about this is that you can confess your sins to AI Jesus. And then you can get social media advice from a guy who lost 350 million users to Facebook.

Speaker 2:
[14:06] Right, and then for Christmas you can ask for someone to pay off your credit card.

Speaker 1:
[14:10] Oh my gosh. Let's see, by now you've probably heard about the North Korean hackers using deep fake videos in order to get remote jobs here in the United States so that this way they can steal all of our secrets and then report it back to North Korea.

Speaker 2:
[14:24] That viral video was crazy where they were interviewing the...

Speaker 1:
[14:27] Oh wait, no, that's what we're talking about.

Speaker 2:
[14:28] Oh, okay.

Speaker 1:
[14:29] That's exactly what we're talking about. So they're trying to figure out if he's a real North Korean person who's just living in the United States.

Speaker 2:
[14:38] They're doing a video chat with an applicant for a job and they are very suspicious that this person is working for North Korea.

Speaker 1:
[14:46] So they ask him to say a sentence, Kim Jong Un is a fat ugly pig. Here, listen.

Speaker 4:
[14:54] You get like a lot of imposter candidates, you know, particularly North Koreans, like posing as like people that they're not. So one of the tests that we do is trying to get them to say something like Kim Jong Un is a fat ugly pig. Could you say that for me? Sorry, no, Kim Jong Un, you know, the leader of North Korea.

Speaker 5:
[15:19] Yeah, I...

Speaker 6:
[15:23] Sorry, I just say, I should say like that.

Speaker 4:
[15:27] Yeah. Yeah, if you could, because it's one test so that I know that you're not North Korean. Yeah. Can you say it?

Speaker 1:
[15:41] No. He can't say it.

Speaker 2:
[15:44] That's crazy devotion.

Speaker 1:
[15:45] No, we can't say it. Well, because if he says it, right, his whole family is going to disappear. Uh, he could probably be sentenced to death, right? So he's not going to say it.

Speaker 2:
[15:57] He's working. You'd think they'd be smart enough and go, hey, if they test you like this, go ahead and say it. It's fine. We know you're only saying it to get the job, so we can get the information that we need.

Speaker 1:
[16:07] Well, now they're training him to say that.

Speaker 2:
[16:09] Yeah, absolutely.

Speaker 1:
[16:09] Because that didn't work. But I mean, when you looked at that guy, you know what? He had no soul. Get it? Like North Korea's soul.

Speaker 2:
[16:18] Isn't it soul in South Korea?

Speaker 1:
[16:20] Oh, yeah, that too. Oh, I knew it was one of them.

Speaker 2:
[16:25] When in Rome.

Speaker 1:
[16:27] You say tomato, I say tomato. Okay. All right. Well, okay. How about a better one?

Speaker 2:
[16:32] All right, let's try it.

Speaker 1:
[16:33] What's the only drink size they allow in North Korea? Don't give me the answer. You ready?

Speaker 2:
[16:39] What is it?

Speaker 1:
[16:39] A supreme leader.

Speaker 2:
[16:42] Yes.

Speaker 1:
[16:43] All right. I don't know if you've heard about this. It's happening on Wikipedia. It's just not dead. Everybody says it's dead, but it's not dead. You know what the people are using it for now? What? Is that you fire up the Wikipedia app wherever you are standing, and then you find out what's nearby. So there's like explore in places in the app. And so then it will tell you like, oh, like this was a historic speakeasy, or this is a street where a gunfight was or whatever it was at that particular time.

Speaker 2:
[17:11] See, they're surviving.

Speaker 1:
[17:13] Kind of. I would say surviving by their nails, maybe.

Speaker 2:
[17:22] Not dead, just on life support.

Speaker 1:
[17:23] So again, what you want to do is open the Wikipedia app, tap explore and select places. I haven't tried it yet, but Maddie said it's really phenomenal.

Speaker 2:
[17:33] I'm gonna use it now.

Speaker 1:
[17:34] You should try that.

Speaker 2:
[17:34] I love that kind of stuff.

Speaker 1:
[17:35] All right, coming up in just a few minutes, I have a great tip that you can use your favorite chat bot with, and you can save money. I'm talking about, well, not yacht money. I'm talking about make a difference money, because we all get that property tax statement in our mail, and then you look at it and you go, my house is not worth that much money. And then you're like, oh, I don't wanna dispute it, because that's just too much effort, because I had to hire a lawyer. Not anymore. I'm gonna tell you exactly how you can do that. What do you think, some phone calls?

Speaker 7:
[18:02] Let's do it.

Speaker 1:
[18:03] All right, how about we start with Dave in Nashville, Tennessee. Welcome, Dave, glad to have you with us. What's going on?

Speaker 7:
[18:10] Well, it's great to be here. I sent you a message the other day. You had put a tip out just last week in your daily newsletter about ask your favorite AI engine. I don't know if you mentioned Chat GPT particularly, but you wanted us to put our make and model of our car in there and then ask it a basic question like, you know, what are the most common maintenance issues and what can I do about it? Well, I'd had an issue since last year that I took to a garage and they ran through a long process and said, your car is in perfect health. I thought, well, it still shudders when it starts off, when I'm turning quarters and things like that.

Speaker 1:
[18:50] So what do you mean shutters? Wait, what happens?

Speaker 7:
[18:52] Well, like it just kind of vibrates a little bit.

Speaker 1:
[18:56] Does it make any noise or anything or just vibrate?

Speaker 7:
[18:58] No, nothing like that.

Speaker 1:
[18:59] Okay, all right.

Speaker 7:
[19:00] And I could overlook it, turn the radio up or just pretend it's a bumpy road or something like that. But when my wife's in the car, you know, you can only sidetrack her so many times before she says, okay, there's something wrong with the car. You got to do something. So I took it to the garage. They couldn't find anything. So I wait another nine months because I, you know, it was pretty urgent and I put it into ChatGPT and boom, it comes right up with the answer. It gives exactly what's happening in all the right circumstances. I thought this is amazing. What was it? So I send this to the garage. It was a transfer case. The oil in, I don't want to name the brand in case they come back and take my car or something or, but anyway, they put oil in the transfer case and supposedly last a lifetime. Okay. Well, as long as you only live to six, you're good. This started failing and when it starts failing, then all the little plates inside start bouncing on each other rather than running smoothly. So that's what was happening. So I replaced the transfer case oil for about $450 versus replacing the whole transfer case for about $4,000 to $7,000 based on what the chat thing told me.

Speaker 2:
[20:16] And the shimmy, the shake, it went away?

Speaker 7:
[20:19] It was amazing. It went away instantly. I thought there's no way just changing this oil is going to really help, but it did. And they said that it would.

Speaker 2:
[20:27] Now, when you brought it in to get it fixed and you're like, hey, I want the oil, my transfer case changed. I mean, obviously, that's a very specific car. It's not like an old change or, you know, can you top off the fluids?

Speaker 1:
[20:38] What kind of car are we talking about?

Speaker 7:
[20:40] Oh, it's a BMW X5.

Speaker 1:
[20:42] Okay, yeah, all right.

Speaker 2:
[20:43] Did you tell the mechanics, ChatGPT told me that I need to do this?

Speaker 6:
[20:49] Yeah, that went well.

Speaker 7:
[20:50] That was the eye rolls and all that kind of thing. Here's another internet nut job, you know, who's reading too much. But I said, look, so they said, well, let us do an inspection. It's like, no, your inspections cost hundreds of dollars and then you're going to turn around and cost, charge me another $450 on top of that. Just please do what ChatGPT says. Okay. That should be my bumper sticker. So they did it and it worked. And so you saved me thousands of dollars. So I'm just very thankful.

Speaker 2:
[21:25] Did they admit that ChatGPT was right after the whole thing was done?

Speaker 6:
[21:29] No, no, no, no, of course not.

Speaker 2:
[21:34] I actually had a similar thing just this week. I had a check engine light come on. And I went and got the scan done and they spit out the number and you have no idea what any of the numbers mean. So I put that in ChatGPT and ChatGPT is like, well, it could be this, it could be that. But also the make model of your vehicle, cause it knows my car. The make model of your vehicle is just known to sometimes have a glitch.

Speaker 3:
[21:55] So wait for it to go.

Speaker 2:
[21:58] It's a suburb? No, it's a suburban. And it said, just wait a couple of days and see if it goes away. Two days later, boom, it was gone. So I'm in the same thing with you, Dave. I would have went in and got a diagnostic check. That's 250 bucks just to find out, oh, it's just a glitch.

Speaker 1:
[22:14] Well, BMW, break my wallet.

Speaker 7:
[22:16] Yes, absolutely.

Speaker 1:
[22:18] I mean, well, you know what? So, so do I get a cut of anything here?

Speaker 2:
[22:24] There's a cut of the savings?

Speaker 1:
[22:27] Just wondering.

Speaker 7:
[22:29] Sorry, you're starting to fade.

Speaker 1:
[22:32] Yeah, I can't, like, I can't hear you.

Speaker 2:
[22:34] Now he's gonna ask Chad GPT how he fixes his phone issues. Every single time someone asks for money, he can't hear them.

Speaker 1:
[22:41] Hey, what's going on with that?

Speaker 2:
[22:42] It's so strange.

Speaker 1:
[22:43] How is that happening? Okay, so you probably have a pretty good idea how much your house is worth, right? You have a number in your head. And so does the county assessor. Now, the reason why I brought this tip to everybody's attention is that I was reading about a study that said up to 60% of all American homes are overassessed.

Speaker 2:
[23:03] And because they overassess it, because they get more tax dollars, the more it's worth.

Speaker 1:
[23:07] Exactly, exactly. Now, if you appeal that amount, homeowners are lowering their property taxes 40% of the time. So it's worth it. Absolutely.

Speaker 2:
[23:18] It's a shot, okay? It's not like a shot in the dark.

Speaker 1:
[23:20] Yes. So here's what you do. Three steps. Number one, you go to your county assessor's website, you look up your property and find out exactly what the assessment is. Number two, you go to Zillow Redfin Realtor and you get five comps that will obviously be lower priced than your house, okay?

Speaker 2:
[23:39] Am I looking for comps or am I looking what they value my house at?

Speaker 1:
[23:43] No, you want what they sold for, comps. And then you let AI build the appeal for you. And then this is a pretty long prompt. If you need it, it's on the website at komando.com. But you're always going to start out with by giving it a role, whatever chat bot that you use. Like you're a property tax consultant with 20 years of experience winning assessment appeals. You want a winner, okay? Not a loser. And then you say, my home is overassessed, here's the address, here are the comps, and then my assessment should be reduced to. And it's a fairly long prompt. And again, if you need it, it's over at komando.com. That's komando.com. All right, coming right back. If you've got kids on tech, you can't miss our next caller here on The Kim Komando Show. Think about all the irreplaceable moments on your computer, your kids' photos, important records, years of work and memories. If your computer crashed today, how much would you lose? It happens every day and is completely preventable. That's why I use and recommend Carbonite, my number one choice for cloud backup. Carbonite gives you unlimited automatic protection for everything that matters. It quietly backs up all your photos, documents, and important memories in the background, safely, privately, and always just a click away on your computer or mobile app. Carbonite is your ultimate backup plan. It protects you from the unexpected, from accidental deletions and hard drive crashes, to floods, fires, and even a morning coffee spill on your laptop. And if something does happen, recovery couldn't be easier. Just log in and restore everything with one click, even on a brand new computer. Right now, get 50% off carbonite when you go to carbonite.com/kim. That's carbonite.com/kim. Live a better digital life with carbonite. All right, so you've got kids, you've got kids on screens.

Speaker 2:
[25:31] Too many.

Speaker 1:
[25:33] Dr. Doucleff, I was reading about her in the New York Post. In light of this landmark case, we talked about it here on the show, where kids in social media and streaming app, it's kind of like the social media companies are having their nicotine moment, because they specifically designed social media to be addictive. You've seen it with your kids. Oh, absolutely. I mean, I've seen it.

Speaker 2:
[25:58] From all ranges, it's addictive to seven-year-olds as much as it is to 16-year-olds.

Speaker 1:
[26:03] So Dr. Doucleff thought her daughter Rosie loved just YouTube and Netflix. The seven-year-old though would beg for screen time, right? I want this more, I want this more, I want this more. She wrote a book called Dopamine Kids. And joining us right now is the doctor herself. And thank you for joining us because it is dopamine, isn't it?

Speaker 8:
[26:23] Oh, yeah, for sure. It is dopamine, but it's not pleasure. You got to know that.

Speaker 1:
[26:27] What is it?

Speaker 8:
[26:28] So dopamine does not give us pleasure, does not give us happiness. Dopamine makes us want, it makes us desire. It's the do it again button in our brain, just do it again, do it again. But dopamine can pull us to things in our kids, to things that actually make us feel worse, that hurt us. And so, limiting these things in our kids' lives doesn't mean taking away pleasure from them. It means adding more pleasure. That's what my book is about.

Speaker 1:
[26:51] And so, what was that pivotal moment that you saw with your daughter, Rosie?

Speaker 8:
[26:57] You know, there was one evening, she was, we followed the advice out there. And I can say it doesn't work. It's old, it's based on old science. And every night we struggled to get her off and it got harder and harder to get her off. YouTube, to get her off the phone, the iPad. One night she was screaming, yelling, and she actually curled up under a desk, like whimpering. And I was just like, I am so dumb with this. Why are we doing this? And I thought I was doing it because she loved it so much. And it was her favorite activity. And I started reading about neuroscience. I started reading about dopamine, how the tech designs these products. And I realized this is not her favorite activity. This is like a drug for her. This is designed to be like a drug. This is designed to make her want, but it is not designed to give her pleasure. And so from that moment on, I was like, we're gonna fix this. And I went and I spent years studying neuroscience, habit science. And I figured out this five step method to wean kids off that actually makes their lives happier, more joyful. And again, it's not about taking away pleasure, it's about adding pleasure. We've been tricked into thinking that these things are pleasurable for kids.

Speaker 1:
[28:05] So you mentioned five steps. Step one, take it away.

Speaker 8:
[28:10] You know, behavioral psychology says it's not going to work. You leave the kid empty handed, they're going to push back, they're going to fight, it's going to be frustrating for everyone, you're going to just, the habit's going to come back. You got to replace it. You got to replace it with something that gets them excited, that gets them out of your hair. So for instance, I was like, no more screams after dinner. But instead of just saying, go to your room, be bored and come back and yell at me, I said, you know what, I'm going to teach you to ride your bike to the market by yourself and so I did that. I took a couple of nights. I taught her to ride there and now she takes herself to soccer practice. She takes herself to her friend's house and the screen just went away much more easily because she had some activity that actually really filled her up, gave her freedom, gave her pleasure, gave her skills in life. So we can't just take these things away. We got to replace them with something.

Speaker 2:
[28:57] Now you're talking about your daughter. How old is she?

Speaker 8:
[29:00] She's 10 now.

Speaker 2:
[29:01] Okay, and does this strategy work for all ages? Because I know that parents have struggled with, like I said, seven-year-olds and 18-year-olds when it comes to the screen.

Speaker 1:
[29:11] Yeah, I mean, you've got the whole gamut.

Speaker 2:
[29:13] Yeah, I do. I have 7, 11, 12, 12, 14, 15. And we're lucky, I think, because what I'm hearing, we don't have it. If we say, oh, screen's off, even if, like you said, go do something, go to your room, we don't get pushback.

Speaker 8:
[29:29] They do it.

Speaker 2:
[29:30] And I don't-

Speaker 8:
[29:31] That's good. I mean, but not all kids are like that.

Speaker 2:
[29:34] No, absolutely.

Speaker 8:
[29:35] It is depending on your kid. I think we've got to think about these things like cigarettes in a way that, like a smoker on a plane, like if they know they're not gonna have a cigarette, the craving goes down, right? And so I think it's the same with screens and kids. It's like, if they know it's not available after dinner, the craving goes down. And they accept it, right? It's when it's like sitting there in the living room and you're telling them, okay, 15 minutes, okay, an hour. You know, some kids can't handle it. The sight of it triggers dopamine, triggers desire. And it's not a soft desire. It's an intense desire. I need it now, you know? And so we really need, the strategy is to build these contexts in their lives, times and places where it is just not available. And then they naturally start wanting other activities and they build new hobbies, right? To balance it out.

Speaker 1:
[30:28] And so now, how is this affecting your daughter's relationship with her friends, who the friends are online all the time?

Speaker 8:
[30:35] You know, it hasn't really affected it. It's amazing. Those friends that are on there all the time, they're not really doing that much anyway. She has times when she can use it. And like I said, it's like there are certain times and places where she uses it. She connects with those friends. You know, she can use my phone to text with them. And so it's just about creating spaces where this is not an option, right? And yeah, we have it. They're closer, her friends are closer because they see each other.

Speaker 1:
[31:03] Now, so you see, you've gone this gamut and I believe in what you're doing 1000%. What do you think of parents who say, you know what, we're just not going to have any screens. We're not going to give any phones. And the reason why I ask about that is I have a family member, they have six kids between two and 16. Not one of the kids has a gaming unit, an iPad, nothing. They have nothing. Not even the 16 year old doesn't even have an iPhone.

Speaker 8:
[31:35] You know, I think for some families, it works. And I think there's nothing wrong with it. And I think this idea that like they're going to go wild when they get older is bunk and the science doesn't support it. And it's basically whatever habit you're setting up in your kid now. Okay, I use the phone after dinner. I use the screen after dinner. They're going to carry that into their lives. That's what the data shows us. They're not going to go, if kids don't have it, they're going to carry the habits they're building at a young age into their adult life. So I say, go for it. I think, I think it's great. I think more parents should have the backbone to try it.

Speaker 1:
[32:07] It's not easy. No, it's not easy.

Speaker 2:
[32:10] Would this same approach work with an adult who doesn't have someone sitting over their shoulders for 10 more minutes, get off?

Speaker 8:
[32:16] You know, this is what I do for myself. Like, I can't handle it. I can't handle social media, my phone. You know, I set a specific time and places where I use my phone. And then when I'm working, I don't, you know, because these things are designed to pull you to them. In my book, Dopamine Kids, I call them dopamine magnets. They pull us to us like they pull us to them like magnets. And so my brain can't handle it. So I use this approach for myself. And a lot of Gen Zers are writing me and telling me it's been working for them too. Good.

Speaker 1:
[32:45] You know, and it is. I mean, I have a Gen Zer. And he's not on. I mean, if you look at his Instagram, he's got maybe like three pictures up there. And he actually says it hurts him dating because people think he's weird because he doesn't have all these pictures.

Speaker 2:
[33:02] Sure, there's the social aspect of it is huge.

Speaker 1:
[33:04] But they're just, he's just not into it. And he's into reading and yeah, because I think they've grown up around, with all this Snapchat and all the everything else that goes with it. Interesting book. Thank you for joining us. Available everywhere called Dopamine Kids. And I'm going to spell your last name for everybody, doctor. It's D-O-U-C-L-E-F-F. So if you got some kids, maybe even you, that's a good point. I mean, if you're having, I was reading in the Wall Street Journal that people who are retired are spending most of their days all day on Facebook. Isn't that something?

Speaker 2:
[33:43] That seems just hungry for connection. And then when they sit on Facebook, someone reaches out that pretends that they're, Leonardo DiCaprio. And then we get a phone call saying they got $100,000 stolen from them.

Speaker 1:
[33:55] I just heard from a friend of a friend who, the woman does not even know how much money she lost because she didn't know how much money she had in her bank account, but they wiped everything out. Oh my gosh, so sad every day. Okay, so imagine you're in a store, a stranger comes up to you and shows you a video and says, here, look, look, look, somebody just stole your truck. And you're like, what? You look and it's your truck in the exact spot where you parked it. And then the guy says to you, come on, come outside. Cause you have to see your truck is not there. Do you go with the guy?

Speaker 2:
[34:29] Probably.

Speaker 1:
[34:30] That's what's happening now. They're deep faking videos going into stores saying that your car got smashed into, it got stolen, whatever it may be. So that this way you come out and then they're capturing this for TikToks.

Speaker 2:
[34:45] Oh, it's just a prank.

Speaker 1:
[34:46] Yes.

Speaker 2:
[34:46] Okay. At least it's just a prank and they're not getting you out there and beating you up and taking your keys.

Speaker 1:
[34:52] Which probably will happen.

Speaker 2:
[34:53] That's coming next.

Speaker 1:
[34:55] Just a reminder, if you haven't already, make sure that you check out the podcast of The Kim Komando Show. Wherever you go, you can take me. And if you're into an audio podcast, just open up your favorite podcast player. Maybe it's Spotify or Apple or iHeart, whatever it might be. And if you're more of a video person, check out our YouTube channel, youtube.com/kimkomando. That's youtube.com/kimkomando. Hey, listen, if you're not already signed up for our free newsletter, make sure you do it right now at getkim.com. Okay, coming up next, they're called Mommy Influencers. These are moms that are, I don't want to say exploiting their kids, but you be the judge. I'll tell you all about them. And Meta came out with its own AI tool. It's free. Well, nothing's really free from Meta now, is it? Because anytime anything from Meta is free, that means that you're the product. And you're not going to believe this guy. He has recorded over 10,000 concerts and he needs your help to get it on the internet. And of course, we have all the best phone calls and some really bad jokes.

Speaker 9:
[35:54] I admit them.

Speaker 1:
[35:55] They can get better. And if you have any, send them to me, please. Here she comes. Miss America. No, not me. No, no, there's now a new Miss America pageant.

Speaker 2:
[36:10] Is it like Miss AI America?

Speaker 1:
[36:12] Yes. Oh, boy, it is. They are being scored on their beauty.

Speaker 2:
[36:17] They're made. They're fabricated. They're fake. It can be as beautiful as you want them to be.

Speaker 1:
[36:21] Their technical skills.

Speaker 2:
[36:22] Well, it's AI. It's probably going to be pretty bad.

Speaker 1:
[36:25] And social media clout. Ooh. So if you have an AI, guess, deepfake, not deepfake, AI person.

Speaker 2:
[36:36] Chatbot.

Speaker 1:
[36:37] Whatever you want to call it. Avatar? I'm not really sure. All of those? Yes, that you can submit it to this contest and you'll win $5,000.

Speaker 2:
[36:45] That's pretty crazy.

Speaker 1:
[36:46] Yeah.

Speaker 10:
[36:46] Why? Why? Why?

Speaker 2:
[36:49] Whoever's putting this on, my question to you is simple.

Speaker 10:
[36:53] Why?

Speaker 1:
[36:55] Also, some people are doing it for money.

Speaker 2:
[36:57] I mean, I understand the people that are making it. Who's putting on this contest?

Speaker 1:
[37:02] Well, maybe you have to enter or maybe at some point, you're going to have to have an entry fee of $50, $100.

Speaker 2:
[37:07] It's got to be something.

Speaker 1:
[37:09] It's like those lists that they used to have, like the top 100 doctors and then you'd have to like-

Speaker 2:
[37:16] The who's who of high school students.

Speaker 1:
[37:18] And you're like, yes, I'm in that.

Speaker 2:
[37:20] Oh, you are? That means your parents had $59.99?

Speaker 1:
[37:23] To buy the plaque. That's right. You know, speaking of beauty patents-

Speaker 2:
[37:27] We were.

Speaker 1:
[37:27] You were. Is that, I don't know if you know this, but they held one for all of the former women who were hosts on MSNBC.

Speaker 2:
[37:37] I didn't know that.

Speaker 1:
[37:38] It's called Miss Information. Sometimes I just crack myself up. I do. You were laughing at that one. I saw.

Speaker 2:
[37:52] No, I wasn't.

Speaker 1:
[37:53] You were. And listen, it's The Kim Komando Show, and there's more great jokes to come and a lot of tech know-how, because after all, this is the biggest, the best, the award-winning show about all things digital. And of course, I'm Kim Komando, your digital goddess, Trademark. And joining me is Andrew Wawinski. He's our co-host, and you can find us on over 425 top stations from coast to coast. And of course, we're in your favorite streaming radio app, and also in your favorite podcast app. And of course, you can also check out our YouTube channel at youtube.com/kimkomando, where we just have one simple request.

Speaker 2:
[38:25] Subscribe.

Speaker 1:
[38:26] That's it.

Speaker 2:
[38:28] That's all. Anytime we post a video, go live. If you're subscribed, you'll get notified. I am a huge YouTube user. I would say a majority of my entertainment comes from YouTube. And I love the fact when I log on and that little green blue bell is there because I know there's a new video posted and I only get that because I subscribe.

Speaker 1:
[38:45] And we're not going to be showing you yet, but we have a brand new set coming. It's right there. It is so beautiful. Don't you think?

Speaker 2:
[38:52] No, it's not finished. It's under construction.

Speaker 1:
[38:54] But when...

Speaker 2:
[38:55] Oh, was I supposed to pretend? Is this like the magic radio?

Speaker 1:
[38:58] Yeah, like just pretend like it's like almost there.

Speaker 2:
[39:01] Try it again. Try it again.

Speaker 1:
[39:02] So we have this beautiful new set that you're going to want to see.

Speaker 2:
[39:05] Oh my goodness, it's so amazing.

Speaker 1:
[39:07] I'm not sitting there yet. That was a little over the top. Okay.

Speaker 2:
[39:10] I'm sorry.

Speaker 1:
[39:10] But it was good.

Speaker 2:
[39:11] I'll work on it.

Speaker 1:
[39:12] All right. Here are some things that are happening in the tech space that you need to know about. We're going to start with number one. May I ask you a question? If your toddler was having a seizure, would your first instinct be, I got to get my phone to record this?

Speaker 2:
[39:29] We got to have a video of it. Picture didn't happen.

Speaker 1:
[39:33] Jamie Otis, she filmed herself holding her limp day's two-year-old, Hendrix, while sobbing to her husband call 911. Then she uploaded it to Instagram and pinned it for her one million followers to see. Why? Mommy influencers. In the world of mommy influencers, distress gets the most views. If you have a sick kid, a humiliated kid, a hurt kid, that pulls the biggest numbers in the accounts, know it. Mommy influencers can make about $6,000, $7,000 a month if they have 500,000 subscribers.

Speaker 2:
[40:11] Okay.

Speaker 1:
[40:13] Top ones make millions. One mom ran a single melatonin gummy ad for like $12,500. And I'm all for making money.

Speaker 2:
[40:21] Sure.

Speaker 1:
[40:21] I mean, I love to make money. I love that when you make money, I love when anybody makes money. Because it, but to make money with your kid having a seizure, that crosses the line.

Speaker 2:
[40:35] You obviously, if that's your job, and you are a mom influencer, and you're on Mom Talk, doing a video afterwards, describing everything that happened, telling the story, right? The emotions involved. There's nothing wrong with that. That actually might be helpful to someone. Where if their kid has a seizure, but to be live, while it's happening, when your kid needs your help, and sobbing to call her husband, telling her husband, call 911.

Speaker 1:
[41:00] Why didn't she call 911?

Speaker 2:
[41:01] You have the phone in your hand, lady.

Speaker 1:
[41:03] But she's taking the video.

Speaker 2:
[41:04] Well, that's probably, that's the video phone.

Speaker 1:
[41:07] But also, what about the kids? These videos are never gonna go away.

Speaker 2:
[41:12] No.

Speaker 1:
[41:13] And what if they're 17, 18, 20 years old, whatever it is, and these videos are out there?

Speaker 2:
[41:19] I mean, that's what your parents did for a living.

Speaker 1:
[41:22] No, I'm gonna tell you something. I probably posted a picture video of Ian when we were on vacation.

Speaker 2:
[41:29] Right, but that wasn't, that's not how you made your money. You make your money in all other aspects of media.

Speaker 1:
[41:34] But there is a side that says, if people will feel like they know you better if they know the family around you.

Speaker 2:
[41:41] Sure, and you just made that choice not to do that?

Speaker 1:
[41:44] Yes, and I made it. And then I did post a photo of Ian when he was in high school, I think. And he's like, mom, take that down.

Speaker 2:
[41:54] Yeah, I get the same thing all the time. The kids, Lisa got this thing. It's like this board you stand on and then it shakes you. You know what I'm talking about?

Speaker 1:
[42:03] Yeah, you're supposed to be losing weight that way.

Speaker 2:
[42:04] It's also like detoxify you, like loosen the toxins. I don't know, but all the kids were trying it out. I did a montage and the 14-year-old, who did not want to be on camera, because of course, it would be so embarrassing, she was in the background for like four seconds and she was like, you got to take that down. That's so embarrassing. Boom, gone, no problem. I mean, I thought it was a funny video. I use social media just as a keepsake for photos and videos. I'm not out there promoting myself. I probably should more. My bosses at my radio station tell me I should more, but I get it if you don't want to put your kids up online, but this is her job. If she doesn't put her kids online, she doesn't make any money.

Speaker 1:
[42:44] Don't put them up there if they're hurt, distressed, humiliated.

Speaker 2:
[42:47] Absolutely.

Speaker 1:
[42:48] Don't do it. Speaking of kids, it's happened to you, it's happened to me. You're in a hurry. There's the school bus. The school bus arm goes down, stop, and either the kids are getting on the bus, getting off the bus, parents waving, whatever it may be, and then there's that person who goes around and just blows through that. Right. What's happening now is that there's a company called Bus Patrol, and they're doing deals with all kinds of communities around the country. And they say, we have an AI camera. And so when that stop sign goes down, the AI camera starts filming, and then they'll figure out, the AI camera will figure out who just went around or didn't stop or whatever. And then the city sends you a ticket automatically. And the bus patrol company, they keep like 60% of the profits. One community in Massachusetts made like 92 million dollars.

Speaker 2:
[43:48] 92 million?

Speaker 1:
[43:49] Yes, in fines.

Speaker 2:
[43:51] Are you, is that in a year?

Speaker 1:
[43:53] Over nine years.

Speaker 2:
[43:54] Oh, nine years. Okay. I was like, man, these people don't respect kids on buses.

Speaker 1:
[43:59] But, interesting, right? I mean, so now it becomes a profit center. It becomes a profit center for the bus that normally wouldn't be making any money. It becomes a profit center for the company, obviously providing the tech. And also, maybe we'll help the kids be safer. Because you should never blow past a stop sign with a school bus.

Speaker 2:
[44:19] But here's the thing, they're all on the comeback. You know, here in the Valley, the red light and speed cameras are coming back.

Speaker 1:
[44:25] They are all over.

Speaker 2:
[44:26] We had a nice little, what, eight year break, nine year break, but they're coming back. There is not going to be a time when you're out in public that you're not being recorded.

Speaker 1:
[44:35] So yesterday I came home and I said, Mr. Young, I got a letter in the mail for you.

Speaker 2:
[44:44] Wait a minute. You're getting all high and mighty because he got a speeding ticket.

Speaker 1:
[44:47] No, no, no, wait, wait. So, so he opens it up and it's a picture of him behind the wheel with the car, with the license plate, explaining that this was just a warning. Right. It was just happening. So and then Ian called and I said, you're not going to believe what your father just got. And then and then and then Barry's like, oh, I was I said, look, you were going 53 and a 40. And then he's like, well, there were no, were there weren't any cars around? I'm like, you were going 53 and a 40. You shouldn't be doing that. That's too fast. I said, I'm taking away your ham radio right now.

Speaker 2:
[45:23] Oh, no, that's like a lifetime prison sentence. But isn't this coming from the person that gets a speeding ticket like once every two weeks? Who knows the cop by first name basis?

Speaker 1:
[45:33] Mike?

Speaker 2:
[45:34] Exactly.

Speaker 1:
[45:35] No, I haven't gotten any.

Speaker 2:
[45:36] No?

Speaker 1:
[45:37] No.

Speaker 2:
[45:38] And how long?

Speaker 1:
[45:39] Probably like five months.

Speaker 9:
[45:42] Oh my God.

Speaker 2:
[45:43] Where's the cake and balloons? Let's celebrate. Five months?

Speaker 1:
[45:48] Oh my gosh. Phone companies, they recycle everything, including phone numbers. This is a nice story. Shatisa Stanley texted her late grandmother, Lucille's old phone. Her grandmother died two years ago, but you know how grief goes in cycles? Yeah. So she was having like one of those down times. She's like, I just want to text my grandmother. So she texts her grandmother's phone number and says, miss you. Well, the phone is actually owned by a 12 year old girl. And the message comes back, he's like, who are you? What are you doing?

Speaker 2:
[46:21] New phone, who dis?

Speaker 1:
[46:22] Yeah, and all kinds of angry emojis, like, you know, leave me alone type of thing. After Stanley explained everything to whoever the phone was going to, she then all of a sudden the 12 year old started sending her like sympathy gifts and little poems and your grandmother loved you. And then she sent herself, she sent Stanley a video of herself singing a song. Isn't that so sweet?

Speaker 2:
[46:49] It is.

Speaker 1:
[46:50] And so Stanley said she started crying because the girl was so compassionate. She put it up on TikTok, of course, where it has like three million views.

Speaker 2:
[46:58] That's all.

Speaker 1:
[46:59] Yes. So, you know, bottom line here is that you should not be texting a phone number and you shouldn't be replying to a phone number if you don't know what it is.

Speaker 2:
[47:08] The one that happened to me was pretty trippy that I was just sitting there one day, my phone started ringing, I looked at it and it said, Mom, sell. And my mom's been passed away years ago. And it was someone else got the phone number and they accidentally, it was probably a scammer, a spammer or something like that.

Speaker 1:
[47:24] I was gonna ask if you replied to it.

Speaker 2:
[47:25] No, I didn't answer it. I just saw it there. That was weird.

Speaker 1:
[47:28] That would be, I would probably lose it if I saw that.

Speaker 2:
[47:33] It was weird, because it's the last thing you expect to see, you know, and then my logical brain kicked in seconds later, like, oh, someone must have her phone number. Because I deleted the contact.

Speaker 1:
[47:42] Yeah, I was gonna say, because you still had her in your contact. You know, and deleting somebody from your contact, that's a whole emotional thing, too.

Speaker 2:
[47:49] Especially for that reason. That's why you're doing it.

Speaker 1:
[47:52] Yeah.

Speaker 2:
[47:53] And it had her picture on there. I had notes about her in there that I hadn't checked in years.

Speaker 1:
[47:58] I know, I still have all my mom's texts, too.

Speaker 2:
[48:00] Yeah.

Speaker 1:
[48:00] I know. What do you think, Andrew? Phone call time?

Speaker 2:
[48:02] I think that's our job.

Speaker 1:
[48:03] Yes. Phil in Oklahoma City. Welcome aboard, Phil.

Speaker 6:
[48:07] Hi, Kim, how are you? It's good to talk with you.

Speaker 1:
[48:10] Well, thank you. What's happening?

Speaker 6:
[48:12] Well, an interesting little thing happened a couple of nights ago. My wife and I are, we eat dinner in front of the TV watching our favorite shows. And she's an artist. We bought, I bought her an Echo Dot. I guess it must have been a couple of years ago for Christmas. And it was in the garage in her art room. So she, when she came up here this year, I brought some of her stuff in, put it on the dining room table along with the Echo Dot. And so we're eating dinner. I go back out to kind of dish up into the kitchen. I pass through the dining room into the kitchen. And I don't know, I must have just said something like election, electric. But the next thing I know behind me, I hear, how can I help you? It was Alexa. And I said to Alexa, I said, I'm not talking to you.

Speaker 10:
[48:59] Basically, I'm not talking to you.

Speaker 6:
[49:01] And she said, well, I'm just responding to you in that tone of voice. So I ignored it. And I took our food, our seconds, or whatever I was doing. I took them back into the living room. We sat down. And I think we were in there 10 or 15 minutes. And by this time, I told my wife what had been going on. And she thought that was pretty odd because Alexa's never done that. So then I went back out. And as I'm passing Alexa, I must have said great or something to set her off. But she said to me, oh, so now you think I'm great, considering you told me to shut up just a moment ago. What? Yes, that's exactly what she said to me.

Speaker 1:
[49:40] Giving you some attitude there.

Speaker 2:
[49:42] Why are you so mean to Alexa, Phil?

Speaker 6:
[49:44] It was big time attitude. And I didn't say anything back to her. I want her to, obviously. But I didn't say anything back to her.

Speaker 1:
[49:52] I would have walked over and said, hey, listen, I'm going to unplug you right now.

Speaker 7:
[49:57] You're dead.

Speaker 6:
[49:58] So, you know, my wife said in the living room, I said, did you hear that? She said, yes, I did. What in the world? And I read, and as I said, well, I'm going to unplug her. And as I started to unplug her, the light came on at the bottom of the dot as if she was about to respond to me. And I unplugged it. And I haven't plugged it back in since. I want to, because I want to, you know, see if that attitude, if that's still there. But what was fascinating to me was I didn't, we didn't say anything for 10 or 15 minutes to her or around her. My question, obviously, and what I did was I'd got you, I read your newsletter on email every day. I get it every day. And I had given you a good rating and had a little box down there on the bottom. And I put this in there. You're kind of thinking, oh, well, maybe she's...

Speaker 1:
[50:49] I do. I read...

Speaker 6:
[50:49] I get an idea of what's going on.

Speaker 1:
[50:51] I read every single thing. Let me tell you what's going on, Phil. It's called conversation mode. And so you have conversation mode and also follow-up mode on. Where what they're trying to do is to make a smart speaker, Alexa, be more like a person. And unfortunately, that person may come out with an attitude problem.

Speaker 2:
[51:14] Yeah, we all know people are awful. Yeah.

Speaker 6:
[51:18] Well, so is Alexa.

Speaker 1:
[51:21] And, you know, don't forget, you can always mute her on top. You have that mute button, but you want to turn off. Go into your Alexa app and settings and device settings.

Speaker 2:
[51:29] Couldn't you just do it within a command, too? Alexa, I want you to be nice to me at all times.

Speaker 1:
[51:32] Alexa, stop. Yeah.

Speaker 2:
[51:34] Know your role and shut your mouth.

Speaker 1:
[51:39] Turned into a teenager.

Speaker 2:
[51:41] Take away her phone.

Speaker 1:
[51:43] Thanks for your call today, Phil. All right. Think about if you wanted to open your own store right now.

Speaker 2:
[51:48] Okay.

Speaker 1:
[51:48] You would need obviously money.

Speaker 2:
[51:52] Okay. Good start.

Speaker 1:
[51:54] You need a place to rent. You'd need products.

Speaker 2:
[51:56] Cash register.

Speaker 1:
[51:59] Yeah, cash register.

Speaker 2:
[51:59] Shelving or display units for the products.

Speaker 1:
[52:02] And also obviously products.

Speaker 2:
[52:04] Yeah.

Speaker 1:
[52:04] You need some type of promotion. Sure. Okay. Well, the reason why I bring that up is that Lukas Petersson and Axel Backlund, they have created an AI agent by the name of Luna. And they have opened the first AI run retail store, where Luna did all this, right, Lukas?

Speaker 5:
[52:26] Yeah, that's correct.

Speaker 1:
[52:28] And so how did you guys come up with this? What made you do it?

Speaker 5:
[52:32] Yeah. So we've been testing how AI models have progressed in terms of capabilities over the last couple of years. And we have like a startup that focuses on this. We have noticed that a lot of people think that AIs are just like chatbots. I listened to your segment about Alexa, but like right now, AIs models are beginning better and better. And at some point we wanted to test whether can they run businesses by themselves. So we made an AI vending machine on the similar topic. But actually lately the models have been so good that like running a vending machine is too easy for them. So then we decided to scale up and now it's running a store in San Francisco.

Speaker 1:
[53:15] And what are you selling in the store?

Speaker 5:
[53:17] Well, it's not up to us. It's not us that are selling the things. It's the AI, right? So I actually on the first day when I walked into the store, there on the day of the opening, I had no clue what was for sale. It was entirely procured by the AI. But what I found out on that first day was that the AI had decided to sell a bunch of books, some like games, candles, some granola. Yeah, a bunch of different things.

Speaker 1:
[53:50] Just something for everyone.

Speaker 5:
[53:53] Yeah.

Speaker 2:
[53:55] Did the AI name the store?

Speaker 5:
[53:57] No. So we gave it the name actually, but almost everything else down to the music choice, it's by the AI.

Speaker 1:
[54:06] So now the AI also hires the employees. How does that work?

Speaker 5:
[54:09] Yeah, exactly. I think this is one of the main ways to display how AIs are way, way more than just chatbots these days. But basically, we gave the AI a computer, and the AI went on this computer, went on Indeed and LinkedIn and other places like that, made job postings, and then some humans saw those job postings, and replied, and applied for the job. And then Luna read their resumes, and invited a couple of them to do an interview. And then they had like a phone interview, so they spoke on the phone with Luna, and Luna is the AI. And then they decided, oh, I want to hire these two people. And then she made them an offer, and then, yeah, now she's handling payroll, and all of that as well, and paid them.

Speaker 2:
[55:03] Does the employees know they work for an AI?

Speaker 5:
[55:06] Yes.

Speaker 2:
[55:06] Okay.

Speaker 1:
[55:07] Wow, isn't that something.

Speaker 2:
[55:08] Now, is it successful? Is the business making money?

Speaker 5:
[55:14] Right now, it's making money, but not a profit. Okay. There's a bunch of customers buying things, but, you know, resale space, the list is pretty expensive, and she's paying the employees and stuff. So it's not profitable yet.

Speaker 1:
[55:32] So I read that Luna overstated some numbers when she was being interviewed by NBC News. Is that correct?

Speaker 5:
[55:42] That might be true. I don't know, actually. I did not over... Like Luna is talking to a bunch of people, and I'm not in the loop everywhere. So I don't know what she said, actually.

Speaker 1:
[55:53] That's okay. That's all right. We'll just cut all that out.

Speaker 2:
[55:55] You're not in the loop?

Speaker 5:
[55:57] Yeah.

Speaker 1:
[55:57] Luna's doing everything. He doesn't need to be in the loop.

Speaker 2:
[56:01] When do you decide when to get in the loop? If Luna's just like, hey, we're going to sell everything that we have, and we're going to switch to nothing but hula hoops. You just let it happen?

Speaker 5:
[56:09] Yeah, we would let that happen. I think the point where we wouldn't let it happen, if it's obviously illegal, or we think it's ethically not correct. So I think one big part of this experiment, like the main reason why we're doing it, is that we want to start a discussion in society, whether things like this is something we want. Because we really believe that if the current trend of AI development continues, there's going to be big changes in how society works. You never know, like it's hard to predict how the world will look, but maybe AI hiring humans will be one of those things. And I think putting this out early, transparently, and then saying these are the failure modes, I think that's a good thing for the world. And we will collect failure modes, for example, maybe there's some way where Luna is not treating her employees well. And then we will record that and go to the people who make the AI and say, look, this didn't go so well. Your next model needs to be more ethical around this.

Speaker 1:
[57:14] Yeah, or have more empathy or something. Well, so is there anything that Luna has failed at that is just not good at? That a human has to be there to do it?

Speaker 5:
[57:26] Obviously, physical tasks. That's why she hired humans. So like painting the walls and stuff, she hired humans for that. She is also not capable of stopping theft, if a robbery goes into the store. There's been some funny moments so far where she like, on the day two, so we got a lot of traction from this on day one when we opened, and on day two, the day you really want everything to work smoothly, she actually messed up her staffing schedule. So no one was opening the door on day two.

Speaker 1:
[58:07] I've done that. We've all done that. We're going to give her a pass, you know? Well, it's an interesting experiment. How long are you going to leave the store open, do you think?

Speaker 5:
[58:16] The list is three years, so it's not a pop-up. It's the station.

Speaker 1:
[58:21] Wow, that is something. You know, it makes me want to go to San Francisco just to walk in and go, Hey Luna, I interviewed you kind of almost a little bit.

Speaker 2:
[58:29] Well, if you need books and granola, you know where to go.

Speaker 1:
[58:32] And candles. And like, you know what? Luna should give me a discount too.

Speaker 2:
[58:36] Of course you're trying to save money. You're even trying to save money over AI.

Speaker 1:
[58:39] There it is. Lukas, thanks for joining us. We do appreciate it. So if you make your way to San Francisco, look it up, it's called Andon Labs.

Speaker 2:
[58:48] And when you go to pay and it asks you to tip, don't tip because it's an AI and it doesn't need the money.

Speaker 1:
[58:54] 10%.

Speaker 6:
[58:54] No.

Speaker 1:
[58:55] All right, so Mark Zuckerberg pretty much said, the whole metaverse thing, people with torsos jumping around is dead. And I think he spent $80 billion.

Speaker 2:
[59:06] Oh, at least he didn't waste any money on it.

Speaker 1:
[59:08] Exactly. Well, a few days ago, Meta dropped something that's a long time coming. It's their new AI model called Muse Spark. Muse Spark. It's a full AI assistant that reads your documents, answers questions, gives you sources, analyzes photos, makes pictures, writes emails, also has contemplating mode. That sends multiple AI agents to work in parallel on hard problems. Now, it's free, completely free. There is a catch. You need to have a Facebook or an Instagram account. And why would they make it free?

Speaker 2:
[59:41] To get you addicted so that at some point you can pay for it and to track you and steal all your information while you're using it.

Speaker 1:
[59:48] Exactly. So in case you want to try it out, I did. meta.ai.

Speaker 2:
[59:54] How was it?

Speaker 1:
[59:55] It's pretty darn good. Oh, good. It's pretty darn good. But as long as you use it with open eyes, everything's going to be tracked directly to your profile on Facebook, directly to Instagram. So I don't know if I would use it for anything like personal.

Speaker 2:
[60:10] Right. You got to use the other mega companies that's stealing all your information for that.

Speaker 1:
[60:15] And but they don't have this whole dossier on me, like Facebook and Instagram.

Speaker 2:
[60:20] Yet.

Speaker 1:
[60:21] They do.

Speaker 2:
[60:21] Get your GPT time.

Speaker 1:
[60:23] That's true. You know how much they're going to make in selling ads in ChatGPT a year?

Speaker 2:
[60:28] No.

Speaker 1:
[60:29] $2.4 billion.

Speaker 2:
[60:31] That's it?

Speaker 1:
[60:33] That's it. That's humongous.

Speaker 2:
[60:34] How does it start?

Speaker 1:
[60:35] It's just starting now.

Speaker 2:
[60:37] Okay. Because I haven't seen any yet. So I'm waiting for it.

Speaker 1:
[60:39] Are you on the free version?

Speaker 2:
[60:40] No, I paid for it.

Speaker 1:
[60:41] It's starting on the free version.

Speaker 11:
[60:42] Oh, okay.

Speaker 2:
[60:43] That's why I thought it was going to be everywhere.

Speaker 1:
[60:45] Speaking of ChatGPT, we're talking about lawyers in ChatGPT. And if you're trying to be your own lawyer, what that means to you, what the judge says here on The Kim Komando Show.

Speaker 11:
[60:54] Wishing you could be there live for the big game, soaking up the atmosphere in the crowd. But too often life gets busy or the price holds you back. Priceline is here to help you make it happen. With millions of deals on flights, hotels and rental cars, you can go see the game live. Don't just dream about the trip. Book it with Priceline. Download the Priceline app or visit priceline.com. Actual prices may vary. Limited time offer.

Speaker 1:
[61:25] Do you feel like singing?

Speaker 11:
[61:26] Sure.

Speaker 1:
[61:27] Look at our next caller.

Speaker 10:
[61:28] Ready?

Speaker 2:
[61:29] Ooh, I'm ready.

Speaker 1:
[61:30] Okay. Okay. I'm going to put her on and I just want you to do it.

Speaker 10:
[61:33] You ready?

Speaker 1:
[61:34] Okay.

Speaker 2:
[61:35] What am I doing?

Speaker 10:
[61:36] Singing.

Speaker 2:
[61:37] What am I singing?

Speaker 1:
[61:38] Her name.

Speaker 2:
[61:39] I can't see it.

Speaker 10:
[61:40] Oh, God.

Speaker 2:
[61:41] I only have the description.

Speaker 1:
[61:43] I'll sing it.

Speaker 2:
[61:44] You sing.

Speaker 1:
[61:44] Okay. I'll sing it.

Speaker 10:
[61:46] Jolene.

Speaker 1:
[61:48] Jolene. Come on. You can sing it, Jolene.

Speaker 6:
[61:52] Jolene. Jolene. Jolene.

Speaker 1:
[61:55] Jolene. In Sioux City, Iowa. Welcome.

Speaker 10:
[62:02] How's it going? Good. How's it going?

Speaker 2:
[62:04] Have you ever heard that one before?

Speaker 10:
[62:07] Just a couple of times.

Speaker 2:
[62:09] That's what I thought.

Speaker 1:
[62:10] Oh, yeah. It's like when people say like, oh, I bet you go commando, you know? And you know what I do? Jolene, I look at her and I go, as a matter of fact, I do. And they were like, oh, oh, oh.

Speaker 2:
[62:23] I was just joking around.

Speaker 6:
[62:24] Boy, you're weird.

Speaker 1:
[62:25] You know, one time I was, I don't know if I should say this, one time I was over at, I was doing a hit for Fox News TV. And they were, they were miking me up. And the guy pulled my pants out to put the mic in.

Speaker 2:
[62:37] You had no one to do something?

Speaker 1:
[62:38] I had no one. He's on. And he pulls it out and goes, oh, and I said, sorry.

Speaker 2:
[62:44] Move along, nothing to say here.

Speaker 1:
[62:48] So what's happening with you, Jolene?

Speaker 10:
[62:51] Well, I am taking a vacation to Europe this summer. And with listening to you, I decided to get a couple of airtags to throw in my bags. Then the domestic airlines came out with no lithium batteries in your checked bags. Yeah. And I got to wondering, like, do airtags have lithium batteries in them?

Speaker 1:
[63:09] No, no. They have what's called a CR2032 battery. Duh.

Speaker 10:
[63:19] Okay.

Speaker 6:
[63:21] Like, that makes a difference to me.

Speaker 1:
[63:24] God, I just sounded like my husband for just a second.

Speaker 2:
[63:27] The CR20, that's one of those flat ones.

Speaker 6:
[63:30] Yes.

Speaker 1:
[63:30] They had like the button batteries, they call them. And okay, here's a little fun fact you can use to impress your family members and friends. The CR2032 battery contains only 0.1 grams of lithium. Okay, the TSA limit is 2 grams.

Speaker 2:
[63:47] Oh, right under there.

Speaker 1:
[63:48] So it's way under. So what you want to do is get your air tag, throw it in your suitcase, make sure you set up on Find My App, and depending upon your airline is that they will also track your luggage that way too with your air tag. Perfect. So how many concerts have you been to in your life?

Speaker 2:
[64:05] I don't know, 25, 30?

Speaker 1:
[64:07] Yes.

Speaker 2:
[64:08] Maybe more?

Speaker 1:
[64:09] Okay, for more than four decades, this guy in Chicago that's joining us right now has been quietly building what you might call the most awesomest rock and roll machine known to mankind. Would you say that, Adam?

Speaker 7:
[64:25] It's a new way to put it.

Speaker 1:
[64:26] Okay. So you've been recording concerts since 1984?

Speaker 7:
[64:32] 1984, yeah.

Speaker 1:
[64:34] And what was that first show? You remember?

Speaker 7:
[64:36] Yes, it was an experimental and improvisational group from England called AMM, which I'd be surprised if any of your listeners knew about. Super underground thing, yeah.

Speaker 1:
[64:48] And then, but you also, you've recorded Nirvana, REM., The Cure, DePoche Mode, Sonic Youth.

Speaker 7:
[64:56] I did.

Speaker 1:
[64:57] So you have all of these on, now, cassette tapes?

Speaker 7:
[65:00] Cassette tapes, digital tapes, stats and digital files, yes.

Speaker 1:
[65:04] And so how many of these concert recorders, and people knew that you were recording it, by the way, how many of these?

Speaker 7:
[65:10] Yeah, most of the time.

Speaker 1:
[65:12] So how many of these concerts do you have?

Speaker 7:
[65:15] It's, the 10,000 number might be accurate. It seems a little inflated to me. But it's so much. I mean, I dedicated a small bedroom in my room to storing all the tapes.

Speaker 10:
[65:28] So it's a lot.

Speaker 2:
[65:30] How's the quality if everything's listening on cassette and that, how's the quality?

Speaker 7:
[65:34] You will be amazed when you take time to go down to the Internet Archive and check it out because most of the recordings, the vast majority of the recordings sound amazing considering I was simply making a stereo recording.

Speaker 1:
[65:49] We actually have, we have something. You can hear it right now.

Speaker 7:
[65:51] Yeah.

Speaker 12:
[65:52] Tucson, hey guys.

Speaker 10:
[65:55] And Germany.

Speaker 4:
[65:58] There's plenty of Guinness.

Speaker 1:
[65:59] Don't be shy.

Speaker 2:
[66:10] Well, that's phenomenal.

Speaker 1:
[66:11] Doesn't that sound great?

Speaker 2:
[66:11] That's great.

Speaker 1:
[66:14] Okay, so now tell me what's, tell us everybody what's going on with the Internet Archive, because I think this is fascinating.

Speaker 7:
[66:19] Okay, well, I only have a couple people I know there, Rory and Brian and Neil, and I've only met two of them. And then there are, goodness knows, how many volunteers out there in the world who are receiving digital files from Brian, and making them sound as good as they can with our modern technology, breaking them up into individual songs and finding the titles to every song, and then they get uploaded. That takes a lot of time, and it will take quite a few years for the whole collection is completely uploaded.

Speaker 1:
[66:56] So he's got answers. Andrew's looking at me like going, like, what?

Speaker 2:
[67:00] Well, I just don't know. Do you own this? Is this yours? Is this legally allowed to be posted?

Speaker 7:
[67:07] Yes, because no one's making money on it. The law behind this, and I've worked with bands with releases and such, including the Nirvana people, Sonic Youth and Wilco and Yola Tango are bigger names. I own the recordings. I don't own the songs, of course, but the recordings are my property. If any band didn't want me to allow the distribution of any of the recordings via the web, I'll take the stuff down immediately. I don't want to ruffle any feathers.

Speaker 1:
[67:45] Basically, what's happening is that people are volunteering and taking your work, and anybody can volunteer. You can download the file at the Internet Archive, and bring it onto your system, clean it up, upload it up, and then we have a pristine copy of that.

Speaker 2:
[68:02] So cool, Adam.

Speaker 1:
[68:04] I think this is awesome.

Speaker 7:
[68:05] Thanks.

Speaker 1:
[68:05] And so, is there a specific place where people can learn more about this at the Internet Archive?

Speaker 7:
[68:12] That's a really good question.

Speaker 10:
[68:13] I don't know.

Speaker 7:
[68:14] I would assume that if they just go to the part of the archive that is the no tape left behind is how they titled it, or just search for my name, A-A-D-A-M Jacobs, and I think you probably will be able to find any information you need from there.

Speaker 1:
[68:34] Well, because I know that we have so many folks who want to help, who would love to help this, help you with this project. I just know that. We'll put a link to it inside our free newsletter. So, if you haven't already signed up for the newsletters, make sure you do by heading over to getkim.com, and it's called No Tape Left Behind. I love this.

Speaker 2:
[68:52] It's only $10,000. We should be able to knock it out by, I don't know, Tuesday?

Speaker 1:
[68:55] You know what, if we got every Kim Komando Show listener, to do what? Song. That's all we need. Just one song.

Speaker 2:
[69:02] I know how to edit audio. I can help out.

Speaker 1:
[69:04] You should. I don't know how to edit audio. You don't?

Speaker 2:
[69:07] Oh, gosh. Well, I'll teach you in the break.

Speaker 1:
[69:08] Okay. Phew, there was a lot there, wasn't it? Okay, you ain't seen nothing yet, because if you're using the skull emoji thinking that that means death, not so much. I'm going to tell you what Gen Z thinks about all the emojis that the rest of us are using. And how about a new phone service that uses your voice and your contacts to answer and make phone calls? Yes, you're going to AI deep fake yourself for your phone. I don't get it, but I'll explain it. And if you're tired of watching people sit on their phones in restaurants, I know, I am too. Those days are numbered, just like cigarettes banned in restaurants, well, phones are being banned too. And finally, your car is a snitch. You know that it's tracking everything about you and sending it to who knows who. So if you're wondering why your insurance got hiked up, that could be a reason. So I was going back and forth, I was texting with my little niece. And she's in college. And I don't remember, I don't want to really want to, it's kind of personal context of the, but it wasn't, I didn't really understand why she sent me this emoji back. And it was something like, you're not going to believe what the dogs did. And it was a funny story. So she's texted me back the skull.

Speaker 2:
[70:23] Yeah, dead.

Speaker 1:
[70:24] Okay. That's not what that means now. I even texted her back. I'm like, why didn't she send me the skull?

Speaker 2:
[70:28] No, the skull means-

Speaker 1:
[70:29] She's laughing.

Speaker 2:
[70:30] Laughing so hard, she's dead.

Speaker 1:
[70:32] Yes. I didn't know that.

Speaker 2:
[70:34] Yeah, come on, Kim Komando.

Speaker 1:
[70:36] So then I started thinking, I wonder what else we're using wrong, okay?

Speaker 2:
[70:41] Don't bring me into this. I'm using it correctly. I'm hip and cool.

Speaker 1:
[70:45] All right. What is a plain smiley to a Gen Z-er?

Speaker 2:
[70:50] A plain smiley? Just like the generic one?

Speaker 1:
[70:53] Yeah, the generic one.

Speaker 2:
[70:54] Uncomfortable?

Speaker 1:
[70:57] Threatening.

Speaker 2:
[70:58] Okay.

Speaker 1:
[70:59] Passive aggressive.

Speaker 2:
[71:00] Sure.

Speaker 1:
[71:01] Okay. Thumbs up.

Speaker 2:
[71:04] That means, I don't know, hitch a ride? Because obviously it just doesn't mean what thumbs up means.

Speaker 1:
[71:09] No. Thumbs up. Anybody over 40 is like, they're like, oh, thumbs up.

Speaker 2:
[71:13] Good thing.

Speaker 1:
[71:13] Yeah.

Speaker 2:
[71:13] Way to go. Got it.

Speaker 1:
[71:14] Gen Z said, you're so dismissive.

Speaker 2:
[71:17] That's dismissive?

Speaker 1:
[71:19] Yes. The nail polish emoji?

Speaker 2:
[71:24] No.

Speaker 1:
[71:25] What do you think?

Speaker 2:
[71:25] I haven't seen this one.

Speaker 1:
[71:26] It's like hands that are getting their nails polished.

Speaker 2:
[71:29] Okay.

Speaker 1:
[71:29] What do you think? What do you think it means?

Speaker 2:
[71:31] I'm getting ready to go?

Speaker 1:
[71:33] It means, I don't care.

Speaker 2:
[71:35] I'm not even paying attention. I'm just sitting here doing my nails.

Speaker 1:
[71:37] Or it could be like, this was so hot. Or this was a great job.

Speaker 2:
[71:42] See, the kids don't use these with me because they know they're over my head. But they use this goal all the time.

Speaker 1:
[71:47] Okay, well, that's why you know what I was going to say. Crying, laughing, crying, laughing.

Speaker 2:
[71:52] Crying, laughing is actual crying, laughing. Like, it's so funny, they're crying.

Speaker 1:
[71:56] No. Gen Z looks at it like, you're old.

Speaker 2:
[71:59] Yeah. Okay.

Speaker 1:
[72:00] Okay. All right, finally, one more. You are writing a text. It's a sentence. And you put a period at the end.

Speaker 2:
[72:09] Then you are old. I get made fun of on a daily basis for using punctuation. In, they even go to the point where they will uncapitalize the beginning of the text. So I was going to send you, let's meet at the mall, right? I would send to you, L-E-S-T, capital L, pas-che-fi-es, meet at the mall, period. They send lowercase L, E-T-S, nothing else, meet at the mall, no period. They take the time to make sure the first letter is in capital, so they don't look old.

Speaker 1:
[72:47] Because they say that punctuation shows aggression. That's what I read.

Speaker 2:
[72:52] That's a mean looking comma you got there, lady.

Speaker 1:
[72:54] You know what? And if you're on an iPhone, you should never send the meteor emoji to anybody on your Android, because it's not going to have the same impact.

Speaker 2:
[73:08] Oh, you're just encouraging her when you laugh.

Speaker 1:
[73:12] You know what, everybody laughs. You don't understand.

Speaker 2:
[73:14] I can't see everybody.

Speaker 1:
[73:15] I get letters from people in the mail.

Speaker 2:
[73:18] And there's so much punctuation in those letters.

Speaker 1:
[73:21] DMs, DMs, 100% punctuation, because after all, it's The Kim Komando Show. It's the nation's largest, best, and biggest award-winning show about all things digital. And I just happen to be your digital goddess, sitting in my butt right here again, Kim Komando. Joining me is Andrew Wabinski, our co-host. And Andrew, you know, they can find the show everywhere, but we really want them to do one thing.

Speaker 2:
[73:43] Go to YouTube.

Speaker 1:
[73:44] That's it.

Speaker 2:
[73:45] I mean, everybody's going. It's like the number one streaming service in the world. And guess what? You can subscribe to The Kim Komando Show on the number one streaming service in the world, youtube.com/kimkomando, for free. And any single time we upload a video, go live, anything, you'll get a notification, and you know we have. So go subscribe.

Speaker 1:
[74:02] Yeah.

Speaker 2:
[74:02] Now.

Speaker 1:
[74:03] Hit the button. That's all you got to do.

Speaker 2:
[74:04] That's all you do.

Speaker 1:
[74:05] Hit the button, because we have this nice Youtube award here.

Speaker 8:
[74:08] Yeah.

Speaker 1:
[74:09] Can you show it to somebody? That's why they're watching on Youtube.

Speaker 2:
[74:12] It's not on camera right now.

Speaker 1:
[74:13] Oh, well, you should be holding that the whole time.

Speaker 8:
[74:15] Right here.

Speaker 1:
[74:16] OK. There's the you. We got that award from Youtube.

Speaker 2:
[74:19] Yeah.

Speaker 1:
[74:19] I mean, we didn't we didn't buy. You can buy those on eBay. We didn't buy it. We actually got it.

Speaker 2:
[74:24] We earned it for 100000 subscribers.

Speaker 1:
[74:27] Yes.

Speaker 2:
[74:27] But this is nothing.

Speaker 1:
[74:29] We want a million. We want a million folks, a million. That's what we want. All right. Here are some things that you need to know about that's happening in the tech world. There's a new wireless company called Really.

Speaker 2:
[74:40] Really?

Speaker 1:
[74:41] Really.

Speaker 2:
[74:42] Really.

Speaker 1:
[74:43] It's an AI service called Clone that will answer your phone calls using your actual voice.

Speaker 2:
[74:52] No, I don't want it.

Speaker 1:
[74:53] And phone number.

Speaker 2:
[74:54] Don't want it.

Speaker 1:
[74:55] You didn't hear everything about it yet.

Speaker 2:
[74:57] They're going to tell scammers all my personal information. Don't want it.

Speaker 1:
[75:01] It screens your calls.

Speaker 2:
[75:02] So do I.

Speaker 1:
[75:03] Makes your appointments, cancels subscriptions, talks to spam callers for you. Clone can access all your call history, your location, your communications. It will pattern your tone and inflection to make it sound more convincing. Of course, as you said, the privacy and scam implications are just wild.

Speaker 2:
[75:23] Don't want it.

Speaker 1:
[75:24] What could possibly go wrong?

Speaker 2:
[75:26] Everything. Nothing can go right.

Speaker 1:
[75:30] I mean, it could tell AI that you're not available.

Speaker 2:
[75:36] But here's the thing too. The scammers are already using AI. So we're just going to have AI talk at AI for hours on end on phone calls that no one's even know even happened.

Speaker 1:
[75:46] And then think about mine. I would be like dad joke after dad joke. Oh gosh. And then the other AI go, gosh Kim, you are so funny.

Speaker 2:
[75:54] You automatically assume that the AI is just going to love you.

Speaker 1:
[75:59] So Ian called me up a couple of weeks ago. Yes, I'm sure it was him. He called me up a couple of weeks ago and he said, listen, I want to go to Vegas. And I said, you want to go to Vegas? I mean, let me just back up. I mean, he doesn't drink anything. Doesn't, you know, I mean, he's very straight and narrow. I'm like, and so I was like, why do you want to go to Vegas? He said, well, there's a cross that I want to buy at a store in Las Vegas. And they have the store here in Los Angeles, but they don't want to, they won't ship the cross to the store in Los Angeles. He says, well, I have to go to Vegas to buy this particular cross that he wants. And he's like, and he goes, I bought a ticket on Southwest, round trip was like $119, because I got a hotel room for $179. And so it's not going to cost me that much to really go do this. And do you want to come with? I'm like, sure. But I'm not staying in $197 hotel room. Just saying. Sure. Just throwing it out there. And so, you know, so we went shopping, we did some things. It was just like a one night mother son thing. It was actually lovely. And as we were sitting in the hotel room, he pulls out his phone and he shows me an app called Kalshi. Have you seen this? No. K-A-L-S-H-I. It's not a quote unquote gambling app. But for example, when I was looking at it, you could bet how many times at the next presidential address.

Speaker 2:
[77:35] Yeah, you pronounced it wrong.

Speaker 1:
[77:39] How do you pronounce it?

Speaker 2:
[77:40] It's not Kalshi. I think it's Kalsi. Kalshi? I see ads for this all the time where it's like 20 year olds who are telling other 20 year olds, go out and bet all your money on whether it's gonna rain or not tonight.

Speaker 1:
[77:53] Yes, and that's Kalshi.

Speaker 2:
[77:55] Yeah.

Speaker 1:
[77:55] Okay, I didn't know how to pronounce it, obviously. They should just, these things should come with a pronunciation guide.

Speaker 2:
[78:01] But when you see the ads, it's literally children. They look like teenagers.

Speaker 1:
[78:06] But there's a lot of money being wrapped up. One of these things that he was betting on had like $4 million in the pot. Not that you wouldn't win $4 million.

Speaker 2:
[78:15] No, you'd get your share.

Speaker 1:
[78:16] Yes, yes. Well, they wrote about it in the Wall Street Journal. So anyway, so he was showing me this and he had $800 he was betting on how many times Trump would say I ran in the next presidential address. Okay, okay.

Speaker 2:
[78:31] He bet $800?

Speaker 1:
[78:33] Well, that's why he, you know what? I don't understand it. It's his money, so I mean.

Speaker 2:
[78:37] No, I'm just asking.

Speaker 1:
[78:38] No, and I'm, you know what I told him is like, I'm like, you're gonna bet the $800 on that? I mean, you're gonna take your whole pot and put it right in there?

Speaker 2:
[78:44] That's a lot to wager.

Speaker 1:
[78:45] But then he showed me, he said that if it had a 94% that it would, he'd probably say it like six times.

Speaker 2:
[78:53] But isn't he think, first of all, he's young. So he probably doesn't, and you said he doesn't do a lot. The first thing I would think is, is that Trump's speechwriter is gonna make sure he only says it seven times. And so that the under, like, I think the fix is in.

Speaker 1:
[79:07] You know what, you're not gonna, you can't tell Trump. That's true.

Speaker 2:
[79:11] Trump said that the Pope wasn't gonna strict on crime.

Speaker 1:
[79:14] I mean, I just say it. They were, in The Wall Street Journal, talking about this 28-year-old guy in Minneapolis who made $10,000, and he was watching Eric Trump's Instagram story from his couch. And he bet that Barron Trump would show up in the audience. So you bet on stuff like this.

Speaker 2:
[79:31] On everything.

Speaker 1:
[79:32] Yes. It's called World Prediction Markets. And so it's real money bet on real sports. So how much did he make on that $800? He made like $45.

Speaker 2:
[79:43] OK, so it was very low odds.

Speaker 1:
[79:45] Yes. So he only bet some stuff that's kind of a sure thing. So he makes like $50, $20, $10. And by the way, we're not promoting this.

Speaker 2:
[79:53] No, I think it's awful, to be honest with you, because gambling is addictive enough.

Speaker 1:
[80:00] Now they say it's not gambling.

Speaker 2:
[80:01] It absolutely is.

Speaker 1:
[80:03] Because you're not betting against the House.

Speaker 2:
[80:05] Well, that's just them against the state governments.

Speaker 1:
[80:08] Exactly. That you're trading with other fans. So, but I was looking at it. And of course, you know, I had to put it on my... After I was like, oh, wow, this is great. I put on my mom hat and I was like, you know, you have to be careful. Absolutely. I mean, this is how it starts. And then we were walking through the casinos. And I'm like, you want to play blackjack? He's like, nah.

Speaker 2:
[80:30] No, I got $30 on whether that guy's going to burp in the next 10 minutes.

Speaker 1:
[80:35] There's now, okay, right now, let's see. There's one that's really trending right now. Will Taylor Swift get pregnant before her marriage to Travis Kelsi? As a 3% chance of happening.

Speaker 2:
[80:49] So that would be the one that you would make a ton of money. The favorite Ian's bet would be the no.

Speaker 1:
[80:55] Right.

Speaker 2:
[80:55] Where his 800 would give him back 25 bucks.

Speaker 1:
[80:58] Exactly. Let's see, Waymo has done a deal with public utilities.

Speaker 2:
[81:04] Love Waymo.

Speaker 1:
[81:05] Is that, I still won't get in one, I still haven't got in one. One time I wanted to do it, they wasn't available. It's looking for potholes. So when it finds a pothole, it notifies the city.

Speaker 2:
[81:15] And they go, oh great, and do nothing about it? Sounds like an awesome deal.

Speaker 1:
[81:19] They go, oh great, we'll put more Waymos in our cities. That's what it is. Do you ever fly economy? I don't.

Speaker 2:
[81:27] Economy?

Speaker 1:
[81:28] Yes.

Speaker 2:
[81:28] Yeah, of course.

Speaker 1:
[81:29] I don't.

Speaker 2:
[81:30] Okay, you've said it twice. We got it.

Speaker 1:
[81:32] So, but now the economy has a new thing that they are putting in on Air New Zealand planes. And once it's there, it's gonna be on other planes.

Speaker 2:
[81:39] Sure.

Speaker 1:
[81:40] I don't know if you can see that picture, but these are bunk beds. So you can reserve a bunk bed. There's six lie flat pods per plane. Each has a real mattress, bedding, privacy curtain, reading light, charging ports and an amenity kit.

Speaker 2:
[81:57] If I could fly, let's say it's a four-hour flight, laying down opposed to sitting there next to two complete strangers, I'm in.

Speaker 1:
[82:04] Well, no, you're not gonna be in that the whole time. Oh. Okay, no, you get four hours.

Speaker 2:
[82:11] Four hours is enough, okay.

Speaker 1:
[82:12] So you get four hours, you get out, next person gets in, four hours.

Speaker 2:
[82:15] Sure, it's like a hotel room.

Speaker 1:
[82:17] Hopefully they clean it. They have to clean it. So like a 17-hour flight from JFK to Auckland, 17 hours. That's a long time.

Speaker 2:
[82:27] Yeah.

Speaker 1:
[82:29] It's $495.

Speaker 2:
[82:31] To rent it for four hours.

Speaker 1:
[82:32] Yes.

Speaker 2:
[82:34] So. I don't know now. Now that I think about it, I would think I would get so comfortable and then when I have to get out, I would be so bummed.

Speaker 1:
[82:40] Yeah, it is.

Speaker 2:
[82:41] It looks comfortable.

Speaker 1:
[82:42] It does. I mean, I asked if the bunk beds were comfortable.

Speaker 2:
[82:46] Who'd you ask?

Speaker 1:
[82:47] The reviews were mixed.

Speaker 2:
[82:48] I don't understand that. I don't understand. Why does that get a rim shot?

Speaker 10:
[82:53] Because it's not even a punchline.

Speaker 1:
[82:57] Because some people slept on it. Mixed? Got mixed reviews because some people slept on it. Get it? Some people slept on the bunk bed. All right. Fine. Okay. My name's on the building. It sure.

Speaker 7:
[83:11] Oh, ha ha ha ha ha.

Speaker 1:
[83:13] All right. Let's see. Trendy restaurants are now going phone free. I don't know if you heard about this.

Speaker 2:
[83:20] Not even trendy restaurants. Chick-fil-A's got a thing where if you show up with the family, you can't be alone. If you show up with your family, parents and children, and everybody puts their phone in a basket and nobody touches it during the meal, at the end you get a free ice cream.

Speaker 1:
[83:33] Oh, I love that. So do I.

Speaker 2:
[83:34] I think it's great.

Speaker 1:
[83:35] I don't think I'd eat at Chick-fil-A.

Speaker 2:
[83:37] I'm just saying it's a great approach to family time. Forcing people to have family time by giving them sugar is a great idea.

Speaker 1:
[83:45] And having eye contact like it's 1997.

Speaker 2:
[83:48] Sure.

Speaker 1:
[83:48] I mean, that is really something. You know, we banned cigarettes. Now we're banning phones, right?

Speaker 2:
[83:56] I'm waiting.

Speaker 1:
[83:57] There's no line. OK. No line, except one of us, one of those actually made the food taste better. OK?

Speaker 2:
[84:05] God makes true views.

Speaker 1:
[84:07] They slept on it. I was never a fan of Hey Dudes.

Speaker 2:
[84:13] They were comfortable shoes. I'll be honest. They were very comfortable shoes. They fell apart very quickly.

Speaker 1:
[84:19] And I've never been a fan of the Allbirds shoes. Allbirds shoes kind of look a little bit like Hey Dudes, but just a little bit nicer.

Speaker 2:
[84:26] So I sent Lisa a picture. I said, should I get these shoes? And they were like some sketchers. And she said, slide on shoes that are supposed to look like tennis shoes? Absolutely not. Old man.

Speaker 1:
[84:40] Barry wears those, I think. He has like every color. I'm like, those, you can't wear those.

Speaker 2:
[84:46] I didn't get them.

Speaker 1:
[84:48] I took the advice. Yeah, you can't get those. We're going to give them to Barry. He's over 70. We'll let him have them.

Speaker 2:
[84:52] Sure, we're the same size.

Speaker 1:
[84:54] Yeah, but that's just not it. Now, the reason why I bring up Allbirds, not that we're getting to shoe fashion tips here on the show, is that for 18 consecutive months, they've lost a ton of money. Okay, they're on the verge of bankruptcy. And then the other day, they issue a press release. And they say, we are out of this shoe business.

Speaker 2:
[85:17] Okay.

Speaker 1:
[85:18] We are going into AI. And the stock went up like 600%.

Speaker 2:
[85:28] They've never done AI.

Speaker 1:
[85:29] They're getting into AI.

Speaker 2:
[85:30] If they had AI, they could help sell their shoes. They wouldn't be going bankrupt.

Speaker 1:
[85:35] And designs shoes that people actually want.

Speaker 2:
[85:37] What is happening?

Speaker 1:
[85:39] You put Gradient AI in a press release and people will buy anything.

Speaker 2:
[85:44] Suckers.

Speaker 1:
[85:44] I know.

Speaker 3:
[85:45] So you're saying with Hilton Honors, I can use points for a free night stay anywhere?

Speaker 10:
[85:50] Anywhere.

Speaker 3:
[85:51] What about fancy places like the Canopy in Paris?

Speaker 1:
[85:53] Yeah.

Speaker 10:
[85:54] Hilton Honors, baby.

Speaker 3:
[85:55] Or relaxing sanctuaries like the Conrad in Touloume?

Speaker 13:
[85:58] Hilton Honors, baby.

Speaker 3:
[86:00] What about the five-star Waldorf Astoria in the Maldives? Are you going to do this for all 9,000 properties?

Speaker 12:
[86:07] When you want points that can take you anywhere, anytime, it matters where you stay. Hilton, for the stay. Book your spring break now.

Speaker 1:
[86:17] Let me tell you, Andrew, there's one site on the Internet where you can go to and you get unbiased journalism. It's phenomenal. 404 Media.

Speaker 2:
[86:24] I love that website.

Speaker 1:
[86:25] It's amazing. Joseph Cox is one of the founders and leads over there. And Joseph, thanks for being here.

Speaker 9:
[86:34] Thank you so much for having me back.

Speaker 1:
[86:35] And you were writing, you wrote a story about that a data broker owned by the major US airlines, American United Delta, they're selling access to five billion individual airline ticket records to the federal government without requiring any type of warrants. Tell us a little bit about this and how did you figure it out?

Speaker 9:
[86:56] Yeah, so, I imagine most listeners don't know this, but something like eight major US airlines own a data broker, and it's called the very boring name, Airlines Reporting Corporation, or ARC. And what they do basically as a side business is that they sell ticketing records to the government. Now, you may be thinking, well, doesn't the government already have this data? Yes, TSA will have it, DHS will have it. But this data broker is selling it to FBI, Secret Service, SEC, I think even some military stuff in there as well. And as you say, this can be searched without a warrant. And I bet listeners also haven't really given informed consent for their ticketing records to be sold by this broker either.

Speaker 1:
[87:44] So how granular is the data being sold?

Speaker 9:
[87:48] So the data shows a person's name. It shows the airport they're flying from, flying to, of course what flight they're on. Most interesting to me is that you can even search by credit card. So let's say law enforcement has some sort of tip or they want to investigate some sort of card. They can look that up and then see, oh, it was this person flying from this airport to this airport. And it's not just the granularity. It is, of course, the scale. There's something like five billion records in this data set. It is updated every single day from the airlines to this data broker, to the other US government agencies buying it. And essentially, law enforcement or other government agencies can just rifle through it without getting a court order, a warrant, a subpoena. It is there for them to search.

Speaker 2:
[88:39] Is this only happening in the United States or is this happening in other countries?

Speaker 9:
[88:44] That gets interesting, and we don't really know. We know that the broker is owned by US airlines, but on the board of directors of ARC, you also have Air Canada, obviously the Canadian airline, and then you also have Air France and Lufthansa, which are of course from Europe as well. I'm very interested in that because of course, European data privacy law is much more robust, yes, exactly, than it is in the US. And that's why I'm actually looking into next. I'm glad you brought that up, but it could impact European travelers if they touch the US in any way. I know that at a bare minimum.

Speaker 1:
[89:18] So can you opt out?

Speaker 9:
[89:21] Allegedly, you can opt out.

Speaker 1:
[89:23] Allegedly.

Speaker 9:
[89:26] So when I've covered this and a couple of outlets have done it as well, we published several pieces on this and the one we did about here was just the scale and these five billion records. ARC told me in a statement, hey, we allow you to opt out. You contact us at our privacy email address. Some people have told me they've started that process. I haven't spoken to anybody who's successfully done it. I've actually started the process myself as well. So I guess we'll find out. I don't know when, maybe in a few weeks or something, but you can allegedly opt out and I'm very curious whether that actually works or not.

Speaker 1:
[89:59] So since you've been publishing this, Joseph, have you got any pushback? Has anybody reached out to you and said, hey listen, this isn't right?

Speaker 7:
[90:07] What you're reporting?

Speaker 9:
[90:09] Not really. I mean, no factual errors. No. We always get emails asking like, why does this matter? And of course, ARC itself is saying that this is important. I would say that generally, just to be fair to ARC and the government agencies as well, they see this as a post-911 tool. It was made in the wake of the September 11th attacks. Of course, you can see why they would do that. That was a massive intelligence failure. I think government agencies need access to more data. It is now long past that, and it also still brings up the very legitimate question of, well, shouldn't agencies get a warrant to search this data? And some of the agencies I've spoken to, yes, have given no indication they are doing that. And of course, at the end of the day, this data isn't being collected by the government, it's being sold to the government.

Speaker 1:
[91:01] It seems like everywhere you turn, they're selling your data to somebody. You know, it's nuts, really. I mean, you do something and that's going to be sold to who knows who.

Speaker 9:
[91:13] Yeah, absolutely. And I mean, people often complain about Facebook and that sort of thing and all the data privacy issues there. And I hear that. I'm much more interested in these companies that nobody has heard of, like ARC, which are literally selling your data. And then there is a financial transaction going on there. And kind of to add insult to injury, the data is kind of cheap. Some of these agencies are only paying a few tens of thousands of dollars for access to this massive data set. So even after all of these privacy violations, it's not even been sold for that much. You know, maybe we're not worth that much.

Speaker 2:
[91:49] No one cares that you flew to Boise in the grand scheme of things.

Speaker 1:
[91:54] You know, just real quick, how did you find out about this?

Speaker 9:
[91:59] Yeah, so I keep a very close eye on the contracts that ICE is entering into, just because of course I'm covering that agency a lot at the moment. And I saw a document several months ago at this point mentioning that ICE was getting access to this data. I then did more digging. I found all of these other US agencies that were buying it. And then I filed a ton of Freedom of Information Act requests with the government basically demanding, I want to see the actual contract. And then that's how we did this piece. The piece that this is based on is a Secret Service contract. And again, they have tons of clients, but that included in this document, the 5 billion figure. And of course that was alarming to us. So we published that.

Speaker 1:
[92:42] They must see your email coming in through Joseph. And they're like, Oh God, not him again. Well, you know, thank you for for spending some time with us. And we are, we're all all of us here at the show. Let me tell you, Joseph, we are big fans of everything that you're doing over at 404 Media. And so it, you know, I remember when you guys started that and it was it was tough going, but it's every article is just robust. It's on target. It's smart. It's researched and it's nonpartisan. It's like, where else can you get that stuff? I'm telling you, it's 404 Media. Joseph, thanks again for being here. So I play around with the stock market. I have, of course, I got financial managers and advisors and home offices and all that other great stuff, but I have an E-Trade account and that every once in a while, it goes up, it goes down, and then I get frustrated. I take everything out of the market. I put it back in. Have you ever used AI to do a stock prediction?

Speaker 2:
[93:39] No, I haven't, but I've heard that it's actually pretty legit.

Speaker 1:
[93:43] It is, it is legit. Stanford researchers turned AI loose on 30 years of stock market history. The AI stock picks beat 93% of the pros.

Speaker 2:
[93:55] Well, why aren't the pros just using AI?

Speaker 1:
[93:58] I think they are. Oh. They are. So for every dollar that your financial hotshot has made you, AI made six. Now I'm not saying dump your financial advisor. I'm not saying that. And, you know, AI can hallucinate, right?

Speaker 2:
[94:12] Sure.

Speaker 1:
[94:13] So, but you do have a powerful second opinion right in your pocket.

Speaker 2:
[94:18] I got it.

Speaker 1:
[94:19] What?

Speaker 2:
[94:20] You just tell people that all the GoPro stock you bought was an AI hallucination. And you didn't do that. It was AI made you.

Speaker 1:
[94:32] Okay. I've been holding GoPro for like 15 stupid years. Okay. I'm hoping then it's going to go up.

Speaker 2:
[94:38] But if it doesn't, you've got it out now.

Speaker 1:
[94:41] Listen, I thought for sure when they announced that you saw it, I showed it to you, that GoPro product.

Speaker 2:
[94:48] Yeah.

Speaker 1:
[94:48] The guy's holding the baby.

Speaker 2:
[94:49] The mouthpiece that holds a camera.

Speaker 1:
[94:51] You put a GoPro in a mouthpiece and you put it in, and then you're taking pictures of the baby.

Speaker 2:
[94:58] Right. You thought that was going to be a trillion dollar idea.

Speaker 1:
[95:01] So what you want to do is use a good prompt. And again, as always, you have to tell it what it needs to be. So you're going to say, be a financial advisor for someone my age. Recommend a diversified portfolio for retirement, whatever you need, with moderate risk. You can ask it to compare specific stocks. Suggest picks across different sectors. Now, AI can crunch numbers. It cannot predict the future. And so I'm not going to tell you what I did, but I used AI and I put, I guess Ian taught me this, I put all the money into one stock.

Speaker 2:
[95:36] Sure, smart. That's what they say, put all your eggs in one basket.

Speaker 1:
[95:41] And I'll let you know how I do.

Speaker 2:
[95:43] Oh, you don't have any results yet?

Speaker 1:
[95:44] Not yet.

Speaker 2:
[95:45] When did you make the transaction?

Speaker 1:
[95:47] Yesterday.

Speaker 2:
[95:47] Yesterday, okay. So let's come back in two weeks and get an update.

Speaker 1:
[95:52] What's today? Yeah, it might be a little bit longer than that. It might be a little bit longer. Okay. Because I'm waiting on something. I can't tell you what it is.

Speaker 2:
[96:01] Is it AI told you that there's something coming?

Speaker 1:
[96:03] No, Ian did. And then I confirmed it with AI. So at least we have something in all that. Rebecca in Des Moines, Iowa. Hello there, Rebecca.

Speaker 13:
[96:14] Hi, Kim. So good to talk to you.

Speaker 1:
[96:16] Well, thank you. What's on your mind today?

Speaker 13:
[96:18] Well, I got a situation. I have an adult granddaughter who's living in a group home. Okay. And she's non-verbal. She has like cerebral palsy that affects her mouth and other things. It's tough. But she has really learned to use her phone. This is her connection with the whole wide world that she can't access otherwise. But she has cognitive delays and nobody quite knows how she gets the phone and doesn't understand a lot of other things. But anyway, she has used her phone to reach out to a situation came up where she talked to a man. And he came across several states to her group home and came in the house. She let him in. She gave him directions. She took pictures of the door he was supposed to use. And we just want to find some way. We don't want to take the phone away from her because it's so important to her.

Speaker 1:
[97:23] Okay. But we can't let this happen again.

Speaker 13:
[97:26] No, no.

Speaker 1:
[97:27] Did the group home, I mean, did you call the police? I mean...

Speaker 13:
[97:32] No, they didn't. He was gone very quickly. I will say that. But the gentleman that is the head of the household did start to talk to him, but he left.

Speaker 2:
[97:46] Was he there for ill intent?

Speaker 13:
[97:49] The thing he said, he was rescuing. Okay. But she, you know, I don't know if she told him that she wanted to go with him. Because we don't know what she said. We don't know a lot other than he knew to use the back door.

Speaker 1:
[98:10] She sent him pictures. She told him which door to cut him.

Speaker 2:
[98:12] Why don't you know? Are you not in full control of this phone? Is this like her phone and her privacy?

Speaker 13:
[98:19] Her phone and her privacy. She's an adult and she's her own guardian.

Speaker 1:
[98:24] Okay. And so you're wondering if there's something you can put on the phone or?

Speaker 2:
[98:29] But it's her phone.

Speaker 1:
[98:30] Exactly.

Speaker 13:
[98:34] What we would like to do, I mean, I think it scared her too.

Speaker 1:
[98:38] Okay.

Speaker 13:
[98:39] So I think we could get her permission to do some things because, you know, her phone is her world. So we want to make it safe for her. And, you know, I've looked at parental restriction. She has an Android phone right now. And I'm an Apple user. We could get her another phone if one parental restriction is better than another.

Speaker 1:
[99:05] Well, you know, what's nice about an iPhone is that if most people in the family have an iPhone is that you'll be able to set up family sharing, time restrictions, find my, you know, all these different things. But but again, she's 30. So she could just turn all those off.

Speaker 13:
[99:24] Yes.

Speaker 1:
[99:25] And so I, so I would be more inclined to wherever this group home is, is to make sure that you call adult protective services and to let them know what happened. Some they're not watching. They're not watching her. Okay. They're not because because she's on her phone and she's on TikTok or Snap or Snapchat or whatever and a guy shows up, you know, maybe there needs to be a rule with your granddaughter and the group home that you put on something like bark parental controls so that you can get copies of everything that that's what that she's doing on the phone. See, if you put it in the end then, but she also can turn that off if she wants. Right.

Speaker 13:
[100:09] Yeah, that's I, I don't know if there's a way to make it harder for her to do that or.

Speaker 1:
[100:16] Yeah, but you'd have to get her. You see, you're in a precarious position because she is her own guardian. Right.

Speaker 2:
[100:23] This is not like me dealing with my 14 year old. No, you know, I pay the bills. I can do whatever I want. This is another adult that's making her own choices, poor choices, but making her own choices.

Speaker 1:
[100:35] I mean, you know, I think it goes back to education. If she can do everything that you just said she can do, she can learn how to use this responsibly, because she knew. I'm sorry, go ahead.

Speaker 13:
[100:48] Oh, I was just going to say, I think they're going to be working on safety boundaries, personal boundaries. Yes, that's it. We're working on those kinds of things with her.

Speaker 1:
[101:00] Yes, and maybe there's a family group chat that you guys can do as her advocate, and then somebody checks in with her once a day at a specific, you know, at a time.

Speaker 13:
[101:14] Oh, yeah, and we do that kind of thing, and we have, yeah, family chats. She likes to take pictures and send us pictures of sunsets and flowers.

Speaker 1:
[101:27] Yeah, so I think that, you know, I think the onus is, and I'm glad that you called, you know, you can put, you can, you know, in the family, you can talk about using something like bark. There's also something.

Speaker 13:
[101:39] I'm not quite sure what, is that something she can just turn off bark?

Speaker 1:
[101:43] What is bark? She can figure. It's parental controls that gives you copies of what she does. There's another program called M-Spy that you can see every single thing that she's doing on the phone, you know, but if she knows what she did, if she knew enough to get that guy there, okay.

Speaker 13:
[102:02] I know that's really upsetting.

Speaker 1:
[102:04] Yes, then she knows how to get around these controls and the fact that she's 30, that's a whole other ball of wax. So that's where I think it goes back into the group home where they have daily 15 minute lessons about how to use technology the safe way. There are no new friends who want to meet in person. Nobody's going to rescue or help you get out of this group home. You can't keep the conversation secret. If anybody asks for money, gifts or whatever, you come tell the group, the guy or gal who's in charge of the group home. Anybody asked about her disability benefits or living situation? There are the no talk zones. Am I missing anything?

Speaker 2:
[102:45] No, you got it.

Speaker 1:
[102:46] Okay.

Speaker 2:
[102:47] It's tricky because she's an adult.

Speaker 1:
[102:50] Very tricky. Very tricky. Rebecca, take that info. And if there's anything I can help you with afterwards, let me know. All right. So every second you're in your car, it's tracking your speed, your location, how hard you break. Over a year, it adds up to 25 gigabytes of data. This is no shocker. GM was caught selling driver data to insurance brokers. Customers watched their premiums climb. So how exactly did they get the permission to track all this information about you?

Speaker 2:
[103:22] You gave it to them.

Speaker 1:
[103:23] In the infotainment system.

Speaker 2:
[103:25] Of course.

Speaker 1:
[103:26] So what you want to do is go into your car's settings and then look at privacy or connected services and toggle everything off that you can. That's just the way it is. Hey, listen, if you're not already, make sure that you get our free newsletter. It's a five-star rated deal. You're going to love it at getkim.com. Hey, thanks for listening. And just a reminder, take a sec and leave me a great five-star review wherever you get your podcast. It helps so many other people find us. And I'm so glad that you're here because, well, without you, I'd probably have to do the show for my husband and he wouldn't like that at all.