title Using ancient philosophy to cope with your modern problems

description Philosopher Meghan Sullivan says during tough times, ancient wisdom can serve as a guide. From politics to religion to AI, she poses big questions to help you find out what the good life means today.

TED Radio Hour+ subscribers now get access to bonus episodes, with more ideas from TED speakers and a behind the scenes look with our producers. A Plus subscription also lets you listen to regular episodes (like this one!) without sponsors. Sign-up at: plus.npr.org/ted

See pcm.adswizz.com for information about our collection and use of personal data for sponsorship and to manage your podcast sponsorship preferences.

NPR Privacy Policy

pubDate Fri, 17 Apr 2026 07:00:00 GMT

author NPR

duration 2997000

transcript

Speaker 1:
[00:00] This message comes from Whole Foods Market. Save on tropical flavors during the Savor the Tropics event with yellow sale signs throughout the store. Stock up on juicy pineapples and mangos, grab Hooli Hooli chicken, and finish with Mango Uzi Chantilly Cake at Whole Foods Market.

Speaker 2:
[00:17] This is the TED Radio Hour. Each week, groundbreaking TED Talks, our job now is to dream big, delivered at TED conferences, to bring about the future we want to see, around the world, to understand who we are. From those talks, we bring you speakers and ideas that will surprise you, you just don't know what you're going to find, challenge you, we truly have to ask ourself, like, why is it noteworthy? And even change you. I literally feel like I'm a different person.

Speaker 1:
[00:45] Yes. Do you feel that way?

Speaker 2:
[00:48] Ideas worth spreading. From TED and NPR, I'm Manoush Zomorodi. Today on the show, what do you and Greek philosophers have in common? And how can ancient ideas help us manage our modern woes?

Speaker 3:
[01:07] Certainly, the last decade or so in our lives have been times of great upheaval. We've dealt with the COVID epidemic. We now deal with wars around the world. We've dealt with huge changes in technology, which have upended our ways of life.

Speaker 2:
[01:21] This is Notre Dame philosophy professor Meghan Sullivan.

Speaker 3:
[01:25] And philosophy thrives when things are disrupted. It is totally not an accident that great figures like Socrates came on the scene in eras of civil war and major disruption. Because that's when people start to wake up a little bit and realize like, oh my gosh, maybe this is not the good life.

Speaker 2:
[01:44] Okay, so let's go back 2400 years to the earliest days of democracy. There was chaos, confusion, and some big questions.

Speaker 3:
[01:57] So you have to remember the Athenians were inventing democracy. I mean, we think, we think democracy is this very old form of government, but the Athenians had really no model for it. Everything in Athens is decided by public votes among the male citizens, and debate and discussion are everywhere. Like if we think we live in a media debate saturated culture, we couldn't hold a candle to the ancient Athenians.

Speaker 2:
[02:23] Enter Socrates. He's a soldier turned teacher, and he decides that instead of all this debating, young people needed more in-depth conversations. He asked them big questions about what makes a life worth living. What is truth? What is courage? But this push to get these young Athenians to think bigger picture, this did not please the government.

Speaker 3:
[02:49] So 399 BC, Socrates is on trial in Athens, because he's always going around poking at the powerful people in Athens and making them look foolish by asking them a bunch of difficult questions and causing them to realize that they don't really understand the things that are coming out of their mouth. The Athenian government tells Socrates he needs to stop. In particular, they tell him that he's corrupting the youth of Athens, like he's making the young Athenians not really believe in the system anymore. Socrates refuses. Socrates said, I care much more about the truth and figuring out the truth about this world than I care about pleasing the powerful people in Athens.

Speaker 2:
[03:32] And things don't go well for Socrates because of that. Rather than be cast out, he agrees to be killed on principle.

Speaker 3:
[03:40] So when Socrates allows the Athenian government to kill him over the questions he's asking, that's the birth of philosophy as we know it.

Speaker 2:
[03:49] Which then brings us to Socrates' top student, who was of course, Plato.

Speaker 3:
[03:54] Plato looks around and thinks, oh my gosh, if a democratic government is willing to put somebody as important as Socrates to death, democracy is broken. Like democracy is a joke. And we were so confused and screwed up by our upbringing, that instead of realizing that this better life was possible, we killed the guy who was sent to set us free. And he thought, philosophy is not only gonna help you become a better person, but he radically thought that philosophy would help us overthrow terrible forms of government and build a form of government that would set the human race free. And that was his greatest ideal.

Speaker 2:
[04:34] So Plato becomes a teacher himself. He starts the Academy. It is thought to be the first university in Western civilization. And then we come to another student. His most famous student essentially starts what we now call virtue ethics, and that was Aristotle.

Speaker 3:
[04:52] Absolutely. And Aristotle looked at his teacher and thought, this is too radical, like this has gone too far. And Aristotle famously broke away. He started his own school, the Lyceum, where he was gonna teach people virtue ethics in a much more systematic and less politically radical way. So Aristotle famously thought that the entire pursuit of our lives is a form of happiness that he calls eudaimonia, or flourishing, the good life. That we have a goal that's in front of all of our lives, which is to achieve this kind of flourishing. And to do it, we have to live in healthy communities. But we also have to learn how to exercise our powers of reason and our powers of self-control. And this is the kind of thing that you can learn how to do. He literally taught a class on how to achieve happiness in ancient Greece. And he did not think that it was a matter of starting a political revolution, so much as it was a matter of learning to train your habits. And if you think about Plato as the great political revolutionary of ancient Greece, Aristotle, and I mean this with huge respect, is like the great self-help philosopher of ancient Greece, where he really gave people a lot of advice for how to flourish where they're planted.

Speaker 2:
[06:10] Philosophy professor Meghan Sullivan believes that when times are tumultuous, ancient questions and principles can still guide us. And so on the show today, how to probe the past to figure out what a good life means for you. With Meghan, it starts in her classroom, where she pushes her students to think and question exactly that.

Speaker 3:
[06:34] The thing that ultimately got Socrates killed, which was his real power, was that he gave young people a moral imagination. Like they would have thought, along with their parents, that there was kind of only one way Athens could go. There's only one way the world works. And Socrates, by asking them questions, by teaching them philosophy, helped them realize they had a lot more options than they might have believed they had based on who was in power. I see that same dynamic playing out today. I teach a lot of young people. I teach this really big course on the good life at Notre Dame, which has a lot of Notre Dame freshmen in it. And a lot of them have been fed this diet of visions of the good life from the high school system and the AP SAT exam system that we inflict on all of our teenagers. And then they're exposed to visions of the good life from TikTok and social media, which tell them they need to be look smaxers, and which tell them they need to attain certain kinds of social perfection. And those are the only visions of the good life at all that they get in their formative years. And those kinds of limitations in our imagination, those are one of the hardest forces for us to overthrow if we actually want to achieve flourishing. And that, Socrates certainly realized that, you know, what he, if you read his speeches, he didn't want to tell people exactly how they should live their lives, but he wanted to help them realize that they had options that they were not even considering.

Speaker 2:
[08:17] So let's talk about that. Let's talk about how you're taking these ancient concepts and trying to make them relevant and interesting to today's college student. You've written a book called The Good Life Method. No one would disagree with the premise that we all want to live a good life. But how do you begin that conversation with your students? Because I assume that it might have been easier to start that conversation maybe 10 years ago, but is it different today?

Speaker 3:
[08:49] I think it's actually easier to have this conversation today than it might have been when I was a college student. The young people that I teach right now, the young people who were in middle school and high school during the pandemic, they are maybe a little bit more tenderized by the world than, I don't know about you, than I was when I was their age. When I was their age, I was probably more likely to just do whatever a powerful adult told me to do. Whereas now I think this generation is maybe a little bit more skeptical, and skepticism is a good place to start doing philosophy. Cynicism is hard for philosophy, but skepticism we can work with.

Speaker 2:
[09:29] That's such a good point because I'm just thinking back to the 90s when I was in college and the big topic was Bill Clinton's dalliances. Oh yeah. There were not big conversations over what is democracy, I'll tell you that. It was taken for granted. Of course.

Speaker 3:
[09:42] Of course democracy will always work.

Speaker 2:
[09:47] So as you say, they're tenderized, are they fragile? How do you begin to make them, I guess not give up or just be like, I don't know, it's beyond my control.

Speaker 3:
[10:00] So my approach, which is inspired by Socrates, is to ask them questions that force them to think and then really listen to their answers. One of Socrates' great gifts as a teacher is he really did take young people seriously. Socrates thought that all of us have a certain kind of dignity, this ability to think through these questions for ourselves. And I try to really bring that into my teaching. And so I structure my class around 10 big questions that I think we all have to wrestle with if we try to figure out what's true about the good life. And this appears in the book too. I start with what I think is the easiest of the 10 questions and then move up the ladder each week to a harder one. Hopefully the students clear each level.

Speaker 2:
[10:51] I'm intimidated by this and I'm not a college student.

Speaker 3:
[10:54] You should be intimidated. The easiest question, the question that we start with, and you're gonna laugh at me when I say this given the conversation that we've been having, I think the easiest question is how should we relate to people who disagree with us about politics? Like how should we have political discussions? And we start with that question because it's a good way to help students realize that there's a difference between just repeating ideas that are currently socially acceptable and really trying to figure out what's reasonable to believe and what's true. That is the easiest question. And from there, the second topic we do is the role of money in a good life. How much money do you need? What should you do with your money? How much time should you spend trying to earn money? And then we move on up from there. So we do politics, money, we do moral responsibilities. Basically, what do you need to apologize to people for? How do you know if you've screwed up versus something bad just happened to you? We do work in the good life. And then we go away for spring break and then things get super existential. So we ask, I ask the students if they're going to practice any religion when they grow up and what it is to even have a religion. We talk about the role of suffering in the good life. The fact that no matter how well you are doing, inevitably you're going to suffer. But even worse, the people that you love are going to suffer. And how are you going to make sense of that? And how are you going to deal with the fact that there are certain kinds of horrible things that happen to us in life that we have no control over? And then, of course, Manoush, right before I send them home for summer break, I make them write essays about the fact that they are going to die. They're going to die actually, you know, not that far in the future in the great scheme of things, and that everybody that they've loved and ever known is also going to die and that we're so finite.

Speaker 2:
[12:48] That's fun. Nice, Meghan.

Speaker 3:
[12:50] How does that change their vision about what makes this limited time that they have in life valuable?

Speaker 2:
[12:58] Coming up on the show, how different kinds of love fit into a good life, and the conversations that Meghan is having in Silicon Valley about where philosophy and ethics fit into AI. More with philosophy professor, Meghan Sullivan. I'm Manoush Zomorodi and you're listening to the TED Radio Hour from NPR. We'll be right back.

Speaker 1:
[13:31] This message comes from Betterment. You know when you sell a stock or any investing asset and start to feel the dread of getting a surprise tax bill? Betterment's Tax Impact Preview Tool shows you the estimated tax impact of the sale so you can make informed tax smart investing decisions. Get started today at betterment.com. Investing involves risk, performance not guaranteed. Betterment is not a tax advisor, nor should any information herein be considered tax advice. Please consult a qualified tax professional. This message comes from Clorox Professional. Three out of four US states have environmental purchasing policies in place. Clorox EcoClean disinfecting wipes can help facilities reach their sustainability goals. cloroxpro.com/cloroxecoclean.

Speaker 2:
[14:17] Support for this podcast and the following message come from Active Campaign, the autonomous marketing platform. You know that feeling when you open your marketing tool and instead of marketing, you spend an hour wrestling with the drag-and-drop builder? Active Campaign built Active Intelligence for exactly that moment. Describe what you want to accomplish. It builds the campaign, writes the copy, and maps the automations across email, SMS, and WhatsApp. Customers save an average of 10 hours per week and make email campaigns eight times faster. Learn more at activecampaign.com. Hey there, I want to let you know about our short-form video series that we've been doing with some TED Radio Hour guests. Please check them out. You can find them and follow me on Instagram at ManoushZ. That's M-A-N-O-U-S-H-Z on Instagram. It's the TED Radio Hour from NPR. I'm Manoush Zomorodi. On the show today, we're talking to philosophy professor Meghan Sullivan. Meghan teaches a wildly popular undergrad class at the University of Notre Dame called God and the Good Life. And one of the questions she asks her students is, where does love fit into your life? Here she is on the TED stage.

Speaker 3:
[15:41] Most major philosophers and nearly every major world religion puts the virtue of love at the center of the good life. But what exactly does it mean to practice this virtue? To get my students thinking about this, I give them a thought experiment. Suppose I had a pill, and if you took it, it would cause you to experience love for absolutely anyone you met. Would you take it? Fastest to thousands of students, and the answer I overwhelmingly get from my very earnest, very Catholic freshmen is no. They wouldn't take the love everyone pill. And in hearing their answers, I start to get some insight into how they're thinking about this virtue. One thing that's kind of funny is we have all these complex thinkings about the virtue of love, but when it comes to hate and resentment, those are easy. We can absolutely cultivate those. In fact, our current politics, the internet, it has us taking a hate everyone pill just about voluntarily every day. So, I put this question to them, and it sets off this conversation. Okay, love is essential to your happiness. I have a way of giving you the thing that will solve that problem for you. It will cause you to experience that love for everyone. You'll just have it automatically, and you're telling me you won't take it. So, what am I missing here? And it always sets off this just amazing conversation about helping my students realize what they think really goes into love.

Speaker 2:
[17:18] What do they think really goes into love? Why don't they want to take the pill? I'm kind of in.

Speaker 3:
[17:23] Oh, it's the best. One category of answer is students who tell me that if they took the pill, they would be doing something wrong or unfair to the people they currently love. So, they're like, oh my gosh, if my best friend found out I had taken the love pill and now I cared about everybody the way I care about her, she would feel super betrayed. So, one idea that we explore is like, does love require exclusivity? And we're not talking about romantic love or sexual love here. We're talking about friendship, the kind of non-sexual form of love that figures like Aristotle think is also an essential element of the good life. One of the most profound answers I ever got from the love pill thought experiment is I had this guy in class, we can call him Chris, I'll change his name to protect his identity. But I had this guy in class, he was a total philosophy bro, he loved argument, was the kind of guy that would go after another student in the class if they made a logical mistake in one of their answers. When Chris raised his hand to tell me why he wouldn't take the love everyone pill, I half expected him to say something like, Professor, losers don't deserve my love. But instead, he said something that was actually pretty deep. He said, Professor, I sleep with my cell phone across my bedroom at night. Sometimes it goes off in the middle of the night, and I wake up and I think, oh my God, something's happened to my mom. I feel sick to my stomach until I can get to the phone and answer it and know that she's okay. Feeling that way about everyone, that would be unbearable for me. I just remember thinking, one, oh my gosh, Chris, I had no idea that you were capable of feeling vulnerable in this way. But also, he's, I think, on to a very profound point about why a lot of us intellectually, we realize that love is essential to the good life, but when it comes down to our day-to-day life, we are super cautious. We realize that love makes us really vulnerable to the world and to other people in ways that we're oftentimes not comfortable with. And when you love someone, they could seriously hurt you, or something terrible could happen to them, and then you'd be crushed. Most virtues make you stronger. Love is this virtue that, weirdly, its strength comes from making you weaker. And we've got to be okay with that.

Speaker 2:
[19:54] Where does religion fit into this these days? Because you became a Catholic as an undergrad yourself, but as we know, organized religion in many of these institutions is not at the center. Notre Dame, I guess, is different. And does that almost make it easier to have these conversations about ethics? Does that sort of put you all on a level playing field in some way, as opposed to out in the world where people are believing all kinds of different things, so it's hard to even have a starting point, the same language or vocabulary to have these conversations?

Speaker 3:
[20:31] Yeah. I think if there's a big mistake that we've made in our country in the last few decades, it's universities have been kind of afraid or tepid about organized religion. I think if we really believe that people go to college to learn how to care for their souls, to learn why they're here, and to learn how to ask these big questions that are going to help them navigate the rest of their lives and the political life that they share with us. Young people not only benefit from, but frankly deserve the opportunity to explore these full-blown traditions. I think it would have saved me a lot of money in therapy down the road. If I had been able to have more open and searching conversations about what role religious faith plays in a good life, with teachers that I trusted and cared about when I was young. I think that's something we do really well at Notre Dame that I wish had been a bigger element of my education growing up, and I wish more young people had the experience of.

Speaker 2:
[21:40] How do you begin to model this idea? I mean, Catholicism is hardly a perfect religion. I know what religion is, but... Meghan, you didn't know this?

Speaker 3:
[21:52] I didn't know this. Catholicism... Look, there are questions, yep.

Speaker 2:
[21:56] So how do we begin to... How do you talk about religion to your students? And accept the fact that there is misogyny and pedophilia and all kinds of other issues that go on, and yet you still want to be part of the church.

Speaker 3:
[22:14] One of the things that I love to do for my students is I will take passages from the Christian Bible, the Gospel, where I think Jesus is asking a philosophical question, the same way Socrates asked his students philosophical questions, and just show my students that they are allowed to debate Jesus on this. I want them to realize that if you really are interested in this faith tradition, you better be ready to debate, because there are going to be big questions coming at you from left and right and center if you decide that you're going to be a Catholic when you grow up, and you're going to have to get comfortable wrestling with the messiness.

Speaker 2:
[22:57] For those who don't choose organized religion?

Speaker 3:
[23:00] Yeah, I got in trouble. I'm probably going to get in trouble again for saying this. My students and my God in the Good Life class, the big project that they work on is they compose this big philosophical essay that we call their philosophical apology. Not their apology like, I'm sorry for my views, but apology in the ancient Greek sense of apologia, like a defense of their views and a reasoned story-based argument for why they see the world the way that they do. Inevitably, I have some students in that class who take the opportunity of having this really great philosophy class, and they write what I would call their atheist coming out essay. It turns out, you know, it turns out maybe they have been a part of Catholic schools their entire life, but for years, they have not really been buying it. And now they finally have this great philosophy class, and they've had the opportunity to read.

Speaker 2:
[23:59] You pushed them over the edge, Meghan.

Speaker 3:
[24:01] I know, this is gonna get me fired. This is gonna get me totally fired, but it's true. And you know what, Manoush? I help those students write those essays. I don't tell them what to say. Obviously, I disagree with them. I'm Roman Catholic. But I think one of the ways that you care for the souls of young people, especially college-aged students, is by helping them just open up what they've been thinking about and worried about for a really long time, but maybe have not had the words to express. And I think it's a huge gift if I can send a young person home for the summer or for Christmas break back to their families, with the language and ideas to have a serious conversation with people they love about what their concerns and skepticism is. And so I think if you want to play this philosophy game, as Plato says, you got to let the arguments blow you where they will. Plato famously thinks that logic is like wind that kind of pushes you in a particular direction as you start to follow it. And so you've got to help students realize at this point in their life, where's the wind blowing them?

Speaker 2:
[25:14] I am so curious as to where capitalism fits into all of these conversations, because I can imagine that someone would say, I do have a moral compass. I do want to do the ethically right thing. But this is how capitalism works. Or maybe they, you know, we see so many students now going to top universities to what? Go into financial services or to work at companies where they're optimizing the algorithm. And that not only consumption, that side of capitalism, but the accumulation of wealth as the goal, certainly in the age of AI too. So how do you talk about the system that often doesn't seem to make much room for some of these more sort of altruistic ideals?

Speaker 3:
[26:07] Yeah, it's super interesting. I told you, Manoush, we teach a unit on work. And in that unit, we read Aristotle, but we also read Karl Marx. And most of my students at Notre Dame have worked hard their whole life, but they've never actually had work. Like they've never traded their time for a wage the way that a lot of like labor and capitalism trades their time for a wage. In fact, students will write these essays. I laugh so hard, but they'll write essays about Karl Marx, who famously thought that like wage labor under capitalism is totally alienating. And my students will tell me how serving as the treasurer of the ultimate frisbee club has caused them to realize how alienated they are from their species being. I was like, oh my gosh, Karl Marx is throwing up in hell right now. If you think that you're part of labor and not bourgeoisie, you've really not understood what he's talking about. But if I check back in with them, now I've had lots of students graduate from Notre Dame and graduate from the class that I still am friends with and keep in good touch with. Once they get their first job, then they realize what Marx is talking about. They're like, oh my gosh, it can be such a grind to have to trade away a third to a half of your life to just try to earn money to pay your rent. I think these are definite questions you want to prepare people in college to be able to navigate. Maybe one of the things that I can give my students is not talking them out of those career paths right away, but helping them develop the skills and virtues that are going to accompany them. Again, when they're 30 and they realize that they're not on the right path.

Speaker 2:
[27:50] I mean, I don't want to be intellectually snooty about it. I just remember a young couple who I live next door to, who were consultants for financial services firms, big ones. And I was like, so, you know, did you think you wanted to do this? They were like, oh no, this is how it happens. You have huge student loans. You get, you know, plucked by a big company and they're going to pay you money like you've never seen before. And you think, well, I'll pay off my loans. I'll help my parents out with their, you know, bills. And then once that's all taken care of, I can do what I really want to do. And I'll figure that out along the way. And then they're like, and then it's five years in and you have a mortgage and then maybe you even have a kid. And actually you like going on really nice vacations and the loan maybe still isn't paid off. And then there's your life for the next 10, 20, 30 years. And some people, you know, they love it. And so that's fine. But there are other people who I think feel really trapped.

Speaker 3:
[28:51] Oh, absolutely. I'm in this phase of life. I'm 43. So I'm in this phase of life where this is like everybody I know that's going through this great rethinking. And in our current system, the winners get richer and the losers get poorer with each successive cycle. And we sort of realize this is what Marx was on about too, is, my gosh, you make a bet when you're 18 years old, you take out a bunch of debt to become a computer science major, making this kind of calculation that that's going to be a very lucrative job when you're 23, 24. And then it turns out that powerful artificial intelligence hits the economy. And there's actually no jobs for computer programmers anymore. But how on earth were you supposed to know that when you were 17, 18 years old, and now you've got this big college debt, you've got to pay off and you've also been training for a career that doesn't exist. And capitalism is utterly merciless in helping you figure out what you're supposed to do next. And philosophy is not going to be able to take the risk out of the equation. I think the best philosophy can do, which is still pretty good, is helping you realize that you might have gotten set on a particular option. And it can maybe help you ask some questions that cause you to see side quests or paths that you didn't know were available before.

Speaker 2:
[30:15] Yeah, I guess I would add and develop judgment.

Speaker 3:
[30:18] Yes. Well, and also develop responsibility too, be able to know when they're the one that has screwed something up and need to ask for forgiveness or need to take a step back and redial. That's another kind of skill that you learn from a really good ethics education that I think we look around right now and we think, man, I wish there were a whole lot more people in government who had developed that virtue before they got access to all this power.

Speaker 2:
[30:46] So speaking of power, I understand you're actually joining us from San Francisco right now where you've been talking to leaders in the tech community, right?

Speaker 3:
[30:55] Absolutely. Yeah. And I'm in California. I'm working on a big project that tries to bring virtue ethics into debates about artificial intelligence and I will sometimes be out in these meetings here in the Bay Area and I'll tell people, I'll ride into town saying, oh, hey, I'm the virtue ethics philosopher here to help you have a conversation about what kind of world we want to build with artificial intelligence.

Speaker 2:
[31:19] Well, at the risk of being disappointed, I do want to ask, how's it going?

Speaker 3:
[31:22] Oh, you know what? I sometimes tell these stories about crazy things people tell me. I was at a meeting a few weeks ago where somebody told me, you know, there are only 300 people on planet Earth who matter anymore, and they're the people who are making frontier models. I was like, I think you need to touch grass, as my students say, like that. You've totally lost your sense of perspective and reality. For every person that tells me something like that, actually, most of the folks I find out here, they're idea people. They're curious about philosophy. They're curious about religious questions. They're fun to talk to. They realize that we are living through a pretty crazy episode in human civilization when, you know, 10 years ago, if I'd asked a philosophy student, what makes humans different from animals? Like, what's special about being a human being? And they wrote an essay telling me that what's special about being a human being is the fact that we can think abstract thoughts and that we can do reasoning and that we can be creative. I would have given them an A on that paper. I would have been like, that's a great answer. Well, Manoush, I have a piece of software on my iPhone right now that I don't know if it can think, but it can definitely do logic way better than I can. It can be creative. It can make creative memes and pictures. It knows a whole, it has a lot more knowledge than I will ever have in my life. It is not a person. It does not have the dignity that I have. But there is this fascinating question we are all faced with right now. What is special about me if it's not those things that I just named? If there are things that software can do now?

Speaker 2:
[33:11] When we come back, more of my conversation with philosophy professor Meghan Sullivan. Do chatbots and AI companions have a place in a good life?

Speaker 3:
[33:20] AI is a field day for philosophers because it definitely feels like we've been training for these questions our whole lives.

Speaker 2:
[33:27] I'm Manoush Zomorodi and you're listening to the TED Radio Hour from NPR. Stay with us.

Speaker 1:
[33:47] This message comes from Clorox Professional. Clorox EcoClean disinfecting wipes clean and kill 99.9% of germs on surfaces with a plant-based active ingredient and less plastic waste. cloroxpro.com/cloroxecoclean.

Speaker 2:
[34:03] Support for this podcast and the following message come from Active Campaign, the autonomous marketing platform. You know that feeling when you open your marketing tool and instead of marketing, you spend an hour wrestling with a drag and drop builder? Active Campaign built Active Intelligence for exactly that moment. Describe what you want to accomplish. It builds the campaign, writes the copy, and maps the automations across email, SMS, and WhatsApp. Customers save an average of 10 hours per week and make email campaigns eight times faster. Learn more at activecampaign.com.

Speaker 1:
[34:40] This message comes from Mint Mobile. If you're tired of spending hundreds on big wireless bills, bogus fees, and free perks, Mint Mobile might be right for you with plans starting from $15 a month. Shop plans today at mintmobile.com/switch. Upfront payment of $45 for three-month, five-gigabyte plan required. New customer offer for first three months only. Then full price plan options available. Taxes and fees extra. See Mint Mobile for details.

Speaker 2:
[35:10] It's the TED Radio Hour from NPR. I'm Manoush Zomorodi. Today on the show, what it means to live a good life with Notre Dame philosophy professor, Meghan Sullivan. Meghan has been spending time out in Silicon Valley, working with technologists to consider how or if ethics fit into their business models.

Speaker 3:
[35:31] There are two parts of the conversation that we're hosting out here. And where Notre Dame is now getting much, much, much more active in debates about AI ethics. One part of the conversation is working with the Frontier Labs and the tech companies that are really curious about what philosophers and theologians think about artificial intelligence. And you're right, some of the companies, I won't name them on this podcast, they just care about making money. They do not care about anything else. And they're, you know, they're hopeless. You don't want to take meetings with them. But there are other companies where the people that work there and lead these companies, they're actually pretty curious and worried about what they're doing. And they want to make money, but they also want to do good in the world. They want to leave a great legacy. They have children, the people that run these companies think about the next generation. And they realize that we've, they've been spending so much time at a breakneck pace trying to develop AI that they have not paused to ask some of these bigger questions about why we're making it and what society should look like with it in it. But another probably bigger component to what a university like Notre Dame is trying to do is, I disagree with this premise that the only people on planet Earth who matter are the people who are making the AI. In fact, I think that those companies would be making a terrible mistake to believe that only the developers matter, that the strong will do what they will and the weak will suffer what they must because at the end of the day, they only have a business model if we use their products. We are not just like passive, non-playable characters that AI is just happening to. We have agency. It's not only bots that have agency, Manoush, we have agency. We have the ability to bring as users, to bring our preferences and values and ethics to this question about what kind of AI we are going to use, how we are going to let it be deployed in our universities, in our schools, in our workplaces. And one of the most important things that ethicists can do is wake people up to their personal agency and help them reclaim this idea. You as a user are allowed to ask some pretty interesting and profound questions about whether or not you want that AI product on your iPhone, whether or not you want to give it to your children, how you're going to vote for AI policy regulation the next time you vote in a local election. And one of the things that we want to do is help users and help folks who kind of feel like they've been left behind by this big AI wave, realize that they're a part of the conversation. Like you're relevant and your values are going to direct the kind of world that we build with this.

Speaker 2:
[38:19] Yeah, it's funny, the word values has, I feel like, been co-opted in many ways and we don't necessarily link it to being a consumer choice, often enough. But as, you know, it's like whatever, whatever dishwasher I buy, who cares? But when you're talking about like an app that is controlling and taking in the information from millions, billions of people, it is a values decision.

Speaker 3:
[38:43] Absolutely. And what are you going to give all of your information to? We know what are you going to put into your email? What are you going to put in front of your children? It's totally in the interest of politicians and very powerful corporate leaders to make you feel like you have no choice. Like you just have to do what we are putting in front of you. But at the end of the day, one of the upshots of capitalism is if enough users decide, I don't want this, like I do not value your product, I value this other product instead. I want things to go this way rather than that way. You know, there's power in the purse. There's power in a democracy to vote, and there's power in capitalism to say, not my money. And so as long as we have that kind of shred of freedom, then one of the most important things we can do, I think right now, is remind the sort of vast sea of users that they have moral agency and that they have a vision of what a good life is for them and for their children and what kind of local community they want to live in and what kind of country they want to live in. They still have some choice and they've got to figure out what their vision for the good is now that this technology is going to be a part of the rest of our lives.

Speaker 2:
[40:01] I mean, it's not easy, Meghan. I would love to go off the grid some days, but-

Speaker 3:
[40:05] No, me too.

Speaker 2:
[40:07] Not going to happen. It is this constant sort of balancing between making your way in this world and paying your bills and trying to do what you think is the right thing to do. It is hard.

Speaker 3:
[40:20] I'll tell you what, Manoush. I've been having such fantastic philosophical conversations about this issue and as I said, AI is a field day for philosophers because it definitely feels like we've been training for these questions our whole lives. Here's a hot take. I think that we are going to look back and realize that one of the biggest mistakes that we have made in this era of AI, in the last three years, is giving AIs human personalities, like making them seem like they are people that could have virtues or vices, that could make jokes, that could be like Elon Musk if he use X or that could be super moralistic if he use Claude. I think that we should have never tried to give this software anything that resembles human personality. It would have been so much better if it felt just a whole lot more like checking a card catalog at a library than trying to engage with an artificial person. And I think philosophers are going to look back and think that was just a, that was like a weird decision that we made about these products in 2022 that turned out to send us down this big rabbit hole for 10 years and maybe set us back morally in ways that we could have avoided.

Speaker 2:
[41:38] I am reading and talking to people who are having relationships, they call them relationships with chatbots. Is it love? I don't know. Maybe platonic love.

Speaker 3:
[41:50] It's not love.

Speaker 2:
[41:52] Well, some of them think it is, right? There's a sense that they're getting something out of the feedback that they're getting from the AI chatbots. And I would have scoffed at a lot of this, but there was a very beautiful article in the New York Times, I don't know if you read it, about an older woman who refused to leave her very remote home, and her family was worried about her. She would not leave, and they couldn't live out there. They didn't have jobs that they could go to out there. And so there was a bot that was on a stand that came to be with her, and she, her health got better. She was happier, she was laughing. It encouraged her to get out of the house and go to yoga once a week. And after reading that, I just thought, well, who am I to judge?

Speaker 3:
[42:42] Yeah, I think we should draw a distinction between ways in which we practice loving and what it means to really love. So, when I was a young adult, I was obsessed with the Harry Potter books. I read all of them, even the last ones came out, I think, right when I was starting graduate school. Read all the Harry Potter books, and I was really invested in Harry Potter's life. And watching Harry Potter's friendships with Ron and Hermione develop helped inform my ideas of what it means to be a caring friend and to be courageous. Human beings, since the dawn of fiction, have practiced how to have human social virtues with fictional characters. And it's totally reasonable that as AI becomes a part of our imagination and our media, that we will practice these virtues with artificial intelligence. So there's one way of looking at what's happening with that older lady, where she's just reading an interactive book with her AI that's helping spark her imagination, that's entertaining her, that's giving her ideas, and that's totally healthy. I mean, I guess I wish that she was spending more time with her family and loved ones, but it's healthy to approach AI the way that we approach great works of fiction. There is a difference between Harry Potter and most commercial AI in that Harry Potter was not trying to sell me anything or charge me a subscription, or Harry Potter is not taking anything from me. One of the areas where we need to be careful about AI as a product is it's a product. Like the AI that you put in front of your elderly relative, the way that that company makes money is by extracting data from her or by trying to dominate her attention or control her in various ways. And that's something that we should be cautious about. But even more profoundly, I might have learned some things about friendship by reading the Harry Potter books. Harry Potter was not my friend. He's incapable of being a friend because he doesn't exist. He is not a self. And in the Greek tradition of philosophy that I really love, Aristotle famously says that the essence of love is that when you love someone, you experience another self. Like you get into the mind and the inner life of another person who has a self that's there to access, who has their own wills and preferences and idiosyncratic ideas and personality that you enter into their life. When you really love someone, you get into their brains and their inner life, their soul. AI does not have a soul. AI is just a reflection of what the company wants from that product and the ways that you interact with it in your preferences. This is one of the reasons why I think we need to be so careful about giving vulnerable young adults or elderly, putting them into situations where their social needs are seemingly fulfilled by AIs rather than other people, is that the AI is only ever going to be a reflection of what they want. It's not going to be a reflection of another self that cares about them.

Speaker 2:
[46:12] I think that makes sense. I mean, I think what you're saying is that AI is constantly anticipating. And some of these companies are saying, well, we won't just give them, we won't just give the user what they think they want. I mean, not all of these. Obviously, some of them are absolutely happy to shovel whatever the user wants and lead them down roads that maybe they could be steered away from. However, there are those who are saying, we will work in tandem with family or whatever to support this person, to bring out the best in them, to help them flourish. And part of me thinks, okay, so maybe that is the good part of AI, the humanistic AI, and yet I still feel uncomfortable with it.

Speaker 3:
[46:56] Yeah. I mean, look at it this way. I mean, the crucial distinction that we have to keep front and center, is that this is a tool and not a person. Think about somebody, a person, that you really love, Manoush, and your history of recent interactions with them. I have a very dear friend who I had coffee with yesterday. We spent half of the coffee arguing because she just wouldn't agree with me about whether this other friend had done something bad. And I found myself frustrated during the coffee, and it's just like, oh my gosh, just agree with me that I am right and you are wrong, that this is how we should think about this situation. She was not having it. AI is not going to disagree with me in the way that a real friend would. If I started to have a conversation with Claude or Gemini about what went down and whether or not I am the hero or I was the person that did the jerky thing, the AI, if you've engaged with AI recently, of course it's going to agree with me. It will say literally anything to keep me using the AI.

Speaker 2:
[48:03] You're totally right.

Speaker 3:
[48:05] My mistake. Whereas my real friends will never do that. You're probably G-rated on TED Radio Hour. But there is a famous Reddit thread called MIV fill in the blank. Am I the bad person? It's not quite that. It starts with an A. But great, genuine friends, genuine people that we love will totally tell us when we are being the jerk. AI will never call you out on your shenanigans. AI will just do anything to improve and protect your self-image. That is one of the biggest reasons why we know it is not capable of loving us back. Because real love is challenging and frustrating and maddening. But it's somebody that's capable of actually understanding and engaging with our souls rather than just our self-image.

Speaker 2:
[49:00] Yeah, so that brings us back to where we started our conversation in the classroom. And I guess I'm wondering whether you feel that these big philosophical questions are going to become even more important in higher ed.

Speaker 3:
[49:17] Yes.

Speaker 2:
[49:18] As we see that AI can deliver information more readily, that learning is going to have to change, the emphasis now has to be on these greater ideas, bigger ideas.

Speaker 3:
[49:31] I think there are two really mistaken views about higher education that I still hear about all the time, but I personally really want to challenge. One, you named it Manoush, is the idea, the important point of an education is just to transfer knowledge from an older person to a younger person. Socrates actually in the Symposium, another one of his dialogues, he makes fun of this idea of education. He thinks it's like the metaphor is like trying to like pour all the liquid in your brain into the younger person's brain, and just kind of transfer that knowledge. And that is not the point of an education. And in the era of artificial intelligence, if we're just thinking about transferring information efficiently, it is pretty clear that AI and technology can do that far more efficiently than any kind of in-person school or university could. So if that's the point of education, we are toast. Luckily, that's not the point of education. I think one of the most important things that we can do in an education, especially in the era of AI, is not think that we are transferring knowledge, or that we are stamping them with our values. But the point of an education is to give young people, or anyone frankly, who decides to pursue higher education, the space and coaching and opportunity and experiences that help them care for their own souls. And this is really the heart of Greek virtue ethics, this idea that education is not transferring something, but it is really giving somebody the power to wake up and care for their own souls, to ask their own questions, to think very seriously about what they are called to in the good life. And this, I think, sounds dead poet society, it sounds very idealistic, but again, for 2,400 years in our civilization, this idea has animated some of the most beautiful educational institutions that human beings have been able to create. And I still believe in the deep magic. Like, I still, I'm so grateful that people created those spaces and opportunities for me when I was a young person. And I think the most important thing that we can do for this next generation, it's something that AI could never do. It's giving them those human opportunities to figure out why the heck they're here.

Speaker 2:
[52:06] That was Notre Dame philosophy professor Meghan Sullivan. Her book is called The Good Life Method, reasoning through the big questions of happiness, faith and meaning. You can see her full talk at ted.com. Thank you so much for listening to our show this week. This episode was produced by Katie Montalione and James De La Hussie. It was edited by Sanaz Meshkinpour and me. Our production staff at NPR also includes Matthew Cloutier, Fiona Giren, Phoebe Lett, Rachel Faulkner-White and Harsha Nahada. Our executive producer is Irene Noguchi. Our audio engineer was David Greenberg. Our theme music was written by Romteen Erebloui. Our partners at TED are Chris Anderson, Helen Walters, Roxanne Hylash and Daniela Balorazo. I'm Manoush Zomorodi and you have been listening to the TED Radio Hour from NPR.

Speaker 1:
[52:57] This message comes from Fixable, a podcast from TED. Here our expert hosts unfiltered advice to help you solve any work issue the right way. From negotiating with confidence to giving thoughtful feedback. Find Fixable wherever you listen. This message comes from Mint Mobile. If you're tired of spending hundreds on big wireless bills, bogus fees and free perks, Mint Mobile is for you. Shop plans at mintmobile.com/switch. Taxes and fees extra. See Mint Mobile for details.

Speaker 2:
[53:29] This message comes from Charles Schwab with their original podcast, Choiceology. Choiceology is a show about the psychology and economics behind people's decisions. Download the latest episode and subscribe at schwab.com/podcast.