Home
cover of episode Andrew Marantz doesn’t want you to give up on the internet

Andrew Marantz doesn’t want you to give up on the internet

2023/1/30
logo of podcast How to Be a Better Human

How to Be a Better Human

Chapters

The episode explores how the internet and social media algorithms influence our emotional responses and behaviors, often leading to extreme reactions and antisocial tendencies.

Shownotes Transcript

You're listening to How to Be a Better Human. I'm your host, Chris Duffy. I grew up in New York City, and so one of the big differences between my childhood and friends who grew up in other places is how little time I spent in cars. Starting in around fifth grade, I could walk over to a friend's house or hop on the bus on my own. I remember the first time I took the subway on my own, my dad secretly followed me the whole way, and he rode in the car right behind me. I actually didn't know it at the time. He only told me that he did this recently.

But pretty soon after that, he got comfortable with me on the train and I was comfortable. And if I wanted to go somewhere, I would just take the subway regularly all over the city. I have so many memories of sitting calmly on the train as it whirs along and I'm peacefully reading my book.

So it was kind of surprising when as an adult living in a different state, I owned my first car and I finally started driving to work regularly. I was really surprised by road rage and not just other people's road rage, but my own. Because I cannot think of a time when I ever cursed someone out or flipped someone off while I was walking down the sidewalk or on a subway. I just can't imagine that. But inside of a car?

I would often just find myself boiling with rage. I remember this one time, a car cut me off at a crowded intersection and I had to slam on my brakes to avoid crashing. And I was so angry at this person for doing this dangerous move and for being such a fool. And I was yelling curse words at them and they were yelling curse words at me. And then I pulled alongside the car ready to throw my middle finger out the window at them. And I saw that this person that I was so furious at was an elderly woman. And I just instantly felt so embarrassed and...

And listen, to be clear, she was still cursing me out. She was still furious. She was flipping me off. But I just felt like, what am I doing? It was probably a 30 second interaction maximum, but it felt so bad. And part of the reason it felt so bad is I was kind of shocked to discover that I had that inside of me. There are situations where for each of us, we find it very difficult to be our best selves. In fact, situations that bring out the absolute worst in us.

Journalist Andrew Morantz, he's been studying for years the way that social media and the internet often put us in those situations where we are metaphorically cursing out the person in the car across from us. So what does that mean for our culture, for our society, for democracy? And is there anything we can do to change that situation for the better?

Here's a clip from Andrew's TED Talk. Facts do not drive conversation online. What drives conversation online is emotion. See, the original premise of social media was that it was going to bring us all together, make the world more open and tolerant and fair, and it did some of that. The social media algorithms have never been built to distinguish between what's true or false, what's good or bad for society, what's prosocial and what's antisocial. That's just not what those algorithms do. A lot of what they do is measure engagement.

clicks, comments, shares, retweets, that kind of thing. And if you want your content to get engagement, it has to spark emotion. So we've ended up in this bizarre dynamic online where some people see bigoted propaganda as being edgy or being dangerous and cool, and people see basic truth and human decency as pearl-clutching or virtue signaling or just boring. And the social media algorithms, whether intentionally or not, they have incentivized this.

Because bigoted propaganda is great for engagement. Everyone clicks on it. Everyone comments on it, whether they love it or they hate it. We will be right back with more from Andrew Morantz after this quick break. And I promise you, the conversation that we are going to have is going to be fully optimized to spark your emotional engagement. So do not go anywhere.

How to be a better human is brought to you by Progressive Insurance. What if comparing car insurance rates was as easy as putting on your favorite podcast? With Progressive, it is. Just visit the Progressive website to quote with all the coverages you want. You'll see Progressive's direct rate, then their tool will provide options from other companies so you can compare. All you need to do is choose the rate and coverage you like. Quote today at Progressive.com to join the over 28 million drivers who trust Progressive.

Progressive Casualty Insurance Company & Affiliates. Comparison rates not available in all states or situations. Prices vary based on how you buy.

Hello, hello. I'm Malik. I'm Jamie. And this is World Gone Wrong, where we discuss the unprecedented times we're living through. Can your manager still schedule you for night shifts after that werewolf bit you? My ex-boyfriend was replaced by an alien body snatcher, but I think I like him better now. Who is this dude showing up in everyone's old pictures? My friend says the sewer alligators are reading maps now. When did the kudzu start making that humming sound?

We are just your normal millennial roommates processing our feelings about a chaotic world in front of some microphones. World Gone Wrong, a new fiction podcast from Audacious Machine Creative, creators of Unwell, a Midwestern Gothic Mystery. Learn more at audaciousmachinecreative.com. Find World Gone Wrong in all the regular places you find podcasts. I love you so much.

I mean, you could like up the energy a little bit. You could up the energy. I actually don't take notes. That was good. I'm just kidding. You sounded great. So did you. And we are back. Today, we're talking with journalist Andrew Morantz about how social media and the internet are built for emotion.

Hi, I'm Andrew Morantz. I'm a staff writer at The New Yorker magazine, and I also wrote a book which is called Antisocial Online Extremists, Techno Utopians and the Hijacking of the American Conversation. So obviously you study some pretty dark stuff. You have done a lot of research into worlds that many of us would avoid. How did you get into this in the first place? Why is this how you decided to spend your time? Yeah, I...

wonder that a lot. What I started with was kind of noticing, okay, it seems like the way we communicate and understand the world is increasingly through these algorithmic social platforms. Seems like those algorithmic social platforms are really gnarly places to be. And this is before even getting into any of the empirical social science of it. It just feels like

personally, anecdotally, like it's sometimes really gross to hang out on the internet. And then my kind of narrative reporter instinct was to say, okay,

Rather than, you know, just talking to experts or coming up with a kind of polemic screed about why the Internet is bad. Why don't I go try to find the people who are making it bad and see if I can hang out with them and specifically watch them do what they do? So part of it was talking to them, but part of it was actually sitting next to them as they did it and saying, OK, if you're going to try to use social media to break democracy, can I be there while you do it? And still, weirdly to me, a lot of them said, yeah, sure.

So much of the parts of your book that are like fascinating and gripping and read like, how can this be real? Are those moments where you're like right next to someone who is doing something that I think many people, regardless of where you sit on the political spectrum, would think of as like objectionable and really deeply problematic. And yet they're they're kind of just happy to be getting attention in any way. And so you feed into that as a journalist of like, great, like I'll give you access because that's more attention. Who cares if it's going to make me look bad? Right.

Obviously, as a journalist, mostly you're focused on documenting this stuff and getting it out in the open. But you must have also had ideas about how we can make the Internet less of a gross, troubling, scary place. Yeah. First, I wrestled all the time with the ethical conundrum of having a transactional relationship with people who I fundamentally distrusted, somebody who doesn't think our

our democracy should exist or doesn't think I should exist as a Jew or doesn't think, you know, trans people should exist. Like that was very, very uncomfortable for me to be playing into, uh,

as you say, giving them attention that they craved. And so I had a constant there's no one size fits all answer to that ethical conundrum. Right. When do you give those people attention and when do you not? And so my editors and fact checkers and I and people in my life, we would just constantly try to gauge, OK, when does the value of, as you say, exposing or showing patterns or informing readers about how this stuff works, when does that outweigh the cost of

essentially entering into an attentional transaction with this person who I fundamentally think is kind of a bad faith player. And you've been doing this for so long, too, that, you know, you started this reporting at a time when people were like, oh, come on, what people say on the Internet doesn't matter. The Internet isn't real life. And then obviously we've all seen that what people say on the Internet often has very real consequences in the real world. And the whole idea that the Internet isn't real life has kind of fallen apart a little bit, even if there are elements of truth in that.

sentence. Yeah, yeah, yeah. I think we're thankfully sort of beyond the

is Twitter real life or not debates of, you know, 2016 or 17. I think there is still truth to, let's say, the idea that maybe you can win this or that political campaign by ignoring what people on Twitter want you to do and paying more attention to what constituents on the ground want you to do. Right. So there are still versions of the Internet is not real life that you could kind of salvage. But the notion that, yeah, as you say, when I started doing this stuff in 2014, 2015,

Yeah, I did get a fair amount of people saying, okay, so there's some fringe people who have a website somewhere, like who really cares? And yeah, you don't get that much of that anymore. Are there things that the regular person can do to make the internet less of a cesspool or less dangerous or even on the positive side, like a friendlier, nicer place that they enjoy being more?

Sometimes what people will ask me for is like, OK, what are the five commandments of Internet life that will equip me for any situation? And like anything, the nostrums, the rules of thumb are not going to get you all that far. So going back to what we were just talking about, 2015, 2014, the kind of basic logic of don't feed the trolls had not yet

been mainstreamed. So you got a lot of this and you still see some of it amplification along the lines of can you believe this awful person said this thing? And that I think is is one good place to start is that you do not always have to amplify everything that you strongly dislike or strongly like or have an emotional reaction to. I think it's surprisingly alluring the temptation to say,

I saw this thing and it freaked me out and I want everyone to know about it. And I just think it's important to step back and remember that was how

Donald Trump ran for president, not even in 2016, but he tried to launch a run in 2012, essentially around being a birther, you know, not believing that Barack Obama was born in the United States, a thing that nobody had to pay attention to, right? Like con man, business guy says ridiculous thing does not inherently have to be a story. It was only a story because it incited and enraged enough people that

that they felt we have to tell people that this guy is saying this outrageous thing. The idea of like, you could always not as the first rule of the internet. I love, I mean, I think it's really funny. And obviously like instantly anyone who's been online realizes like, oh yeah, that is an important thing that people could know. It also makes me think,

I used to work in an elementary school. You are the parents of two young children. And sometimes when I am out in the real world or when I'm interacting with people online, I just realize how all of us, all adult humans are also just small children, because the number one thing that a kid does is like if you give them a big reaction, they're going to keep doing that thing, whether it's a positive reaction or a negative reaction. Like, oh, I got attention. That's a thing that I could do.

So I wonder if you as a parent of young kids see the connection between how you interact on social media and how you interact with your sons. Yeah, I see it all the time. I'm the parental figure in my house and the parental figure in the house that is social media is Mark Zuckerberg or Elon Musk. And so they are making paternalistic choices, whether they admit it or not, about what the

the users of their platform are doing. I call them in my book, the new gatekeepers. And I call them that sort of pointedly because that's the last thing they want to be called, right? The people who run social media platforms, they want to be seen as liberators. They want to be seen as we're taking down the gatekeepers. We are disrupting, we're innovating. And so they don't take responsibility for the power that

they have, which as anyone who has seen the Spider-Man movies knows is against the rules. These people have this immense power and responsibility to be shaping people's behavior and they are shaping people's behavior, whether through commission or omission, whether through intention or recklessness, they are shaping people's behavior.

If you are a bizarre, negligent, erratic parent, the way Elon Musk is both in Twitter and apparently in life, you can't shape behavior in a coherent way. So you're giving people all kinds of constant, contradictory, frenetic, informational signals about and direct incentives about what they should be doing with children. It is all about attention and dopamine and these tiny feedback loops. And as we all know,

The whole premise of the business model of social media is a giant dopamine slot machine. Some of the recent parenting stuff I've seen in my own life is...

Attention is the big one, but there's also there are different kinds of attention and there's also different ways to get to the root cause behind it. Right. So we've entered the toilet word phase of my five year old life. And social media has never left that phase. Exactly. He's playing the slot machine and seeing when I say one of these seven words you're not supposed to say in the playground, it gets a big laugh. He's going back to that again and again.

That's kind of all you need to know about the basic mechanism of the thing. And then we are so attuned as human beings to finding the patterns that will make us feel more loved and accepted that that has become one of the biggest business models in the world. If the only lever I have is disciplinary carrots and sticks, if all I can do is say, I will punish you if you say the naughty word or I will reward you if you don't.

Both of those are extremely limited, right? Because I'm not touching the root cause of why this is happening. I also can't say be a different kind of person than you are, right? I can't just say have a fully developed frontal cortex and don't be interested in what the kids on the playground think of you, right? That's also not realistic. But sometimes, and it doesn't happen all the time, I can get to a root cause and say, okay,

Why were you so interested in getting this kind of attention? And sometimes he will say the kids were kind of ganging up on me. And then you're at the level of root cause instead of at the level of do I take away your granola bar or do I not? And the very direct parallel with social media stuff is that we often start and end the conversation at the

OK, this person is saying toilet words or this person is saying Nazi words or this person is, you know, doing this behavior that we find distasteful. Do we ban the account or not? Do we freeze the account or not? Do we set a rule around it or not? And those are all valid questions, but they are not getting anywhere near anything like a root cause.

So in your work, in your journalism and in your book, you often have sat with individual people who are bad actors, right? Whether that person is as extreme as a neo-Nazi or, you know, extreme on a different end, someone who is deliberately putting disinformation out online, trying to spread things that they know are false because they know it'll get attention or achieve a means. So there obviously are real bad actors out there.

But one of the big things that I think changed the way that I see social media in general is this idea you talk about in your book, which is that overall, even if everyone was a good actor, that social media is designed to prioritize certain types of emotions and not others.

Yeah, people will say they'll try to make some argument about human nature or they'll say, well, you know, human beings gravitate toward things that are on the extremes or human beings want to be emotionally stimulated or some sort of vague thing. And what that leaves out is that not all emotions are created equal and not all emotions are equally incentivized. These social media algorithms that we have come to see as normal and default have

have made very specific choices about how to boost certain things in the algorithm. And the most basic choice they've made is to boost things that are emotionally engaging around what social scientists refer to as high arousal emotions, things that make your blood boil, things that make your heart rate increase. These are literally measurable in a lab. Fear, rage, excitement. Some of them are positive, some of them are negative.

Part of the reason my book is called Antisocial is not just because I was with some bad people, but because there are prosocial and antisocial feelings, meaning prosocial things that bring us together, antisocial things that drive us apart. It just so happens that there are more high arousal emotions that tend toward antisocial ends, as we have now seen. So you don't get extra points on the board.

because you made someone think or made someone feel something. The only way you get extra points on the board is if somebody takes an action in response to your post. They retweet it, they like it, they share it, they dislike it, they hate retweet it, whatever.

The things that make you more likely to take an action are high arousal emotions. So if somebody listens to us talking right now and they have a very strong emotion of, I feel edified and engaged and connected and I feel part of a community, those are really good pro-social emotions that I hope we can try to foster in people. But those are not emotions that are strongly associated with

I will necessarily take an action. I mean, maybe they will tell a friend, maybe they will, you know, mention it to someone, but it doesn't necessarily mean that they're going to smash that like button. If you are feeling really keyed up and like, can you believe that this

jerk said this outrageous thing, that's what makes you smash the button. And so it's really that simple. It's just the mechanic of the thing is built around emotions that on average are not good for us. So what can we do if we're online and we're noticing that we're in a high arousal state? Like, how do you channel your emotions in a way that's productive and doesn't just give in to the negative antisocial parts of this? I'm just going to fully give the dad answer to all of these. Take a breath.

You know, stretch your body and wait till your body feels safe. There are giant structural things that are beyond any one person's capacity to change single handedly. So that's the demobilizing part. The mobilizing part is you are the coin in the slot machine. It's you, your attention. So you can choose where that attention goes or doesn't go. It's hard to choose because these things

supercomputers are designed to scramble your brain. But if you step back and disengage, and sometimes that means not doing anything. Sometimes that means closing the laptop or throwing your phone under your bed. But sometimes it just means having a two second break between the thing you feel compelled to do and the thing that you...

end up doing. And also sometimes it means just knowing how something works. Like I sort of the way I think about narrative journalism is I don't always think that I can shed light on something and automatically change it. You know, it's not necessarily I showed that this person was innocent and then, you know, they were released from prison. It's great when that happens. But other times it's just being aware in the same way that being aware of, you know,

reading Kitchen Confidential and understanding how they make fish in a big restaurant just might make you think twice before you order the fish. When you have just a basic awareness of how this stuff works, you just move through it differently. Yeah, it doesn't necessarily mean that you

necessarily throw your phone in a river or disengage or delete everything. You just kind of feel less like a cog and more a little bit like your hand is on the levers of the machines just in your own personal way. I actually would strongly encourage everyone to throw their phones into a river. I want to make sure you have a plan for how to listen to podcasts once that phone is gone. So, you know, make sure make sure you've downloaded them on your computer or on some other device.

Okay, and while you are doing that downloading so that you can safely destroy your phone while still finishing this podcast, we are going to take a quick break for some podcast ads. Warmer, sunnier days are calling. Fuel up for them with Factor's no prep, no mess meals. You can meet your wellness goals thanks to this menu of chef-crafted meals with options like Calorie Smart, Protein Plus, Veggie Vegan, or Keto. And Factor has fresh, never frozen meals, which are dietician approved and ready to eat in just two minutes.

That sounds like a dream come true. I cannot wait. So no matter how busy you are, you will always have time to enjoy nutritious, great tasting meals. Make today the day that you kickstart a new healthy routine. What are you waiting for? Head to factormeals.com slash betterhuman50 and use code betterhuman50 to get 50% off your first box plus 20% off your next month. That's code betterhuman50 at 50%.

factormeals.com slash betterhuman50 to get 50% off your first box, plus 20% off your next month while your subscription is active.

Support for this show comes from Brooks. My friends at Brooks sent me amazing new Go 16s that I have been running with. The Go 16 has made me feel lighter and more energetic when I run out the door in the morning. They have soft cushioning through a technology called the Nitrogen Infused DNA Loft V3. It offers just the right softness.

There's also engineered air mesh on the upper side of the shoe that provides the right amount of stretch and structure. It'll turn everyday miles into everyday endorphins. That sounds good, right? Let's run there. Visit brooksrunning.com to learn more. So the number one thing that has to happen here is social networks need to fix their platforms. So if you're listening to my voice and you work at a social media company or you invest in one or, I don't know, own one,

This tip is for you. If you have been optimizing for maximum emotional engagement and maximum emotional engagement turns out to be actively harming the world, it's time to optimize for something else. Okay, so that was another clip from Andrew's TED talk, and he was addressing the responsibility that tech companies have to bring about change.

But what about the role of the government? What policies would be helpful in dealing with these issues? You're not saying that we should just be having the equivalent of prohibition where we ban all social media and we just say that's going to work. But obviously, we have limitations and we have laws around alcohol, and that helps mitigate some of the harm. So what are some of the policy changes that you think people should push for? People who think that they are flummoxed by what to do here, I just want to

uplift you and tell you you're right to feel that way because I've been thinking about this for a few years and I am right where you are. I look, I have views about what the FTC could do, what the SEC could do. I think that Meta is too big a company. I think Amazon is too big a company. I'm personally in favor of taking antitrust action against companies that are that big. That

kind of thing, which I would advocate for, I think still in a way isn't thinking big enough because a way to get at how big this problem is, right, is that with alcohol or cigarettes or pharmaceuticals or food or cars, I feel that the government has a strong, robust role to play in regulating those things, making them safe.

There are times when the government has totally failed at that. There are times when it's been moderately more successful. We can find policy fixes to a lot of those things. I think there are two slippery slopes. There's the slippery slope of doing nothing and letting the most powerful social communication tools in history become garbage fires of bigotry. And I think there's also a slippery slope of what if the government gets too involved in legislating and regulating which speech should exist. And we're getting to a point

I think a slightly dangerous place where people who take one seriously have a hard time taking the other seriously. And so you see a lot of kind of free speech absolutist stuff. Again, Elon Musk famously, his reasoning for taking over Twitter was I'm a free speech absolutist. And I think that any speech that's legally allowed in the United States should be allowed on Twitter.

Then about five minutes into owning the company, he realized that what everyone had told him was true, which is that's an incoherent mission statement. As a content moderator, you cannot run a social media company that way. And a lot of the chaos we're seeing as a result of that. I've also learned from you. This is a phrase that I've learned from you is the idea that freedom of speech does not mean freedom of reach.

And that being allowed to say whatever you want does not mean that whatever you say has to be promoted through a microphone to millions of people who don't know you. Exactly. Exactly. So it's just so simplistic as to be disingenuous, I think. The question is not whether anyone can say whatever they want. The question is, how much do the algorithms amplify and promote it in ways that are invisible to the average user? So.

That whole canard of, well, if you love free speech, then you'll give me as a plutocrat a free pass when I do nothing to prevent my thing from becoming a garbage fire. That that whole cascade of logic, I think, has been discredited, at least to me. Not everyone feels that way. But we've come a long way from, again, the beginning of when I started covering this. It was much easier to get away with saying stuff like that. I hear what you're saying. And I think it's a really important point that there are two things.

competing concerns, right? That the idea that we don't want to just let pretend that free speech is just on its own will just work. And that we also don't want to pretend that like massive regulation of what you're allowed to say can work, right? But obviously,

there is a middle ground between those two. And as someone who's thought about it a lot, I do feel like I'm curious to hear what policies you think need to change or even if you don't have this specific proposal, like what is it that you think needs to be tweaked more that's not getting tweaked in the right way? So I would propose breaking up companies that are too big or not allowing further acquisitions in ways that seem against the spirit of antitrust. Section 230, which I

makes it hard to hold these companies legally liable, that's another area where policymakers can tweak things. The reason I say it's not big enough is that I really think we should be cautious about investing too much power in any government entity to make recommendations about what speech should exist. So where does that leave us in terms of policy recommendations? I think it's kind of like the climate thing where

There are a lot of incentives that can be set at a governmental level. There are a lot of things that can be done from a corporate level. But ultimately, there's a huge sort of ethical shift that needs to happen, which is we need to stop burning dead carbon in the ground and we need to start finding entire new economies that can power us. We really need to move beyond

high arousal, emotional, algorithmic social media, full stop. So the things that people should advocate for responsible government regulation of social media and all that stuff, I think that's all good to think about. But I actually just think I would actually encourage people to think even bigger than that. Another thing that you have obviously looked on a lot is extremism and how people become extremist online and get radicalized.

This show is, you know, people listen all over the world. I'm sure there's lots of people listening who are outside of the U.S. Online extremism is happening across the globe. It's not just a U.S. issue, even though these companies are often based in the U.S. How do you see the global picture of online extremism? As long as we have a high arousal slot machine based information system, there will be people who will be radicalized by it in various ways. It seems like

Kanye West, Elon Musk, Donald Trump, a lot of people have exhibited similar symptoms of what you could call, you know, algorithmic brain poisoning. I spent a lot of time with people who were brought down a particular rabbit hole of male supremacy, white supremacy, antisemitism, whatever the case may be. And it was very easy to see that as a fringe thing. But when I saw Kanye West going down this path,

I could immediately almost guess almost like a kind of bingo card like, oh, I wonder if he's going to start talking about the JQ. The JQ is short for the Jewish question. Oh, there he goes. He started talking about the Jews like he says out loud everything he's thinking. Given what he had been looking at before, it was not very hard for me to see where the algorithms would push him next.

Personally, that breaks my heart as a Kanye fan. But systemically, it's not clear to me how you can get that to stop happening at scale. Let's take it out of the realm of it being someone who is kind of distant and famous. And instead, you know, millions of people have this experience where someone that they love, a friend or relative,

starts to say some things, it may not be as extreme as like overt anti-Semitism, but they found things online that are troubling or you disagree with. You feel like they're kind of becoming more extreme with what they're engaging with online. What can we do if we have a person in our lives? And it feels like that's they're starting on that path, but they're not at the end, which is obviously a much more complicated problem. I think the first step is to

try to meet people where they are, try to listen. And listening obviously doesn't imply agreement or acceptance or excuse making, but try to listen so that you actually know what they're saying, because often you're

you hear your, let's say, teenager say something and it freaks you out and your immediate response is a kind of anxiety turtle going into its shell response. And you just sort of say, I know I don't like what you just said. And so, you know, stop it or I'm going to show disapproval or I'm going to forbid you from going to that, you know, YouTube channel or whatever. I feel like anyone who's tried that as a parent has had

not that much success with it. Let's just say in this example, your teenager says to you, I've been reading some really interesting things about demographics and birth rates and the future of European civilization. I would understand if you recoiled from that and said, I don't like that. Stop talking about that. The question for me is, what about that stuff is the person finding interesting or engaging with or what need is it meeting for them?

Some of it is all the most basic stuff, loneliness, longing for community, wanting to feel seen, wanting to feel like your identity is being reinforced. And some of it is just actual intellectual curiosity gone wrong. We can't really legislate that away or regulate that away or even on a person to person level forbid people from asking those questions.

I would like to see people providing better answers to those questions. What about your life is making it hard for you to make meaning and find community? What is causing you to feel alienated? What does it mean to have a personal and group identity that is constructive and not destructive and positive some and not zero some? And the Internet is a big place. So some of that stuff is going to be really boilerplate and, you know,

It's a small world after all kind of stuff, but some of it can get pretty weedsy. And I think for some people that that would fill a vacuum that is currently out there. Something that you talk about in your book is how a lot of the really dangerous stuff online starts out kind of as a joke. And as a comedian, right, I see this all the time of people being like, oh, you don't take it so seriously. It's just a joke.

And you talk about how a lot of the extremist groups online start out by saying something and being like, it's just a joke. It's just a joke. It's just a joke. Laugh about it. And then bit by bit, you're like, OK, well, you hear those ideas a lot in jokes. What if it wasn't a joke? And then, oh, I'm just being ironic. And then it gets more serious the deeper you go down the rabbit hole.

And, you know, I'm saying this as a comedian. I don't want people to be like wet blankets who can't take a joke or laugh about anything. But I wonder if there is a way to have these conversations where you kind of get a preview of what's coming down the road, because I think that would in my uninformed opinion, it seems like that would stop people as if you're like,

this joke leads to you being a neo-Nazi. People would be like, hold on, that's not where I want to go. But if you take all those steps, all of a sudden you are surrounded and you realize, oh, I'm friends with all these people who believe these things. Maybe I believe that too. And it takes a different kind of person to step out of that. The comedy thing is a great example, right? All these people say, it's just a joke. Why are you being a wet blanket? Why don't you get a sense of humor? And then

I mean, not to take it there, but that was also the response to Hitler, right? Why can't everyone see that he's just a clown? Like he's literally being mocked by Charlie Chaplin. Why doesn't everyone get a grip? This guy is never going to have real power. I mean, that was the discourse in American media in 1936. So, you know, not everyone is Hitler. Not everyone is a Nazi, but.

Leaving yourself open to the possibility that some people are makes these things very confusing. I spent a lot of time in the book with the Proud Boys, a street gang of Western chauvinists, they call themselves, which is another bad thing to be in my personal view. And they were started by a guy who was, you know, said he was a professional comedian and said, you know,

why can't everybody see that this is a big joke? We're talking about memes and songs from Aladdin and why is everyone taking this so seriously? And then flash forward four years and Donald Trump is on a debate stage telling the Proud Boys to stand back and stand by. So it's very easy to seem like a pearl clutching alarmist. But leaving that possibility open in your mind, I think is a really important exercise. What would it mean

If you were to take seriously the idea that this joke is a step down a potentially slippery slope that could get really scary, what would that even mean? Again, to be really clear, I don't think that means that you ban the speech. And I don't even think it necessarily means that you don't want people telling those jokes. I think it just it changes how you reflect on it and how you would encourage people in your life to respond to it. Do you have any sort of like

positive takeaway for the future of social media? I feel like you don't think that it's just doomed. So I'm curious, like, what is your like positive takeaway for the future of social media or for the future of the Internet? You can create a social media for yourself that is based around the stuff you want it to be. Seeing your friends, baby pictures and feeling good about that. Being in touch with people you've lost touch with all the stuff that they advertise to you because it's the thing that people want

You can create that for yourself. It's just harder than you think it is because there's always other stuff trying to suck you back in. OK, so, Andrew, we're coming to the end of our show. And one final question for you. So the show's called How to Be a Better Human. I'm curious, what is one way that you yourself are trying to be a better human right now?

I'm trying to meditate more. I feel like a lot of this stuff just boils down to what are you actually doing with your attention on a moment by moment basis? And you can train yourself to be better at that by just sitting and actually doing it. And it's very easy for me to tell myself that I don't have time to do that.

because I'm too busy, you know, catching up on the white Lotus or whatever, but it's something that I'm always getting better at finding time to do. And it, I'm always glad when I do it. Thank you so much, Andrew. I really appreciate it. Yeah, yeah, yeah, for sure. That is it for today's episode of how to be a better human. Thank you so much for finishing the episode. Thank you. Especially if you really did throw your phone into a river, you went the extra mile and I just want to let you know that does not go unnoticed.

Thank you to today's guest, Andrew Morantz. His book is called Antisocial. I'm your host, Chris Duffy, and you can find more from me, including my weekly newsletter and information about my live show dates at chrisduffycomedy.com. How to Be a Better Human is brought to you on the TED side by Anna Phelan, who's posting baby pictures of herself, Whitney Pennington-Rogers, who is supportively liking every single one of your posts, and Jimmy Gutierrez, who's posting only the very finest memes.

This episode was fact-checked by Julia Dickerson and Erica Yoon, who are strongly advocating for an emoji react button that says, sources needed. From PRX, our show is brought to you by Morgan Flannery, who is taking a deep breath and stepping away from her feed, Rosalind Tortosillas, who is turning off notifications, and Jocelyn Gonzalez, who keeps telling me that a million downloads isn't cool, but a billion downloads sure is.

And of course, thank you so much to you for listening to our show and making this all possible. We will be back next week with more episodes of How to Be a Better Human. PR.