I'm John Walczak, host of the new podcast Missing in Arizona. And I'm Robert Fisher, one of the most wanted men in the world. We cloned his voice using AI. Swine off.
In 2001, police say I killed my family and rigged my house to explode before escaping into the wilderness. Police believe he is alive and hiding somewhere. Join me. I'm going down in the cave. As I track down clues. I'm going to call the police and have you removed. Hunting. One of the most dangerous fugitives in the world. Robert Fisher. Do you recognize my voice? Listen to Missing in Arizona every Wednesday on the iHeartRadio app, Apple Podcasts, or wherever you get your favorite shows.
In the early morning hours of September 6, 2016, St. Louis rapper and activist Darren Seals was found murdered. That's what they gonna learn. On for death, on for nothing. Every day Darren would tell her, all right, ma, be prepared.
They are going to try to kill me. All episodes available now. Listen to After the Uprising, The Murder of Darren Seals on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. From iHeart Podcasts comes Does This Murder Make Me Look Gay?,
9-1-1, what's your emergency? Mastavati is dead! Featuring the star-studded talents of Michael Urie, Jonathan Freeman, Frankie Grande, Cheyenne Jackson, Robin de Jesus, and Kate McKinnon as Angela Lansfairie. Lick them, lick those toesies. Listen to Does This Murder Make Me Look Gay? as part of the Outspoken Network on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hey guys, it's Andrea Gunning with some big Betrayal news. I have been on location with some of the people you heard in Season 2. Ashley Avea and their family to shoot a docuseries for Hulu. I'll let you know when the docuseries is available on Hulu later this year. We're also excited to announce that Betrayal will become a weekly series starting this summer.
Thanks to your support of this podcast, we'll be able to bring you many real-life stories of betrayal, making this community even stronger. So if you've been thinking about sharing your story, now is the time. Email us at BetrayalPod at gmail.com. That's BetrayalPod at gmail.com.
I want to share some news that affects parents and children everywhere. Our second season of Betrayal focused on families destroyed by child sexual abuse material, also called CSAM. The National Center for Missing and Exploited Children has reviewed over 322 million images and videos of child sexual exploitation. It's hard to wrap your head around that.
It's why we couldn't stay away from the topic last season. It's also been a big issue in Washington recently. Betrayal producer Kerry Hartman has been following developments. Kerry, I know you watched it. What did you see?
Yeah, I watched it. It was fascinating. The Senate Judiciary Committee, they subpoenaed five CEOs of some of the biggest tech companies, Discord, Snap, Meta, X, you know, formerly Twitter, and TikTok. And the committee wants to advance several bills that address online safety for children. And this hearing, it got a ton of publicity. And at the beginning, Senate Judiciary Chair Dick Durbin explained
how the committee was feeling. These apps have changed the ways we live, work and play. But as investigations have detailed, social media and messaging apps have also given predators powerful new tools to sexually exploit children. Your carefully crafted algorithms can be a powerful force on the lives of our children. Today, we'll hear from the CEOs of those companies.
Their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk. But the tech industry alone is not to blame for the situation we're in. Those of us in Congress need to look in the mirror.
This was a major issue for two New York Times reporters that you talk with earlier this season. Yeah. Why don't we actually revisit that interview with Gabriel Dance and Michael Keller? We spoke with people who said that as early as 2000, tech companies knew this was a very serious problem and were doing nothing to solve it.
In 2009, when they introduced scanning technology, we knew that it could be effective in helping stem the problem. Still, tech companies were not using it. I would say if you talk with most technology policy people, their answer would be no.
Technology companies don't have that much pressure to get rid of harmful content on their platform because Section 230 of the Communications Decency Act shields technology companies from any liability for content that users post.
Can you explain more about what Section 230 does? Okay, so Section 230 means any lawsuit holding a tech company liable for damages
won't go anywhere. They have immunity. So if Facebook, Discord, Snapchat, or X is storing or transmitting images of CSAM, for example, parents can't hold the company responsible and try to imagine if it was your child's photo and if that child was tricked into sending it.
But Section 230 was passed almost 30 years ago, back in 1996. No one could have imagined back then TikTok or Instagram or even sextortion. People still had their photos developed at the drugstore.
And I have to tell you how real this is. I mean, this happened to a close friend of mine, to her child. You take a vulnerable kid and a savvy adult with no conscience and no barriers. Right. So why was there a hearing now? It seems in recent months that frustration with tech's immunity is just getting bigger on both sides of the aisle. And look, this isn't the first time Congress has summoned tech leaders for a shaming session.
But I was really curious. Was this more than a shaming session? So I reached out to Politico technology reporter Rebecca Kern. She was in the room for this whole thing, and she shared some of her thoughts. Oh, interesting. I've been covering efforts in Congress to regulate social media companies and how they handle kids' online safety issues.
Typically, there's a lot of posturing from the senators. But in the room, it was very palpable, the motion, because this time the committee members invited families whose children have died as a result, they say, of content they've been exposed to on the platforms.
A number of children have committed suicide over cyberbullying, over a new phenomenon that I know you guys have covered in the podcast called sextortion, where organized criminal groups create fake accounts that pose to be other children and extort illicit images from children and then hold them financially. Oh my gosh. Yeah. And the committee chair, Dick Durbin, co-sponsored the Stop CSAM Bill.
That bill would hold platforms responsible if they host CSAM or make it available. And you're probably thinking, well, who would make those images available? But haven't you ever searched for something like you just took up skiing recently, right? So you want to see more images of skiing. And then the platform's algorithm recommends more content because they think that you like that.
Well, it does the same thing with nefarious and dangerous content. And Senator Ted Cruz went after Meta on exactly that point. Mr. Zuckerberg, in June of 2023, the Wall Street Journal reported that Instagram's recommendation systems...
were actively connecting pedophiles to accounts that were advertising the sale of child sexual abuse material. In other words, this material wasn't just living on the dark corners of Instagram. Instagram was helping pedophiles find it by promoting graphic hashtags, including #PedWhore and #PreteenSex to potential buyers. Instagram also displayed the following warning screen:
to individuals who were searching for child abuse material. These results may contain images of child sexual abuse. And then you gave users two choices, get resources or see results anyway. In what sane universe is there a link for see results anyway? How did Mark Zuckerberg respond to that? There's no good answer for that. But here's what he said.
Well, because we might be wrong. We try to trigger this warning, or we tried to, when we think that there's any chance that the results might be wrong. Here's more from Rebecca Kern. Tech companies will admit, and it is for sure not something they want on their platforms, they don't want to be hosting CSAM, and they take great efforts to remove it. And I will give them credit. They invested
millions of dollars into AI and machine learning to detect it early. But it's still there and it gets spread across multiple platforms. These companies are self-policing and self-reporting, but we're depending on them to find it and shut it down. It's interesting that you bring that up because a senator from Rhode Island, Senator Sheldon Whitehouse, commented exactly on that issue. We are here in this hearing today
Because as a collective, your platforms really suck at policing themselves. In my view, Section 230 is a very significant part of that problem. Listen, there were great soundbites from senators, but that doesn't translate to policy, right?
Rebecca Kern pointed out that Section 230 served an important purpose, at least for a while. We wouldn't be leading the globe in these innovations without Section 230 and allowing them to flourish without lawsuits. But...
A lot of other senators are saying, okay, we allowed them to flourish and grow. Now we need to rein them in. And we're an outlier in the whole globe. Europe has been able to pass regulations and hold them accountable. And so a lot of people said it's time to take away this quote unquote sweetheart deal that we have given to tech companies.
I'm John Walczak, host of the new podcast Missing in Arizona. And I'm Robert Fisher, one of the most wanted men in the world. We cloned his voice using AI.
In 2001, police say I killed my family. First mom, then the kids. And rigged my house to explode. In a quiet suburb. This is the Beverly Hills of the Valley. Before escaping into the wilderness. There was sleet and hail and snow coming down. They found my wife's SUV. Right on the reservation boundary. And my dog flew. All I could think of is him and the sniper me out of some tree.
But not me. Police believe he is alive and hiding somewhere. For two years. They won't tell you anything. I've traveled the nation. I'm going down in the cave. Tracking down clues. They were thinking that I picked him up and took him somewhere. If you keep asking me this, I'm going to call the police and have you removed. Searching for Robert Fisher. One of the most dangerous fugitives in the world.
Do you recognize my voice? Join an exploding house, the hunt, family annihilation today, and a disappearing act. Listen to Missing in Arizona every Wednesday on the iHeartRadio app, Apple Podcasts, or wherever you get your favorite shows.
New from Double Asterisk and iHeart Podcasts, a 10-part true crime podcast series. Emergency 911. This is fire in my apartment life. This car is on fire. In the early morning hours of September 6, 2016, St. Louis rapper and iconic Ferguson activist Darren Seals was found shot dead. Every day Darren would tell her, they are going to try to kill me.
A young man in 2016 was killed on this block. I'm a podcast journalist. And I'm a former state senator, Maria Chappelle Nadal. I was in the movement with Darren, and I've spent two years with co-host Ray Novoselsky investigating his death. Even if I did want to tell you something, that's a dangerous game to play. The FBI did this to myself. They've been following him for months. That's enough proof right there. All episodes available now.
Listen to After the Uprising Season 2, The Murder of Darren Seals, on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. ...ghoules and girls, and welcome to Haunting, Purgatory's premiere podcast for all things afterlife. I'm your host, Teresa. We'll be bringing you different ghost stories each week, straight from the person who experienced it firsthand.
Some will be unsettling. When she was with her imaginary friend, she would turn and look at you and you felt like something else was looking at you too. Some unnerving. The more I looked at it, I realized that the some looked more like a claw, like a demon. Some even downright terrifying. The things that I saw, heard, felt in that house were purely demonic. But all of them will be totally true.
Listen to Haunting on the iHeartRadio app, Apple Podcasts, or wherever you live and get your podcasts. Did any comments stand out to you while you were watching? There were a lot of them, but this one from Amy Klobuchar kind of got me. When a Boeing plane lost a door in mid-flight several weeks ago, nobody questioned the decision to ground a fleet of over 700 planes.
So why aren't we taking the same type of decisive action on the danger of these platforms when we know these kids are dying? She has a point, right?
When everyone is worried about their own physical safety, boom, it's done. Exactly. And I got to tell you about another moment that really took the room down. And that was when Meta's CEO Zuckerberg testified that social media doesn't really do any harm to kids. With so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well-being. I take this very seriously.
Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. Did he say that with a straight face? He did. And there was some laughter. I mean, it was one very short moment of levity, but, you know, it's just so absurd. You don't have to be a social scientist or a psychologist to understand that social media impacts kids a lot.
Was there anyone there defending the work of technology companies? I mean, there are ways they've enriched all of our lives. Can you even remember life before Amazon? Life before Amazon? You mean going to a store and having to wait in line?
No, of course not. No. But all kidding aside, some senators mentioned that and did praise these companies for adding some value to society. But this hearing wasn't set up for pushback. It was really about these tech companies being told draconian measures are coming if you don't do a better job.
But outside of this, there is an advocate for the tech company called NetChoice, and they are pushing back pretty hard. They have filed several lawsuits against states that are tired of waiting for the federal government to do something. Can you give me an example?
Sure. There's one, Netchoice is suing the Ohio Attorney General over the Social Media Parental Notification Act. This law requires companies to obtain parental consent before individuals younger than 16 can use platforms like Facebook, Instagram, YouTube, Snapchat. So Netchoice does not support any of these bills being pushed by the Judiciary Committee. What do they support?
Well, free speech is what they hang their hat on. Free speech, free speech all the way. But one thing that they did promote that will be familiar to our season two listeners is to hold child abusers accountable.
By prosecuting more of them, you know, far too many reports of CSAM offenses are not investigated, not prosecuted, because we talked about this, Andrea, like they're triaged, right? There's not enough law enforcement to go after all the people that are breaking these laws. And when they're able to go after them, they can prosecute them and at least put them in for some kind of prison time.
But despite NetChoice, there was some movement on one of the bills called COSA, or the Kids Online Safety Act.
Now, this bill wouldn't repeal Section 230. So we asked Rebecca Kern, what would it do? That one specifically would hold tech companies accountable and imposing a duty of care for them to make sure that their recommendation systems, their algorithms do not recommend harmful quote unquote content. That is the key word. How do you define harmful?
For them, they're saying it's suicide content, it's eating disorder content. And Rebecca pointed out that some groups are worried about COSA moving forward.
Progressive LGBTQ groups are saying we're worried that this bill also empowers state attorneys general to sue over harmful content and how they would define content, maybe like trans content or LGBTQ content that these communities would want to see on the platforms. Some conservatively-engaged youth may want to take that down. So they said this could have an inadvertently negative impact for certain vulnerable youth.
While the CEOs were on the hot seat and, you know, the day before they were called to the hearing, they did make some concessions that are worth mentioning. Here is ex-CEO Linda Iaccarino. Ex-supports the Stop CSAM Act.
The Kids Online Safety Act should continue to progress, and we will support the continuation to engage with it and ensure the protections of the freedom of speech. And, you know, Snap CEO's Evan Spiegel also came out in support of COSA. And look, it's not everything, but maybe it's a start.
Here's Politico's Rebecca Kern again. These are the constant battles these platforms have to deal with between privacy, which is such a strong protection in our country, and free speech and other protection and safety. And there's, you know, no real mandate to put safety first. Do you think Section 230 has a chance of being repealed?
I asked Rebecca that question, and she seemed pretty doubtful. You know, it's not just the law passing, but it's the lawsuits that would follow, and how many years would it be caught up in court? I can't help but wonder, did this hearing make a difference? If you're asking, will it create more safety for children online, I think there is a reason for hope. There was some movement we've never seen before, but people need to keep applying pressure because that does make a difference.
Thank you to Politico's Rebecca Kern for her insight. And thanks to our listeners for your support of Betrayal. Remember, if you want to share your story for the new weekly series of Betrayal coming this summer, email us at BetrayalPod at gmail.com. That's Betrayal, P-O-D, at gmail.com. Betrayal is a production of Glass Podcasts, a division of Glass Entertainment Group in partnership with iHeart Podcasts. The show was executive produced by Nancy Glass and Jennifer Faison.
Hosted and produced by me, Andrea Gunning. Written and produced by Carrie Hartman. Also produced by Ben Fetterman. Associate producer, Kristen Malkuri. Our iHeart team is Allie Perry and Jessica Kreincheck. Audio editing and mixing by Matt Dalbecchio. Betrayals theme composed by Oliver Baines. Music library provided by My Music. And for more podcasts from iHeart, visit the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
I'm John Walczak, host of the new podcast Missing in Arizona. And I'm Robert Fisher, one of the most wanted men in the world. We cloned his voice using AI.
In 2001, police say I killed my family and rigged my house to explode before escaping into the wilderness. Police believe he is alive and hiding somewhere. Join me. I'm going down in the cave. As I track down clues. I'm going to call the police and have you removed. Hunting. One of the most dangerous fugitives in the world. Robert Fisher. Do you recognize my voice? Listen to Missing in Arizona every Wednesday on the iHeartRadio app, Apple Podcasts, or wherever you get your favorite shows.
In the early morning hours of September 6, 2016, St. Louis rapper and activist Darren Seals was found murdered. That's what they're going to learn. On for death, on for nothing. Every day, Darren would tell her, all right, ma, be prepared.
They are going to try to kill me. All episodes available now. Listen to After the Uprising, The Murder of Darren Seals on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. From iHeart Podcasts comes Does This Murder Make Me Look Gay?,
9-1-1, what's your emergency? Mr. Vati is dead! Featuring the star-studded talents of Michael Urie, Jonathan Freeman, Frankie Grande, Cheyenne Jackson, Robin de Jesus, and Kate McKinnon as Angela Lansferry. Lick them, lick those toesies. Listen to Does This Murder Make Me Look Gay? as part of the Outspoken Network on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.