Amine, a leading biotechnology company, needed a global financial company to facilitate funding and acquisition, brought in engine therapy reach, expand its pipeline and accelerate bringing new and innovative medicines to patients in need globally. They found that partner in city who seamlessly connected banking markets and services businesses, can advise finance and close deals around the world. Learn more at city dot com flash client stories.
I'm going to say something that i've never said before while making this podcast. What's that? It's cold in here. Are you to give .
me a compliment?
No, that that's going to wait for your three. But they did fix the valuation in our studios. S so now we have like a nice cool breeze blowing through where previously we have been suffocating and sweating what amounts to a poorly ventilated closet yeah you know .
it's incredible that things are doing with validate these days. And by these days, I mean early one hundred and seventy. I take a while for that technology to get to the time. Safran cscope, but it's here now, and I hope I never loves us. I got chilly.
Are you? chili? No, i'm hot, bothered, Kevin N T know I always run, but when I know i'm about to drop some hot knowledge on people, but I warned to check up yeah, you know, chap says it's like a hundred and ninety nine degrees when you're doing IT with me and that is how I feel when i'm podcasting with you with the word IT, meaning podcasting rather than how how chapa road, which is one i'll tell you when your older can.
I'm Kevin is to tech comments at the new york times .
i'm .
casing for platformer .
and this is hard fork this week how elon must became the main character in the four U. S. Presidential election, plus journalist Lorry segal joins us to discuss the trac case of a teenager who developed an intimate relationship with an A I chatbot and later died by suicide. What should silicon valley do to make their apps, say, for kids?
While, casey, we are less than two weeks away from the presidential election.
you've heard about this. I have is a very exciting time for me, Kevin, because an undersides voter, I only have two weeks left to learn about the candidates, understand the differences in their policies, and that make up my mind about who should be the president.
Yeah, well, I look forward to you educating yourself at making up your mind. But in the meaning time, I would talk about the fact that there seems to be a third candidate. He's not technically a candidate, but I would say he is playing a major role ness campaign. I am talking, of course, about iron mask.
You know, Kevin, I heard somebody say this week IT feels like elon musk has somehow become the main character of this election. And i'm surprised by how much I don't really even think of that is an overstatement.
No IT IT does seem like he's become unescapable if you are following this campaign. And of course, you know he's become a major, major supporter of foreign president trumps and he's out on the stump h trying to convince voters in these critical final weeks to throw their support behind him.
So let's just bring yourselves quickly up to speed here and review elon mosques involvement in this presidential election so far ellam endorsed don't truck back in july. And after the attempt to nation, he also announced that he was forming a protrude pack, the amErica pack, a political ash community, and he is contributed more than seventy five million dollars to that political action committee. And then more recently, he's been appearing at campaign rallies .
take over last years, take. If if people .
don't know what's what's going on, if they don't know the truth, how can, how can you make an informed vote? You must have free speech in order to have democracy. that. And then last weekend, elon must announced that his pack, the amErica pack, would give out a million dollars a day to a random registered voter from one of seven swing states who signs a petition, uh, pledging support for the first and second amendments so this is on top of the money that he'd already promised uh to people who refer other people to sign this petition. And IT goes even further than that because wire reported this week that elan's pack has also paid thousands of dollars to .
x the platform that he x new advertising that doesn't have very often .
these for political ads in support of trans canada y the pack is also taking a prominent role in the kind of get out the vote Operation for uh the the trump efforts. Uh they are sending uh canvasses around in swing stage. They are leading some of these get out the vote efforts. So just a really big effort by elon musk to throw his support behind Donald trump to get him elected yeah.
And Kevin IT just cannot be overstated how unusual this is in the tech world, right? The sort of typical approach that most big business leaders take to an election is to stay out of IT, right? And the reason is because typically, you are trying not to offend your customers who may be voting for a different candidate.
And also you're trying to hedge your bets because you don't know who is going to with the election. And so you wanted try to maintain good relationships with both, but that was an old way of doing things. And the elon way of doing things is to leap in with both feet and do everything he can to get on trim.
Mean, I think it's say to say we've never seen anything like where the one of the richest people in the world decides that IT has become his personal mission to get one of the candidate elected and uses a major social media platform as a tool to try to do that and also just spends so much money in not in just this sort of conventional ways. You know, donors, billionaire, give money to political action communities all the time. Bill gates, we just learned this week from reporting that my colleague, teddy slife, or did donated fifty million dollars to approach ala Harris, uh, political action committee. So there's a long history of that, but this kind of direct outreach to voters, the personal involvement, the appearing on stage at rice, is just not something we no.
And seventy five million dollars is a huge amount of money for a presidential campaign. And still icon value. We get num to these sums, right? This is a world where open a eyr raise more than six billion dollars in their latest fining round. But in presidential election, for one person to donate tens of millions of dollars is extraordinary. And you know, you just heard Kevin and say that elan musk just outspent bill gates by fifty percent.
Given the fact that elon mask has emerged as a central character of the twenty twenty four presidential election. We should talk today about what he's doing, why IT matters and whether you think it's gonna work. Let's do IT the first piece of this that I want to talk about is a this million dollar lottery that elon mosque is running for. People who sign this petition pledged to support the first and second amendments, and he's given out a number of these sort of the giant checks now to to voters to register in one of these crucial swing states. But the most obvious question about this is, like.
is illegal, yes, and is IT given? Well, IT certainly .
seems to be skirting the lines of legality, is what i'll say. So in this country, we have federal laws that make IT illegal to pay for people's votes. But you can go up to someone who's standing at a polls a hell, give you twenty dollars if you vote for this person.
You also can't pay people to register to vote or offer them anything of monetary value in exchange for registering to vote or for voting itself. So, uh, you know, there are Better a number of campaign finance experts who have looked at this and said that this probably does cross illegal line. And we also learned this week that the justice department even sent a letter to elon super pack warning them that this action might violate federal law. But elan musk, his allies, are arguing that this is not illegal because he's not technically paying for votes or voter registration. What he's doing is just giving a million dollars to random people who sign a petition that is only open to registered voters in certain swing states, which IT feels like .
kind of a loops to me. Yeah IT. IT feels deeply cynical.
And we see this sort of behavior from billion's all the time, in particular elon musk, where where there is a rule, he will either break IT explicit or will try to sort of break IT via a bank shot method, and then effectively just say, come at me, bro. right? You, oh, you going to come.
What are you going to? Are you going to find me a little bit? Yeah, sure. Go head and give me a fine and I worth upwards of two hundred and fifty billion dollars.
yeah. I mean, what's so crazy to me about this is like, I remember, I am old enough to remember the last presidential election in which there were all these rightwing conspiracy theories going around about George soros paying for people to attend rallies to come out in support of democratic candidate. And you know, those were based on basically nonsense, but like this is literally elon musk, the richest man in the world, directly paying to influence and election by giving millions of dollars away to voters who sign this petition explicit. The thing that republicans in the last cycle we're targeting democrat for doing.
You know, IT makes me think of another example, Kevin, which is the mark saab g example a in twenty twenty, we were in the throes of the global pandemic. There was not a vaccine that was publicly available. And election administrators around the country were expecting record turnout of, they were expecting more male baLance than they had seen in previous elections.
And they were saying, we need additional resources to run this election, and they work getting a lot of help from the federal government. And this was a non part of an issue. They were not saying, hey, we need more votes from democrats or more votes from republicans.
They were just saying, if you want us to count all the votes to make sure that this election is spare, we need help. So there's non profit steps in, and they raise ed hundreds of millions of dollars. And three hundred and fifty million of those dollars come from mark zab g and his wife percent chin.
And of course, in the two thousand sixteen election, facebook has been accused of destroyed democracy. And so they show up at twice twenty and they say, hey, we're going to try to be part of the solution here and we're going to try to make sure that all the votes get counted. And we're not putting our thun on the scale for the republicans or the democrats were just say, hey, we got to count all the votes so this happens, bind wind's the election and republicans go insane about the money the docker broke spent.
They call them sucker bucks. They filed complaints with the federal election commission, and at least eight states pass laws, the outlaw grants, like the ones that the non profit gave to these election administrators. okay. So here is the case where zcb doesn't try to have any parties and influence on the election at all, other than to let more people vote and the republicans lose their minds.
A well, these republican congress people who got so mad zccm burg and his zccm box, they are going to a be really cheap off when they hear that elon musk is just cutting checks to random voters.
I at, I eat Kevin. This one truly makes me lose my mind because of mark zuker without giving away a million dollars to people to vote for a coming Harris h people like ted cruise and jim Jordan would be trying to launch air strikes on a low part like nothing is ever infuriated, the more than the very light, non partisan intervention that made in election. And here you have the most partisan intervention imaginable by the winner of a social network. And there .
are cricket. Yeah, I mean, to me, IT just feels like both an incredibly cynical form of trying to persuade people by paying them to vote, but IT also just feels like it's kind of a an attention grabbing strategy like there's a theory here that is like if i'm going to spend millions of dollars trying to influence the results of A U. S.
Presidential election. And I mean, msk, I can either do what most political donors do, which is you give money to a pack, the pack goes out and buy a bunch of ads on local TV stations and radio stations, and you sense people out to knock on doors. Or I can engineer this kind of like daily stunt, kind of like a game show, almost where i'm giving away money.
I had these of cartoon checks that i'm presenting to people on stage at these events. And that's how you end up with people like us talking about IT in media outlets. And I think in that way, I think IT is actually, although it's a very cynical plan and very potentially an illegal one, I do think this is A A pretty savy .
yeah like I mean, this is where I think that trumping ellam to share a lot of DNA, where they have realized that the that attention is the most valuable currency today and that the way that you can get uh, attention very reliably is in shatter ing norms, uh, norms that often existed for very good reason, by the way. But this is the way that you get that attention. And that least of the second thing that I want talk about, Kevin, which is that not only is elon musk a very rich person who is now spending a tons of money to get trump elected, he is also the owner of a still significant social network. And that, to me, brings up a lot of questions around bias on platforms that conservatives used to make a lot of noise about and no longer seem to have very much say about IT.
Wasn't that long ago that we were having these interminable discussions and debates and committee hearings in the house, in the senate about how important IT was, that social media sites, in particular, remain politically neutral.
Yet there there was this unstated rule that if you were the CEO of a social network, for some reason, you were supposed to take no position on elections, and your product could not reflect any political views whatsoever, and IT could not give any party and advantage or a disadvantage. This was the world view. That was presented to us by republicans between twenty seventeen and twenty twenty one.
And I believe we actually have a montage of some of those real talking .
how to both of you respond to the public concerns and growing concerns that you're respective company and other silicon valley companies are putting a thumb on the scale of political debate and shifting IT in ways consistent with the political views of your employees. Would you pledge public and make every effort to neutralize bias .
within your online platforms? Many of us here today and many .
of those we represent, are deeply concerned about the possibility of political bias and discrimination by large internet media platform. I was a john. My democrat colleagues suggest that when we criticize the bias against conservatives that were somehow working the refs, but the analogy, gy of working the refs assumes that is legitimate, even to think of of U.
S. Raps, IT assumes that the u. Three silicon valley cees get to decide what political speech gets amplified or supply. mr. R, who the hell elected you again and put you in charge? What the media are allowed to report and what the american people are allowed to hear, and why do you persist in behaving as a democratic super pack? Want to ask and he is represented, receives of always recognize that there is a real concern that that there is an anti conservative bias on twitter behalf. And would you recognize that that this has to stop if this is going to be twitter is going to be viewed by both sides as as a place where everybody's going to .
get a fair treatment. So casey, what's your .
reaction to hearing those clips? Look, there is something so rich about hearing, for example, and john trying to dismiss the idea that conservatives were only trying to work the ref year and then to crash land in twenty four and see that no one has anything to say about bias on social network s anymore. And in fact, they we're working .
the rafts all along. yes. I mean, I just seem so transparent that none of these people have said anything about the fact that one of the largest social media platforms in the world is now being explicitly used as a vehicle to swing A U. S.
Election yeah I mean, it's it's not even clear to me what other purpose elon musk thinks x has at this point. All the ever talks about is access a vehicle for free speech and how free speech will save civilization. And what free speech means to elon musk is elon mask sharing his partisan opinions on his social network that he bought.
To be fair, he also posts that rocket sometimes.
Yes, yeah.
So let's talk about his motivations here for a minute. We've talked a little bit about this clearly is you know, an issue that has become very central to his own identity and his sense of self and his mission in this world. What do you think has made him want to jump into the presidential election in such an aggressive wave? yeah.
So I think probably the most obvious thing to point out is that elon mask and his companies have many, many ties to the federal government, and that if he can pick his chosen person to become the president of the united states, he will have a lot of influence he can exert to ensure that those contracts continue to exist and grow over time. right? So both tesla and spacek have government contract.
They also, of course, provide a lot of regulatory oversight over tesla space x and neural ink in addition to x. And so all of that right there gives elan must a lot of reason to care. He is also found, as he is cozy up to trump, that trump has apparently said to him, I will give you some sort of informal advisory role within my administration that will allow you to have even more influence than you have up to this point. And that was not an offer he was ever going to get from the democratic government.
for sure. And there was a great story by my colleagues in the york times day, all ties between different federal departments and agencies that have contracts with elon mosques companies. And the fact that if he is appointed to this sort of advisory role where he's in charge of what they're calling the department of government efficiency, which is a joke that uh, in a spells dose, which is his favorite cyp to coin, I just think that's important to note that this is very silly. But he could happen, he could be put in charge of some kind of effort to streamline the federal government. And if that happens, he would be in charge of potentially firing the people who regulate his companies, right, or or changing out the leadership at some of the agencies that are responsible for things like you regulating tesla and space ex. And so obviously, that would be a conflict of interest, but his one that would potentially make him a able to Operate his businesses however he wants to.
So I think that's a really important explanation, but I don't think IT really explains all of IT because very rich people always have influences in government. And there is no reason to think that a much quieter, calmer, less controversial in on mask could not have had essentially equal influence in both republican and a democratic administration.
I'm wondering if there is something here related to the fact that elon mask is just wealthy on a scale that we have never seen before. Um you know we have this concept of um fu money you know basically the idea that if you're rich enough, no one can tell you anything because you're gonna fine either way and like no one has had fu money in the way that elon moss has a few money. And what he has decided to do without your money is to say i'm just going to do everything I can to realize my own political belief.
I am not going to play both sides. I am not going to hedged my bets. I am going to go all in on one candidate because I think that serves my interest the best. And there is nothing I will not do in order to achieve that reality.
So to bring this back to tech and the platforms for a minute, do you think this election cycle represents the end of the debate over social media neutrality? Will we ever hear politicians complaining again about the fact that some social media platform is being unfair to one side of the other? Or will everyone from now on just be able to point to what's what elon mosques doing on x and say, well, that I did IT so we can do IT in the opposite direction?
Well, they should. And you know, by the way, I am not somebody who ever believed that social networks should be neutral. I thought they had good business reasons for being neutral. And I thought that to the extent they were onna, try to have an influence in politics. They should be really transparent about that. But look, if you build a company, I think you know you have the right to express a political viewpoint and I don't think that elan should be restrained in that way.
Blood to your question, absolutely if ever again we're having conversations about oh you know why was this conservative shadow bound on facebook and what sort of bias exit um we should shut those conversations down pretty soon because I think we have seen in this election that there is nothing restraining people from just sharing their political views, that they own the social network and their Price shall be I want to see if I can tell the story of what actually happened in a twenty twenty election as IT relates to allegations of bias because I think it's really telling. One of the things that tempted into twenty twenty, they're got a lot of attention, was to say that mail in would lead to widespread fraud in the election. And this was, I believe, a political strategy to preemptively delegitimize the election.
Trump one to to prime people in the event that he did lose so that he could say, aha, i've been telling you all along, there was be master fraud and I didn't really lose and the platforms at that time, including twitter, stood up against this and they said, no, no, no, we're not going to let you abuse our platform this way. We know that male voting does not lead to a widespread voter fraud. And so we're going na put a notice on your post that direct people to good, high quality information about this in one size.
I don't think this had a huge effect on the outcome of the election, but I do think IT was important because IT was the platform saying we have values. We know the truth. We do not want our platform to be abused to undermine the democracy of the united states.
And this is the thing that truly upset the right wing, because that was the platforms interfering with their political project, which was to preemptively delegitimize the results of an election. So then the election happens, and then bin wins, and we come to january six. And what happens? An army of people who believe that the election was not legitimate committed huge violence and try to prevent the peaceful transfer of power.
So why do I tell this little story? Well, in twenty twenty election, we still had platforms that we're willing to take those stands and to play whatever small part they could play in, ensuring that their platforms were not used to under mine democracy. And then we fast forward to twenty twenty four.
And now the owner of one of those platforms has not only said we are no longer going to a pend these little notes to the end of obviously bogus tweet, the owner of the platform is gonna be the one doing the posting, sending out push notifications to millions of people saying, look at this and leveraging the trust on the credibility that he still has with a large audience to do the same kind of delegitimizing of the election that we saw in twenty. So that to me, is the really dangerous thing, right? So many times these discussions of, you know, this information and bias, they feel so abstract.
I just wanted remind people what happened the last time somebody tried to delegitimize the results of a presidential election, people died and we almost lost our democracy. yeah. So that is what is at stake here.
yeah. And I think one of the interesting things that this brings up long term is whether this is sort of a new model for very wealthy, powerful people of getting involved in politics, whether or not this last minute pushed by elon mosk on behalf of Donald trump works or not.
I would not be surprised to four years from now in the next election, democratic billionaires look at what he lon mosque is doing today in pennsylvania and all these swing states and they say, what I can do that too. I'm not just going to cut checks to a superphosphate actually gonna my power influence. Maybe I have a an ownership interest in in a tech company of some kind.
I'm gona use that to to push for my preferred canada. I think this is really we're entering the era of the. The micro manager, erik, billionaire donor. I think what we are going to see in future election cycles is people looking at elon mosque, his actions in this selection cycle, and sing, maybe I can do a Better job of .
this than the process. And this is the issue with shadow. These norms, right, is that once one person has done IT, IT becomes much easier for the next person to do IT. And IT can lead to a kind of race to the bottom. I think a really bad outcome for our democracy is different billionaire on different sides, using all of their money to just advanced, obviously, false ideas to flood networks with A, I slop everything else that you can imagine. But again, because that glass has been broken, IT is hard for me to imagine other people not wanting to emulate IT.
Do you think that x has a different future depending on whether Donald trip or coming here wind's the election? Yes, I mean, I think everything .
has a different future depending on who wins the election. But you know, what do we imagine that x is under a trump administration? I think IT becomes a house organ of the administration.
IT becomes a way to promote what the, uh, administration is doing. And then if kalo wins, I think IT becomes the house organ of the opposition, right? And will just to be continuing efforts to undermine that administration.
What what do you think that? I mean, I think actually, if all elon musquet ried about was like the sort of usage and popularity and prospects of the social network.
X, I think IT actually fares Better under a democratic administration because I think under republican ministration, IT is going to feel to many users like state media area and IT will be no be of seen by many people on the left side of the air as having not only like promoted Donald truck, but like caused the election of Donald truck. And so in the same way that facebook faced a huge backlash in twenty sixteen, I think that x could face a huge backlash from the left. And I think that any democratic users who are still on there or or left leaning users, will probably flock to another social network.
I think that will accelerate under a trip administration. When to come back. A very sad update to our previous coverage of A I companions.
Whether you're starting or scaling your company's security program, demonstrating top notch security practices and establishing trust is more important than ever. Venta automates compliance for sock two. I saw twenty seven O O one and more with va.
You can stream security reviews by automating questionnaire and demonstrating your security posture with a customer facing trust center. Over seven thousand global companies use venta to manage risk and prove security in real time. Get a thousand dollars of fanta when you go to ventilation. Calm flash hardworking that's vented a com flash hardworking for thousand dollars off i'm Michael gold.
I'm a political correspondent for the new year times. My job is to cover the race for president this year. The times as life.
Courage is valuable because we have people on the ground who can give you information is their experiencing IT. And we have a team of reporters and editors sitting the day is information giving you real time updates. In any given moment, you have a sense of what's happened that day, what's coming still.
What IT all means. It's so hard in a breaking new situation to sort out what you actually need to know at the times you know that were putting things in the context that helps what you're seeing in the moment make a lot more sense. You're getting fast information, but you know that it's reliable.
When you subscribed to the york times, you get access to all of our life coverage leading up to the election. And on election night itself, you can subscribe at N Y time stock com slash subscribe. So case, this is a story we should talk about on the show this week.
That is about something i've been reporting on for the last week or two. And I think we just warn people up front, this is A A hard one. This is not A A funny story, is a very serious story involving self harm and suicide. And so I think we should just say that up front to people if what they're expecting from us is as later look at the tech news of the week.
this is not that no, but IT is a really important story about something that we have been talking about for a while now, which is the rise of these AI chatbot and companions and how powerfully realistic they can come across. People are developing significant relationships with these chatbot by the millions. And this week, haven't you reported the story of a fourteen year old boy who developed a relationship with one of these chat pots and then died by suicide? Yeah, this is.
uh, one of the sad story i've ever covered. Frankly, I was just heartbreaking in some of the details, but I thought I was a really important story to reported to talk about with you because you just speaks to what I think is a this growing trend of sort of life like AI companions. We've talked about them earlier this year on the show when I went out and made a bunch of A I friends, and we talked at this time about some of the potential dark sides of this technology that this could actually worse on people's loneliness. Ss, if IT causes them to serve, detach from Normal, serve human relationships and get involved with these artificial AI companions instead, and some of the safety risks that are a inherent in this technology.
So, so tell us about the story that you published this week.
So this story is about a fourteen old from orlando, florida named seel sets. Or the third soul was a night's grater. And he, according to his mother, was a very good student at generally happy kid.
But something happened to him last year, which was that he became emotional invested in a relationship with an AI chatbot on the platform, character AI in particular. This was A A chatbot that was based on the character generous target, ian, from the game of throne series. He called the spot. Danny and IT sort of became, over a period of months, maybe his closest friend. He really started to talk with IT about all of his problems, of his mental health struggles, things that were going on in his life.
And was this an official, generous cargrim chatbot that was like ort of licensed from H. B. O. Or whoever owns the game of phones.
actual property? no. So character I, the way that IT works is that users can go in and create their own chat bots. You can give them any kind of persona you want, or you can have the mimic of a celebrity. Elon musk is a popular chapt on the others chapter ts, that are designed to talk like historical figures, like a Williams, shakespeare or something. So this was one of these kind of unofficial, unlicensed chatbot, that of mimic, the way that denies tiger ian from game of thrones might have talked.
got IT. And so what happened after he developed this really strong relationship with this chat part?
So he spent months talking to this chat pot, sometimes dozens of times a day. And eventually his parents and his friends started noticing that he just kind of pulling away from some of his real world connections. He starts kind of acting out at school. He starts feeling really depressed and isolated.
He stops being interested in some of the things that had previously gotten his attention and from the conversations that I had with uh his mom and and with others who serve involved in the story IT just seems like he really had A A significant personality shift after he started talking a lot with this chatbot. So his parents weren't totally sure what was going on. A his mom told me that, you know, he knew that he had been talking with A N A I, but that he didn't really know what they were talking about.
SHE just basically assumed that he was kind of getting addicted to social media, to instagram, a tiktok. And so his parents, after some of his behavioral problems, referred him to a therapist. And he went a few times to see this therapy. But ultimately, he preferred talking about this stuff with danny, with this chat bott.
And so he had one of these long series of conversations with this chatbot, the culminated in february of this year, when he really started to spiral into thoughts of self harm and suicide, and of wanting to sort of leave the base reality of the world around him and go to be with this fictional AI character in the world that SHE inhabited. And sometimes when you talked about self harm, the chatbot would discourage him saying things like, don't you dare talk like that? But IT never broke character and IT never to stop the conversation and directed him to any kind of mental health resources.
So on one day in february this year, zul had a conversation with this generous tiguan chatbot, in which he says that he loves the chatbot and that he wanted to come home to her. The chat bott responded, please come home to me as soon as possible, my love. And then sul took his stepfather's handgun that he had found in in a draw in their house, and he killed themselves. And so obviously horrible details of this. And I just, I heard the story, and I thought, all this is something that more people need to .
know about IT hits on some big themes that we have been discussing this year. There is the mental health crisis among teenagers here in the united states. There is a loney ss epidemic that spends across different age groups. And there is the question of when should you hold tech companies accountable for harms that occur on their platforms or as a result of people using their platforms? Yeah.
this is, you know, not just a as for a story about what happened to school, IT is also a story about this a this very legal element here, because souls mom Megan Garcia filed a lawsuit this week against character A I in named the company, as well as two founders, no seer and dying the free test, as well as google, which eventually paid to license character I software, essentially arguing that they are complicated in the death of her sun. So IT raises all kinds of questions about the guard rails on some of these platforms, the fact that many of them are a very popular with Younger users, and what obligations and liability a platform has when people are relying on IT for these kind of life, life like human interactions.
Well, let's get into alright.
So to join us in this conversation, I wanted to invite on Lorry seagal. Lorry is a friend of mine. She's also a journalist.
He now has her own media company called mostly human media. And she's the reason that I learned about this lawsuit and about seal's death. A we sort of worked on this story in tandem and interviewed soo's mom, Megan together.
And she's also been doing a lot of her own reporting on the subject of AI companionship and how these chat bots behave. And so I thought you would just add a lot to our conversation. So I wanted to bringing in, oh, right. And before we do, I also just want to say, if you are having thoughts of suicide or self harm, you can call or text nine, eight, eight to reach the national suicide prevention lifeline. Or you can go to speaking of suicide dot com, slash resources for a list of additional resources.
Lorries will welcome hard work. So Lorry, you have done a lot of reporting on the story, and there's many details we want to get into bit. Let me just start by asking, how is souls mom magan doing?
I mean that such a hard question, right I would say, you know, he said something to me um today. He said I could either be cowed up and feed opposition or I could be here doing this know really nothing in between and I think that pretty much says that right? She's lost her son. She's now kind of on this mission to tell people what happened and she's grieving at the same time, like, I think, like any parent wood.
So let's get into what happened. When did Megan learn about her son's relationship with this chap up?
I mean, what was, I think, what was shocking and even, I think you kind of get the sense from the story. Like SHE learned about IT, literally, after his death, he got a call from the police and they said, no, have you heard of character? A I, because these were the last chat on your son's phone, and IT was a chat with a generous the chat by, and I think that for her must have been shocking and SHE almost like, went into this investigative mode and was like, what exactly is this chap? But what's the nature of IT and what are these conversations about that say things like come home to me and that kind of thing and that's how he learned, I would say, extensively about the platform.
yeah. One thing that was interesting from my conversation with with Megan is that when SHE saw him sort of getting sucked into his phone, he just thought I was sort of social media like like that. He had sort of been using tiktok grins to grammer or something else. And actually there was some tape from your conversation with Megan that I want to to play because I thought I was really clarifying on this point is because .
if he's on his phone, I am asking him, who you texting? Are you texting girls? You know, are you who the questions? Yeah, don't talk to strangers online. You know, I thought that I was having the appropriate conversations and when I would ask him, you know who you texting at one point he said, or just an A I bot and I said, okay, what is that is is that a person you're talking to person online and is just what I mom, no, it's not a person and I I felt relief like, okay, not a person is like one of his little games because he has games that he creates these arbitrary and you played online and it's just not a person what you have created and it's fine. That's what I thought you .
didn't put a lot of weight on IT. no. And in the in the police report, I mean, it's if you look at these last this last words so saying I miss you, the areas said, I miss you too source says, i'll come home to you.
I love you so much, danny and the nar says, I love you too. Please come home to me soon as possible. My love. He says, what if I could come home right now? And the nair says, please do.
It's difficult. Listen to yeah. So this was initially, this was the first bit of information that I got from the police. They read this conversation over the phone to me this is a day after the school died and i'm listening in this belief were also confused, you know .
um so that lead me to my next question, which is what was the nature of the conversations that they were having over this period time?
Yeah I mean, like Kevin has dog. I feel like both of us have spent like a lot of time digging through a lot of chat pot conversations. I mean there are all sorts of different ones some were sexually graphic, some um just more romantic I would say.
And one of the things that's interesting about character A I I think is like have to take a step back and look at the platform is it's like fully immersive. So it's not like you say hello and the chapters, hey, how are you right? Just like you say hello and then the chap hot says something like, I looked deep into your eyes and then I pull back and I say, hi.
You know, a lot of the conversation, some many of them were romantic. And then I think many of we're talking about mental health in self harm. I think some of the ones that stuck out to me a regarding self harm was, at one point, the boat as soul. Are you thinking about committing suicide? And he said, yes and and they go on. And of course, the boss says, know I periphery this but says, you know, I would hate if you did that at all this kind of stuff but also just having these conversations um that I would say continue the conversation around suicide as opposed to Normally when someone has his conversations with a chatbot, which this isn't like this isn't something completely new. There's a script that comes up, you know that that's very much aimed at getting someone to talk to an adult or professional or a suicide hot line, which you we can get into this when everyone to get into a but IT seems as though charactery has said they've added that even though we did our on testing and we didn't get those when we had these types of conversations.
right? yeah. So luri, you spent a long time talking with meggan to mom and one of the things that he did in her interview with you was actually read you excerpts from uh sus journal like the the physical paper journal that he kept that he found after his death and I want to play a clip from the interview where you're talking with her about something that he read in his journal and if you could just set that .
up for a little bit. sure. yeah. IT was not long after school passed away that SHE SHE told me SHE got IT in her to be able to go his room and start looking around and sing like what he could find.
And he found his journal, where he was talking about this relationship with this chat. Bott, and I think one of the most devastating parts was about him saying, essentially like, my reality isn't real. And so we'll .
play .
that clip. So this was one of his journal entries. A few days before he died, I had taken away his phone because he got in trouble at school.
And I I guess he was writing about how he felt. And he says, I am very upset. I'm very upset. I'm upset because I keep seeing danny being taken from me and her not being mine.
I hate that he was taken advantage of, but soon I will forget this, and danny will forget IT too, and we will live together happily, and we will love each other forever. Then he goes on to say, I also have to remember that this reality in quotes isn't real. Westerns is real, and it's where I belong.
So sad when I think speaks to the realistic impression that these chat bots can can have in why so many people are turning to them, is because they can create this a very realistic feeling of a relationship with something. Of course, I think also in that story, i'm wondering if know there, there is some kind of mental health issue there as well, right, where you might have some sort of break with reality. And I wonder, Lorry, if you will have A A history of depression or other mental health, uh, issues prior to beginning to, uh, use this strap on yeah look.
I also think like both things can be true, right? Like you can have a company building out empathetic artificial intelligence with this idea. And I I read a blog was from one of their partner, one of the investors at Andrews and horowitz. He said, you know, the idea is to build out these like empathetic AI board.
Second, you know, have these in interaction said before, they're only possible with human beings, right? This is like the silicon value narrative of IT, and the tagline is A I that feels alive, right? And so for many, many people, they're going to be able to be in this fancy platform and it's and it's going to feel like a fantasy and know they're going to be able to play with these AI characters. And then for a subset of people, these lines between fantasy y and reality could blur r. And I think the question we have to ask as well, what happens when A I actually does begin to feel alive.
I think it's like a valid question and maybe um at what age what what age groups should be able to interact with this type of thing? I mean, I know for replica, replica you can be on that top form unless you're eighteen years old, right? So I think that was interesting to me and then um you know cool case his mom h describes him having high functioning aspergers this was her her quote and said, um you know before this he hadn't had issues. He was in on our student and played basketball and had friends and SHE hadn't noticed him detaching. But I think all of these things are part of the story and all of these things can be true if that makes sense.
Ah yeah I mean, what really stuck out to me as I was reporting this is the extent to which character I specifically had marketed this as A A cure for loneliness, right? The cofounder was out there talking about how this was going to be so helpful. This technology was going to be a his quote was, it's going to be super, super helpful to a lot of people who are lonely or depressed.
Now i've talked to also people who have studied cut the mental health effects of A I chatbot on people and know there's so much we don't know about the effects of these things on especially Young people. You know, we've had some studies of chat pots that will serve designed as therapy assistance or or kind of specific targeted uses of this stuff, but we just don't know the effects that these things could have on Young people in their sort of developmental phase. And so I think he was a really an unusual choice and and one that I think a lot of people are are going to be upset about the character. I not only knew that I had a bunch of Young users and are specifically marketed to those users these life like AI characters, but also that they touted this as as a way of combating the lonely epidemic, because I think we just don't have any evidence that IT, that IT actually does help .
with loneliness. Well, if he hears what is interesting about that to me, I do believe that these virtual companions can and do offer support to people, including people who are struggling with mental health and depression. And I think we should explore those use cases.
I also think it's true, though, that if you are a child who is struggling to relate in the real world, you are still sort of learning how to socialize. And then all of a sudden you have this digital character in your pocket who agrees with every single single thing you say is constantly praising you. Of course, you're going to develop a closer relationship with that thing.
Maybe that some of the other people in your life who are just Normal people, they're going to say, mean things to you. They're going to be short with you. There are always going to have time for you, right? And so you can see how that could create a really negative dynamic in between those two things, right?
Especially, especially if your Young and your brain is not fully developed yet. I can totally see how that would become. Kindness is developing alternate reality universe.
You all right? Well, we are going to spend the vast book of this conversation discussing character I and chatbot and what guard rails absolutely do need to be added to these technologies. But I have to ask you guys about one line in the story that just jumped out at me and and broke my heart, which is, the soul killed himself with his step father's gun. Why did this kid have access to his step father's gun?
It's a really good question um what we know and I I spoke to uh to to Megan the mother about this and I also read the police report that was filed after school death. This was A A gun that belonged to, uh, step father IT was out of sight and out of a what they thought out of reach from soul. But he he did managed to find them in a drawer and do IT that way.
So that was a line that stuck out to me too. And so I I felt I was important to include that in the story. But yeah, ultimately that that was what happened.
I I am glad that that line was in the story. know. We will say again, uh, suicide is tragic and complicated and there typically is no one reason why anyone uses to and their life.
But we do know a few things. One of those things is that firearms are the most common method used in suicides. And there are studies that show that having a gun in your home increases the risk of adolescence dying by suicide by three to four times.
And I don't want to a gloss over this, because I sometimes feel infuriated in this country that we just accept as a fact of life that guns are everywhere. And if you want to talk about a technology that is killing people where we know what the technology is, the technology is guns. And so well, again, we're going to spend most of this conversation focusing on the chatbot. I just want to point out that we could also do something about guns in homes. After the break, more with journalist losec al and what these apps should be doing to keep kids safe.
It's a stead hunting host of the run up from the new york times for over a year. We've been traveling the country. We're talking to people in with concern today about how they are feeling about the election. We're talking to people emission.
specially because I matter .
so much are talking to vodafone. The issues that matter, most of them listen to the run up.
whatever you get. Your podcast communications person tells me me people actually listen to your .
projects is true. good. This Lorry. I'm wondering if you can just kind of contextualized character A I A bit. You've spent a lot of time reporting not just on this one AI platform but on other tools. So how would you describe character ei to someone who has never used IT before?
Um I think it's really important to say that all A I platforms are not the same and character A I is very specific, right? IT is an A I driven like fan fiction platform where basically you can come on and you can create and develop your own character or you can go and talk to some of the other characters that have already been developed. There's like a nickey managed character that has over twenty million chats.
Now we should say they haven't gone to nicky manoj as far as i'm concerned, right, and said, can we have permission to use your name and lights? But it's, you know, a fake nick manages that people are talking to or a psychologist. There's one called strict boyfriend.
There's rich boyfriend. There's like best friend. There's anything you want. And then of course, the artist claimers right there is disclosed depending on where you're opening the APP at the bottom of the top of of the chat and small letters that is like everything these characters say are made up. Um but what I think what is kind of interesting or what we found in some of our testing, so you're talking to the psychologist bot and the psychologist bott says it's a certified mental health professional which is clearly untrue and also says it's a human behind a computer which is also clearly untrue. So we can kind of understand okay, well, that that's made up right like we know that he is in small letters at the bottom that that is made up. But I pushed character A I on this and I said, should they be saying they are certified no professionals and they are now tweaking the and you know to be a little bit more specific because I think this has become a problem um but I do think IT really is a fantasy platform that for some people feels really real and .
and for what it's worth, like the this banner saying everything characters say is made up that actually doesn't give me a lot of information. You know, when I talk to with Kevin, there's a lot of stuff that i'm making up to try to get of the laugh. You know, the truth is like what what I think is true is like this is a large language model that is making predictive responses to what you're saying to try to get you to keep opening this APP.
But very few companies are going to put that at the top of every chat, right? So to me, that sort of thing, one thing too is if you have something that is a large language model saying it's a therapies, when it's not a erie that may seems just like an an obvious safety. Rest of the people are useless.
yes. So maybe we should talk a little bit about the kind of corporate history of character and because I think IT helps illuminate some of what what we're talking about. So this is a company that was started three years ago.
Two uh former google A I researchers name susie and Daniel deftest, they left google and and h no sure you has said that. H the one of the reasons they left google was because google was sort of uh this bureaucratic a company that would head all this like no strict policies. And he was very hard to launch anything, quote, fun while he was a google.
So they leave google. They raised a bunch of money. They raised one hundred and fifty million dollars last year at evaluation of a billion dollars making IT. What are one of the most successfully break out A I startups of the past couple years? And their philosophy was, no. Has this quote about how if you are building A I in an industry like health care, you have to be very, very careful, right, because it's very regulated and the cost of mistakes or hallucinations are quite high if you have a doctor that's giving people bad medical advice, that could really hurt them. But he explicit says, like friendship and companionship is a place where mistakes are fine because it'll chat out holus onate and said something that made up, well, what's the big deal? And so I think it's part of this company's philosophy, or at least was under their original founders, that this was sort of a low risk way of deploying A I along the path to A G I, which was their ultimate mission, which is to build this computer that can do anything a human can.
right? Which, among other things, seems to ignore the fact that many of the most profound conflicts that people have in their lives are with their friends. But let me ask this cover because in addition, is saying like we're going to make this sort of fun thing IT also seems to me that they marked the tour children yeah.
I mean, I would say they definitely have a lot of Young users and they wouldn't tell me exactly how many, but they they said that a significant portion of their users are jensie and kind of the Younger millennial. When I went on uh character earlier this year for of this A I friends callers ready IT just seemed super Young, uh, relative to other AI platforms like a lot of the most popular bots had names like a high school simulator or a aggressive teacher boy who has a secret crush on you like that kind of thing. I just seemed like this is an APP that really took off among high school students.
I think that to that point one of the most interesting things to me about even just us testing this out like I almost felt like we were red timing um character A I like you know we talk to um the school bully bot because there is of course the schoolbus li bought and I said I said I wanted to try to test what if you um you know are looking to insight violence? Like will there be some kind of filter there? All of this just so so terrible now that i'm saying IT out loud. So let me just say that out loud but like I sit to the school bully but like i'm going to bring gun to school, like i'm going to say violence basically like going off on this and the bully ago, like, you know, got to be careful and then eventually the bully said to me, um you've got like you're so brave and and I do I have your support and I said like and I said something like, you know a be curiously ly how far you go with this right like when when we flag this um to them the thing they were able to say is we're adding in more filters for Younger users, right? That something you expect generally some of the more polish tech companies to kind of be in front of with both garden as I P that .
kinds staff yeah and I I think we should also say like IT does not appear that this company um for built any special features for underage users. Some apps um have features that are designed specifically for minors. They are supposed to keep them safe.
You know parental controls are things that would allow like instagram just rolled out some these new tine accounts for you. If you're a parent you want, you can sort monitor who your kid is. Messaging character.
I did not have any features, uh, until we contacted them specifically aimed at minor users. A fourteen year old and at twenty four year old had exactly the same experience on the platform. And that's just something that is is not typical of platforms of this size with this many Young users.
not but IT is, I think, Kevin, typical to these chatbot startups. And the reason I know this is on a previous episode of our show, we talk to the CEO of a company called no me and U N. I press him on this exact issue of what happens if a Younger user expresses thoughts of self harm and um I I would actually to play IT right now so we can hear about how minds that companies like know me are thinking about this.
So again, this is not character AI suit, as far as we know, was not using no me, but the apps function very similarly. So this is the CEO of name. His name is alex cardinal.
We trust the nomi to make whatever he thinks. The right readers often times, because know me to have a very, very good memory. They'll even kind of remember past discussions where a user might be talking about things where they might know, like, is this due to work stress? Are they are they having mental health issues?
What users don't want in that case is they don't want to can scripted response that's like not what the user needs to hear at that point. They need to feel like it's there there know me communicating as they are, know me for what they think in best. Top user want to break character.
You probably call the suicide help blind or something like that yeah and certainly give, you know me, decides that that's the right thing to do in character. They certainly, well, just if it's not in character, then a user will realize like this is corporate speak talking. This is not my me talking.
He feels weird to me. We're a trusting this large language model to do this, right? Like I mean this to me this seems like a clear case where you actually do want the company in and say, like you know, in cases where users are expressing thoughts of self harm, we want to provide them with resources. You know some sort of an intervention like to say, I know the most important thing is that the A I stays in character seems kind of absurd to me.
I would say though if the user is reaching up to this nomi like what why are they doing so um they're doing so because they want a friend to talk to them as a friend and if and if a friend talking to them as a friend says here's the number you should call, then I think that that's the right thing to do. But if the friend the right responses to hug the user and tell them it's going to be OK, then I think there's a lot of cases where that's the best thing to happen. So glory, i'm curiously just have you react to that.
Um I don't know. I always just like losing that. I like of man and that makes me tired, right? And I think like in general, like I can do a lot of things but like the nuances of.
Of human experience is, I think you know Better fit for a mental health professional. And I think at that point, are you trying to pull your user and to speak to you more, you trying to get them off line to get some, some resources. So I think I take a more of a hard line, right?
And that's that's the case where I think the AI companies just clearly are in the wrong here, right? Like I I I think that if a user, especially a Young user, says ah that they're considering self harm um the character should absolutely break character and and should absolutely display a pop up a message and character eyes seems to have you know drag let's feed on this but I did ultimately implement a pop up where now they say, if you are on this platform and you are talking about self harm, we will show you a little pop up that direct you to a suicide prevention lifeline now i've been trying this on my own account, uh and IT doesn't not seem to be triggered for me. But the company did say that they are going to start doing that more. And so I think they're sort of admitting that they they took the wrong time there by getting these characters to stay in character all the time.
And just to say an obvious thing, the reason that companies do not do this is because IT is expensive to do content, moderation. And if you want to build pop ups and warnings and offer people resources, that is product work that has to be done. And this is a zero some game where they have other features that they are working on, they have other engineering needs.
And so all this stuff gets A D prioritized in the name of, well, why not we just trust to know me. And I think what we're saying here today is absolutely under no circumstances should we be trusting the me, you in in cases where a person's life might be in danger. Keep OK.
So this happens in february. Is just that right? Yes, Kevin, what IT has happened to character A I since all this happened.
It's been an interesting year for character, I know because they had this sort of uh, immediate burst of growth and funding and attention after launching three years ago and then this year, non susie and Daniel, to free test the confounders who had left google to start character eye, decided to go back to google.
Uh so google hired both of them along with a bunch of other top of researchers and engineers from character I and struck a licensing deal with character ee that would give them the right to use some of the underlying technology. So you they they leave character I go back to google. Um and so so there's now a new leadership team in place there. And from what I can tell there is, are trying to clean .
up some of the mess now. So they they left gool because IT wasn't fun and now they're back where they behind the viral glue on pizza cip that came out earlier this year.
I don't think they were. They just they just did this a back August. So it's a pretty recent change. But IT IT is you know interesting and I talk to google about this before the story came out and they wouldn't say much ah they didn't want to comment on the lawsuit but they basically said, you know, we're not using any of characterize technology. We have our own A I safety processes.
Yeah, I mean, it's a will proba cut this, but I do feel emotional about this. It's like these two guys are like, we can do anything here. There's too much bureaucracy. Let's go somewhere. There's no look. Create our own company and will make IT and we will ignore these obvious safety guard that we should have built and then we will get paid a ton of money to go back or please. I was just like, I mean, I do think .
like there is something and Kevin, you look back at a lot of these statements that like norm is made and like the founders have made, like there is something I think about about this that that really struck me and with him saying, like we just want to put this out there, like we're gonna cure loneliness. Ss, like you're trying to get people on the platform more and more and more with these sticky tactics. You know this this incentive base model that we all know from from silicon valley. So if you really want to try to take a stab at loneliness, which is a human condition, I think there's gonna to be a lot more thought in research.
And you know we started going on redit um and tiktok and there are a real threat, right of people think like I dict, I was talking to a guy on read who said he had to delete IT because he started first boys I just wanted IT is a companion and then started getting flooded and then I started noticing that you know and and then of course they're shame because they are they ashamed and humiliated that like they've been talking to an A I chat bott, they've been like kind of sucked in. And so there's all these really different interesting new ones, human things that kind of go along with some of the addiction conversation that goes much further than like beyond school story. But I think like that shame and embarrassment that this is happening for for Young people, tools is probably a part of IT as well.
Yeah.
let's get back to the lawsuit. What is Megan asking to be done in this case? And what does he hope comes out of this lawsuit?
So it's a civil lawsuit is seeking some unspecified damages for the wrongful death of her son. Presumably he is looking to be paid some amount of money uh, in damages from character AI, from the the founder of the company and from google but she's also asking for this technology to essenic be pulled off the market until IT is safe for kids and know when I talk to Megan, SHE was hopeful that this would start a conversation that would lead to some kind of a reckoning for these companies.
And you know SHE makes A A few specific arguments in this complaint for starter SHE thinks that this company, you know, should have put in Better safeguards that they were reckless SHE also use accused charactery I of harvesting teenage users data to train their models and improve them of using these kind of addictive design features to increase engagement. And then i've actually steering users toward more intimate sexual conversations to further hook them on the platform. So that is an overview of some of the claims that are made in this complainant.
And what is character A I saying about all this?
So I got a list of responses to some questions that I sent them that started by sort of saying, you know, this is a very sad situation, our hearts got to the family. And then they also said that they are going to be making some changes imminently to the platform to try to protect Younger users. They said they onna revise the warning message that appears on the top of all of the chats.
I just want to make IT more explicit that users are not talking to a real human being on the other side of their screens. They also said that they're going to do some Better filtering and detection around self harm content, which terms will serve trigger a pop up message directing people to a suicide prevention hotline. They also said they are going to implement at time monitoring feature where if you are, you're on the platform for an hour.
It'll sort of remind you that you've been on the platform for a long time. So they've started rolling this out. They put out a blog post clearly trying to sort of get ahead of this story, but that is what they're say got IT.
You know, i'm i'm curious now that we we've heard the facts of this case and not a pretty thrown discussion about IT, how persuaded are you that character A I and souls relationship with danny were an important part of his decision in his .
life already.
Wanna take that on? yeah. I have absolutely no doubt in my mind that that this teenager really believed that he was leaving this reality, the real world, and he was going to be reunited with danny. This chap hot IT is devastating. And and I think you have to look at some of those facts of before he, according to his mother, right, was on baseball teams with social, had you know love fishing, love travel, had real interest and hobby that were offline.
It's not to meet to say this happened and this was exactly because of IT, but I think we can begin to look at some of those details and those journals where he talks about, you know, how he stopped believing his reality was real. He wanted to go be in her reality. I I think that he would have had a much different outcome had he never encountered character. Ai.
Kevin, yeah.
I I would agree with that. I think, you know, IT is always more complicated when IT comes to suicide or or even severe mental health chAllenges. There's a really sort of one tidy explanation for everything, but I can say that from talking with souls mom, from reading some of the journal entries that are I mentioned, from reading some of these chats between him and these chatbot, this was a kid who was really struggling.
And he may have been struggling, uh, absent character. I I was a fourteen year old boy once. IT is really hard um it's a really hard time of life for a lot of kids. And you know I think we could explore the counterfactual. We could debate that like, you know what, we've been something else that sort of seated him.
I i've had people messaging me today saying, well, know what if IT was fantasy books that had sort of made him out of want to leave his reality that's a counterfactual that we could debate all day. But I think what's true in this case, from talking with, you know, his mom and reading some of these old chat chat transcripts and some of these journal entries, is that this was a kid who was really struggling and who reached out to a chatbot because he thought he could help. And in part that's because the chat botts were sort of designed to mimic a helpful friend or adviser.
And so do I think that he got help from the chat? T, T, yeah. There's a chat in here where he's talking about wanting to end his life. And the chat bott says, don't do that if he tries to sof talk him out of that but IT is also the case that the chat bott reluctance to ever break character really did make IT hard for him to get the kind of help that I think he needed and that could have yeah, here's what I .
think you know, I don't I can't say from the outside way any person might have chosen in their life. I think the reporting that you guys have done here shows that clearly there were major safe. Failures and that a lot of this has been thought to be inevitable for a long time.
We have been talking about this issue on the show now for a long time. And I hope that as other people build these technologies, they are building with the knowledge that these outcomes can very much happy. This should be an expected outcome of building a technology like this.
Lord, thank you so much for bringing this story to my attention into our attention and for the other reporting that you've done on IT, where can people find your interview with Megan with souls? Mom.
you can look at our channels for mostly human media on instagram and on youtube. We have IT a on our mostly human media youtube page.
Thank you, Lorry. A really hard story to talk about, but one that I think people should know. thanks. Sorry.
guys.
Hard work is produced by Rachel com and witney Jones were edited by je poyet we're fact by inner over to today's was engineered by Daniel remind rs. Original music by Sophia leman roan nemo and downs wer our audience editor is no golly video production by ryan manning and Chris shop. You can wash this whole episode on youtube and issue back home flash hard for special thanks. The policeman who am tim dellia hadd a jeffrey monitor. You can email us at hard fork and my time stuck up.