Indeed believes that better work begins with better hiring. So working at the forefront of AI technology and machine learning, Indeed continues to innovate with its matching and hiring platform to help employers find the people with the skills they need faster. True to Indeed's mission to make hiring simpler, faster, and more human, these efforts allow hiring managers to spend less time searching and more time doing what they do best, making real human connections with great new potential hires, learn
Learn more at Indeed.com slash hire. Casey, a little bird told me it's your birthday today. It's my birthday, Kevin. Happy birthday. Thank you so much. And let me just say, 58 looks great on you. Thank you. I've never felt better. Really. You're not 58. Well, in internet years, I think I probably am at least 58, if not older. But no, I feel like good. Yeah. I got you a present. What's that? You want to see it? Yeah. Okay.
I wrapped it in everything. Now, I have to warn you, I'm not a good wrapper. Really? I'd like to hear you at least spit a few bars. Hey, I hear what you did there. Thank you. I actually think this is beautifully wrapped. It's a nice sort of, you know, some brown and glitter paper. Oh, wow.
A multi-voice changer? Yeah, so this is because sometimes we get listeners writing in to say that they can't tell our voices apart. So this is to give you some options for how to transform your voice. That's great. Should we listen to a few of them now? Yes, just describe what it is. This is a sort of miniature bullhorn that is purple and plastic, and it is still in its packaging. Yeah, open it up. Let's try it. All right, let me just cut through these zip ties here.
And it has many different modes, and the way you adjust it is by turning those sliders, and then you pull the trigger, and then you talk into it, and it changes your voice. Okay, so let's... Hello? Hello? Hello? Pretty good. It's giving robot. Let's try to do high pitch. So that's just A, it says. Okay.
Yes. I like that voice. Yes. And then they would no longer be confused about our voices. It's very good. I like it. Yeah.
I'm Kevin Roos, a tech columnist at The New York Times. I'm Casey Neal, the platformer. And this is Art Fork. This week on the show. The Surgeon General wants to issue a warning about social media. Should Congress let him? Then, former Stanford researcher Renee DiResta joins to talk to us about her new book on modern propaganda and whether we're losing the war against disinformation. And finally, The Times' David Yaffe-Bellany stops by to tell us how crypto could reshape the 2024 election.
Well, Kevin, this week we start with a warning. Yes. But not our warning, a Surgeon General's warning, or I guess we should say an attempted Surgeon General's warning. Yeah, let's talk about it. This was really interesting. This was maybe the biggest tech news of the week, and it came out earlier this week from Surgeon General Vivek Murthy. That's right, and it is a story that we have been following here on Hard Fork, I would say essentially since the beginning, right? Because last May, we had a story
The Surgeon General issued an advisory about the risks of social media for young people. In that advisory, he wrote both that there were potential risks of harm for excessive use of social media, particularly for some groups of young people, and also that there could be benefits for other groups of people. We talked about that here.
Yeah. Then more recently, we had Jonathan Haidt on the show in March. He wrote this book, The Anxious Generation, which went on to become a bestseller, kind of covers this similar idea that social media can be really risky. We talked to young people. They called into the show. We talked to them about how they felt about it.
And since then, this has just been, frankly, one of the biggest debates of the year, wouldn't you say? Totally. I mean, I have been talking with parents, you know, for months now. I would say that the debate sparked by Jonathan Haidt's book has become a true social phenomenon. I've seen this book, you know, on desks and shelves everywhere. I've heard from just, you know, many, many people about
this. And we got so much feedback on the episodes that we did, not just with Jonathan Haidt, but with the listeners who wrote in and who we talked to about this issue. So I would say this is
One of the biggest sort of debates about technology right now is the effects of social media on teens and adolescent mental health. Absolutely. And, you know, while there are a lot of folks who wrote in who are very sympathetic to the ideas expressed by both the Surgeon General and Jonathan Haidt, there's also been some pushback. Candace Audras, who's a professor at UC Irvine, wrote in the journal Nature, quote, hundreds of researchers, myself included,
have searched for the kind of large effects suggested by Haidt. Our efforts have produced a mix of no, small, and mixed associations. Most data are correlative. So in other words, efforts to prove once and for all, find the smoking gun, say, hey, you look at Instagram too long, it's going to make you depressed. She's saying we have not been able to find a very large effect for that. Right, and the tech platforms themselves have been pushing back on this idea for years, right, that they are sort of causing mental health problems
among young people. But I would say this has become like a kind of kitchen table debate in America and around the world. It has also spawned a bunch of legislation and attempts to actually try to reform social media through new regulations and laws.
That's right. So more than half of the states in the U.S. are moving forward with some form of legislation aimed at protecting children who use the Internet. Laws passed in Utah and California have already faced legal challenges because, of course, it's very hard to regulate social media in a way that doesn't infringe on the First Amendment. I believe New York just passed a bill this month that restricts social media companies from using algorithms in kids' social media feeds without parental consent. So we'll see how that one plays out.
My guess is that'll be subject to a big legal challenge as well. So, Kevin, as you say, this is maybe the big kitchen table debate about tech so far this year. But, Kevin, what if I were to tell you that all of that was just a prelude to what happened this very week?
I would believe you. So the Surgeon General wrote an op-ed in your newspaper, so congrats on the big scoop, where he says that social media should add cigarette-style warning labels to social media platforms. In the opening paragraphs of his op-ed, he said, we don't have perfect information, but the mental health crisis among young people is an emergency. Kevin,
What did you make of this op-ed? Oh, it was really interesting in part because I think, you know, a lot of people know that the Surgeon General puts warning labels on cigarette packages. And we have seen those for decades now. And there's actually some evidence that those warning labels can increase awareness of the risks of tobacco and that they can change behavior among the people who see them.
And so what the Surgeon General essentially called for in this opinion essay is applying the same kind of approach to social media, where, you know, if you're a teenager and you log onto a social media platform, maybe there's a little banner, maybe there's a little warning label, and it says something like the use of social media by adolescents has been linked to mental health harms.
And this is something that a lot of parents and teachers have been calling for. But it's one thing to have sort of a citizens movement around this stuff. It is another thing to have the Surgeon General of the United States say that social media platforms should carry warning labels. Well, that is certainly what he is counting on, right? That he can use the authority that came from many Surgeon Generals ago pointing out that smoking caused cancer to use that credibility to say now essentially, hey, you look at Instagram or Snapchat too long, you're going to have problems.
But I have to say, Kevin, I was not impressed with this statement. All right, walk me through why you were not impressed. Well, what I want to take issue with something you just said, which is that these warnings have been associated with a change in behavior. Well, I think that's true in a broad sense. I think it's important to remember all the other things that were happening that contributed to people smoking less.
Because just a few years after they started putting out those warnings, Congress banned advertising for cigarette ads on TV and radio. And then we began to see the banning of smoking in public places. Right. So the warning, yes, was part of a package of things that appears to have had a very positive effect. But the idea that a warning in and of itself really did much, I'm actually not convinced at all.
Yeah, I mean, I also think it's a more nuanced argument that the Surgeon General is making. He actually writes, to be clear, a warning label would not on its own make social media safe for young people. Like he is not calling for this to be the only thing that the federal government does to deal with the issue of young people's mental health issues.
and social media. He also is supporting still this legislation in Congress. He wants companies to be required to share data about the health effects of their apps with independent researchers and allow independent safety audits. He also calls for these sort of phone-free zones that parents and schools can set up. But I think the sort of narrow question of this warning label is,
I just don't see what it harms. Do you actually see, you know, people being hurt as a result of, you know, if you were a teenager and you had to click past a little warning label when you spent too much time on Instagram, do you actually think that would hurt your life? No, but what if like I'm a 14 year old LGBT kid and I have parents who aren't supportive and I say, can I create an Instagram account? And my parents say, no, you can't. It's like not safe for you. And it's like, OK, well, I'll just go be, you know.
feel very alone for the next couple of years. Like that doesn't seem great to me. I just think that this warning is going to be used as a pretext to keep kids off social media who might get some benefit for them. And look, it's not that I'm saying that there's no risk to some groups of teens, but I think everything is just sort of like getting flattened into this very just like kind of
ham-fisted warning when we need more targeted solutions, like the Surgeon General was proposing last year. Yeah, well, we should also just say the Surgeon General cannot unilaterally just start slapping warning labels on products and social media platforms. This actually would require...
an act of Congress to put a warning label on Instagram or TikTok or any of these platforms. I have to say, I was a little surprised by that. Were you? Yeah, I kind of was too, because I kind of thought like, what's the point of being the Surgeon General if you can't like snap your fingers and put warning labels on stuff? Congress has to be like, okay, you can warn people. What did we do?
need the Surgeon General for? Was my question. But I think it's, you know, it is a position that has a lot of symbolic importance. This is sort of the top doctor in the nation. And I think it matters if the Surgeon General says, you know, this thing that your teens are using may be harmful to them. Well, it does matter. But I have to say,
I was really disappointed by this statement because as I'm reading through both the op-ed and an accompanying interview that he did with reporters at the Times, he does not bring any new science to the table, right? So a year ago, he brings forth what I thought was this very measured look at teens at social media. And then a year later, he's in the Times saying that he believes that the risk of using social media is so great to adolescents that...
the benefits of potentially using it do not outweigh the potential harms. That's an incredibly bold statement to be making without having substantial
subsequent evidence to support it, right? When the Surgeon General came out and said smoking causes cancer, there was really, really good science about that. This, I think, is a much less settled question. And so I think to skip all the way to, well, now we need to slap a warning on every website where, like, teens can post, I thought it was actually super dramatic.
Yeah, I mean, it's definitely dramatic, but I have to say this idea of a Surgeon General's warning on social media platforms does not bother me, I think, as much as it bothers you. In fact, I think it could be kind of a good idea for a few reasons, one of which is it's not actually banning
teens from using social media, right? It is not something like we've seen even in some of the legislation that's been proposed that would require platforms to do age gating or verification or anything like that. It's just a warning label.
And I think we know a few things actually pretty well. We know that there are many studies that have found a correlation between heightened social media use and increased anxiety and depression among teens. We also know that the mental health effects of social media tend to be worse for girls because of negative social comparison.
And the third thing we know is that some of the platforms knew that this was true years ago before the general public did and did not do much, if anything, to stop it. So I think those sort of facts alone and, you know, people will argue about whether this study or that study is, you know, is sound methodologically or not. But I think.
We know enough to be able to start making some public health recommendations about social media for adolescents. Do you disagree? Well, I sort of because I don't actually think that this is making a recommendation. It is just trying to spook people and saying, ah, social media is scary. Be careful, teens. It's not giving teens the tools that they need to understand if this is going to be a safe environment. It's not giving teens tools to say, hey, I'm having a miserable experience on Instagram or Snapchat. What do I do?
And that's why when I read this, Kevin, I just felt like this actually feels pathetic to me. Like this is a story about the Surgeon General coming forward a year ago saying we need to take this seriously. We need a package of legislation that attempts to address some of these harms, which I agree with you are real harms. And then nothing happens because we live in a country where we do not pass laws to protect people. And so a year after that,
He's like, put a warning on it. And, you know, and maybe that'll sort of galvanize people's attention. And so it's not that I think that there is no substance to what he's saying. I thought this is a broken country that we cannot take seriously these harms. We cannot sort of work on the sort of more nuanced version of this that would try to bring real help to people. And instead, we slap a label on it and call it a day. Yeah. Yeah.
I hear what you're saying, but I also think like social media has been around for two decades at this point. It is quite established that these products have effects on the mental health of adolescents. And in my opinion, the failure of those platforms to pay attention to their own research teams and their own investigations and to make changes to their products to protect kids is
has created this feeling that like someone somewhere needs to step in to do something about that. And like, is the Surgeon General the right person to be taking this on? I don't know. I think you could have reasonable disagreements about that. But if Congress isn't going to act, if the platforms aren't going to act, and
And if schools and parents are not going to act in any coordinated way, I think we do have to start thinking about more creative solutions. Sure. I mean, you know, I'm just struck, though, Kevin, by it's like one thing everyone does agree on is that we are living through this teen mental health crisis. Right. It gets talked about a lot. You know, there is truly no argument that this is real. And yet, like,
what has been the societal response to it? Like, what is our government doing any differently than it was two or three years ago to try to help people with any of these issues? It feels like there is no investment in mental health care or other infrastructure that would just sort of help these folks.
And maybe that's where you hear my frustration coming out here is that I do want to have a real response to that. Like, I do want to help teens. And I'm just so nervous that if this warning gets slapped on a website that people like the Surgeon General and President Biden just sort of wash their hands of it and say, well, what are you
want us to do? We put some texts on a website, right? And I just feel like the moment that we've gotten to this feeling like the next obvious thing to do in the teen mental health crisis, it just feels absurd to me. Yeah. I do think, you know how like in Europe, some of the warning labels on cigarettes have like images with that, like a photo of like a lung after it's been like decimated by years of tobacco use. It's
and I think we should do the same thing with social media. We should just like put up an image of someone whose brain has been totally rotted by spending too much time on social media and like the kinds of crazy stuff that they're posting on their feed. And this is what will happen to you if you spend six hours a day on Instagram. Yeah. Except you know, it would be, it would just show that person getting gradually hotter over time as they started eating. Right. They started working out. They started paying obsessive attention to their body. So that's what the warning would be. Well, I don't think that's necessarily true, but I,
I do. So look, I think we could have a productive conversation about what, if anything, a warning label should say. I also think we should talk about where it would appear because we know that not all social networks are created equal. Adolescents are not having mental health problems from spending too much time on LinkedIn, right? This is a problem that is very specific to a certain subset of social networks. I would say Instagram, maybe Snapchat should be in there. TikTok maybe should be in there too. These are the ones where there really is this kind of
visual comparison going on of what your body looks like, what your face looks like. These are the kinds of places that can be really unhealthy for teens to spend lots of time. You know, another thing I've been thinking about as I've been reading The Surgeon General is could he offer a more targeted warning about a more obviously real problem?
There's this story that I've been wanting to talk about on our show, and we've just not been able to find a way to do it because it is just so, so upsetting. We try not to bum people out too much. But there's this issue, and there's been a lot of great investigations onto it over the past year about these scammers who target mostly teenage boys, and they pretend to be a pretty girl.
and they get the boy to send them nude images. And the second that the nude image gets sent, the spammers blackmail them and they say, "Send us hundreds of dollars right now "and we're gonna tell all of your friends and family "what you just did. "We're gonna send your nudes with everyone "that you follow on Instagram." And these boys who have never had one thought that anything like this could possibly happen, they panic and we've now seen a really horrifying number of kids who have killed themselves over this issue, right?
This is just a simple scam that is being run that has this outsized and horrible psychological effect. This is what I want the Surgeon General to be warning people about, right? I want the Surgeon General to be out there saying, these are the kinds of scams that could potentially cause an acute mental health crisis in your child. Do something about this. If the Surgeon General wants to...
make Snapchat, put a warning on Snapchat that says this scam might help you, that I'm totally cool with, right? What I want to get away from is this very mealy mouth. Some kinds of use can be associated with some kinds of harm, right? Let's tell people about the real risks on these platforms and start doing something meaningful to address them. Yeah. I mean, as you're talking, I'm wondering whether we should have some kind of like a
like a test that you have to take before you're allowed to sign up for a social media account as a teenager. Like the way that we make kids take driving tests before they get their driver's license. Like maybe you should have to like
you know, sit through like a little tutorial or a little like, you know, seminar about like the various kinds of exploitation and harm that can happen to you on social media. Maybe that's one way to do that. I mean, good luck getting that through Congress, but I think that's the kind of intervention that could actually help teens navigate this because my problem with a, with a surgeon general's warning label is actually that it doesn't go far enough, that it doesn't actually tell teens,
teens and their parents like what to do if you do want to be on social media, how do you engage safely and without harming your mental health? All it says is this thing might be dangerous to you. I mean, this point, I totally agree with you. If, you know, if I had kids and they were going to school and I found out that their school was offering a social media literacy class and they took it and it was the same kind of thing as like driver's ed where you got like, you know, half credit or whatever.
It sounds like a non-solution when you're going to say, well, what we really need is education and literacy. When people say that to me, I sort of feel like they're throwing their hands up and it's like, okay, but what's actually going to solve the problem?
But this whole story is about media literacy. It's about understanding how systems are designed, how they are likely to make you feel, what strategies you can use if you find yourself in a spot of harm. What are some likely scams or dangers that you might find on these systems? Like...
it would be amazing if the platforms actually offered that kind of literacy, right? And maybe that is an area where I'm like, yeah, Congress actually go ahead, mandate that they do something like this for these teens. But if they're not going to do it, school districts could do it. Parents groups could do it. Nonprofit groups could do it. But I agree with you. That is what I would like to see that I think actually starts to make a dent in this problem. Yeah. But in the meantime, I don't think this idea of a Surgeon General's warning is necessarily a bad idea in the same way that I think, you know, putting warnings on cigarettes didn't
immediately curb smoking overnight. It wasn't like people stopped smoking because they knew all of a sudden it was bad for them. But it is kind of a little visual reminder if you are going to the store to pick up a package of cigarettes, it's
It just it sort of makes you pause or at least think for one second before you, you know, hand over your money and get your Marlboros. Like it does actually have a psychological effect. And I actually don't mind the idea that teens before they spend, you know, four hours on Instagram would get a little visual just pop up or something to just say, are you sure you want to do this? This could be bad for you.
Yeah, I mean, when you put it that way, it doesn't sound like that big a deal. Again, I'm just like, what are the odds that we apply this warning and it has any meaningful improvement in the lives of these teens? I just truly struggle to see the causal connection. I mean, I think the effect that it could have is on actually parents. I know so many parents who are struggling with what to do with their kids when they reach adolescence about social media. Do I give them a smartphone? Do I let them have an Instagram account?
And a lot of parents just feel very powerless in those situations because all their kids' friends are on social media. There's this sense that by sheltering them away, you are actually limiting their ability to be social with their friends. A lot of parents don't feel like they have a lot of backup when it comes to limiting or controlling the ways that their teens use social media. And I actually do think that having the Surgeon General of the United States put a warning on these social media sites that say this could be bad for your teen's mental health
I think that could embolden parents to say, look, it's not just me saying that this stuff is bad for you. The Surgeon General says it's bad for you, too. And it could help them feel a little more confident in actually setting some policies for their own kids. I can't believe you disagree with me like this on my birthday, by the way. What did I do to you? Jesus. Jesus.
When we come back, we'll talk to Renee DiResta about her new book on disinformation and how to win the war against it.
Indeed believes that better work begins with better hiring. So working at the forefront of AI technology and machine learning, Indeed continues to innovate with its matching and hiring platform to help employers find the people with the skills they need faster. True to Indeed's mission to make hiring simpler, faster, and more human, these efforts allow hiring managers to spend less time searching and more time doing what they do best, making real human connections with great new potential hires. Learn more at indeed.com slash hire.
I'm Julian Barnes. I'm an intelligence reporter at The New York Times. I try to find out what the U.S. government is keeping secret. Governments keep secrets for all kinds of reasons. They might be embarrassed by the information. They might think the public can't understand it. But we at The New York Times think that democracy works best when the public is informed.
It takes a lot of time to find people willing to talk about those secrets. Many people with information have a certain agenda or have a certain angle, and that's why it requires talking to a lot of people to make sure that we're not misled and that we give a complete story to our readers. If The New York Times was not reporting these stories, some of them might never come to light. If you want to support this kind of work, you can do that by subscribing to The New York Times.
Well, Kevin, I hate to brag, but it is my birthday. Last week at Platformer, we broke some news. Yeah, what was the news? So the Stanford Internet Observatory, which is this small but I think very influential group that studied the way groups use online tools to spread disinformation, is basically being dismantled and will no longer exist as we know it. And why is this a big deal?
So this group was the most savvy and well-connected among the tech platforms, and they had really good relationships with companies like Facebook or Twitter when that existed. And so as elections would take place,
The SIO, as they called it, would be in close communication with the platforms to understand what narratives are going viral, some true, some false, and then be able to report that back so that people like you and me who are trying to understand, hey, what's happening in this election? Is there any foreign interference taking place? Are there some narratives that are gaining a lot of traction? We would be able to understand that in real time, write about it, and sort of help people understand events as they were unfolding in real time.
Yeah, and I would say to take a step back, like the Stanford Internet Observatory was one of these sort of groups that sprung up in the wake of the 2016 election when all this viral, you know, disinformation went around on social platforms. There was this Russian interference campaign.
These social networks all started to take this kind of thing much more seriously, and there sprang up kind of these research communities who were filled with academics who wanted to look at the way that information, both true and false information, travels online, about influence campaigns, things like that. And I would say the Stanford Internet Observatory was the most prominent
of the kind of academic groups that sprung up to study this issue. Yeah, I think it was just a really natural response to the fact that we had Russian interference in the 2016 election, and it drew a lot of folks' attention to the fact that these networks could be exploited, that they could be really useful for spreading propaganda. And then after that, a bunch of academics came forward and said, hey, why don't we take a look at that? So all that seems like it should be kind of relatively uncontroversial, right? Academics studying how information travels on the internet.
But the Stanford Internet Observatory did attract quite a lot of criticism and attention, especially from partisans on the right. So what happened? Well, so in 2022, the Republicans retake the House and the Republican Jim Jordan starts what he calls the Select Subcommittee on the Weaponization of the Federal Government.
which is nominally designed to take a look at improper contacts between the federal government and platforms. And the fear there is that maybe the government was pressuring platforms to take down speech, to censor speech, and to violate First Amendment rights.
And SIO gets dragged into the middle of this because while there's not a lot of evidence of the government directly pressuring platforms to remove speech, there is plenty of evidence out there that SIO was talking to platforms about, hey, look at all of these tweets that might violate your policies. You might want to take a look at that.
And so I want to be clear that there isn't really a super coherent narrative that emerges out of this. It's just sort of this miasma of partial facts, half-truths, innuendos, right? But it all gets spun up into this government committee that has subpoena power. And so Jim Jordan starts going after some of these academics, dragging them before these hearings and saying, hey, what were you doing with respect to the election?
Right. And this is part of this kind of larger conservative backlash to what they see as online censorship by tech platforms, which they believe are trying to suppress conservative speech. And that they have the allegation is that they have hidden this effort behind these kind of academic.
veneers that you have groups like the Stanford Internet Observatory, which, you know, they believe are functioning as kind of arms of the federal government to work hand in hand with the platforms to sort of remove unpopular or conservative speech. Yes, that is the charge. There is no evidence that any of that is true, you know, and I just want to say, like, why was SIA doing this work? Well, they were worried that some of these narratives might take hold and turn really violent, which is, of course, inescapable.
exactly what happened after the 2020 election, right? We have January 6th. We have the Capitol being stormed. And instead of focusing on that, some people in the government want to focus on, well, okay, but which accounts were removed? After I published my piece, Stanford responded, disputing the idea that SIO is being dismantled and said they were going to continue a bunch of this work under new leadership. But this is a smaller team and several of the core people who were doing the original work are now gone. So...
That brings us to our guest today, Renee DiResta. She's one of the people who worked at SIO. She was actually the research manager of the Stanford Internet Observatory. And you and I have both known Renee for years. She's a person who lots of journalists and media people have relied on for research and data and insight about what's happening on social networks.
And in a strange way, she has almost become a central character in the story of online disinformation because she has been targeted by Jim Jordan and his committee, but also many of these sort of right wing influencers who have decided that she is a sort of shadowy figure in the effort to suppress conservative speech.
That's right. You know, she did an internship at the CIA, like when she was younger. And so now they call her CIA Renee online and are constantly hinting that she still works there, even though that's not true. I gotta say, that's kind of a cool nickname. It is. I wish I had a nickname that cool. I'll tell you that much. But look...
You know, it's really a sort of crazy twist of fate because as all of this was unfolding, Renee was working on this book called Invisible Rulers where she sought to chart these changes in the information ecosystem. How does propaganda work these days? How do what she called these cinematic universes get spun up out of almost nothing and then take over entire political subcultures? And so we wanted to talk to her about that.
Because this is somebody who not only has an uncommonly good understanding of the media world that we live in today, but has had to live through some of the worst parts of it. Yep. Yeah. All right. Let's bring her in. Renee DiResta, welcome to Hard Fork. Thank you for having me. So I want to get to everything that happened at the Stanford Internet Observatory. But first, I want to ask, what was the original spark for the book that you just published?
The original spark for the book, I just wanted to write about how propaganda and influence had changed. It was really, honestly, I started the project before all the subpoenas and lawsuits and all the BS. And I really just wanted to write a book kind of like Manufacturing Consent did in the 1980s, right? Where he's writing about here's this incentive structure and here's the outputs that come out as a result of it. And I thought we haven't really had an update to that in 40 years or so. So maybe I'll write one. And then, of course,
A couple months later, the conspiracy theories about us started, then the subpoenas came down, and then I was like, well, I didn't want to write a memoir, but I guess I am. But now you are, yeah. I want to talk a bit more about this idea that something has changed about propaganda over the past couple of decades. What had you noticed in your work that made your ears perk up and say, there's something interesting here to dig into? Yeah.
I spent a lot of time looking at the anti-vaccine movement as a new mom in 2013 to 2015 timeframe, right? But I didn't actually think of that as propaganda at the time. That was not, you know, I thought of it as activism, right? We are fighting. We are pro-vaccine people. We are fighting those anti-vaccine people. We have a law we want to pass in California. We have a campaign and we're going to fight it on the internet. And the thing that's interesting about it is whenever you have a political campaign, there's like a start date and an end date, right? Yeah.
But they did not see it as having a start date and an end date. For them, this was like this was their thing. And they were on it 24-7. And they had built an entire community. There were thousands of anti-vaccine accounts. This is years prior to COVID. Just to clarify, we're talking about measles vaccines here. I thought it was interesting that I thought of this as something that this is activism. We're turning it on and off.
But they thought of it as something that was persistent. They were going to message forever because they were really true believers in this idea that vaccines caused autism. So that was kind of my first experience. And then I felt like I had sort of seen the future, right? I was like, this is how every campaign is going to be fought. I can run ads on Facebook. I can granularly target down to the zip code level. And nobody knows who the hell I am. And I don't have to tell them. And this is absolutely insane, actually. Like, this is wild. Yeah.
And then around the same time, ISIS was doing its thing. And that I saw as propaganda, right? This is something with, you know, they have an iconography. They have a flag. They are out there. And at the time, they were recruiting heavily on social networks, right? Yes, they were. Basically, they were like ISIS influencers. I didn't have that vocabulary quite then. This was, again, 2015, right?
But it was, you know, women posting selfies with their fighter husbands and their guns or their kittens, you know, and it was this absolutely surreal thing. And I thought, OK, this is propaganda. They're an explicitly political organization. There's nothing remotely surreptitious about this. They're building a brand, right?
So I wound up doing some work on the question of like, how do you respond to that? And it turned out the government had no clue how to respond to that, which was, you know, why I was like, I'm in this room, where are the adults? So that happened.
realization that anybody could do this and everybody was going to be was, I think, what made it a motivating factor for me to write about. Yeah. You were very early to understanding that social networks provided this really powerful new vector for anyone who wanted to grow a movement, to do activism, to spread propaganda. I first encountered your work during those years. And then you get to 2019 after you've been doing all this work and the Stanford Internet Observatory comes about. So what did you do there?
So SIO started in 2019. I never actually worked on right-wing content or hate speech or anything. In the entirety of my 10 years doing this, I've never published on that topic. It was just not one that I spent my time on. But we did this project called the Election Integrity Partnership in 2020. And the EIP was an inter-institutional collaboration. There was us, Kate Starbird's team at University of Washington Center for an Informed Public, and
Grafica Digital Forensics Research Lab. Most of us had pretty deep expertise looking at state actor campaigns. And just so I'm giving the listeners enough background here, this was the Election Integrity Partnership was sort of an ad hoc group of organizations that were looking specifically at attempts to undermine democracy.
the election on social media. Absolutely. So I think sometimes you see right wing media report that, you know, it was somehow related to Hunter Biden's laptop. That's BS. Or that had something to do with, you know, political speech in the context of candidate A saying something about candidate B. But you would you were doing things like, you know, monitoring. There was this thing, Sharpie Gate, which is sort of a conspiracy theory about people, you know,
basically manipulating ballots. Yes. So the scope was exclusively limited to things related to policies and procedures having to do with voting. So it was that kind of stuff, you know, Sharpie Gate making an allegation that a felt-tip pen would invalidate a ballot.
And that specifically, the sort of other piece of it was the delegitimization narratives. So in the context of Sharpie Gate again, that those felt tip pens were only given to Trump supporters to steal the election, right? So we were only looking at things, only looking at narratives related to the election. We did not care about Hunter Biden's laptop. But, you know...
What winds up happening is that work, which did involve occasionally speaking with state and local election officials, and you see Fox News call Arizona for Biden. And all of a sudden, the Sharpie story goes from being just some people concerned about Sharpie markers to that's how they stole Arizona. Right.
And so state and local election officials, meanwhile, throughout the entirety of Election Day are trying to ensure that people have confidence in the election and that if there is something irregular or weird that is happening, that they know about it.
But election officials are not supposed to be sitting on the Internet all day long. That is not their actual job. It is, in fact, our job. And so we were like, well, OK, we can communicate with state and local election officials. And what this meant was that they would occasionally send tips, basically, hey, can you look at this tweet? Can you look at this post? And oftentimes we looked at it, but it was some random guy with two followers who was like wrong on the Internet.
But then sometimes there were these things that got a whole lot of attention and a whole lot of pickup. And in certain cases, we would also tag tech platforms and just say, hey, you should have a look at this. Now, this was reframed as government officials were using us, giving us things that they wanted taken down, and that we were then telling platforms to take them down. That this was like this vast operation. And they turned it from being state and local election officials to being like DHS itself.
Because DHS is responsible kind of at a federal level for elections. And so you have this conspiracy theory that we are somehow being used. And I mean the numbers that these sort of like right-wing blogs start to write are like 22 million tweets, entire narratives nuked from the internet with an AI censorship super weapon. I'm not kidding. That was the actual phrasing, right?
And, you know, this is the sort of thing where in a normal polity, this would be seen as tinfoil hat BS. But in this reality, Jim Jordan is like, yes, we need to investigate this. We need to investigate the AI censorship super weapons run out of Stanford University.
So let's just finish the narrative arc here, because we have this sort of pushback from the right to these groups and platforms that are engaging in what they believe is politically motivated censorship. Jim Jordan starts sending out all these not only letters, but subpoenas. He gets a bunch of emails from platforms communicating with governments and academic institutions. And
Then something happens with Stanford itself, which is spending tons of money. I think you've said, you know, millions of dollars to defend testimony. Yeah. To defend, you know, against these claims and to respond to these subpoenas. And, you know, maybe at first that feels like they're sort of on your side. They're sticking up for the research team. But at some point, that's that seems like it started to change.
And you recently learned that your position at the Stanford Internet Observatory was being discontinued, that there was not going to be more funding made available for you to continue working there. And I'm just curious what your emotional reaction was when you heard that. So several of us were given that news at the same time that there was no funding. And, you know, I think the reaction was disappointment, obviously. Well, my reaction was disappointment. I can't speak for them.
We have a really close-knit team, and we do amazing work together. And again, I think for us, the immediate reaction was,
Are there ways to get funding for particular project lines? Is there... The child safety work. I mean, I think it's perhaps, just to make it clear, there are certain tools that we have to build to be able to do that work. It's not something you can just do anywhere because it is illegal to view that kind of content in addition to being seriously damaging. And so...
A lot of what we've done is design ways to do certain types of very delicate trust and safety research in ways that are, you know, that enable the team to put out amazing work while, you know, not bumping into some of the sort of terrible sides of it. I also felt like because we have that breadth, because all of us work on all of these different project areas, right? Like I work on trust and safety. I work on our generative AI stuff. I work on our election integrity or information integrity work.
We built an entire center on the idea of all of these challenges are interrelated because they happen on the same system. Like there's structural things here. How can we have that pipeline from quantitative empirical research to policy recommendations that also take into account the way that all of these things are related? There are very, very few institutions that have that type of structure.
analytical capacity and that have that vision of the internet as complex system. And so while there are many, many excellent institutions that do deep work on, you know, topic A or topic B or that write exceptional policy briefs, what we really wanted to build at SIO, what we did build at SIO over five years was this ability to study a very complex system at a holistic level and
you know, make material impacts across a fairly broad array of topics. So I want that to exist. And I want to be doing it too. Also, I would say, you know, even if you're somebody who says, wow, you know, there's this censorship industrial complex and, you know, these academics have gotten out of control. I just want to remind us that what we're talking about is should universities be able to study the way that narratives spread on
Should they have a sense of which narratives are gaining popularity? Which accounts are responsible for spreading them, right? How do these networks of ideas work? This is a sort of objective study of reality, and it is somehow being painted as this –
malicious effort to censor speech, right? So I just want to say that, like, you can have different opinions about what should we do about tweets we don't like, and should we be able to study the way that information spreads online? I think the, um, what I actually found most disturbing and the thing that I think I'm going to wind up writing about is that there's a couple of tweets that go out from, like, House Judiciary GOP and Jim Jordan saying explicitly victory, right? And that's a thing that I think, I don't know
what it takes to jolt academia out of its complacency, to make them realize that this was the objective, right? That the objective was to silence the work of a First Amendment-protected research project and team. And my frustration there is that when you have a sitting government official with subpoena power gloating about killing students
First Amendment protected work and then saying freedom of speech wins. I mean, that is, you know, I feel like Orwellian is the most overused word on Twitter, but like, man, is that really... I want to ask about another part of the criticism of some of the work that you do that I actually think is sort of interesting. And it's not about Stanford or academia in particular, but it's about...
Actually, the role that government plays in this whole universe of online platform manipulation and disinformation. There's this word jawboning that gets used a lot in these debates. And for people who aren't familiar, it's basically jawboning is sort of when the government is kind of applying pressure to private companies to kind of do something they want.
Even if they're not legally required to. Right. So, you know, it could be as simple as someone, you know, from the White House, you know, sending an email to the trust and safety team at Facebook or at another social network and saying, hey, you know, we've got these 50 accounts that we believe are spreading misinformation. Maybe you should take a look at them and maybe you want to sort of like, you know, apply, you know, a label to them or maybe even take them down.
And we know, in part, thanks to some of these subpoenaed emails, that this kind of thing actually did happen. There were people in the Biden White House emailing platforms, talking with them about trying to get certain content taken down, and that sometimes the platforms pushed back and refused to do that, but sometimes they went along with it. So,
Do you think this issue of job boning is real? And do you think the government has overstepped when it comes to trying to enforce social media policy on these platforms? I think it's, I mean, I think it's a really interesting question, right? Job boning is bad, right? We should be able to hold that idea in our head and say, it is bad. It is not a thing that we should want as a democracy. We should want our government to do. There's a couple
of nuances there, meaning that the government also has speech rights. The government also has particular incentives, for example, during a pandemic to communicate with platforms about here is here, here, here we are trying to prevail upon you for like why you should do this thing. I think that that's best done perhaps a little bit more publicly, I think, though, interestingly, when it is, you know, when you do see Biden say something like in a public press conference, like, you know, what did he say? You're killing people. Was that the sentence?
That's also sort of viewed as like, whoa. This was something that he said about Facebook during the pandemic, basically accusing them of killing people by not removing more misinformation about vaccines and things like that. Right. So there's a whole spectrum of government communications, public and private. One of the things that we see is
governments, not the United States, but other governments making explicit content takedown requests explicitly to throttle their political oppositions. You see the Modi government requesting Sikh politicians in Canada have their content throttled so that it can't be seen in India. That is, I would argue, rather transparently censorship in the actual sense of the word. So this is a worthwhile thing to be looking at. I think that
Google, in particular, will put up these transparency reports where it says the government requested action on, and then it will sort of like list content that governments request action on. I think that's a very reasonable thing for tech platforms to do, which is to say when these requests or asks come in, we're going to make them public, right? And that provides then, I think, kind of a check on government because if they don't want that request being made public, then maybe they won't make it, right? Or if they feel like it's a...
It's a very, very important thing and a thing that they want to request. They can either do it publicly themselves or make it public after the fact. I think we need government and platforms to have open channels of communication, particularly because there are certain areas where you do see meta in some of its adversarial threat reporting about state actors in particular, like China, saying the government no longer talks to us because it's afraid of being seen as somehow –
Any communication is jawboning. And that, I think, is also a very, very bad state for us to be in. Your book is sort of about how we ended up in the place that we are now, which is where, you know, you have millions of Americans who are deeply invested in conspiracy theories. It kind of feels like we have what you call bespoke reality, where everyone is just kind of stitching together their own version of events based on the sources that they're following, the influencers they pay attention to and trust.
we don't have a sort of broad consensus reality anymore. You also have some ideas in your book about how we could sort of start to make our way back to something like consensus reality, how we could start to turn the tide of disinformation and extremism and all this stuff. Can you walk us through some of your ideas for that? Yeah, so a big area of focus for me has been design, and that's because I think
I think people hope for regulation. I'm a little bit more of a skeptic on the regulatory front, and that's mostly because I don't, from a purely pragmatic standpoint, I just don't see how anything gets passed in the United States, right? So the basic, you know, we've been talking about tech reform and tech accountability and so on and so forth and everything from
to antitrust, to child safety, to privacy, to, you know, and then you've got a whole slew of like very, very bad bills also. But nothing gets passed anyway. So I think what we look at here is the question of can you, you know, what did we used to do to arrive at consensus? We've always had heated debates. How did we get to a point where we could not have any kind of ability to bridge?
I think one of the things that happens is when you have heated debates in your local neighborhood, you usually talk to your neighbors, right? You're geographically constrained. You see these people at the bus stop. You see them at the library. You don't spend all of your time screaming obscenities at them, you know? No, you go on to next door like a reasonable person and you write an all caps post complaining that your neighbor set off fireworks at 11 p.m. and it woke up the dogs. And no, that might just be my neighborhood.
I am no longer on Nextdoor. But yes. No, I think there's a – you can have civil disagreements in the real world. I think it's hard to look somebody in the face and accuse them of being a secret government whatever agent to silence them. It would sound a little bit crazy if you did that. So the question is how do you create that –
that kind of environment where you can have heated debates, but in a better way. And so right now there's some work being done on what are called like bridging algorithms, right? And so I think it's important to note that everything that you see on a social media platform is ranked or curated in some way. This is why I've always been slightly mystified by the conversation about, um,
You know, my post is being censored because it's not, you know, it's not being served to as many people as I would like. And I'm like, well, I mean, like, welcome to the Internet, right? Everything is curated. Some, you know, some algorithm somewhere is making a determination about that. And there's a sense that these algorithms are like somehow sacrosanct, right? Like we have to like freeze them at this moment in time because any change to them is somehow censorship or bad.
In reality, you can surface content. You can prioritize in a feed ranking content that is just not caustic, right? You can have somebody who isn't expressing an opinion on abortion or Gaza or whatever else that is expressing it in a way that is like, you know, articulates the point, gets the thing out there, but is not focused on here is why my enemy is evil, right? So because a lot of the time, here is why my enemy is evil.
It's not inviting debate, discussion, or consensus. It's inviting a mob to go target your enemy. And I do think that we can just curate that a little bit differently. I think this is a very reasonable thing to try. So I think there's just interesting areas to interrogate with regard to design. And then the last piece is education, which is really, the book is not a book about social media. It says this is the infrastructure in which this happens. This is like the water in which we swim. But ultimately, it is people making the decision to click the share button, click the like button. And can we teach people to
To recognize certain types of rhetoric as, hey, this should actually be a red flag when you see something framed in this way where somebody's talking about like they hate you, you know, they don't want X, Y, Z. Can you make it so that that kind of rhetoric is actually like –
Like, low quality. Like, this is not a thing you should be proud to boost. Like, how can we teach people to recognize this is actually propaganda, right? This is pretty propaganda 101 over here. This is, you know, can we get, rather than media literacy about sources and facts, can we just get to here is how these types of claims work on us psychologically and, you know,
Why maybe in that moment when you're that outraged, you actually shouldn't share. You should go look for more information. So we need media literacy about rhetoric as well as the facts and the sources. This was something that was done in the late 1920s, early 1930s. And I spent some time on it in the book, right? There was this radio priest, Father Coughlin. He was one of the first big, you know, you can call him an influencer, right? He had 30 million listeners. The population of the U.S. was about 120 million people. Massively popular man.
who has this arc going from being, you know, a populist in the sort of original definition of that term, right, a priest advocating for the people, but working with like FDR and others, he gradually becomes extremely disillusioned with that and descends into fascism. And again, a literal sense, right? He is writing letters to Mussolini. He is praising Hitler. And that is where the man's arc goes with his 30 million listeners following along.
And what's interesting is there's this group of academics, plucky academics, who start this thing called the Institute for Propaganda Analysis. And they begin to just annotate his speeches. They're not fact-checking them. They're not saying this is right, this is wrong. They're saying when he uses this rhetoric, like this, you know, they call it like the glittering generality. You know, these are the red flags. So they literally release annotated copies of his speeches.
I was absolutely blown away the first time I saw this with emoji. The glittering generality has a little diamond emoji, and they literally just put the emoji in there. And I saw it, and I was like, is this really from the 1930s? My God, I use emojis in my talks for that reason because I think, okay, it'll make people immediately click and remember that thing, that sort of visual association. And there they are doing it in the 1930s. And I was like, okay, where did this all go? One more question.
I think a lot of people right now have the sense that disinformation is winning, right? That there's this kind of battle between the forces of truth and the forces of lies, and that we are just surrounded and inundated with lies and conspiracy theories and half-truths and misleading suggestions just everywhere you look. What do you think? Is disinformation winning?
I think that institutions do not know how to operate in this moment. That has been a source of frustration for me personally over the last year, and I'm not going to sugarcoat that away. I felt this acutely during COVID. I spent so much time during COVID writing these articles in The Atlantic. Like, wouldn't it be great if, you know,
Public health, rather than being reticent, was like out there at the forefront saying, here's what we know right now. Will it change tomorrow? Maybe. But here's what we know right now. So that they're at least in the conversation. In the conspiracy theories about us, you know, it was, well, you know, just say nothing, right? It's just some people on the internet who are making stuff up. And I'm like, it has profound impact. You absolutely must say something.
Right.
The institutions that the disinformation is often about just simply haven't adapted to the modern communication era. And they're just not putting anything out that would make it seem like there is a debate or a dispute or the facts are out there. And instead, they just say nothing. And that's something that I experienced over the last year. And it was very, very frustrating.
So if you want to win the information war, you can't just ignore the conspiracy theorists and the cranks anymore. Well, you have to at least you have to be in the game. That's the thing where it's the idea that they don't understand that the game has changed and that while they might not want to play it, they are in it. You remember the university president's debacles, right? I mean,
This was just this yet another – what do you think you are going into that hearing for? Do you think you're going into the hearing for oversight? You're not. You're going into it for a Twitter moment. What is the Twitter moment going to be here?
And you just watch these things with incredible frustration because they're not thinking in those terms. They're thinking we're going to be bland and boring and we're going to get out of this and no media is going to write an article because it was a very boring hearing. It's not how the world works today. And again, I think they need to be sort of jolted out of complacency. And I wrote the book in large part because I do think that we need to just change our thinking on that. Renee, thanks so much for joining us. Really enjoyed your book. And the book is called Invisible Rulers. And it's out now. Thank you so much. Thanks, Renee.
When we come back, we'll talk to David Yaffe-Bellany of The New York Times about why crypto is poised to reshape the 2024 election. And it's not by running a board ape for Congress.
Indeed believes that better work begins with better hiring. So working at the forefront of AI technology and machine learning, Indeed continues to innovate with its matching and hiring platform to help employers find the people with the skills they need faster. True to Indeed's mission to make hiring simpler, faster, and more human, these efforts allow hiring managers to spend less time searching and more time doing what they do best, making real human connections with great new potential hires. Learn more at indeed.com slash hire.
Well, Casey, I don't know if you've heard or not, but crypto is back. Oh, Kevin, you know, that's the one thing I was hoping wouldn't come back. You're like bell bottoms. I'm good with crypto. Not so much. So obviously we've had a big crypto boom this year as the prices of a lot of cryptocurrency tokens have gone up.
But something else has been happening, which is that politicians have increasingly been engaging with the crypto industry as part of a strategy to win their elections. Well, tell me about this. So last year, RFK Jr., who's running for president on a third party platform, chose a crypto event in Miami as the place to make his big campaign debut. And he declared that he was a big lover of the crypto industry. And he said, well, I don't know.
And then just over the last month, Donald Trump has also been invoking crypto in his campaign speeches and positioning himself as a friend of the crypto industry. And now even apparently President Biden is thinking about meeting with the crypto industry to talk about policy. Well, that is interesting, although, Kevin, when it comes to RFK Jr., we can never forget that a worm did eat part of his brain.
That's very true. So, you know, it's been a little weird as someone who's been sort of following the crypto industry for a while to sort of see this, you know, this sort of turn of events where politicians who used to dismiss crypto out of hand are now apparently taking it seriously.
And I think it's just a very revealing story about how the crypto industry has been working behind the scenes to kind of drum up support among lawmakers to try to beat back some of these regulations that it thinks are going to hurt its ability to make money. And also how it's using its money in very conventional ways to try to influence the upcoming election. So this week, three of my colleagues, David Yaffe-Bellini, Aaron Griffith, and Teddy Schleifer, published a story titled How Crypto Money is Poised to Influence the Election.
Basically, it's about this new attempt that the crypto industry is making to raise a bunch of money and to start super PACs and to start distributing it to candidates in races where they think their support could make a big difference. And I'm very excited about this because anytime I hear about a lot of crypto money going somewhere, I think it's a fresh opportunity for people to eventually be incarcerated. Yeah.
Right. So I thought this was a very revealing piece, not just because of what it said about the crypto industry, but because of what it says about politicians and how easily some of them apparently can be bought or at least convinced to take crypto more seriously. So to talk about this piece and what it means, we've invited our old pal, David Yaffe-Bellany, DYB, back on the show. BRB with DYB? Well, BRB. Have we ever used that joke? With DYB. Yeah.
Davey F. Bellamy, welcome back to Hard Fork. Thanks so much for having me. By my count, this is your seventh appearance on this show. You are the most frequent Hard Fork guest. How does it feel? Well, do I get some sort of medal or like any hardware to signify this achievement? We just put an NFT in your crypto wallet. You'll want to check that out later. That's even better, baby. DYB, where are we catching you right now?
I'm coming to you live from Puerto Rico where I'm on a real grueling hardship reporting assignment for the next few days. You're just like sipping margaritas with Francis Haugen, aren't you? Yeah, basically. Well, I hope you're getting hazard pay for your arduous reporting trip to Puerto Rico.
David, before we dive into the story of crypto money and the 2024 election, I think it would be helpful if you just sort of mapped the terrain of crypto politics for us a little bit. And I want to start by asking you about how crypto is being viewed on the right and specifically by former President Trump.
Because until fairly recently, he was not a fan of the crypto industry. He used to say stuff like calling Bitcoin a scam. But recently, he's totally flip-flopped. And this year, he has declared himself a friend of crypto. He's accepting campaign donations in crypto. He's taking up causes that matter to crypto supporters. He recently met with Bitcoin miners at Mar-a-Lago. And he's been saying stuff in his speeches like, I will end Joe Biden's war on crypto.
He even has his own NFT series. So what happened? So that's a really good question. Like you said, he had this kind of long history of disparaging comments about crypto. And, you know, really up until even kind of earlier this year, in February, he made a comment about how he preferred dollars to Bitcoin. And so what happened? So
Well, one thing is that he spent a lot of time hanging out with Vivek. You know, remember Vivek from the Republican— Vivek Ramaswamy. Not Vivek Murthy, the Surgeon General. No, no. And Vivek's a huge crypto proponent and has sort of taken some credit for changing Trump's stance on the issue, at least claimed that in private conversations he sort of nudged him toward thinking about crypto a little bit differently.
But I think really what's happening is sort of a raw political calculus, which is that this is an issue that a very wealthy group of people care a lot about and that they're very angry about as well. And so it's sort of an opportunity to score points on Biden. And whether Trump actually cares about crypto remains to be seen. But he sort of found it to be kind of a useful political wedge. And it's not just about scoring points, David, right? It's also about raising money.
Yes. You know, the crypto industry has spent huge sums of money already in this election cycle. That money has primarily gone toward congressional races, but I think it's reasonable to assume that, like, Trump sees this as a potential fundraising opportunity for his own campaign.
Right. I mean, that feels like a very straightforward story we've heard before with other special interest groups. You know, some, you know, the fracking lobby gets really interested in politics and they have a bunch of money. And so all of a sudden people start changing their views on fracking. Like that's not a new story. But that is interesting what's happening on the right and specifically with Trump, because it kind of has given crypto a political home that it maybe didn't have last year, even a couple of years ago.
I want to ask you also about the left, because it used to be that most Democrats were seen, at least by people that I talked to in the crypto industry, as being hostile to crypto. And some Democrats, including President Biden and Senator Elizabeth Warren, are still very much seen as kind of enemies of the crypto industry by people in the crypto industry.
But you also have some interesting cases of Democrats like Chuck Schumer, who recently broke with President Biden in an effort to roll back some SEC guidelines that the crypto industry didn't like. So, David, what is going on with Democrats and crypto right now?
The left doesn't really know what to do with crypto, I think. I mean, on the one hand, you've got people like Elizabeth Warren, who sort of see some of the abuses in the crypto world as a kind of mirror image of abuses that have happened in the traditional finance system. Obviously, she's a longtime critic of the big banks, and so she's sort of using a kind of similar rhetoric to talk about crypto. But that's created some sort of strange bedfellows situations where you've got kind of
lobbyist group sort of siding with Elizabeth Warren because the banks also want to take out the crypto industry since they see it as a threat to their business. And so you have other Democrats who say, hey, wait, why are we allying ourselves with the kind of entrenched financial interests that we've also criticized in other contexts? And so that's resulted in this kind of scrambled situation where you have different kind of Democrats with sort of different positions on crypto. I will say, though, kind of by
By and large, you have kind of most Democrats in Congress sort of falling in line with the position of the Biden administration, which is that this stuff is, you know, largely illegal, at least the way it's being kind of used at the moment.
Right. Okay, that's helpful. I want to talk now about money, and in particular, these crypto super PACs that you've been reporting on that are trying to influence the upcoming election. Who's behind this money? And how much money are we talking about?
So we're talking about a huge amount of money, primarily coming from three big crypto companies. Ripple, which has sort of battled with the SEC for years and years. Coinbase, the biggest U.S. exchange. And A16Z, which is a VC firm, but one with huge, huge investments in crypto, obviously. And they've each spent about $50 million to finance a group of PACs.
the largest of which is called Fair Shake. And so those groups are sitting on a pool of money, you know, more than $150 million, which in the tech world is not like an astounding amount of money, but in politics, it can really make a huge difference. So lay out the political agenda of these PACs. What do they hope to accomplish?
So it's sort of pretty kind of explicitly transactional, even by crypto standards or even by political standards, really. Like they want to elect pro-crypto candidates. You know, they're talking about sending questionnaires along to candidates to sort of like gauge their views on crypto. And then the idea is to kind of elect people who will kind of back
pro-crypto legislation. And that could be, you know, a bill that strips a lot of power away from the SEC that says that, you know, cryptocurrencies are not actually securities and therefore they're allowed to kind of be offered and traded the way they have been in the U.S. Got it.
And what kinds of races are these crypto super PACs most focused on right now? So Fairshake, the biggest of the PACs, announced a couple of months ago that it was going to focus on four Senate races, including two that are
very competitive that involve Democrats who are looking pretty vulnerable in their re-election efforts. And there's the Senate races in Montana and Ohio. So it's Jon Tester in Montana and Sherrod Brown in Ohio who are both kind of vocal Democratic critics of the crypto industry kind of facing re-election in those crucial states.
And are these super PACs mostly or exclusively supporting Republicans? Because there are some Democrats who are seen as pro-crypto or at least a little less anti-crypto than maybe Elizabeth Warren and other very anti-crypto Democrats. So are they supporting any Democrats or independents?
Yeah, absolutely. And the PACs and the companies that are backing them are very quick to say that they consider this a bipartisan issue. They see strong supporters of crypto on both sides, et cetera, et cetera. And it's true that one of the first kind of major expenditures by Fairshake was in the California Democratic Senate primaries.
where the group spent about $10 million on attack ads against Katie Porter, who was one of the Democratic candidates and was seen as sort of a close ally of Elizabeth Warren. And so she was defeated and Adam Schiff ended up winning that race. And Schiff went on to meet with Coinbase and some other crypto firms at Coinbase's offices a few weeks after that election. So you definitely see these groups kind of rubbing shoulders with Democrats as well as Republicans.
And how much of this activism by the crypto industry do you think has been helped by the fact that crypto prices are quite high right now? I mean, if we were talking in 2022 when the sort of crypto industry had collapsed and all these coins were – their value had fallen precipitously –
there just might not have been as much money to spend on these races. So how much is the fact that like Bitcoin is, you know, close to an all time high now that a lot of crypto prices have recovered and are booming again? How much has that helped these attempts to influence the political process?
Yeah, I mean, it's unquestionably a big part of it. I mean, most of Coinbase's revenue comes from transaction fees on crypto trades. And crypto trading ramps up and the sizes of those trades tend to be bigger when the market is doing well. And so Coinbase does a lot better when the market is doing well. It generates a lot more revenue. And you can see that in its earning reports every quarter. And so Coinbase has more money to spend now than it would have had
you know, two years ago. And, you know, thus it can, it can afford to lay out $50 million on, you know, on a, on a pack. You, you mentioned the, the Katie Porter race where the crypto people got where they wanted. Are there other examples of them winning? Like, do they feel like they have some real momentum? So,
So one sort of cautionary thing I would say is it's always difficult to determine causation here. We know that Katie Porter lost and we know that the crypto industry spent a lot of money in that race, but was one a result of the other? It's not totally clear. They're very quick to claim that scalp, but I think
that we probably need more evidence before we can definitively say that this money is shaping the elections. Another claim that backers of some of these PACs are making behind the scenes is that Sherrod Brown's position on some crypto issues has kind of softened. He's voiced a willingness to vote for some pro-crypto legislation as a result of the threat to spend a huge amount of money in his race.
But if he had simply put his position on the blockchain, it would have been immutable and then it never could have either softened or hardened. So that's something that candidates should be thinking about. Exactly. This is how we stopped the flip-flopping that bedevils our political process. So obviously there are parts of this that just sound very traditional and sort of about a special interest trying to influence the political process, whether through big campaign donations or super PACs.
But there's also this idea among some people I talk to in the crypto industry about the crypto voter, right? There's this idea that a lot of crypto leaders have that there are millions of Americans out there for whom crypto is a very important issue and who will vote for candidates who support crypto and won't vote for candidates who don't support crypto. What do you make of that theory about the crypto voter? Yeah.
I mean, you know, I mean, I know Casey's a single issue crypto voter. Correct. Every decision is shaped by these issues. So it seems plausible to me. You know, this is something I've, you know, I joked about with, you know, my colleague, Kellen Browning covers, used to be on the tech team and covers politics now. And, you know, I said to him a few months ago while he was sort of on the campaign trail, like, so are you running into a lot of these single issue crypto voters? And he just laughed. Like, of
course nobody's talking about Bitcoin at a Trump rally or whatever. But the industry has these surveys that are exclusively commissioned by the industry, which show that there are a huge number of people in the U.S. who own crypto. The argument is that even if it's just a fraction of that group that votes based on their own kind of financial interests and their crypto holdings, that that could have an impact on the election. I am not super convinced, both because these surveys are
like I said, commissioned by the industry and also because that leap from owning crypto to voting based on your crypto ownership seems like a big leap. But this is a huge industry talking point right now.
I mean, you have me wondering, essentially, like, will crypto come up during the presidential election campaign? Because if for whatever reason, you know, Trump talked about this a lot in debates, I imagine it would become a very polarizing issue. And then people would have to decide, you know, whether they agreed with him or not. So far, though, aside from, you know, maybe some of these campaign events that we mentioned up top, it doesn't seem like crypto is the axis on which the 2024 election is being fought.
Yeah, I mean, I wouldn't be shocked if a crypto question popped up at the debate. It is striking already. I mean, this is a niche, niche issue. This is not something that
random people on the street care about or know anything about. And yet you have like RFK Jr. like holding his campaign debut at Bitcoin Miami. And you've got like Trump suddenly like, you know, posting on Truth Social about how much he loves crypto. And, you know, now our reporting shows the Biden campaign is reaching out to some of these big crypto companies and saying like, let's talk. We're a little bit worried about where this narrative is headed.
And it's just an illustration of how powerful even a small amount of money can be in the political process. It's true. Yeah. It's turned this thing into a big deal when nobody was talking about it before.
And what's the best case scenario for the crypto industry here? Like if they if that, you know, if these super PACs are able to sort of, you know, help their preferred candidates win all these elections and they do end up getting a bunch of pro crypto people elected to Congress, what would actually change for the industry?
So I think there's this sort of like dream scenario, which is probably unlikely, even if all the crypto candidates win, where Congress passes a piece of legislation that basically says the SEC doesn't have authority over crypto anymore. These assets aren't securities and they shouldn't be regulated as securities. And that kind of immediately sort of neuters the SEC's whole kind of enforcement regime against crypto. And then at the same time, if Trump's in the White House –
He selects, you know, I don't know. He makes Vivek Ramaswamy the SEC chair and Vivek drops all the cases and says, you know, the SEC loves crypto now. And then suddenly the industry's problems are behind them. I think he's actually going to make a board ape the secretary of the treasury. That's how this is going to resolve. I'm curious, David, like there is this.
idea of decentralization in the crypto industry. And the idea behind Bitcoin and a lot of early crypto experiments was that you didn't need the government's approval to do any of this, right? You didn't need to have sort of a conventional lobbying effort because all this stuff was happening sort of not through the traditional banking system, but
on the blockchain. And it was kind of this global unregulatable industry for that reason. Permissionless innovation. Exactly. So like what happened to that? And does this whole sort of super PAC influence strategy kind of run counter to the core idea of crypto as a decentralized technology? I know, I mean, it's rough for Casey, who I know has a permissionless innovation tattoo across his back. He's going to have to get that removed now. But yeah, this is, I mean, look,
Over the last two years, the kind of founding ideals of crypto have been sort of undermined in all sorts of ways over and over again. And what we've seen is that it's just become a different sort of thing than this kind of radical libertarian vision that it might have been 10 years ago. And this is part of that process. I mean, you've got three giant companies funding a giant pack that's trying to like
put politicians into Congress. And like, that's how every other industry in the world operates. And it reflects a high degree of centralization. And it shows that like, really, the industry can't actually operate sort of on the margins of government. It needs to sort of be engaged in government and, you know, like control the government to some degree in order to meet its objectives. David, how likely do you think it is that President Trump, if he's elected, will pardon Sam Bankman Freed?
Well, it's interesting. I have actually discussed this question with some people around the case. One argument that he would pardon SBF is that the judge in the case, Lewis Kaplan, is the same one who oversaw the E. Jean Carroll case. And so Trump might want to just stick it to Judge Kaplan.
That's complete speculation, I should be clear. But look, I mean, SBF is a guy who likes spent heavily to support Democratic candidates and who, you know, according to Michael Lewis's book, talked about offering Trump some vast sum of money not to run. So it's hard to imagine that he's the sort of person that Trump would want to pardon. Trump has said he'll commute the sentence of Ross Ulbricht, the Silk Road guy, who's serving a life sentence and is a huge crypto coal hero.
And has there been any discussion of why he would do that, given that the Silk Road was used for a lot of crimes? So there's a sense in the crypto world that Ulbricht was kind of an early crypto pioneer. Like, he found a use case. He sure did. Drugs and guns and hitmen. That was basically the use case that he discovered. Yeah. And there's also a feeling that, like,
Maybe he was guilty of some crimes, but the two life sentences that he received, it was an over-the-top punishment. So that's kind of the argument. So you see people with free Ross Ulbricht Twitter hashtags and that kind of thing. I actually don't know the answer to this.
Are you allowed to donate cryptocurrency to a campaign? Can you just make an in-kind donation? Can you donate your NFT to a campaign? Can you donate your Solana coins to a campaign? What are the actual rules around crypto and specifically funding crypto?
campaigns. You know, Kevin, I don't think your stash of leftover Dogecoin from 2020 is going to affect the political process much. But no, the Trump campaign is accepting contributions in crypto. That was one of the kind of
pro-crypto moves that Trump has made recently. But there are all sorts of complicated disclosure issues around that that are going to have to get ironed out. Well, it has to be weird because there's like a limit to the amount of money that you can donate, but you're donating using this currency whose price fluctuates constantly. Yeah, that's a good question. Yeah, I don't know the answer to that. That is a good point. Well, why did we bring you on then?
No, I'm just kidding. Dave, you have to be Melanie. Thanks for coming back. Thanks, David. Thanks for having me. Puerto Rico. Where's the sunblock? Bye. My friend.
Indeed believes that better work begins with better hiring. So working at the forefront of AI technology and machine learning, Indeed continues to innovate with its matching and hiring platform to help employers find the people with the skills they need faster. True to Indeed's mission to make hiring simpler, faster, and more human, these efforts allow hiring managers to spend less time searching and more time doing what they do best, making real human connections with great new potential hires. Learn more at indeed.com slash hire.
Before we go, just a note, if you want to hear more about the Surgeon General's call for a warning label on social media platforms, The Daily has an episode out today featuring an interview with the Surgeon General Vivek Murthy himself. So go check that out if you want to hear more.
Heart Fork is produced by Whitney Jones and Rachel Cohn. We're edited by Jen Poyant. We're fact-checked by Caitlin Love. Today's show was engineered by Daniel Ramirez. Original music by Alicia B. Etube, Mary Lizano, Rowan Nemisto, and Dan Powell. Our audience editor is Nelga Logli. Video production by Ryan Manning, Sawyer Roque, and Dylan Bergeson. Check us out on YouTube at youtube.com slash heartfork.
Special thanks to Paula Schumann, WeeBink Tam, Kate Lepresti, and Jeffrey Miranda. You can email us at hardfork at nytimes.com. Be accepting birthday wishes all week. And also crypto donations, for sure. Imagine earning a degree that prepares you with real skills for the real world. Capella University's programs teach skills relevant to your career, so you can apply what you learn right away. Learn how Capella can make a difference in your life at capella.edu.