cover of episode Disinformation Warfare, Part Two - America | Investigation

Disinformation Warfare, Part Two - America | Investigation

2024/4/30
logo of podcast True Spies: Espionage | Investigation | Crime | Murder | Detective | Politics

True Spies: Espionage | Investigation | Crime | Murder | Detective | Politics

Chapters

The episode explores how Russian disinformation tactics, particularly through the Internet Research Agency, influenced American politics, leading to internal conflicts and protests.

Shownotes Transcript

This episode is brought to you by Shopify. Forget the frustration of picking commerce platforms when you switch your business to Shopify, the global commerce platform that supercharges your selling wherever you sell. With Shopify, you'll harness the same intuitive features, trusted apps, and powerful analytics used by the world's leading brands. Sign up today for your $1 per month trial period at shopify.com slash tech, all lowercase. That's shopify.com slash tech.

Disclaimer, this episode contains strong language. This is True Spies, the podcast that takes you deep inside the greatest secret missions of all time. Week by week, you'll hear the true stories behind the operations that have shaped the world we live in. You'll meet the people who live life undercover. What do they know? What are their skills? And what would you do in their position?

I'm Rhianna Needs, and this is True Spies from Spyscape Studios. I think a couple of years ago, I would have said that Russian disinformation is the defining issue of our era, that Russia and the United States are engaged in a virtual Cold War. But now I really think our own information environments in the West have degraded to the point that we are our own worst enemies.

Disinformation Warfare, Part 2, America. July the 4th, 2017, Washington, D.C., Independence Day. It was a stifling summer's afternoon as the White House gleamed in the sun. Donald Trump was president. The atmosphere was celebratory.

as tourists, onlookers and locals, mixed with a huge group of people preparing to sing parodies of the musical Les Miserables. I love musicals, I do, even to this day, although it has caused me a lot of grief in my life. The group breaks out into Do You Hear the People Sing? A Call for Revolution. But the lyrics had been altered. Instead of There is a life about to start when tomorrow comes...

The protesters sing, "There is a life about to start when impeachment comes." In song, they protested the incumbent president's politics, his rhetoric around women and minority groups, and his controversial tweets. The event had been organized by a man called Ryan Clayton, an activist and leader of the creative protest group Americans Take Action.

He's quite tall. He's kind of classically American, handsome guy, blue eyes, curly brown hair, a little bit hippie-ish the time that I met him. But I've seen other pictures of him where he's in a suit and tie. So he's a little bit of a chameleon in that way. Ryan spent close to two decades in politics as a campaign manager before transforming into an events organizer, finding unique and fun ways for people to protest and have their voices heard.

The Les Mis Flash Mob, as they came to be known, was a spur-of-the-moment event. Ryan had jumped on Facebook and in minutes created an event page: White House Independence Day Come Sing With Us. People were off work for the national holiday and it was a beautiful day. The people picnicking nearby strolled over to see what was happening. Many onlookers thought it was funny.

Ryan had, again, done so many protests over the years, and he really felt that despite, you know, this being an interesting event, that it was one that something strange was happening there. Ryan didn't notice anybody suspicious, except those attired in American revolutionary outfits performing show tunes. But he was shocked by how many people attended. Then he remembered something.

Three days before the event, he and several other protest groups, Sing Along Solidarity, Americans Against Trump, and ReSisters, had been contacted by a woman named Helen Christopherson. She had written to Ryan saying, Hey, we've got some money on our Facebook ad account. We'd like to spend to target this protest to people in the D.C. area. Clicking on her profile, Ryan noted that it looked like she lived in New York,

was originally from Charleston, South Carolina, and had been on Facebook since 2015. She looked legit, but her English was a little off-kilter. And Ryan and his collaborators were like, well, somebody wants to give us free advertising money. But there was a catch. Helen needed to be made an administrator of the event page. Ryan agreed. As long as these aren't like politicians for killing puppies or something like that.

With the social media advertising boost from Helen, more people attended. The filming of the protest also helped inspire countless copycat singing flash mobs all over the country. It showed the power that a hastily created Facebook event page could wield.

More than a year later, I started getting a flurry of messages from Russia watchers about a criminal complaint that had been launched by the Secret Service. And in that complaint, they noted that there had been a musical theater protest outside of the White House. And that sounded very familiar to me. Ryan, the activist leader, was named as the organizer of the protest in the complaint.

He basically was shocked. He had no idea that Russia had somehow infiltrated his group, given them money to buy ads to get more people to their protest, and essentially provided a vehicle for Russia to influence politics in the United States. This was an intelligent, politically-minded man. And even he became a victim of the Internet Research Agency in St. Petersburg, aka the Troll Farm.

Last time on True Spies, we delved into the troll farm and how it operated within Russia itself and its neighboring nations, spreading disinformation, pulling apart societal divisions, smearing anybody it deemed to be the enemy. The branches of the troll farm had now spread very far indeed through social media, targeting both sides of the political spectrum.

The culmination and effects of its actions would contribute to one of the most shocking and unprecedented attacks on America's democratic norms and institutions, not by Russians or outside actors, but by Americans themselves. Earth's most powerful nation was about to navigate some incredibly treacherous waters. But how did this happen?

Guiding us through the journey from Russian troll farm to complete chaos in American democracy is our expert. My name is Nina Jankowicz and I'm an author. I wrote two books, How to Lose the Information War and How to Be a Woman Online. For the better part of the past decade, I have been researching Russian disinformation. I ended up working at the National Democratic Institute, which is an NGO that does democracy support around the world.

This is the second of a two-part True Spies special, focusing on Russian information operations and their influence on our lives, both online and off. Listen back to part one, if you haven't already, to learn all about how the troll farm came into being, the information war it waged, and how it was used as a precursor to physical attacks on countries such as Ukraine. But let's begin where we left off in part one.

in Ukraine. In 2014, Russia's influence on the country had suffered a setback with the Maidan uprising. The year before, Ukrainian President Yanukovych reneged on a pledge that would bring the country closer to the EU. The subsequent protests would see him flee to Russia. The undoing of his constitutional amendments was declared the "revolution of dignity."

But Russia quickly regrouped and annexed Crimea. So after Euromaidan, these big protests in the center of Kyiv and across Ukraine, along with the illegal Russian annexation of Crimea, there cropped up a lot of civil society organizations and operations that were meant to counter disinformation. And some of them were fact-checking operations.

Russia was widening the information battlefield, focusing its attention on the east of Ukraine. Many separatist groups and ethnic Russians were driven onto the streets by what they read on social media: that their Ukrainian government had abandoned them, and only Russia could provide the life they longed for. The Kremlin's goal to undermine Western-style democracy by sowing mistrust and causing its internal decline was working.

Putin needed to show his people and the surrounding former Soviet countries that democracy didn't work. Citizen groups, funded by several Western governments in Ukraine, were being held up as the antidote to "fake news," the creation of which was the bread and butter of the troll farm in St. Petersburg. All of this fact-checking, while admirable, just really seemed to be missing the mark.

Here, Nina is referencing something called the implied truth effect.

According to Yale researchers, debunked labels added to social media articles and news websites bizarrely led to an increased perception of accuracy in stories which were not labeled. In terms of changing opinion, the psychological research about fact-checking over decades, going back to the 1970s, shows that actually it tends to make people remember the wrong information more than it helps them correct their perception.

In eastern Ukraine, Kremlin-backed trolls pulled at ethnic tensions and spread disinformation about sensitive topics, such as the removal of a Soviet statue, and encouraged pro-Russian separatists to demand action from Moscow. The Kremlin had played these games before and learned many lessons in perfecting their information warfare.

In Estonia in 2007, the troll farms stoked tensions between Estonians and the ethnic Russian population before launching a cyber attack, crippling the country's technological infrastructure. In 2008, Georgia suffered cyber attacks, economic interference, cultural manipulation, an all-out disinformation war and a physical invasion.

In Poland in 2010, conspiracy theories were amplified by Russia surrounding the wreckage of the plane crash that killed its president. And then on to Ukraine. Just by nature of the way that fact-checking works, you're kind of leaving a vacuum, too. You are allowing the adversary to set the narrative.

The Kremlin's efforts to destabilize these nations, thanks in large part to the troll farm, were quick, effective, temporarily anonymous, and, set against the cost of traditional warfare, extraordinarily cheap. One by one, these countries suffered chaos through foreign interference. And then, of course, Euromaidan and the Revolution of Dignity happened in Ukraine, and that's when I really felt a

a calling to go out into the field. And so I did a Fulbright Fellowship in Ukraine advising the Ministry of Foreign Affairs there. This region was extremely poorly understood and that the West always thought of it as this backward, gray, Soviet place, when in reality, the entire region has been dealing with some of the really intractable problems that we find ourselves saddled with now for much longer than we have.

The troll farms became adept at really drilling down to see what worked with audiences. It was the fissures, the real grievances in society that were being manipulated by Russia. And I felt that in order to really solve the disinformation crisis, we needed to start looking at that instead of just playing what I called whack-a-troll and trying to either remove fake accounts or correct false narratives online.

Do you ever get to the end of the month and find that you're still paying for that streaming service you don't use anymore?

After downloading Rocket Money, I don't need to worry anymore about managing hidden, lingering subscriptions that are preventing me from saving. It's given me an overview of all my subscriptions in one place. Total visibility and control on everything. There's a built-in budgeting feature where you can track your spending and set allowances month to month. It's even notified me when a subscription is changing in price.

Rocket Money is a personal finance app that finds and cancels your unwanted subscriptions, monitors your spending, and helps lower your bills so that you can grow your savings. Rocket Money has over 5 million users and has saved a total of $500 million in cancelled subscriptions, saving members up to $740 a year when using all of the app's features. Stop wasting money on things you don't use.

Cancel your unwanted subscriptions by going to rocketmoney.com/spyscape. That's rocketmoney.com/spyscape. rocketmoney.com/spyscape. The fall of the Soviet Union promised many things to the likes of Georgia, Estonia, and Ukraine, not least Western-style economic prosperity. When that failed to materialize, life became hard for many.

In those former Soviet countries, swathes of people became disillusioned with their new governments, societies and identities. They believed that life had, in fact, been better under communism. It's these thoughts and feelings that Russia is so adept at exploiting, using them as motivators to disrupt, even as precursors to war.

These countries also warned the rest of the world to ignore Russian disinformation strategies at their own risk. But the U.S. wasn't listening.

I think we just thought in America that our system was stronger than disinformation, that we were able to tell truth from fiction because we had checks and balances and we had this robust media ecosystem. And it all sounds ridiculous now in hindsight, but I don't know if it was that we thought we knew better or that we thought we were inoculated.

Part of the effectiveness of the troll farm came from the fact that it was hard to prove where the content it produced had come from. A lot of the social media accounts created in St. Petersburg, including the amplification of adverts and events pages, looked like they belonged to the locals of the country they were focusing on. That authenticity, mixed with very simple, emotive messaging, became a winning formula.

They'd figured out what works. And sometimes they failed. Sometimes they fell flat-footed entirely. But they did strike gold a couple of times and understand that if they were going to hasten polarization, if they were going to turn Americans against one another, the best thing to do was identify fissures in society and really poke at both sides of them. Russia's disinformation toolkit was well honed as it traversed the Atlantic.

Ahead of the 2016 election battle between Democrat Hillary Clinton and Republican Donald Trump, the farm immediately saw results by using an unusual tactic, playing both sides of the political spectrum.

I believe at one point there was a protest, a real life protest that Russia had supported in Texas with pro-gun people on one side of the street and anti-gun people on the other side of the street. And Russia had supported both sides and sent them both there.

The event Nina is referencing here was to do with a Russian-controlled Facebook group called Heart of Texas that promoted secession from the United States, often accompanied by content that projected the pride of living in a land that prizes the right to bear arms. The page amassed hundreds of thousands of followers.

And in May 2016, those followers were invited to a Stop Islamification of Texas rally. People wanted to feel that community. They felt, OK, here's a trusted page. They've sent me some good memes in the past, so why shouldn't I put my name on this petition? Why shouldn't I become involved with them offline as well? At exactly the same time,

Another Russian-sponsored Facebook page called United Muslims of America advertised a Save Islamic Knowledge rally in the exact same location. This example in Texas only cost the Russians $200, and the potential for conflict and harm was absolutely real.

The troll farm was creating identity groups and community pages at an infernal pace, racking up six-digit followers with calls to support certain politicians and protest against their neighbors and those in their own communities. This was in contrast to funny video posts and harmless-looking memes.

My favorite one was this golden retriever with an American flag bandana and a little American flag. And it said, like, if you think it's going to be a great day. And it cultivated this sense of community and then gradually moved the needle. So making bigger and bigger asks, not just like if you think it's going to be a great day, but if you support Donald Trump for president, change your profile picture.

The very first Facebook ad that the Internet Research Agency bought asked people to "make a patriotic team of young Trump supporters by uploading photos with the hashtag #KidsForTrump." The combination of seemingly innocent calls to action with emotionally charged content mixed with the technological innovation of social media was a powerful cocktail.

If you support Donald Trump for president, buy a T-shirt from our store. Who knows where that money went? Religion, race, sexual orientation, employment status, political positions, even hobbies and interests were powerful motivating factors in getting people to harden their pre-existing beliefs.

Russia has hastened the polarization and kind of the fissures in our society widened those gaps to the extent possible, but they all existed beforehand. Over 30 million Americans shared content created by the farm. Many, many millions more liked or interacted with the content. They had gay rights pages. They had Latinx pages. It really ran the gamut. People wanted to feel that community.

A group calling itself Blacktivist, run by the farm, had over 500,000 followers, more than the official Black Lives Matter page at the time. It supported rallies, uploaded videos about police brutality, some of it designed to shock, sold merchandise and engaged users on Facebook Messenger. Of the 470 groups the troll farm created, this was one of its most popular.

One group called Black Matters asked its followers to wear black, fight back. These groups manipulated so-called filter bubbles, the phenomenon of social media platforms providing personalized information to their users that conform to their existing beliefs or patterns of online behavior, isolating people from differing viewpoints. This is called white jamming.

Russia really knows its audience and understands that there are certain things that really piss people off or get them out of their seats and encourage action. And so it's not like it was trying to target staunchly Democratic voters with a Vote Trump meme. Over 67,000 Facebook posts, not adverts, just posts, were created by the troll farm during the run-up to the election.

On Instagram, this figure was nearly doubled to over 116,000 posts. Disinformation changes people's behavior. It's changing how they conceive of certain candidates, events. The 2016 U.S. election between Donald Trump and Hillary Clinton was very close. It was just 70,000 votes across a couple of key swing states. But the whole event was mired in controversy.

The pejorative term "fake news" discredited candidates. Claims of voter fraud abounded. Democratic National Committee emails were hacked and allowed to spread on social media. The entire time, rumors of Russian interference bubbled below the surface, including suggestions that Donald Trump's associates had colluded with the Kremlin. But there was an even more serious issue afoot.

The other thing that we saw also is the attempt to suppress certain voters, especially in the Black and Latinx community. There was a lot of messaging to those voters that said, "None of these candidates represent you or your needs. Why should you go vote?" These voters historically tended to vote Democrat. If they didn't turn out, the Republican Party could gain an edge.

Again, it's creating this trusted community and then maybe changing behavior that doesn't even have to do with what you mark at the ballot box, but whether you turn out at all. Trump became president. But instead of pushing ahead with measures that curbed foreign interference, Nina says he decided to move in a different direction.

Trump normalized disinformation and more plainly kind of normalized lying. Trump's behavior created basically open season for politicians across the spectrum to have a loose relationship with the truth at best. For Nina, that began at his inauguration when Trump's team claimed it was the largest audience ever to witness an inauguration, period.

All evidence suggested that it wasn't. This was also a tactic we saw Vladimir Putin use in part one of this series. When Russia illegally annexed Crimea in Ukraine, he first denied his soldiers were anywhere near the region, despite all evidence pointing to the contrary. He later made an about turn, saying that they were there. Facts, it appeared, simply didn't matter anymore.

According to Nina, this white jamming phenomenon was at play in the U.S., making people feel like there was a lot less incentive for them to participate in the democratic system. And it usually gets back to those basic grievances where they don't feel their opinions are represented and somebody who's sharing the disinformation feels like they're listening to them.

Later, the Senate Intelligence Committee concluded that the Russian Federation had targeted all 50 state voting systems with cyber attacks.

Not only was the Kremlin targeting American democracy, but also the very apparatus people used to exercise those liberties. So the cyber attacks, luckily, in 2016 and then in subsequent elections, were never successful. There were instances in which Russia accessed voter rolls, but they weren't able to make any changes, and they certainly were not able to affect the vote count. In total...

Four investigations and committees collectively confirmed Russian interference in the US election. The Republican-led Senate Intelligence Committee confirmed meddling. The House Intelligence Committee released samples of Facebook ads by the Internet Research Agency, claiming that Russia had "brazenly interfered" with the US election, without explicitly alleging collusion with Trump or his team.

During these hearings, the FBI, the Federal Bureau of Investigation, confirmed they had also been investigating interference since 2016 with the Department of Homeland Security. And then there was Special Counsel Robert Mueller's investigation. More on that in a minute. We were now two years into Trump's presidency.

And before Mr. Mueller's report was released, a criminal complaint in his investigation was unsealed. It showed the Troll Farm's budgets in St. Petersburg, correspondence between employees, and the types of activities it was promoting on social media in America. This included the 2017 anti-Trump Les Mis singing flash mob, with which we started our episode.

The complaint stated that the farm had spent $80 to promote the event, targeting those within a 30-mile radius of Washington, D.C. Adverts for the flash mob were seen by between 29,000 and 58,000 people.

Ryan Clayton was named as its chief organizer. He led a group called Americans Take Action, and they had done a lot of creative protests throughout the early days of the Trump administration. They stood up at inauguration with the word resist spelled out on all their shirts. Not long after the inauguration, Ryan handed out Russian flags with Trump written in gold letters at a conservative political action conference.

He and his fellow Americans Take Action members were kicked out, but not before many attendees started waving those flags in front of the cameras. They unfurled a resist banner at the Washington baseball team, the Nationals' home opener from the top of the stadium. For Ryan's Les Mis event, he transferred control of the Facebook event page to what looked like an American named Helen Christopherson.

who apparently lived in New York. Her written English wasn't perfect, sure, but not every native speaker uses standard grammar. Helen had money to advertise and promote the event on social media, but she was not who she said she was. The complaint suggested that she was actually working at the troll farm in St. Petersburg. Not content with simply meddling in elections, the Kremlin appeared to want unmitigated chaos.

I think this is what's important about this particular example. These are people who were anti-Trump, and it was after the 2016 election. It was six months into Trump's term. So it meant that Russia was still at it, and they were trying to really rip that fabric of American political society apart to turn people against each other. Targeting rallies was an effective way of getting Americans out of their houses and onto the streets. A Confederate rally in Texas was provided funding.

Pro-Trump rallies of coal miners in Pennsylvania were promoted and given supplies. Racial and historical divisions were amplified around a neo-Nazi rally in Virginia at the site of the statue of Confederate General Robert E. Lee, a slave owner who fought to preserve bondage and desired secession from the United States. As we saw in episode one, threats against monuments and statues are powerful motivators.

In Eastern Europe, Russia had manipulated local proxies to disseminate divisive content through believable channels. This was now being repeated in America. For just a few dollars, Russia could infiltrate a pro-gun group and put them in the same location at the same time as a group promoting peace.

It took over two years to collate this evidence and confirm that the troll farm had falsified so many social media accounts. Its impact, however, is hard to quantify. I think it's certainly plausible that it could have had an effect on the outcome of the 2016 election.

In 2019, the full Mueller report, officially titled "The Report on the Investigation into Russian Interference in the 2016 Presidential Election," confirmed the meddling, stating that there were "multiple, systematic efforts to interfere in our election," and that allegation deserves the attention of every American.

It concluded that Trump associates did not collude with the Russians on interference, but that his administration did try to obstruct the investigation 10 times. The president later staunchly rejected the report using the term fake news.

I think Trump and some other Republicans believe if it benefits them, then they're happy to let Russian disinformation or any other disinformation happen. Putin and his trolls have actually helped me. They've fractured the Democrats. They've made sure that we are able to continue our policies, whether it's in Russia or Ukraine or whatever. It's a good thing for us. And they're not thinking far enough in the future.

We started this episode talking about the lack of a cohesive response to Russian disinformation in Eastern Europe. America was not listening to the experiences of those countries and was caught flat-footed in much the same way.

Two years after the election, the interference, the chaos on the streets, the first Facebook accounts were deactivated. The Department for Homeland Security and the FBI notified the social network of more than 100 people it traced back to the Internet Research Agency in St. Petersburg. By this point, those accounts had already done their job.

But the fruits of Russian disinformation were really ripening around this time.

At the beginning of the war in Ukraine, Russia was using what we call inauthentic amplifiers. So trolls, advertisements bought under the guise of being local when in fact they were bought from a troll farm in St. Petersburg. Stuff that was pretty rudimentary, I would say, perhaps with a deep understanding of whatever culture they were trying to affect, but still pretty rudimentary. Now, in America...

It had advanced to activating authentic amplifiers. Those in power, real people, sharing mis- and disinformation in the mainstream, as we'd see in the extreme during the 2020 election and in Trump's final year in office. And in the midst of the coronavirus pandemic, new complexity surrounded the president's relationship with disinformation, even if it wasn't coming from Russia.

On the 23rd of April 2020, President Trump was holding a coronavirus task force briefing at the White House. Going off script, he suggested that scientists take a look at using disinfectant in the body as a treatment for the virus. I said, you know, we can't fact check our way out of the disinformation crisis, this crisis of truth and trust that we face.

Health organizations repeatedly implored people not to drink bleach and other disinfectants if they had COVID or symptoms of COVID.

The infodemic, as it came to be known, started to show that mis- and disinformation could be a threat to health and even life. I was appointed to a role within the Biden administration that was going to be coordinating disinformation policy at the Department of Homeland Security. And people, mostly on the right, but some on the far left as well,

said that I was going to be the minister of truth from George Orwell's 1984 and that I would have the power to censor Americans and to send men with guns to their homes if I disagreed with them. That's what Tucker Carlson said about me. It wasn't true. That led to death threats and violent threats against me and my family when I was a couple weeks away from becoming a mom, of all things. In this world, the jump from online to reality is a very short bridge.

Nina has said as much to those who she accuses of lying about her in public. And I did not mince words when I told them, I think what you're doing, you're lying for power and for profit. In order to shore up political fundraising, in order to shore up your political future, and putting lives at risk.

But when Nina began her role within the Biden administration, even talking about disinformation was so polarizing that it was nearly impossible for her and her team to make progress on the issue. People like me were caught in the crosshairs because it was just such a politically charged issue that was something that the Republicans could really win on. Claiming that there was censorship of conservatives and that this whole Russiagate scandal was a hoax to remove Trump from office.

COVID saw all of us spending more time living online. During this time, swift and deadly transmissions of disinformation were prominent.

QAnon conspiracies, including the debunked claims that leading Democrats were part of a satanic child sex trafficking ring, white supremacy and replacement theory, the idea that whites are being replaced through cultural and demographic shifts, the recycling of ancient anti-Semitic tropes,

The notion that climate change is a hoax and that George Floyd, whose murder in 2020 by a police officer inspired the global Black Lives Matter movement, was an actor. None of the stuff that we've seen in the past eight years is solely a result of Russian disinformation. But I certainly think there are moments where Russia is patting themselves on the back and saying, we don't need to do very much anymore. America's going to do it all to themselves.

Hello, True Spies listener. This episode is made possible with the support of June's Journey, a riveting little caper of a game which you can play right now on your phone. Since you're listening to this show, it's safe to assume you love a good mystery, some compelling detective work,

and a larger-than-life character or two. You can find all of those things in abundance in June's Journey. In the game, you'll play as June Parker, a plucky amateur detective trying to get to the bottom of her sister's murder. It's all set during the roaring 1920s,

And I absolutely love all the little period details packed into this world. I don't want to give too much away because the real fun of June's journey is seeing where this adventure will take you. But I've just reached a part of the story that's set in Paris.

And I'm so excited to get back to it. Like I said, if you love a salacious little mystery, then give it a go. Discover your inner detective when you download June's Journey for free today on iOS and Android. Hello, listeners. This is Anne Bogle, author, blogger, and creator of the podcast, What Should I Read Next? Since 2016, I've been helping readers bring more joy and delight into their reading lives. Every week, I take all things books and reading with a guest and guide them in discovering their next read.

They share three books they love, one book they don't, and what they've been reading lately. And I recommend three titles they may enjoy reading next. Guests have said our conversations are like therapy, troubleshooting issues that have plagued their reading lives for years, and possibly the rest of their lives as well. And of course, recommending books that meet the moment, whether they are looking for deep introspection to spur or encourage a life change, or a frothy page-turner to help them escape the stresses of work, or a book that they've been reading for years.

The 2020 election saw repeat themes of the 2016 election.

But along with Russia's amplification of disinformation, adverts containing false information, and the organizing and supporting of rallies, one method in particular was proving to be very successful. What we've seen is Russia using more techniques of information laundering. And this is where they find someone who is able to take a narrative that might be based

you know, at its core, in fact, to twist it a little bit, perhaps to introduce documents that have been altered or perhaps hack some documents that weren't public originally, who then introduces it into the mainstream. And so your figures like Rudy Giuliani, for instance, are especially vulnerable to this. Rudy Giuliani, Donald Trump's attorney, took a particularly active role in amplifying allegations of fraud in the 2020 election.

without providing any evidence. We can't do much about that because Rudy Giuliani can go on Fox News or OANN and he's perfectly allowed to do that. There's nothing illegal about what he's doing. And he's protected by the First Amendment. The First Amendment of the United States Constitution, created not long after the Declaration of Independence, guarantees the right to free speech without government censorship or interference.

One consistent narrative that was brought to public attention by Rudy Giuliani was to do with Hunter Biden, Joe Biden's son. According to Rudy Giuliani, Hunter's business dealings with a Ukrainian energy company involved corrupt activities.

This was followed by a series of allegations of misconduct in Hunter Biden's business affairs centered around incriminating emails that were found on a laptop apparently owned by Hunter. And this story came to Trump through Giuliani. Over 50 ex-intelligence officials suspect the origins of the story exhibit characteristics of a Russian information campaign.

The claims about Hunter Biden have been widely debunked, or the reporting around the story deemed unreliable. But Giuliani pushed the idea of corruption among those linked to the Democratic Party.

That's the sort of stuff that we are seeing ahead of 2020. Again, a lot less related to overt bots, trolls, inauthentic amplifiers, ads in Russian rubles, and more of this circuitous attempt to influence the narrative.

And it makes it so much more insidious when that's happening because it's not an account that you can remove. It's not something that you can de-amplify on a platform. And some people are going to buy it. And that's just the way democracy works sometimes. Trump then allegedly used this narrative to call Ukrainian President Volodymyr Zelensky, exerting pressure on him to initiate investigations into the activities of Hunter Biden.

although trump was cleared of any wrongdoing in his first impeachment trial around this phone call then on the 6th of january 2021 after losing the election to joe biden trump gave a speech in washington in which he told his supporters to fight like hell to overturn the election results and told them to march to the u.s capital i hoped

With COVID, with January 6th, we had all these off ramps where we could have come together and said, this is where it stops. You know, people are taking horse medicine. People are dying because they believe in these conspiracies. And people doubled down on all of those narratives. Insurrectionists, after hearing months of Trump's repeated rigged election rhetoric.

violently overran the Capitol building, a symbol of democracy in America, where laws are made and politicians represent their people. Thousands attended the rally. Hundreds stormed the building. Many were armed. People went into the seat of government and destroyed property and smeared poop on the walls. And yet that was not a warning somehow.

The riots were the sudden and violent explosion of a pressure cooker blowing its lid. And I think we have seen since January 6th

a penchant on Russian television and other Russian media to say, oh, these poor protesters, they were just peaceful protesters in the Capitol and they're being unfairly treated by the Biden administration. There was a lot of amplification of the Trump narrative that the 2020 election had been stolen. America's institutions were under attack from its own people.

From the chaos of the insurrection, a striking photo emerged: a shirtless, American flag-waving QAnon shaman with fur hat and horns and a spear, sitting in a vice president's Senate chair. The chair where the rule of law, freedom of speech, and democracy is presided over was being sat in by a supporter of a baseless conspiracy theory.

Five people died during the attack. Trump was charged with incitement of insurrection and was impeached for a second time. He was then acquitted for a second time. But the effects of January 6th will be felt in America for generations. The Russian troll farm that planted the seeds of an alternate reality had seen those seeds sprout into chaos and democratic instability.

The locus of all of this change is social media, the place where people are getting their information, reading news articles, watching videos, talking to friends, organizing singing flash mobs, connecting with communities, meeting to riot. But what exactly is driving these platforms to be so relaxed in curbing foreign interference?

So money, I think, is driving a lot of the disinformation that we see. We have the social networks, which are private companies, and they have shareholders to answer to, and those shareholders want to see these companies make a profit. And so if we know that the most engaging material is the most enraging material online, what incentive do social media companies have to moderate disinformation?

There's a great irony that the world's biggest social media networks are American. Russia can give these companies tiny amounts of money and use them to influence, amplify and coerce Americans. But the laws around social networks spreading disinformation remain rudimentary.

So the first bill that was on the table way back in 2017 was something called the Honest Ads Act. And it was sponsored by a Democrat and a Republican. And to my knowledge, that bill has never made it out of committee. The bill was intended to introduce simple regulation, like there is for print, radio and TV ads, about who could buy political ads online or ads that support a particular candidate.

The bill faced a filibuster, a delaying tactic used to prevent a vote from Senate Republicans blocking its advancement. Easy, low-hanging fruit there. And even that was too political to get out of committee. America has struggled to coordinate a response to these intrusions, much like we saw in Ukraine. And the many homegrown actors who had taken it upon themselves to spread disinformation make it especially difficult to handle.

There have been other efforts to regulate social media. Very few of them get anywhere near the floor of the House or the Senate. Still, to this day, even the things that we should be able to agree on politically are not things that are legislatively taken care of. And because Elon Musk bought Twitter and because of this whole narrative that

There is some collaboration between governments and researchers and the platforms to censor conservative voices. All of the platforms are rolling back all of the election security measures that they had in place. The information ecosystem has only deteriorated further, says Nina, and lawmakers are effectively unable to impose rules that protect its users.

The major social media companies are mostly based in the United States. So Nina believes that's where legislation needs to change first. We haven't seen the US really stand up in terms of regulating what's going on online. And now we're playing catch up to the European Union and the UK and their online safety bill.

In 2018 and 2019, technology companies increased transparency around political advertising and enforced stricter regulations around who can buy them. Because of this, you could no longer buy adverts on the major American social networks in Russian rubles. Thinking back to basics of democracy, you know, sending...

Representatives to listen to their constituents and really make change based on that rather than governing based on outrage is, I think, a really important thing to do. And solving those grievances that allow disinformation to be so successful in the first place. That's what I mean by citizen-centric approaches to disinformation.

But can any administration justify censoring the supposed authentic expressions of its own citizens and not impinge on their First Amendment rights? Russia has always relied on advances in technology to help shift the playing field. The next wave of technological innovation will surely see artificial intelligence playing a role.

AI is pretty scary when it comes to large language models and the way that they can put out information that is not true at scale. If you're using something like ChatGPT 3.5 or 4, they've got checks in place where theoretically a Russian troll or an Iranian troll would not be able to say, "Hey, write some falsehoods about Hillary Clinton for me." Throughout the elections of 2016 and 2020,

Social media saw lots of memes and posts and adverts with spelling mistakes, grammatical errors and odd phrasing.

As inconsequential as this may seem, it actually gave investigators easy clues that the content wasn't written by those with strong English language skills. A lot of the linguistic indicators that we relied on earlier to identify foreign disinformation aren't going to be there anymore because it's in pitch perfect English, even in, you know, with some colloquialisms that are inserted in there based on kind of the way that people write and speak online.

AI is also able to generate deep fake content. Hyper-realistic, utterly convincing photos or videos or audio depicting individuals engaging in actions or making statements that aren't true. That's my biggest worry. The way is that the inherent biases in our society are reflected out by AI and can be manipulated by those who are in power, whether that is outside of our borders or within.

Numerous challenges we've explored in these two episodes have had implications that have reverberated around the globe. The world's gaze is on America to see what happens at the next election.

I'm certainly worried with attacks on people who study disinformation up and down the board, with trust and safety staff getting fired from major platforms and the bar going lower and lower and lower for what it is that politicians and platforms need to do around elections. And I think we've proven that we're not inoculated to disinformation and in fact, we're not.

We're happy to gobble it up if it confirms our pre-existing biases. Without solving our internal problems, we're not going to solve the external ones either. And I think those have actually gotten far worse. If you look at where the United States and a number of other democracies are ahead of the 2024 big election year. But there is hope.

If I didn't think there was hope, I wouldn't be in the game anymore. And especially after myself having gone through some pretty horrible disinformation campaigns. But I believe there is something worth fighting for. And our systems, to some extent, have...

I wouldn't say they have shined, but they have kept us from complete ruin. And I think now we need to reckon with what democracy looks like in the internet age and hopefully claw our way back to some semblance of cooperation and compromise because that's what democracy is meant to be about.

There are countries out there now who have reliable models of protection against disinformation. Russia's neighbor, Finland, has education-based counter disinformation programs. As a societal whole, they are more resistant to the tricks employed by their neighbor. There is also another country that we can look to, despite the tragedy it's currently facing, that could be a path to finding solutions.

I'm Rhianna Neitz.

Join me next week on True Spies. Disclaimer. The views expressed in this podcast are those of the subject. These stories are told from their perspective and their authenticity should be assessed on a case-by-case basis.

If you're enjoying this podcast, please click now to give it a five-star rating or leave a review. Ratings and reviews help people discover the podcast and help us bring you more great stories. And if you have some time, why not forward the podcast to a friend?

Aaron learns that two Chinook helicopters carrying units from the U.S. Navy's SEAL Team 6 had approached Takurgar Mountain, a strategically crucial peak offering near total control of the valley below. It's snowing, low visibility. Having received the all-clear, the SEALs prepare to touch down and secure the site. As the first of the Chinooks flares to land, though...

They encounter some resistance. An RPG round hits just behind the cockpit, starting a fire in the cabin. Another hits the right side radar pod, blowing out all electrical power to the helicopter. Taking evasive action, the pilot tilts the Chinook to one side. Such that a Navy SEAL by the name of Neil Roberts falls out of the back and lands in the snow. And so they leave him there.

True Spies from Spyscape Studios. Search for True Spies wherever you get your podcasts.