Ryan Reynolds here from Int Mobile. With the price of just about everything going up during inflation, we thought we'd bring our prices
down. So to help us, we brought in a reverse auctioneer, which is apparently a thing. Mint Mobile, unlimited premium wireless. How did you get 30, 30, how did you get 30, how did you get 20, 20, 20, how did you get 20, 20, how did you get 15, 15, 15, 15, just 15 bucks a month? Sold! Give it a try at mintmobile.com slash switch. $45 upfront payment equivalent to $15 per month. New customers on first three-month plan only. Taxes and fees extra. Speed slower above 40 gigabytes each detail.
An underrated aspect of this year's elections, the British election was on July 4th and the American election is on November 5th, a.k.a. Guy Fawkes Day. That's true. Oh, yeah. Fireworks for Britain in America and then fireworks for America in Britain. Okay, well, somebody just described our last round of polling over the next few months as the finale to a fireworks show. So I feel like that works out really well. That's good. I love that.
Hello and welcome to the FiveThirtyEight Politics Podcast. I'm Galen Druke, and we are exactly two months away from Election Day. In fact, early voting in Pennsylvania, the likeliest tipping point state, begins in just a week and a half, September 16th. Folks, it is election time.
To help prepare for the final post-Labor Day stretch, we're assessing the presidential election from three perspectives. How we got here, where we are now, and what we should expect over the next two months. We discussed where we are and how we got here in the last podcast episode, so if you missed it, you may want to go back and listen to that first. Today, we're going to talk about what to expect over the next two months, how much
Polling usually changes between Labor Day and Election Day. Sources of polling error, debates, and of course, October surprises. Joining me again are senior elections analyst Nathaniel Rakich and survey editor at The New York Times, Ruth Egelnik.
Let's dive right in. So election watchers, probably us included, treat Labor Day like a bright dividing line in the campaign calendar. Before Labor Day, we caveat most things with it's too early. After Labor Day, we're really into real election season. So how much do polls actually typically change between Labor Day and Election Day?
Well, Galen, our own G. Eliot Morris, has done some work on this. And basically, he found that since 1948, at this point in the election cycle, the margin between the candidates in state polling averages moves historically by about seven percentage points. So that is obviously a lot. And you should expect a decent amount of volatility in the polls. That said, 1948 is a long time. We are in more polarized times these days.
So I don't expect seven points of movement between now and Election Day, but I do expect that these averages will not look exactly the same on Election Day. And because when we're talking about leads of 0.2 points or whatever, obviously that's important.
Yeah, Nathaniel, as you mentioned, the 1940s was a long time ago, and our producer Shane crunched some of the numbers more recently. And if you look from the 1980s onwards, it looks like the volatility is more like between Labor Day and actual Election Day, three and a half percentage points. And then if you actually just look from the 2000s onward, it's more like two and a half percentage points. So
Yes, that final Labor Day stretch used to be a period during which you could see all kinds of wild swings. It seems like that's lessened, but of course, sometimes history is prologue and sometimes it's not. Ruth, do you think that Labor Day proves as a real marker in terms of
when the real campaign happens and when things can really shift. I think why it matters is because historically it was when voters started paying attention, right? And this election has been a little bit different because we saw that shift a little bit earlier. A little bit. A little bit earlier. But
But that was the historical marker that mattered. Not so much that the polls couldn't shift during that time period. They could and did and will. It was more that that was when people started paying attention. And so actually, you could argue there might be more volatility after Labor Day than before Labor Day because people weren't paying attention. So I think that's the marker to really note. So would we take that suggestion and say, well, maybe that period has already happened? Because if you look at Gallup polling...
In August, 79% of Americans said they are giving quite a lot of thought to the campaign. That is a record for Gallup asking this question in August. They also ask it right before the election happens. The all-time high for that number is...
since they started asking this question, was 84%, and that was October of 2004, like right before the election in 2004. We're only five percentage points shy of that number in August. So, I mean, there's not even that much more attention, literally, that can be paid before you reach 100%. Is that reason to believe that this cycle's post-Labor Day will have any less meaning than past post-Labor Days? I don't
I don't think so, because I think one of the things, one of the sort of themes we keep seeing come up is this race to define Kamala Harris, that, you know, Harris is trying to define herself and that Trump is trying to define her. We talked a little bit earlier about the sort of dramatic swing in favorability for her. And one thing that
Any dramatic swing in polls these days really clues into me in this hyper-polarized era is that those are fairly soft opinions. If things swing really dramatically, it's because it's not a really firmly held opinion. So I think there's reason to expect some volatility, even though people are paying a tremendous amount of attention, because I think there's still a lot of definition that needs to come in this race.
Yeah, Ruth, that's a really interesting point. And the Wall Street Journal asked this directly. They said, do you have a firm opinion of Kamala Harris? Yes or no? Or do you need to hear more about her? 84% said they had a firm opinion of her. 15% said they need to know more.
in a relatively polarized world, 15% ain't nothing. So is that evidence to you that really there is more sort of softness and volatility potentially? Yes, that is absolutely evidence to me. And that's the exact question that
I think the Wall Street Journal, many others will be asking to try to understand. And it'll be particularly interesting in swing states because, yeah, to me, 15 percent is a relatively high number. And granted, we've historically like these past few elections, we've been talking about kind of insanely well-known candidates between Trump and Biden and Clinton. But at the same time, it's enough people to really make a difference and kind of makes up a lot of that swing in her favorability is people who are like, well, I don't really know, but she seems good.
And maybe that holds as they learn more about her. And maybe it doesn't. That's hard to say.
Yeah. So I think it's instructive here that when we saw the polls basically not move between Biden and Trump in 2024, basically like until the debate and also like throughout a lot of the 2020 cycle, we were dealing with some very well-known candidates. And in 2016, the polls were a little more volatile because people were still kind of feeling out their opinion of Trump in particular, right? You saw after Access Hollywood, he dipped down and then after
After the Comey letter, he went back up and Clinton went down. So there is some recent-ish precedent for the polls to change. And I think to Ruth's point, another interesting tidbit is that if you look at our polling averages, you look at our Harris-Trump polling average now, Trump has basically been consistently at 44%. You look at our Biden-Trump polling average, Trump was also at 44% for a while. He rose a little bit after the debate.
His numbers have been quite consistent, and I think that tracks with the fact that everybody knows how they feel about Donald Trump at this stage. Kamala Harris and Joe Biden, Joe Biden was obviously polling lower. He was like around 40, 41 percent in our polling average before. And Harris is now polling higher. That variability has been on the Democratic side. And so I completely agree with Ruth that Harris is kind of the perceptions of Harris and Harris's number in particular is maybe what's softer and what could move here.
So another thing that's going to happen over the next two months is a lot of talk about polling error. Ruth, what sources of error in this election keep you up at night? Ooh, all sources of error keep me up at night. She's not sleeping, folks. No, right, yeah. I have small kids. I'm never sleeping. Okay.
There are three things for me that I think I'm paying a lot of attention to. And one is maybe resolved. I think I was very worried earlier in the cycle about the error that was associated with a more popular third-party candidate. Just because polling, it's very challenging to measure third-party candidate support. If you name them, you historically overstate their support. If you don't name them, you obviously understate their support. And so how polls were handling that was really...
inconsistent, and I worried a lot about that leading to error. That is arguably less important now with Kennedy dropping out, though I still think some of that is a little bit murky.
The other two sources of error are kind of interrelated. One is I'm concerned about error in accurately measuring Latino voters. That's been a fairly swinging group this cycle. And I think we and others have been paying a lot of attention to that and making sure that we're very accurate in making sure we get the right types of Latino voters, college-educated Latino voters, non-college-educated voters that...
speak English, voters that primarily speak Spanish. So just making sure that we are accurately measuring and have the right balance of those voters. And interrelated sort of states that have higher than average polling error, and some of that is states where we have a higher share of Latino voters, but also just other states where there's a higher than average polling error, right? Like Wisconsin is a source of anxiety for all of us. The biggest polling error in Wisconsin recently was nine percentage points in 2020.
If we're looking at Wisconsin right now, and a lot of Democrats are looking at Wisconsin and saying that looks like an OK state for Harris, a nine percentage point polling error could really change the conversation around that state. So I think don't trust a Wisconsin poll, folks. Exactly. So I think state level polling error is really the thing that keeps me up at night is these states where we have this higher than average polling error.
Do you think that pollsters have worked hard in general? I mean, the New York Times has also shown their work on this. I mean, they've tried different survey methods in Wisconsin, sending actual snail mail, creating actual incentives to respond to surveys, trying to see the difference between high response surveys and low response surveys and...
what the different kinds of voters might be who respond to surveys literally in Wisconsin. And so I know you've all been showing your work and being really diligent in this space and trying to fine tune the work that you're doing accordingly. But do you think the broader polling landscape has been sort of working hard on this issue after the past two presidential elections and that they've tried to make changes to address some of the sources of error? Yeah, I would say yes. And I would say, yes, people are working very hard at it.
And I don't know if we found the solution yet. I think it's a problem that we know exists and a lot of people in the polling community are spending a lot of time and attention on. At the same time, we walked out of 2020 without sort of firm conclusions on polling.
what all went wrong to cause that polling error. So without a clear direction on what went wrong, it's hard to feel like we've gotten our hands around a clear solution. Like you said, we've done work in Wisconsin with high response rate surveys and incentives to try to see if the types of voters that we were getting are different than the types of voters that we get with our phone polls that have one to two percent response rates.
And we found that the voters that we got when we did these higher response rate surveys were sort of less politically engaged, which could be valuable. The sort of balance of partisanship wasn't necessarily wrong in that our phone polls match what we were getting in these higher response rate surveys. But we were getting these sort of less politically engaged people that could be a difference maker. So that's something that we think we might have gotten our hands around. But I still worry about it. Obviously, it's still something that's of concern to me.
Nathaniel, what keeps you up at night? And for you, I'm asking just generally, not polling.
Obviously, the polls on Election Day can be wrong for many reasons. There's kind of good old fashioned sampling error where it's just like you didn't get like your sample just like wasn't quite it was maybe a couple points to Republican or to Democratic. And that's the ballgame. And then there are kind of trickier things like non-response error, which is, you know, this idea of that.
you know, maybe there's something systematic about Trump voters in 2020 and 2016 that we may be or maybe haven't solved about in terms of trying to make sure that we're getting more of them in the sample. And again, the polling era could also go the other way. You know, we've seen Democrats being very engaged in elections like special elections over the last several years. And maybe if you have a situation where Democratic turnout is significantly higher than people were expecting, you know, the polls might lowball them as well. And so, you
That kind of thing. I mean, in some ways, it's good that the polls are so close because it makes our job easier as journalists to say, yeah, that 0.2 percentage point lead, it could go either way. And it's hard, I think, for people to dispute that. You know, that's just the thing we want to get across to people is that right now, at least, this is a very close election. And even if the polls are
like historically accurate, that would you could still be off by like one point. If the average polling error were one point, that would be fantastic. That would be such a great performance for the polls. But obviously in a state with a point two percentage point lead, that could be the difference. So that's what keeps me up at night.
And it's worth saying that we've said this many, many, many times on this podcast before. The average national error in a presidential election is four percentage points. And I'll just say, since we were comparing from the 1980s and the 2000s until now, since the 2000s, the average error, like comparing polls from Labor Day to the end result is
is three and a half percentage points. And if you compare the 1980s until now, it's five percentage points between the polling around Labor Day and where the actual results end up. So that's a difference between how much the actual polls move. Yeah, and I will just add, and maybe this is sort of a controversial take.
My entire career is focused around polls. I think polls are insanely valuable. They're the voice of the people. I could go on and on and on, but I think something polls aren't great at is
is measuring really, really close elections. They are fairly blunt instruments. They do a really good job of showing us where things generally stand, and they're showing us right now that things are generally very close. But being able to tell the difference between one percentage point this way and one percentage point that way, polls just aren't great at that. They're not an exact enough instrument to be able to tell that difference. And that is a real challenge because it is the only tool that we have to measure these kind of things.
but it's also an imperfect tool. And so I think that's just a real challenge. Right. In some ways, the thing that we obsess the most over as pertains to polling is the thing that they are least good at doing. So if we're talking about an error of four percentage points and we're looking at a question on which the American public divides, you know, 75-25, then, okay, say you even double that error, you still know where the majority falls. But on a question like this, it's just like,
We ultimately know that squint and half the country feels this way, half the country feels that way. And like finding the difference of in the like 3% is just not what polls are in some ways even designed to do. Polling error is a known unknown here. There are some other known unknowns and then there are a lot of full on unknowns. So let's talk about that a little bit. But first, a break.
Today's podcast is brought to you by GiveWell. You're a details person. You want to understand how things really work. So when you're giving to charity, you should look at GiveWell, an independent resource for rigorous, transparent research about great giving opportunities whose website will leave even the most detail-oriented reader stunned.
GiveWell has now spent over 17 years researching charitable organizations and only directs funding to a few of the highest impact opportunities they've found. Over 100,000 donors have used GiveWell to donate more than $2 billion.
Rigorous evidence suggests that these donations will save over 200,000 lives and improve the lives of millions more. GiveWell wants as many donors as possible to make informed decisions about high-impact giving. You can find all their research and recommendations on their site for free. And you can make tax-deductible donations to their recommended funds or charities. And GiveWell doesn't take a cut.
Go to GiveWell.org to find out more or make a donation. Select podcast and enter 538 politics at checkout to make sure they know you heard about them from us. Again, that's GiveWell.org to donate or find out more.
Today's podcast is brought to you by Oracle Cloud Infrastructure, or OCI. AI might be the most important new computer technology ever. It's storming every industry and literally billions of dollars are being invested. So buckle up. The problem is that AI needs a lot of speed and processing power. So how do you compete with costs spiraling out of control? It's time to upgrade to the next generation of the cloud, Oracle Cloud Infrastructure, or OCI.
OCI is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds. It offers one consistent price instead of variable regional pricing. And of course, nobody does data better than Oracle. So now you can train your AI models at twice the speed and less than half the cost of other clouds.
If you want to do more and spend less, like companies Uber, 8x8, and Databricks Mosaic, take a free test drive of OCI at oracle.com slash 538. That's oracle.com slash 538. The numbers, not the letters. Oracle.com slash 538. Today's podcast is brought to you by Shopify. Ready to make the smartest choice for your business? Say hello to Shopify, the global commerce platform that makes selling a breeze.
Whether you're starting your online shop, opening your first physical store, or hitting a million orders, Shopify is your growth partner. Sell everywhere with Shopify's all-in-one e-commerce platform and in-person POS system. Turn browsers into buyers with Shopify's best converting checkout, 36% better than other platforms.
effortlessly sell more with Shopify Magic, your AI-powered all-star. Did you know Shopify powers 10% of all e-commerce in the U.S. and supports global brands like Allbirds, Rothy's, and Brooklinen? Join millions of successful entrepreneurs across 175 countries, backed by Shopify's extensive support and help resources.
Because businesses that grow, grow with Shopify. Start your success story today. Sign up for a $1 per month trial period at shopify.com slash 538. The number is not the letters. Shopify.com slash 538.
the debates. We've got one in a week. We've got a VP one at the beginning of October. Nathaniel, how much we already had this conversation once in the cycle because we had actually the first presidential debate in June. But how much might we expect these kinds of affairs to shape the election?
Well, we've already had maybe the most consequential presidential debate in history, probably. I guess you'd have to say so, right? So obviously, it'd be pretty silly to say that debates can't matter or don't matter. Our colleague, Elliot Morris, did crunch the numbers before that first debate and found that, generally speaking, the polls moved by an average of two points historically after the debate. Can make a difference, but on the margin, which is kind of the same old song, right?
Kind of to the point of what we were talking about earlier, this is Kamala Harris's first debate. Again, Donald Trump has done lots of these. He did them in 2020. He did them in 2016. We saw him already in 2024. Well, she had a VP debate. Oh, that's a good point. Yes, that's fair. But also people will be viewing her with different eyes in this debate.
The pressure will be on. It is the kind of thing that could change perceptions of her at this kind of malleable time for her. And I think expectations will also be higher for her than they were for Biden, which doesn't help her. Basically, it is exactly the kind of thing that could kind of, quote unquote, stop her momentum, but is also the kind of thing that she could use as a springboard or could not make a difference because sometimes that happens, too.
Yeah. Speaking of which, I was on vacation, so I actually totally missed the CNN interview and I still haven't watched it. I promise I will once I'm done catching up on all of my email folks. Don't worry. You really don't have to.
I mean, that's another type of event where you could say that it will either stem or continue the momentum. In your estimation, did it do either of those things? No, that kind of thing. Like a debate, a presidential debate is like a capital E event, right? Those are the types of things that people are used to being big thinkers.
things on the calendar. Even people who aren't super obsessive politics nerds like us are going to maybe stop and watch the debate, or at least they will watch clips of it afterward because they feel like it's their duty as a citizen or whatever.
Those individual network interviews, on the other hand, they're just not nearly as big of a deal. So for example, that CNN interview last week got 6 million viewers, which like, hey, you know, like sounds like a decent night for cable news. I'm like, sure it is. The debate in June got 50 million viewers. So to give you a sense of just kind of like the scope of it, and obviously people are going to watch clips afterward and these numbers don't include streaming and stuff like that. But like, again, I think it shows you that
the way that a debate can penetrate the narrative about a presidential campaign much more than one individual interview as hyped as CNN wanted to make it.
We are also entering the period of the race where people are absolutely inundated with advertising. So far, since Harris took the helm of the Democratic ticket, Democrats have outspent Republicans. We're going to spend more time looking at ads and ad dollars on this podcast. It's a whole episode unto itself. But is that another source of
or is that one of those things where it's like, yeah, we kind of make a big deal of it and it's a big part of elections, but it doesn't change much?
I think probably mostly the latter. We've talked on the podcast before about how it's kind of like a mutually assured destruction type of thing. Like both candidates have a ton of money. They're going to spend a lot of money on ads in swing states. And as long as no one candidate dramatically gets their message out much more than the other one, it's kind of going to cancel each other out. I kind of expect that to happen. So far, I will say that the Harris campaign has been out advertising Trump
Whether it's by enough to make a difference, I don't know. You know, again, there are two months left. So I don't think TV advertising is something that has historically moved campaigns all that much. And I don't really expect that to be different here.
All right. Lastly, we've got some unknown unknowns, like October surprises. There might not even be any. Exactly. That would be the ultimate surprise, right? If there are no surprises. Did that happen in 2020? Do I remember correctly that there wasn't a real October surprise? Are you kidding me? Ruth Bader Ginsburg died?
Trump got COVID. Does RBG dying count as an October surprise? Yes, absolutely. One thousand percent. That would 2020 was like the most ridiculous campaign from start to finish. Like the topsy turviness of that race will never be topped. And I am not issuing that as a challenge. 2024. 2024.
I don't know. RBG also died in September, not October. And it seemed to change literally nothing. We don't know. I mean, the polls ended up being not that accurate in 2020. So like, we don't know what kind of things were happening under the surface. Like, I don't know. Anyway, Trump test positive for COVID October 2nd, 2020.
Boom. Okay, there we go. So RBG was a September surprise. Trump's COVID was an October surprise. I mean, the Access Hollywood tape was an October surprise and that ended up not mattering because there was the later, the next October surprise, which was the Comey letter. Like, this is crazy. Yeah, if we don't get anything, I will be a very happy camper. Okay. Needless to say, we don't know and we won't start speculating now. So with that, thank you. Let's speculate. It's fine. It could be fun. What are some fun October surprises that we could get?
Aliens. We haven't gotten the aliens yet. That one's like, that's the big series finale, right? Nathaniel, what are you doing here? With that, thank you so much, Nathaniel and Ruth, for joining me today. Thanks for having me. Thanks, Caitlin.
My name is Galen Druk. Our producers are Shane McKeon and Cameron Chertavian, and our intern is Jayla Everett. You can get in touch by emailing us at podcasts at 538.com. You can also, of course, tweet with us with questions or comments. If you're a fan of the show, leave us a rating or review in the Apple Podcast Store or tell someone about us. Thanks for listening, and we'll see you soon.