Are people more likely to accurately evaluate misinformation when the political stakes are high? Haha, no
Imagine you’re walking down the street and a random guy asks you to solve a math problem. A complicated one, but one you know how to solve — it’d just take several minutes of some fairly serious thinking on your part. Would you do it?
Maybe you love math and want nothing more than random afternoon word puzzles. But I’d wager most people would just mumble “sorry” and keep on walking. You have the capacity to solve this rando’s problem, but you don’t really have any incentive to.
Now imagine the same scenario — but this time, the guy says he’ll pay you $500 if your answer is correct. Suddenly, there’s a potential return on your investment of time, so you’re more likely to offer up the mental energy.
This is a fairly established finding in situations where someone is asked to throw some cognition power at a problem. Giving someone a meaningful incentive on a mental problem can lead them to work harder and have a better chance of getting it right. That’s also true for a very specific kind of mental problem: figuring out whether to believe some random headline you see on social media. For example:
— This 2024 study asked 3,999 people to participate in a mock social network where they were tasked with determining whether or not a post was scientifically sound. The most effective way to increase subjects’ accuracy, and the only way to get them to head over to a search engine to do some research on the topic at hand? Tell them they’ll get more money if they’re correct.
— Or this other 2024 study, in which subjects were asked to evaluate criminal evidence from two witnesses to a crime — one of whom subjects were informed was lying. Despite that information, many believed both witnesses — unless they were promised an extra €5 if they kept things straight. That small amount was enough to “significantly reduce” subjects’ mistakes.
— Or this 2008 paper, which asked 1,200 Americans questions of political knowledge. Telling people they’d get $1 for each correct answer increased their accuracy by 11%.
— Or this 2015 paper, which asked Democrats and Republicans a series of questions about economic performance under past presidential administrations. Unsurprisingly, people painted more positive pictures of their own party’s record. But offering a financial incentive that amounted to just 17 cents per correct answer reduced the partisan gap in answers by more than half.
There’s a consistent thread here: If people don’t see a reason to bring their full mental capacity to bear on a question, they probably won’t. We’re lazy! But when the stakes are a little higher — when there’s a little more reason to bring our A-game — we can do better.
Let’s transfer that idea into politics. After all, there’s usually no direct reward for sussing out a fake headline in your News Feed, or for detecting when a claim about a politician edges from plausible to laughable. In day-to-day life, a single bit of political wrongness is unlikely to impact your life one whit. So why summon up the brain power?
But what if the stakes were suddenly higher — say, just hypothetically, if it was a presidential election season and the country is being presented with two wildly different potential futures? Would people then summon up more of their mental capacity to separate good information from bad? Pundits have long said most voters only “get serious” about an election a few weeks before the big day — maybe that new seriousness might mean a stricter adherence to the facts?
That’s one of the issues addressed by a new paper by Charles Angelucci, Michel Gutmann, and Andrea Prat — of MIT, Northwestern, and Columbia, respectively. Its title is “Beliefs About Political News in the Run-up to an Election“; here’s the abstract, emphasis mine:
This paper develops a model of news discernment to explore the influence of elections on the formation of partisan-driven parallel information universes. Using survey data from news quizzes administered during and outside the 2020 U.S. presidential election, the model shows that partisan congruence’s impact on news discernment is substantially amplified during election periods. Outside an election, when faced with a true and a fake news story and asked to select the most likely true story, an individual is 4% more likely to choose the true story if it favors their party; in the days prior to the election, this increases to 11%.
Did you catch that? People aren’t more likely to evaluate accuracy correctly during the fever pitch of an election season — they’re less likely, and by a meaningful margin.
Angelucci, Gutmann, and Prat base their study on YouGov survey data from a total of 10,094 people. These surveys, from between 2018 and 2022, presented participants with both real and fake news stories and asked them to identify the ones most likely to be true. Importantly, people had an incentive to be accurate: They were promised $1 for each correct answer. The authors separated out the surveys conducted just before the 2020 presidential election from the others and compared how people’s answers differed.
Here are some of the true statements:
- The U.S Senate acquitted Trump of impeachment charges.
- President Trump nominated Brett Kavanaugh to the U.S. Supreme Court.
- The U.S. Government was partially shut down in fight over Trump’s border wall with Mexico.
And some of the false statements:
- Attorney General Barr released text message from Special Counsel prosecutor Robert Mueller: “We’re taking down Trump.”
- President Trump disparaged the Puerto Rican governor and statehood movement, tweeting that Puerto Rico was “a small island filled with savages.”
- President Trump said that former President Obama wrote the emoluments clause of the Constitution.
First, let’s look at their findings on partisanship:
This chart shows how the partisan nature of people’s responses differed depending on the survey’s timing. The left column shows data from the non-election-season surveys and finds that — depressingly, if not surprisingly — Republicans are more likely to give Republican-biased answers and Democrats are more likely to give Democratic-biased answers. (That is, they’re more likely to ascribe truth to fake stories when they make their political opponents look bad.)
The right column shows the just-before-the-election data. The bias gets amplified on both sides — though, as is often the case in U.S. politics these days, the effect is more pronounced on the political right.
Strikingly, the gap between Democrats and Republicans in the average partisan reflection of their selected statements more than doubles during the presidential election compared to outside of the election period. This finding helps explain the varied results in the literature and suggests a dynamic approach to studying “parallel information universes.”
To try to eliminate alternate explanations for the data, the authors construct a “model of news discernment” that controls for variability between surveys and the amount of time between when a news story broke and when people were being asked about it. (It’s worth noting that people were not especially good at these little quizzes. On a given survey, the average person got 2.61 correct out of 4.)
Our estimation exercise confirms the result suggested by the raw data: individuals’ beliefs when assessing the truthfulness of political news become significantly more partisan during election periods.To quantify this finding, consider a thought experiment where an average partisan individual is presented with a pair of recent news stories — one true and one false, with the false story being neutral in its partisan orientation. Our model estimates predict that, outside of an election period, the individual is 4% more likely to select the true story as the most likely to be true if it reflects favorably on their preferred party compared to unfavorably. However, during an election, this difference increases to nearly 11%.
When asked about a recent news story on a survey conducted outside election season, people were 7 percentage points more likely to call a story true if it was “very favorable” to their party than if it was “very unfavorable.” But just before an election, that gap more than doubled to 17 percentage points.
All in all, our model estimation exercises corroborate the pattern hinted at by the raw data presented in the introduction: even after accounting for stories’ age, quiz difficulty and differences in partisan dispersion of news stories, we find that partisan congruence shapes individuals’ beliefs about political news far more strongly during election periods than non-election periods. Elections, it seems, amplify the influence of partisanship on the perception of truth.
In a sense, it all comes down to what you mean by “high stakes.” Yes, a presidential election is high stakes for the country at large. But believing something that supports your ideological priors is high stakes for your ego — especially at the height of an all-consuming campaign. Our brains want to believe the best about our side and the worst about the other. And it seems that overrides any extra incentive for accuracy at the moment our votes matter most.