Since 2010, this blog has described growing political polarization in the United States. Polarization has been less a matter of Americans becoming extremists—most remain centrists or oblivious to politics—but more that politically engaged Americans have increasingly aligned their views, values, and even their practices, from where they live to what they drive to where they pray, with their politics.
Accordingly, public as well as scholarly interest in the topic has soared in just the last dozen or so years.[1] And so acute has polarization become that it increasingly undermines efforts to accurately measure how acute it has become.
Both the surveys and the administrative data that researchers use to track polarization are increasingly distorted by polarization itself. I discuss this development here. In a later post, I will discuss research on polarization itself published since I last reviewed it (here and here).
Polls: When People Don’t Answer
Looking at polls taken over the years is the major way researchers observe trends in polarization. For example, my early 2023 post included a graph from the General Social Survey showing that in the 1970s and ‘80s Democratic and Republican respondents on average expressed similar views regarding abortion. Then support for abortion access grew among Democrats and shrank among Republicans. Before, knowing whether people were Democrats or Republicans told you nothing about their positions on whether abortion should be available, say, to a married woman who does not want any more children. Now, you can safely bet that that Democrats are likelier to say yes and Republicans to say no.[2] Masses of surveys on many topics show such widening differences.
Polling, however, has become extremely difficult. The percentage of Americans who agree to be polled has plummeted even for the best survey organizations and government agencies. The respected AP-NORC collaboration just reported a poll with “a cumulative response rate of 2.5%.” This trend partly reflects polling fatigue and partly political fatigue. In a 2023 Pew survey, 65% of respondents said that “they always or often feel exhausted when thinking about politics.”
If Americans who answer polls and those who do not answer polls were similar to one another, researchers could still track change reasonably well. But those who answer and those who refuse are not similar. Polarization itself shapes response rates. People who participate tend to be more partisan, more connected, and more informed and, so, more polarized,[3] giving us overestimates of polarization. At the same time, reports of polarization in the polls seems to feed people’s sense that America is dividing into camps and in that way they increase future partisanship (see here and here).
Also, as the percentage of the population that participates in polls shrinks to low single digits, the results from any given survey are increasingly vulnerable to even modest swings of mood. For example, people whose side seems to be winning at the moment of a survey (say, just after a presidential debate) become likelier to answer the survey than before. That variation in responsiveness, in turn, exaggerates the real, underlying changes of opinion; it is an illusion (see here and here). Dropping response rates combined with growing polarization create distorted views of the nation.
Polls: When People Do Answer
Polarization also affects the answers respondents give. On the one hand, partisan tension can lead some respondents to edit themselves if their neighbors are of the other political persuasion. More commonly, on the other hand, partisans succumb to “motivated reasoning,” adopting and reporting their side’s supposed “facts.” For example, partisans believe and report that the economy is doing well if their party holds the White House, but believe and report that it is doing poorly if the other side holds the White House. (See here and here.) People often (non-consciously) believe the world is as they wish it was politically–a pattern that has accentuated over the last couple of decades.
Some survey respondents even lean in to polarization by making assertions that they probably do not believe just to make a rhetorical point, declare their loyalty, or purposely shape the polls (for example, Trump supporters claiming that Trump’s inauguration crowd in 2017 was larger than Obama’s in 2009). (See also here and here.)
In a just-published paper, Minjae Kim and colleagues provide considerable evidence for both processes: First, partisans often believe false statements to be factual if those statements come from their leaders; second, and more importantly, partisans often set aside facts if there is a greater “truth” and a greater cause than mundane accuracy to uphold—say, Trump being the legitimate president. (Strikingly, in the case of 2020, Republican endorsers of the “Big Lie” seem to be largely real believers in a stolen election rather than advocates for a larger “truth.”)
While most recent studies such as these focus on Trump supporters, Kim and colleagues show that the partisanship effect operates for Democrats, too. Many believe or look past liberals’ misstatements on behalf of a greater “truth,” for example, AOC’s false claim that “police brutality is now a leading cause of death for young men.” AOC herself responded to correction this way: “I think there [are] a lot of people more concerned about being precisely, factually, semantically correct than being morally right.” Trump loyalists would agree.
These distortions mostly seem to inflate the extent of polarization, but they also fuel polarization by convincing people that Americans on the other side are nutty extremists (see here, here, here, and here).
Polls: Media Effects
Fewer Americans are getting news through traditional journalism, if getting news at all. Perhaps as cause or as effect of this, American news media’s language has become much more negative in tone since 2000, especially since 2010: angrier, sadder, more disgusted. David Rozado and his colleagues report these findings and argue that the biggest acceleration in negativity resulted from the development in 2009 of online tools that measured which social media stories were getting the most attention; those were the emotionally negative ones. News editors adapted accordingly.
Negativity ramped up in both red- and blue-tinged news media, but more so on the right. Left anger, however, may well have been stoked by another media trend Rozado and colleagues reported elsewhere: a large increase since 2010 in the appearance of terms describing prejudice—terms like racism, sexism, and transphobia. Public concern about these topics, as expressed in the polls, rose alongside this trend, more grist for the polarization mill.
Beyond the Polls
These are ways that polarization is distorting our assessments of polarization in surveys. Other ways of assessing polarization are also being distorted by polarization, for example, in voting data and in census data, such as those showing residential segregation by politics.
Voting data are increasingly affected by hyper-partisanship in a variety of ways—Trump-induced Republican shyness about mail-in voting (a large majority of Republicans oppose it, while a large majority of Democrats support it); threats against election officials and challenges to voter lists; finely-targeted, data-driven get-out-the-vote campaigns; misinformation online and in text message campaigns about, say, precinct locations; and so on.
Political partisanship is distorting census data as well. The Trump administration’s effort to add a citizenship question to the 2020 census helped triple the proportion of Hispanics who were uncounted compared to the 2010 census. On the other side, advocacy groups pressed efforts to increase Hispanic response rates and, after the census, demands for census corrections, which sometimes succeeded, came from Democratic administrations.
* *
Other phenomena can distort how we measure them. High inflation leads consumers to change what they buy in the market basket of goods that the BLS uses to track prices. High speed slows down a rocket’s on-board clock used to measure its speed. American political polarization seems to have accelerated in ways making it harder for us to assess its acceleration.
====== NOTES======
[1] The relative frequency of Google Searches with the phrase “political polarization” rose roughly 150-fold between 2010 and 2023. The frequency with which that phrase appeared in American books between 2010 and 2019 roughly tripled (my calculations using nGram Viewer).
[2] General Social Survey question “abnomore” crosstabulated by “partyid” (coding leaners as partisans): For 1972 through 1989, 43% of Democrats and 44% of Republicans said they approved of allowing such abortions. For 2010 through 2023, the results were 58% and 35% respectively (gamma = .45; this excludes 2021 when Covid yielded a biased sample).
[3] It may seem a paradox, but it is the more informed who express the most polarized views. They know how the issues align with their political identities and they know who the “enemy” is; the less informed (and the apathetic) do not. Columnist Thomas Edsall just quoted a team of political scientists: “Those with high levels of partisan animosity are both politically engaged and knowledgeable…. ‘[They] discussed politics much more frequently, posted about politics on social media much more frequently, expressed greater interest in politics … and possessed substantially more political knowledge’.”