Posts Tagged ‘surveys’

A few months ago, I sketched preliminary explanations of last November’s election; those conclusions still hold up well. This post addresses how well–or, poorly–the election polling did, why, and with what implications for using polls as a voice of popular opinion.Truman-Dewey

Putting the major polls together, their miss in last year’s presidential election was, on average, 4 percentage points, mainly because they underestimated the Trump vote; they also underestimated the Republican down-ballot votes by about the same margin. (Fivethirtyeight.com’s final averages of polls gave Biden an 8.4-point lead; he ended up winning by 4.4 points.) As presidential elections forecasting in recent decades go, this error was roughly average.

However, the 2020 polling stirred considerable and appropriate consternation; Politico declared the morning after that “the polling industry is a wreck and should be blown up.” The reasons for consternation include these:

* Although the polls got the electoral college winner right this time, the 2020 error was actually larger than the 2016 error, which was only 1.8 percentage points (Clinton was predicted to win the popular vote by 3.9 points, but won it by 2.1 points).

* This deterioration in accuracy occurred despite major efforts by polling organizations to fix the apparent 2016 problems and notable improvement in the 2018 off-year elections. The average 2018 error in forecasting party shares of the congressional vote was exactly zero. FiveThirtyEight.com declared that the “Polls are Alright.”

* In particular states (e.g., Wisconsin, Florida) the 2020 presidential polling error was much larger than the national 4 points.

* Many projections for down-ballot races, such as the Senate race in Maine, performed a lot worse than the presidential ones.

* The polls’ errors leaned in the same direction as in 2016, underestimating the Republican vote yet again.

Post mortems on the election now have some analysts and some political action groups (e.g., Swing Left) looking to rely less on polling going forward and more on “fundamentals” such as how a district voted in prior elections.

What happened?


Read Full Post »

Here are a few facts about what surveyed Americans claim to be facts: In a late 2017 poll, one in five survey respondents claimed that Donald Trump received more popular votes than did Hillary Clinton; about two in five said that the unemployment rate had risen during the Obama years; and about one in three told another poll that Obama was born in Kenya. All wrong. Of course, there is a huge political split on such topics. For example, about one-half of Republicans versus one-seventh of Democrats said that we’ve had a Kenyan president.telegram

There are partisan splits on a range of facts. In 2016, 79 percent of liberal democrats versus 15 percent of conservative Republicans said that they agreed that “Earth is warming mostly due to human activities.” Not all the misperceptions are on the Republican side. It is well known that many Americans sharply flip their reports about how the economy is doing or even about how their own finances are doing when the White House changes party control (e.g., here and here). Both political sides have tended to report crime as rising when it was actually falling.

How should we understand the detachment from reality that so many Americans seem to display when asked questions about facts? What does it say about polls and their value? One thing it says is that many people use polls to send a message.


Read Full Post »

Survey Says . . .

The Gallup organization recently announced that it will not poll on the presidential primary races and perhaps not on the 2016 general election as well—in order, its editor-in-chief said, to focus on “understanding the issues.” Observers might suspect that Gallup is passing on elections because its forecasts were embarrassingly wrong in 2012. About 5 points off, they called Romney the winner. It is hard to sell your data to businesses and news organizations when you know it is questionable. Whatever the motive, Gallup’s withdrawal spotlights a deep worry in all survey research: declining accuracy because of plummeting response rates. …. For the rest of this post, see the Boston Review here.

Read Full Post »

Surveying Change

Social historians studying the twentieth century have an advantage specialists in earlier centuries do not. Survey research, which began seriously in the 1930s, allows the former to know what average people reported about their attitudes and actions in ways that no documentary archive can even approximate (here, for example). To track changes over the decades in attitudes and action accurately, not only should the samples drawn in different eras be comparable, the questions asked should be the same – whether they are about church attendance, political participation, racial views, whatever. As the noted sociologist Otis Dudley Duncan reportedly stated, “If you want to measure change, don’t change the measure” – i.e., the wording of question.

Wise advice. But there is a problem: Sometimes the words themselves change meaning.

I was sharply reminded of this issue recently when leading a team that was putting together a survey. I had jotted down a phrase to use in a question: “in order to keep things straight.” Graduate students quickly objected. You can’t use straight because of its sexual connotations. I was well-aware that the word gay had been transformed. Tom W. Smith, a dean of survey research, noted that the Gallup Poll’s 1954 question, “Which American city do you think has the gayest night life?,” did not mean the same thing just 30 years later. Now, neither does straight.

Survey designers cannot fully rely on fixed meanings. Paradoxically, the pollsters’ craft requires judgments about social change in order to write the questions to measure social change. (For related discussions of how words’ histories can affect psychological testing, see this earlier post and here.)


Read Full Post »

The Survey Crisis

At this time in the presidential election cycle, we are inundated by surveys, almost moment-by-moment, battleground state by battleground state. But surveys are far more important than just serving to handicap elections. It is through scientific surveys – that is, asking standardized questions of representative samples of the population – that researchers and policymakers in the last 70 or so years have been able to get an accurate sense of Americans’ lives and opinions. It is how we know, for example, who is having trouble making ends meet, or people’s views about big policy issues.

(There are many survey cynics out there, I know. But, well-done surveys are roughly accurate – and roughly accurate is better than other ways of figuring out what is going on, like extrapolating from one’s friends’ experiences and opinions.)

Telephone Interviewers (source)

For the last several years, however, survey researchers have faced escalating challenges, in particular, the problem of getting Americans to answer their phone calls. (See this excellent National Journal article.) High-quality, face-to-face surveys are still done, but the costs have shot through the roof. Only the government and well-funded academic projects can afford them. The more common way of doing surveys – say, the way Gallup or Pew does them – is by telephone. And that has become difficult. I know one small survey business that closed from general dispirit.


Read Full Post »

Social scientists trying to understand what makes Americans tick often turn to cross-national surveys to compare Americans’ opinions to those of people in other countries. Such surveys show us, for example, that Americans are generally more religious, more patriotic, and more suspicious of government than are people most elsewhere.

Patrick Vinck_UC Berkeley

A recent conference devoted to designing such international surveys made concrete an important point that I had perhaps appreciated too abstractly: There are deeper differences underneath the different answers Americans give. The very assumptions behind the questions that are asked, whether the questions even mean the same things, differ profoundly from nation to nation.


Read Full Post »

Depressing Comparisons

An August post on a sociology blog began, “For the last several decades, depression rates have been on the rise at a rapid pace.”

Source: Andrew Mason via flickr

That assertion has appeared in many places over recent years. The blogger provided no reference for the assertion. I think I know the initial source of the claim; most writers who declare that depression has been rising probably read it in – of course – The New York Times.

Research indicates, however, that there was no rise in depression rates over the last several decades. The key studies relied on for the claim that depression rose had one or more important  flaws. Understanding those flaws helps us understand the difficulties of discovering and making historical claims.


Read Full Post »