At this time in the presidential election cycle, we are inundated by surveys, almost moment-by-moment, battleground state by battleground state. But surveys are far more important than just serving to handicap elections. It is through scientific surveys – that is, asking standardized questions of representative samples of the population – that researchers and policymakers in the last 70 or so years have been able to get an accurate sense of Americans’ lives and opinions. It is how we know, for example, who is having trouble making ends meet, or people’s views about big policy issues.
(There are many survey cynics out there, I know. But, well-done surveys are roughly accurate – and roughly accurate is better than other ways of figuring out what is going on, like extrapolating from one’s friends’ experiences and opinions.)
For the last several years, however, survey researchers have faced escalating challenges, in particular, the problem of getting Americans to answer their phone calls. (See this excellent National Journal article.) High-quality, face-to-face surveys are still done, but the costs have shot through the roof. Only the government and well-funded academic projects can afford them. The more common way of doing surveys – say, the way Gallup or Pew does them – is by telephone. And that has become difficult. I know one small survey business that closed from general dispirit.
Not Answering the Phone
Survey research response rates have been declining for decades, but slowly at first and now faster and faster. Quality survey organizations, notably university-based ones, used to count on getting 70% or more of the people whom they targeted for interviews to participate. It is still possible — if one invests the training and effort to reach such percentages with in-person household interviews. (That is what the General Social Survey run by the National Opinion Research Center at the university of Chicago – on whose board I sit – attains.) But it has become much more of a struggle to get those interviews. Meanwhile, interviewing people on the telephone has become borderline futile. The Pew Research Center, a responsible outfit, now reports completing only 9 percent of targeted interviews, down from 36% 15 years ago.
Several factors have driven this decline. Americans have been increasingly away from home, especially as wives have gone off to work and as childless adults spend evenings out. Another factor was rising fear of crime; the caller may be trying to trick you or scope out your home. Both the out-of-the-house and the crime trends took off after the 1960s. More recently, many Americans, perhaps a third, have shifted to cell phones and the procedures for reaching people on their cell phones are much more daunting. And, no doubt, Americans have become survey-fatigued – especially because more organizations do telephone polls and many businesses do fake surveys as a way to market their products.
If a smaller and smaller percentage of Americans are answering surveys, the great danger is that those who do answer are less and less representative of the general public. There are statistical tools available to adjust the data for such biases. It is remarkable that commercial survey organizations have been able to stay more or less accurate in predicting elections (although commercial pollsters usually devote extraordinary effort to be right about elections and not as much effort to be right about other things). At some point, however, the underlying data become questionable.
Other difficulties in surveys have become apparent to scholars of survey methods. We know, for example, that who the interviewer is can shift some respondents’ answers. And the way questions are worded and the sequence they are asked can shift some answers. We also know that there are topics that people exaggerate about – say, whether they voted, or attended church this week – and other topics that they play down – say, drug use, or prejudice. (In an earlier post, I discussed some of the complications that arise in particular when comparing survey answers across cultures.) Researchers have ways of dealing with some of these difficulties so as to accurately answer some research questions, but these problems can introduce distortions.
Absent Surveys, What?
America depends critically on surveys. We get our most important economic information from surveys – the unemployment rate, job creation rate, consumer confidence, and so on. We track critical social problems with surveys – smoking among teens, the “dark figure” of crime (crimes not reported to the police, which includes most rapes), hunger among children, and so on. Businesses depend a lot for their forecasting of demand on polls. We conduct much of our democracy via surveys, as politicians tack with public opinion winds.
What would we do without surveys?
We might try to discern public opinion through letters to the editor, Facebook “likes,” calls to congressional offices, tweet vocabulary, street demonstrations — or the most common way we do it, assuming that what we and our friends think is typical. We might track Americans’ health by admissions to the hospitals, death rates, consumption of Lipitor. We could estimate changes in poverty by counting beggars on the street, malnourished kids at school. We could try to figure out the “dark” crime number by…. I don’t know.
The survey professionals are working hard at trying to restructure and save this vital method for measuring the American condition. It is an important task to protect an important tool.
(Cross-posted by The Berkeley Blog on September 13, 2012.)