Here are a few facts about what surveyed Americans claim to be facts: In a late 2017 poll, one in five survey respondents claimed that Donald Trump received more popular votes than did Hillary Clinton; about two in five said that the unemployment rate had risen during the Obama years; and about one in three told another poll that Obama was born in Kenya. All wrong. Of course, there is a huge political split on such topics. For example, about one-half of Republicans versus one-seventh of Democrats said that we’ve had a Kenyan president.
There are partisan splits on a range of facts. In 2016, 79 percent of liberal democrats versus 15 percent of conservative Republicans said that they agreed that “Earth is warming mostly due to human activities.” Not all the misperceptions are on the Republican side. It is well known that many Americans sharply flip their reports about how the economy is doing or even about how their own finances are doing when the White House changes party control (e.g., here and here). Both political sides have tended to report crime as rising when it was actually falling.
How should we understand the detachment from reality that so many Americans seem to display when asked questions about facts? What does it say about polls and their value? One thing it says is that many people use polls to send a message.
Polling Realism
Some wrong answers result from the routine “noise” of survey research: respondents don’t understand the question, or click the wrong button by accident, or rush through the interview, and so on. In a few cases, respondents are just messing with the poll or the pollster. Some interviewees, of course, truly believe the false statements. No doubt, many Americans cross-their-hearts believe that Obama was born in Kenya or that millions of illegal aliens voted in 2016. But, an interesting fragment of respondents treat polls not as a quiz to be graded on but as an opportunity for what survey scholars have termed “expressiveness” and partisan “cheerleading.”
I would broaden this kind of poll responding to include “self-presentation” or, more simply, “sending a message.” That is, there are respondents who treat some factual questions not as chances to show what they know but as chances to tell the interviewer, or data analyst, or reader, or even themselves something more important than facts.
Survey researchers have long described one version of this behavior as “social desirability bias.” Respondents will sometimes answer questions in ways that make themselves look better–saying that they voted when they had not, give to a charity when they do not, are racially tolerant when they are not, went to church last Sunday when they did not, and so on. (Some may be consciously lying, others fooling themselves–for example, forgetting that they had missed the last vote–and yet others reinterpreting the question–for example, reporting what they usually do on Sunday.) This is a well-known source of error in surveys and practitioners have tools to deal with it.
Political Messaging
Another version of sending a message, I am arguing, is using a survey question about facts as an opportunity to declare one’s political identity. You can see evidence of this in the finding that more educated and more knowledgeable respondents are as likely or even more likely than are less educated and less knowledgeable respondents to give the wrong answer if the wrong answer suits their politics. For example, conservatives who know much about science are more prone than those who know little about science to deny climate change (e.g., here and here). This resembles the finding that people with higher education are likelier than those without to say they voted when they hadn’t, but the “alternative fact” here is in service of their politics rather than their pride. Sending a message through polls then helps widen the political polarization that encourages people to send a message.
Liberals also send messages. A rising proportion of Americans answer survey questions about their religious preferences by choosing “none” as their answer. I have been convinced that a major portion of that increase is made up, not of people whose beliefs have changed or who are especially irreligious, but of liberals and some centrists who are sending the message that they reject the religion of the religious right. “Religion today seems to mean being anti-gay and anti-abortion, so put me down as having no religion!,” they seem to be saying (see here and here). And, let’s not forget that many on the left question vaccination and GMOs against the scientific consensus on each.
Experiments
We have more evidence of message-sending in a few recently published experimental studies. In research they conducted in 2008 and 2012, John Bullock and colleagues asked survey respondents a long set of factual questions, for example, about casualties in Iraq and inflation. They observed the familiar political party difference in the answers given by subjects in the “control” condition. (The overall differences were not dramatic; many of the questions were not “hot button” items.) In the experimental conditions they offered respondents a bit of money if they answered correctly–$1 per correct answer and 33 cents to admit that they do not know the answer. In those circumstances, the party gap was much smaller.
In research conducted in 2004 and 2008, Markus Prior and colleagues asked respondents questions about economic facts such as the unemployment rate and the federal debt. As usual, in the control condition Democrats and Republicans saw things differently. However, respondents offered $2 for a correct answer were half as likely to give party-“congenial” answers. When subjects were given no incentive to be correct, the more politically knowledgeable among them showed the greatest tendency to misreport, but in the condition offering money for facts, that paradoxical result did not appear.
Finally and most colorfully, Brian Schaffner and Samantha Luks ran a flash survey on the controversy over the relative sizes of Donald Trump’s and Barack Obama’s inauguration crowds. They showed half the online respondents unlabeled pictures of each crowd–the difference in size was blatantly obvious–and asked them which picture had the most people. Fifteen percent of Trump voters picked the smaller, Trump crowd–and about 25 percent of the most educated Trump voters did–compared to virtually none of Clinton voters and of non-voters. In another condition, they asked respondents to indicate which picture was of the Trump inauguration and which was of the Obama inauguration: 40 percent of Trump voters picked the more crowded, Obama, image and claimed it for Trump. The authors doubt that these results are explained by sincere, mass delusion. They conclude that many respondents were declaring their political positions rather than actually answering a pretty obvious question.
Implications
How many survey respondents are sending a message rather than telling what they believe to be true? On most mundane topics, probably only a few are. On sensitive issues–politically sensitive matters like evaluating the economy or personally sensitive matters like revealing their level of generosity–probably a notable percentage are. Nonetheless, even in the era of Big Data, we still need surveys. But we need to take these sorts of distortions into account when we read survey results.
On evaluating the economy, for example, the percentages reflect a lot about respondents’ partisan feelings. We might generally take such survey results with a grain of salt, but if respondents whose party holds the White House start complaining about the economy, it’s worth taking notice.
As polling experts have long noted, surveys are in some ways “conversations at random,” even when the “conversations” are with a computer program. And in conversations, we report facts, we give opinions, and we sometimes send a message.