Those of us who teach undergraduates often encounter a woeful complaint when we’ve presented them with conflicting factual claims, explanations, or theories: “But which one is true?”
In part, this cry may reflect students’ urge to know what answer to give on the exam. But I think that it more reflects a general need for certainty. The teacher may want them to compare and contrast assertions, to appreciate that the science on many topics is still in flux, to understand that there may no “right answer” on many topics, but accepting that level of ambiguity is a hard task for many students– and people in general.
Living with Maybe
The scholar is trained to live with ambiguity, with probabilities rather than certainties. At any given time, various claims may be about as likely as others – even about historical claims. We might be pretty sure, for example, that U.S. Grant’s brutally aggressive tactics turned the tide on the Potomac front of the Civil War. But we are not certain; historical discoveries or rethinking yet to come might revise the conventional wisdom. (Indeed, recent research has revised the estimate of Civil War dead up by about 20% from the number that had been assumed for generations.)
Even more so with sociological sorts of claims. Does having more money make people happier? Probably, although perhaps not for people who start off with lots of money. Does social diversity in a community undermine attaining the “common good”? Probably, but likely only in a complicated way. Or maybe both claims will, when the dust of all the research settles, turn out to be wrong.
While most scholars can live with the notion that one statement is for the time being, say, 70% likely, and that the probability of another one is a 50:50 proposition, most normal people find that difficult. Either things are or they aren’t. Sometimes, even statements that are not actually contradictions make people antsy for certainty. For example: Racism in America has substantially decreased in last few generations. Racism persists in America. Both claims are (very probably) true and they are logically consistent, but many listeners feel that both cannot be true at the same time and insist on one or the other.
Ironically, people seem to have an easier time totally reversing their beliefs — going from being certain that X is true to being certain that not-X is true — than accepting ambiguity. This tendency of many people to blithely reverse their views drives opinion researchers batty. Ask survey respondents a question twice about, say, whether the president is doing a good job, or whether non-Christians can go to heaven, and notable proportions of people will give different answers at different times or in a different context. For many people, inconsistency is OK, but ambiguity — carrying around partial, maybe, sort-of, could-be notions of the world – is intolerable.
Much of our civic and social discussions are dominated by the voices of people who are absolutely certain. The speakers brook no thought that their claims are provisional, that future evidence or future reflection might overturn them. Those who accept more ambiguity are at a disadvantage. Once these uncertain folks grant that their opponents just could be — perhaps in certain cases, perhaps partially — right, they have lost the initiative to the certain-truth warriors.
Some disappointed commentators on the Left diagnose Obama as too professorial, too willing to treat political debate like a graduate seminar rather than the trench warfare it is. There is some social psychological evidence suggesting that people on the Right are more fixed, more certain, and and more intolerant of ambiguity than are people on the Left. But that is just a tendency. Whether the debate is about fixing the economy, messy wars, or religious faith, there are people on both sides who cannot tolerate ambiguity and cannot admit to uncertainty. And so the volume of the arguments just increases.
As for us professor types: Why can’t the world just be one big graduate seminar?