Many of America’s cultural battles in recent decades seem to be face-offs between science and faith: over the teaching of evolution, the reality of climate change, the value of stem cell research, the personhood status of an embryo, and the so on. Many on the liberal side of these issues see the controversies as part of a confrontation between ignorance and knowledge. For the more philosophically inclined, it is about a centuries-old tension between Faith and the Enlightenment’s assertion of reasoned observation. (Scientific American writer Michael Shermer’s “Skeptic” column is largely devoted to this theme.) Recent research suggests, however, a more complex structure to these debates and Americans’ views: Many of those on the religious side are far from scientific naifs; some are scientifically quite knowledgeable. It’s when science directly touches faith that the conflict flares up.
Builders of the American republic in the decades either side of 1800 grasped and employed new philosophical and ideological tools for its construction. The revolutionary idea of inherent political equality – “all men are created equal” with “inalienable rights” – however limited its reality then seems looking back from now, was the Next Big Thing of the day. Also critical were economic analyses originating with Adam Smith and his British colleagues. “Free market” arguments asserted that self-interested actors uncontrolled by authorities combine to create the greatest good for the greatest number.
John Lauritz Larson, in a recent presidential address to the Society for Historians of the Early American Republic, notes a couple of important similarities between the political theories and the economic theories of America’s revolutionary era. Both sets of ideas demanded that the king’s government get off people’s backs, especially by stopping its interference in commerce. And both sets of ideas asserted, based 18th-century “scientific” analysis, that state rule distorted the God- or Providence- or Nature-given order of things. Men, advocates argued, were naturally equal and self-governing; similarly, markets were naturally productive and self-governing.
Larson goes on to make the point that there was nothing natural about the ascendency of the naturalistic argument for the laissez-faire philosophy. But we live with its seeming naturalness to this day.
Two scenes from twenty-first-century America.
Young friends cycle to church for Sunday services. They stash their bicycles by the side of the building, walk in sporting their aerodynamic spandex, and take their places in the pews.
Later that day a waitress at a nearby restaurant approaches a silver-haired couple squinting at their menus. “So, what can I get you guys?” she asks the pair.
Across a range of behaviors, from dress to forms of address, Americans have become strikingly informal: we deviate from convention more than we used to, and the conventions we do observe entail less deference to institutions such as churches and statuses such as advanced age.
Read the rest of this column at the Boston Review here.
Many efforts have been made to explain the persisting black-white gap in economic attainment. It is particularly puzzling because there was considerable progress in closing that gap in the decades after World War II. And then the closing slowed down. The mid-1990s seemed to bring more progress for black employment and wages, but the 21st century – especially the Great Recession – has seen retrograde movement. Moreover, as sociologists Becky Petit and Bruce Western have shown, the standard economic indicators we use, such average income, underestimate the width of the racial gap because they typically ignore the disproportionately high percentage of black men in prison or effectively out of the labor force.
When the General Social Survey asks respondents to choose an explanation for this persisting gap, about half – white, black, and Hispanic – choose blacks’ lower “chances for education” and lower “motivation or willpower” as factors (although about half of blacks and Hispanics also choose the discrimination explanation). Social scientists have explored more complex anayses. The accounts can be sorted into ones that stress the lasting effects of slavery and Jim Crow – often emphasized in this blog; ones that stress current circumstances like remaining discrimination or the suburbanization of jobs; and ones that stress combinations of the two, such as how lacking family wealth makes it harder for youths to go to college just when college-going has become more important.
In a new paper [gated], University of Michigan sociologist Deirde Bloome presents a sophisticated analysis that points to contemporary conditions that have stymied the closing of the black-white gap in family income; it points more to the family part than the income part of family income.
One of the major changes in American life about 100-120 years ago was the domestication of public spaces, particularly in our cities, making them places where “respectable” women went shopping and for entertainment. As described in an earlier post, the streets were historically dangerous places for women, even in daytime. Beginning after the Civil War and accelerating around the end of the century, authorities established dependable public order in many areas of the cities, especially in commercial districts, to the point that going downtown rather than avoiding it was what fashionable middle class women did. (Americans went through another cycle of shunning public spaces in the 1950s to ‘90s and then flocking back to them more recently; see here.)
Two just-published papers reveal yet more of how women claimed urban spaces in the late 19th and early 20th centuries. One describes how middle-class women came to drink in public places, getting a bit of alcoholic relief downtown. Another describes women reformers’ efforts to provide public bathroom relief downtown. That campaign stalled and the search for public facilities continues, as witnessed by smartphone apps for finding toilets downtown. Both accounts fill in the story how women tamed the city.
Social historians studying the twentieth century have an advantage specialists in earlier centuries do not. Survey research, which began seriously in the 1930s, allows the former to know what average people reported about their attitudes and actions in ways that no documentary archive can even approximate (here, for example). To track changes over the decades in attitudes and action accurately, not only should the samples drawn in different eras be comparable, the questions asked should be the same – whether they are about church attendance, political participation, racial views, whatever. As the noted sociologist Otis Dudley Duncan reportedly stated, “If you want to measure change, don’t change the measure” – i.e., the wording of question.
Wise advice. But there is a problem: Sometimes the words themselves change meaning.
I was sharply reminded of this issue recently when leading a team that was putting together a survey. I had jotted down a phrase to use in a question: “in order to keep things straight.” Graduate students quickly objected. You can’t use straight because of its sexual connotations. I was well-aware that the word gay had been transformed. Tom W. Smith, a dean of survey research, noted that the Gallup Poll’s 1954 question, “Which American city do you think has the gayest night life?,” did not mean the same thing just 30 years later. Now, neither does straight.
Survey designers cannot fully rely on fixed meanings. Paradoxically, the pollsters’ craft requires judgments about social change in order to write the questions to measure social change. (For related discussions of how words’ histories can affect psychological testing, see this earlier post and here.)
One issue sparking off from the fiery debate around the police shootings of black men is the extent to which Americans simply react negatively to seeing black – whether it is a police officer making a life-and-death split-second decision about the threat a black man poses, a store clerk tracking a black customer in a store more intently than she would a white one, or an online shopper preferring to buy a device shown in a white hand rather than a black hand.
Explicit racial discrimination, often subconscious, is rarer than it was once was. And such discrimination does not explain most of the black-white gaps in life circumstances such as lifespan and wealth; those largely grow from historically deeper and convoluted roots, further fed by institutional inequalities. Still, the effects of plain old racial aversion are real – accounting, according to one recent analysis, for perhaps a third of the difference between black and white wages (pdf). And such racism certainly takes an emotional toll.
Two recent publications present yet more systematic evidence that plain old racial aversion persists and matters — despite the belief among many whites, perhaps most, that reverse discrimination is just as big a problem. (An earlier related post is here.)