Tornadoes and floods hit Oklahoma . . . . 16 banks in Florida close . . . Millions face extended unemployment . . . Search on for food contamination that felled dozens . . . Workers severely injured when roof collapses . . . Venture capital firm files for bankruptcy . . . .
Q: What do all these headlines have in common, besides being tales of woe? A: That the people injured physically or economically by these woes are helped, often made financially whole. Better yet, many more people who could have been in the same scrapes – killed, injured, broke, homeless, or just recently, endangered by salmonella in eggs – are protected from those threats. By whom? By your insurer of last resort: Uncle Sam.
Evolution of Risk Control
In an important 2002 book, When All Else Fails: Government as the Ultimate Risk Manager, Harvard Business School Professor David A. Moss describes the government’s role in “reducing or reallocating risk.” Risk is reallocated when a law shifts risk from one party to another. For example, product liability laws make, say, the manufacturer of ladders responsible to pay damages if its ladders keep collapsing, so that at least the financial risk — if, alas, not the physical one — from falling off ladders shifts from the users to the ladder-makers.
The major risk reallocations, however, entail diffusing individual risks across many people. We don’t know who will, for example, end up injured on the job, but a workers’ disability program spreads out the risk: We all pay a bit into a pooled fund, so that whoever is injured is largely protected financially.
The government reduces risk by, for example, establishing minimum requirements in electrical work, by conducting research on epidemics, and by forecasting the weather.
Moss tells the history of American governments’ increasing role as “risk manager.” In the earliest era, government mainly acted to protect businesses from risk, shifting their hazards and costs to the wider public. The laws that established limited liability corporations protected investors from major losses. All they could lose if the business flopped was the money they had invested, not their personal property; the creditors (including unpaid workers) just had to eat their losses. Nineteenth-century legislation and court decisions – e.g., systems for enforcing contracts, to create a durable credit system – reduced the risks of starting and conducting business.
In the 20th century, Moss explains, legislators and courts moved on to protecting American individuals from dangerously high risks. We are all familiar with unemployment insurance, old-age insurance (Social Security), and Medicare insurance. They take the risks of being unemployed, impoverished in old age, or severely ill in old age and spread their cost out among all the employed via payroll taxes. But the government’s activities in reducing and reallocating risk goes much deeper and farther.
One example Moss could have used is milk. Around 1900, the buyers of milk – or, rather, their infants – carried the risk of adulterated milk; caveat emptor. Such adulteration (watering the milk down, usually with contaminated water) killed many infants. Eventually, dairies were forced to submit to inspections and to serious penalties if their milk was bad; the risk had been shifted from the consumer to the producer. Today’s food inspectors and EPA activities fit in that tradition.
Many examples of risk-shifting could be taken from agricultural America – crop subsidies, for example. Farmers always ran huge risks, not only from weather and pests, but also from wild swings in crop prices. Today, Uncle Sam buys up overproduced crops and pays to have some land lie fallow. “Farm income stabilization” alone costs about $20 billion a year. Each of us as taxpayers has taken on a share of the farmers’ risks.
Note also the ways that government backstops the private insurance system – by regulating insurance companies, which reassures and then encourages their customers; by requiring that people buy insurance (for example, automobile insurance and mortgage insurance); and by subsidizing insurance costs, as in the tax deductibility of health plans.
And then there is the entire range of government rescue programs, especially disaster relief. Moss provides a telling comparison. In 1927, Mississippi Valley floods cost several hundred lives and what would today be over $4 billion in losses (see picture at top of this post.). President Coolidge had the federal government save stranded people and provide tents; that was just about all. The Feds covered only about 3% of people’s and businesses’ losses. Everything else was left to the Red Cross and charity and the victims to cope with. Similar floods in 1993 cost fewer than 40 lives but about $14 billion financially (see picture to the right). In the end, the Feds, with widespread public support, covered about half of those losses. Floods, tornadoes, hurricanes, earthquakes, whatever: we expect the Federal government to anticipate needs, to protect people, to pick up the pieces, and to compensate – to shift the risks.
Some people complain that all this risk abatement that Uncle Sam does is a bad idea. It creates a “nanny state”; it tries to remove the normal hazards of life, which enervates its citizens and is ultimately incomplete. Tennessee senatorial candidate Rand Paul, for example, suggested that mining safety regulation was an unnecessary economic burden; if a mine was dangerous, workers would simply learn not to work there. Such dangerous mines would have to close.
There are several responses to this philosophical critique. One side comment is to note that the critics do not usually include in their condemnations those risk abatements that are targeted for business – limited liability for corporations; having a court system supported by taxes to insure that contracts are fulfilled; having taxpayer-supported sheriffs enforce home foreclosures; etc. It is usually a critique pointed only at the unfortunate.
More substantively, retaining risk and insecurity may seem a way to spur hard work, investment, innovation, and so on. And in some ways it does. But, in the larger picture, psychology and history show that insecurity undercuts such initiative and enterprise. (Chapter 2 of Made in America addresses this topic.) When people are insecure for themselves and their families, they’ll do the safe and sure more often than they will take a gamble; they’ll hunker down because the sky could fall tomorrow. But when risks are limited for individuals (as well as businesses), they can more confidently strike out in new directions.
Also, the costs of a truly high-risk society are probably higher than most Americans would tolerate: It would mean, as it once did, many people going to debtors’ prison or the poor house; workers having to move their families almost as soon as they lost their jobs or their crops failed; people having to cope with injury, illness, old age, disaster on their own or with only their families to help – and many dying from taking those risks.
One sometimes hears or reads that we could handle risks and misfortune as we “used to”: by neighbor helping neighbor. That is mythology. To be sure, community aid often did and does occur. Historically, however, even where neighbors had the most heartfelt intentions toward the needy (which was often not the case — if the needy were strangers, newcomers, or disrespected), charity often fell far short of individual need.
(To get a flavor of the world as it was, consider the poem that Harper’s Weekly published on its cover in 1871: “For I’m old and I’m helpless and feeble / The days of my youth have gone by / Then over the hill to the poor house / I wander lone there to die.”)
And neighborly charity certainly couldn’t help much with large-scale disaster. For example, it did not help the thousands of miners over several generations who died in mine accidents and of black lung disease before mine safety regulation (despite Mr. Paul’s analysis implying that the free market of those days must have eliminated the risk of mining). And that system totally collapsed in the Great Depression; even the people who used to be the givers of help needed help. And so came the New Deal.
It is bemusing that some of the most vociferous critics of the government’s role as insurer of last resort are often the ones who gain from it the most — Medicare recipients or farmers, for example. A couple of studies (here and here) have pointed out how many rural Americans assert their self-reliance and independence of government even as they benefit from the security system the New Deal set up.
Enforcing Risk Reduction
Another line of complaint against the government as the insurer of last resort is that, whatever its merits may be, individuals should not be forced to participate in this system. For example, motorcycle riders should not be forced to wear helmets just because the government says it will reduce their risks. And the medically uninsured should not be forced to buy into an insurance plan under the new health care system.
It is true that a libertarian take on this would let people be suicidal if they wished to be – but only if their actions did not impinge on others. In most cases they do impinge. The injured motorcyclist still wants to get picked up and taken to the emergency room even if he cannot afford to pay the cost of the medical care. (I have thought that perhaps people should be allowed to conduct reckless behavior, like riding without a helmet – if they posted a bond that would cover all the expenses of cleaning up after them.) Moreover, insurance cannot generally work for the great majority if individuals can opt out; “free riders,” so to speak, undermine the system for everyone else.
In any event, I’ll take complaints against government insurance more seriously when I stop hearing other kinds of complaints – complaints that the government did not do enough to keep contaminated lettuce off the market, get my social security check to me quickly, cover another new drug on medicare, build safer dikes in New Orleans, prevent or stop the oil spill in the gulf, compensate me for crop blight, send cops into my neighborhood, help our company deal with foreign competition, and so on and so forth.
Admit or not, we expect Uncle Sam to have our back.
Update (Feb. 8, 2011)
A recent paper by Suzanne Mettler revealed the extent to which Americans seem unaware of how much they rely on government programs — in this case on tax breaks and government expenditures. In a large-scale survey, she asked Americans a general question, whether they had ever used a “government social program.” Later, she asked them specific questions, whether they had used any of a long list of such programs. She found that large proportions of those who had in fact used a specific program had earlier said they had not used a “social program.” Some of the specific cases were targeted tax cuts: 60% of those who had used the home mortgage interest deduction, for example, had earlier said the “no, I have not used a government program.” But perhaps people find it hard to think of those as “programs.” Still, 44% of respondents who took social security payments, 40% of those on medicare, 43% of those who had gotten Pell (college) grants, and even 25% of those who had gotten food stamps had denied benefiting from a “government social program.” We are so allergic to the notion of government that we find it hard to see it when it is right before us — in a check or a medical card or a food stamp. Of course, Americans are even more oblivious when the government program is something like food safety or flood protection.