In 40 years of teaching economics, I have always followed my rule of not commenting on my preferences about politicians, even when asked. Well, almost always. There was one exception.

In a lecture on numeracy, which is basically literacy with numbers, I told my students that my favorite candidate for the Republican nomination for president was Indiana governor Mitch Daniels. The reason: as a former director of the federal government’s Office of Management and Budget, Daniels knew the difference between a million and a billion.

The difference, of course, is huge. A billion is a thousand times a million. Another way of saying it is that a billion is 999 times more than a million. Daniels could not have been even a semi-effective head of OMB had he not applied that distinction dozens of times a day. Yet we see politicians regularly demonstrate their innumeracy in matters of public policy. And the results are costly. A little attention to numeracy by politicians, the media, and average citizens would elevate public discussion and would result in better decisions on policy, whether the issue is government budgets, terrorism, or job safety. As a bonus, applying some basic numeracy to our own private lives would help us make better decisions.

The two most important ways people are innumerate are in confusing one large number with another large number and in confusing one small number with another small number.

Consider, first, large numbers, starting with an example that has no political overtones. In his excellent book Innumeracy: Mathematical Illiteracy and Its Consequences, John Allen Paulos, a mathematics professor at Temple University, tells the following true story. A blurb on the box containing a Rubik cube states that there are more than three billion possible combinations. Paulos notes that the actual number is about 4.3 times 1019. Is 4.3 times 1019 greater than 3 billion? Yes; the blurb is correct. But it’s so much greater than 3 billion, which has only 9 zeroes after it rather than 19, that the blurb is uninformative. As Paulos writes, it’s like having a sign at the entrance to the Lincoln Tunnel stating “New York, population more than 6.”

The Rubik cube blurb is just a “toy” example, right? Well, sure, but think about similar confusions about big numbers in the real world of government budgets. Many congressional Republicans say they’re worried about annual federal budget deficits that range between $500 billion and $1 trillion, and that they want to cut spending. Then look at some of their talking points and it’s rare for them to advocate cutting any particular program by at least $1 billion. Instead, they will castigate this or that federal agency for much smaller expenditures. Former Senator Jeff Flake of Arizona, for example, criticized the National Science Foundation for “spending millions of dollars to determine why yawning is contagious, if drunk birds slur when they sing, and if cocaine makes honey bees dance.” He was probably right to make that criticism. Those grants do seem wasteful. But they are rounding errors on $1 billion, let alone $1 trillion.

By contrast, consider what would happen if instead of increasing Social Security benefits by 1.6 percent for 2020, the federal government had increased them by “only” 1.5 percent. Social Security expenditures for Fiscal Year 2020 are expected to be $1.102 trillion. Cutting that $1.102 trillion by 0.1 percent would yield budget savings of $1.102 billion. That’s still a small number in a $4.747 trillion budget. But it swamps the effect of cutting expenditures on studying yawns, drunk birds, and waltzing bees.   

The other main way people are innumerate is in confusing one small number with another small number even though the two small numbers may differ by orders of magnitude. We see this especially in discussions of risk.

In one of the most important articles in the early 2000s, aptly titled “A False Sense of Insecurity,” John Mueller, a political science professor at Ohio State University, pointed out just how low is the U.S. fatality rate from terrorism. He wrote:

 

Until 2001, far fewer Americans were killed in any grouping of years by all forms of international terrorism than were killed by lightning, and almost none of those terrorist deaths occurred within the United States itself. Even with the September 11 attacks included in the count, the number of Americans killed by international terrorism since the late 1960s (which is when the State Department began counting) is about the same as the number of Americans killed over the same period by lightning, accident-causing deer, or severe allergic reaction to peanuts.

I think it would come as a surprise to most Americans that the risk from terrorism is equal to the risk of being killed by accident-causing deer. 

Now you might say, as even some of my friends have, “Wait a minute; the risk of terrorism since 9/11 would be much higher than before 9/11 if it hadn’t been for the Transportation Security Agency and other anti-terrorism measures.” That may be true, although I’m skeptical. Look at the fact that the only airplane terrorism actions we know of that were prevented were foiled by fellow passengers: the 2001 Richard Reid shoe bombing attempt and the Christmas 2009 underpants bomber. In both cases, the terrorists managed to slip through European security screeners who used methods similar to those of the TSA.

Is it possible that the TSA has stopped terrorist attacks that we haven’t heard about? It is, although given the fact that it's a government agency trying to preserve its budget, the odds are high that if the TSA had stopped a credible terrorist attack, it would be bragging about it publicly.

Moreover, what if the threat from terrorism, absent government measures to thwart the threat, were ten times its current level? Then the threat would still be very low.

Why, then, do many of us think the threat is so high? Because politicians and the media, among others, have so hyped the threat.

In their entry in The Concise Encyclopedia of Economics, “Risk and Safety,” Aaron Wildavsky, a political sciencist at the University of California, Berkeley, and Adam Wildavsky, a senior software engineer with Google, have an interesting table showing fatality risks from various activities. Their measure is the number of fatalities per 100,000 people at risk. Not surprisingly, the probability of dying from any of the activities listed is under 1.0 and, in all the cases listed, is well under 1.0. But what is immediately striking from their table is the huge difference between numbers that are all low-probability. Consider the top end, Chicago crack dealers, which at the time had a risk of 7,000 fatalities per 100,000 people at risk. (Obviously, there were never 100,000 crack dealers just as there have never been 100,000 U.S. presidents; the authors “norm” the data—get all of the data to deaths per 100,000 people for purposes of comparison.) That fatality rate translates to a probability of death of 0.07.

Now compare that to the fatality rate from a job in manufacturing, which has 3 fatalities per 100,000 people at risk. That translates to a probability of death of 0.00003, which is well below the 0.07 for crack dealers.

Their table is informative in so many ways. We often hear, for example, that policeman put their lives at risk to protect us. It’s true, but that doesn’t tell us much. In the United States, how many policemen are killed per 100,000 at risk? It turns out to be 20. That’s almost 7 times as risky as working in manufacturing, but note that it’s below the 28 fatalities per 100,000 farmers at risk. Yet how often do we hear that farmers put their lives at risk to feed us?

The numerate analysis of risk can also help us think through the “precautionary principle.” The city government of San Francisco implemented a version in 2003 that states:

 

Where threats of serious or irreversible damage to people or nature exist, lack of full scientific certainty about cause and effect shall not be viewed as sufficient reason for the City to postpone measures to prevent the degradation of the environment or protect the health of its citizens.

Imagine if that principle had been applied before cars were introduced. For the first few decades, the fatality rate from being in a car was very high. In 1923, the number of Americans killed per 100 million vehicle miles was 17 times its current level. And there’s nothing less healthy than death. Had the precautionary principle been in place, it’s conceivable that the government would have banned cars.

In case you think that the precautionary principle is the preserve of the left, consider a statement that Dick Cheney made while U.S. vice-president:

 

If there's a 1% chance that Pakistani scientists are helping al-Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response. It's not about our analysis ... It's about our response.

Of course, there is a potentially catastrophic effect if al-Qaeda not only obtains, but also uses, a nuclear weapon. But even if al-Qaeda got a weapon, that wouldn’t mean it would use it. And even if an organization that got a weapon were sure to use it, a 1% probability of a catastrophic effect is the not the same as a 100% probability. Treating a 1% chance as if it’s a 100% chance is innumerate nonsense.

Sometimes, when you learn about a public policy, it doesn’t change the world at all because few of us have the power to change things in the short run. But learning numeracy is different because it really can make us better off.

Here are two examples. The first is from Paulos’s book Innumeracy. In 1985, there were some terrorist attacks in the Mideast in which 17 Americans were killed. As a result, hundreds of thousands of Americans cancelled trips abroad. Let’s consider whether that made sense. In 1985, 28 million Americans traveled abroad. So a rough estimate is that an American traveling abroad that year had a 17 in 28 million probability of being killed by a terrorist. That’s 1 in 1,650,000. Is that large or small? To see, let’s compare it with something. That same year, Americans had a 1 in 5,300 chance of dying in a car crash. To make the appropriate comparison, we need to know how long the average trip abroad was. Let’s say it’s two weeks. So the chance of dying in a car crash in the same amount of time, two weeks, was 1 in 5,300 * 26. (There are 26 2-week periods in a year.) That’s 1 in 137,800. So the probability of dying in a car crash at home was approximately 12 times as high as the probability of being killed by a terrorist abroad.

That’s probably not totally accurate. Maybe the probability of being killed by a terrorist was elevated and so the 1 in 1,650,000 is understated. But would it be understated by an order of magnitude? That’s hard to believe. Adding to the irony is that probably a number of people who cancelled trips abroad because of fears of terrorism went on driving vacations at home. A little numeracy might have reassured people that they should continue with their travel plans.

The second example of applied numeracy is from a segment on CBS’s 60 Minutes that my wife and I watched recently. The reporter, Bill Whitaker, noted that there were increased shark sightings near the shore of Cape Cod and that this was causing many people to avoid swimming. To Whitaker’s credit, he also noted how rare shark attacks on humans are. In a follow-up article on line, CBS’s Brit McCandless Farmer pointed out that in 2018, there were 66 shark attacks on humans worldwide and that shark attacks lead to about 6 fatalities annually. The bottom line: go ahead and swim. Life hasn’t turned into “Jaws.”

And if we became more numerate, who knows, we might be willing to be unfazed by tiny-probability events. John Mueller, in the article referenced above, quotes risk analyst David Banks’s relevant statement: “It seems impossible that the United States will ever again experience takeovers of commercial flights that are then turned into weapons — no pilot will relinquish control, and passengers will fight.” Yet the lesson of United Flight #93 has yet to be learned. Instead, governments have cost us billions of tax dollars annually, millions of hours wasted in line, some of our privacy, and some of our freedom.

Although I didn’t agree with many of the positions taken by the late Senator John McCain, I’ll end with one he got right. Shortly after 9/11, McCain stated:

 

Get on the damn elevator! Fly on the damn plane! Calculate the odds of being harmed by a terrorist! It’s still about as likely as being swept out to sea by a tidal wave. Suck it up, for crying out loud. You’re almost certainly going to be okay. And in the unlikely event you’re not, do you really want to spend your last days cowering behind plastic sheets and duct tape? That’s not a life worth living, is it?

 

[Editor's note: This piece has been edited after publication to correct a factual error.]

 

 

 

 

overlay image