The typical American worker is doing better today economically than at any other time in history. That's probably not what you've read elsewhere. But it's true.

If you take the government's data at face value, real wages appear to have fallen since 1973. In 1973, the average real wage for production and nonsupervisory workers in the private sector was about $14.30 in 1997 dollars. The average real wage for the same category of workers today, by contrast, is $12.34, a 14 percent decline. But these data have two big shortcomings. The effect of both is to understate current real wages.

Statistical Shortcomings

First, over the last twenty-four years, an increasing portion of workers' pay has taken the form of benefits--pensions, health insurance, and so on--none of which is counted in hourly wages. Although the Department of Labor's Bureau of Labor Statistics (BLS) reports overall compensation for all employees, not just for production and nonsupervisory workers, the data are illuminating. Since 1980, real benefits, valued at the employer's cost, have risen by 21 percent. Average real employee compensation, including benefits valued at cost, has risen by 5 percent.

The second problem with the standard data on real wages is that the consumer price index (CPI), used to adjust for inflation, overstates inflation. Economists who have studied the issue, including the five economists on the Senate Finance Committee's Advisory Commission to Study the Consumer Price Index, believe that the CPI overstates inflation by more than 1 percentage point a year.

There are three main reasons. First, the CPI does not adjust for the fact that people buy more of goods whose price has fallen and fewer of those whose price has risen. Second, it fails to adjust for quality improvements--that calculator you bought last year is many times better than the best one you could have bought fifteen years ago, but the BLS assumes zero quality increase.Third,

If you take the government's data at face value, real wages appear to have fallen since 1973. But these data have big shortcomings.

the CPI fails to capture the "Wal Mart phenomenon," that is, that consumers can now purchase goods at large chains for lower prices than they once paid at their local mom-and-pop stores. These three factors alone, according to a recent study by Northwestern University economist Robert J. Gordon, bias the CPI upward by more than a percentage point a year. Assuming this minimum bias for every year since 1973, real hourly wages since 1973 have actually increased by about 9.5 percent and real employee compensation since 1980 has increased by about 25 percent.

Now it's true that fringe benefits should not be valued at employer cost. They are typically worth less. The employer's portion of Social Security taxes, for example, is mandated by the federal government and is less valuable to employees than the cash that they could have spent or invested in stocks and bonds. And the benefits that are not mandated, such as health insurance, are probably worth less than their cost but are provided because they are a form of tax-free income. Therefore, the picture I painted of rising real compensation is slightly rosier than the reality. But let's put the blame for this where it lies: on federal and state governments that have increased Social Security taxes and other mandates.

Despite meddlesome government, the lot of the vast majority of workers has improved. Not convinced? Then go beyond the consumer price index, whose main purpose, after all, is to measure the change in the cost of a given standard of living, and look at our standard of living directly. First, look at your house. According to Peter Salins, a professor of urban affairs at New York's Hunter College, in 1960 the average U.S. dwelling contained 1,200 square feet and 3.1 residents; by 1987, the average dwelling was 1,600 square feet and had only 2.4 occupants. Next, look in the house. Almost all houses, at every income level, have color TVs, dishwashers, microwave ovens, and other similar conveniences that were once thought of as luxuries available only to a few. VCRs, which have revolutionized home entertainment and given families with young children a chance to watch movies without the expense of a baby-sitter, were almost unheard of in 1973. By 1994, 60 percent of U.S. households officially classified as poor had VCRs. The information technology revolution has also slashed the cost of telephone calls. Now even families of modest means can keep in touch with relatives on the other side of the country and even, sometimes, with relatives in other countries.

The Skills Gap

It is possible that a small number of unskilled workers are worse off now than their counterparts of twenty-five years ago. But the reason is that they have fewer skills than their counterparts back then. A recent study in the Review of Economics and Statistics, by two economists from Harvard and one from MIT, concludes that "a high school senior's mastery of skills taught in American schools no later than the eighth grade is an increasingly important determinant of subsequent wages [italics theirs]." It finds that those who graduated from high school in 1980 were noticeably less skilled than their class-of-1972 counterparts. What are these skills? We're not talking rocket science or even calculus but simple computing with decimals, fractions, and percents and recognizing geometric figures. So there you have it. The lack of skills that we've heard so much about turns out to be an absence of the basics of arithmetic.

More government spending on schools is not the solution. Government schools are the problem. What are we to think of a president of the United States proudly stating his ambition for every student to know how to read by the end of third grade? "Only about half of the nation's high school seniors have mastered" the eighth-grade skills, the study's authors note. When a firm has only a 50 percent success rate on the basics, shouldn't the customers consider going elsewhere?

overlay image