In promoting the Kyoto Protocol, which would require a major cut in greenhouse gas emissions, the White House claims that “scientists agree that global warming and resulting climate disruptions could seriously harm human health (projections include 50 million more cases of malaria per year)” (http://www.studyweb.com/). President Clinton has asserted: “Disruptive weather events are increasing. Disease-bearing insects are moving to areas that used to be too cold for them. Average temperatures are rising. Glacial formations are receding” (address at the National Geographic Society, October 22, 1997).

In his 1997 exhortation to the environmental ministers at Kyoto, Vice President Al Gore warned that “disease and pests [are, will be?] spreading to new areas.” The White House’s home page continues that theme: Americans better watch out; global warming will make them sick.

The Sierra Club has also weighed in, asserting that “doctors and scientists around the world are becoming increasingly alarmed over global warming’s impact on human health. Abnormal and extreme weather, which scientists have long predicted would be an early effect of global warming, have claimed hundreds of lives across the US in recent years. Our warming climate is also creating the ideal conditions for the spread of infectious disease, putting millions of people at risk” (http://www.sierraclub.org/global-warming/factsheets/health.html).

The Public Interest Research Group, a left-leaning environmental organization, fears “Health Threats—Climate change is projected to have wide-spread impacts on human health resulting in significant loss of life. The projected impacts range from increased incidence of illness and death due to heat stress and deteriorating air quality, to the rise in transmission rates of deadly infectious diseases such as malaria, dengue fever, and hanta virus” (http://www.pirg.org/environ/). Other environmentalists and health experts have also forecast that global warming would bring death and disease (Danzig 1995; IPCC 1996a; Jackson 1995; Epstein and Gelbspan 1995; Cromie 1995; Stone 1995; Monastersky 1994; Patz et al. 1996; Kalkstein 1991, 1992; Kalkstein and Davis 1989; Epstein et al. 1998).

This analysis will explore whether Americans do indeed confront a health crisis. If global warming were to occur, would the United States face an epidemic of tropical diseases, malaria being the prime suspect; would Americans face increased heatstroke and summers that brought a surge of deaths; would global warming bring more frequent and more violent hurricanes wreaking havoc on our citizens? Is it true that warmer climates are less healthy than colder ones? Would cutting greenhouse gas emissions, as the Kyoto Protocol requires, improve the health of Americans? This essay will show that the answer to all those questions is a resounding no.

Not only does my own research demonstrate that the claims of imminent doom are unwarranted, but other studies have found little cause for alarm (WHO 1990; Committee on Science, Engineering, and Public Policy 1991; Taubes 1997; White and Hertz-Picciotto 1995; Shindell and Raso 1997; Cross 1995; Singer 1997; Moore 1998a, 1998b; Murray 1996; Michaels and Balling 2000; Reiter 2000). Knowledgeable organizations, such as the World Health Organization (WHO 1997, 1998, 1999) and the American Medical Association (Council on Scientific Affairs 1996) have ignored the subject, suggesting that, in their eyes, it is unimportant.

After examining the potential impact of global warming on poor countries, the American Council on Science and Health (ACSH) took a realistic view and reported that

Nearly all of the potential adverse health effects of projected climate change are significant, real-life problems that have long persisted under stable climatic conditions. Bolstering efforts to eliminate or alleviate such problems would both decrease the current incidence of premature death and facilitate dealing with the health risks of any climate change that might occur.

Policies that weaken economies tend to weaken public health programs. Thus, it is likely that implementation of such policies would (a) increase the risk of premature death and (b) exacerbate any adverse health effects of future climate change. (Shindell and Raso 1997)

As the ACHS concludes:

From the standpoint of public health, stringently limiting such emissions [greenhouse gases] at present would not be prudent. Fossil-fuel combustion, the main source of human induced greenhouse-gas emissions, is vital to high-yield agriculture and other practices that are fundamental to the well-being of the human population. A significant short-term decline in such actions could have adverse health repercussions.

The optimal approach to dealing with [the] prospect of climate change would (a) include improvement of health infrastructures (especially in developing countries) and (b) exclude any measures that would impair economies and limit public health resources.

The World Health Organization’s World Health Report 1998: Life in the 21st Century, gave the globe an A for progress. The WHO showed that remarkable advances have been made in increasing life spans, decreasing disease and suffering, and improving health for virtually all age groups and that the future looks even rosier (see chart 1). To quote the Executive Summary: “As the new millennium approaches, the global population has never had a healthier outlook.” How can this be? After all, the White House tells us the next century promises to be one of rising temperatures, spreading disease, and increasing mortality. Somehow, the WHO didn’t get the word. The World Health Report 1999: Making a Difference again fails to address this problem that the White House believes is so worrisome.

Chart 1

According to the WHO, the only significant and growing threat to human health is HIV/AIDS, a disease that has nothing to do with climate. Indeed, we have made substantial progress in controlling many major infectious diseases. By 1980, for example, smallpox had been eradicated; yaws had virtually disappeared (except to medical students, even the name of this tropical skin disease is unfamiliar). As a result of antibiotics and insecticides, the threat of plague has declined; improvements in sanitation and hygiene have made outbreaks of relapsing fever rare. Unbelievably, for those who remember summers of fear and polio insurance, poliomyelitis is scheduled for eradication this year.

A LOOK TO THE FUTURE

Looking to the future, the WHO report identifies three global trends affecting health—none is global warming. One is economic: the WHO reports (1998) on the “unparalleled prosperity” between 1950 and 1973, which resulted in marked improvements in health and life expectancies. The organization identifies the years since 1993 as another era of economic “recovery,” which has once again contributed to reduced mortality. The other trends singled out as having significant health effects are population growth and social developments, particularly urbanization.

Over the last forty years, the growth in the world’s economy has brought about a doubling of the world’s food supply, while the number of human mouths has grown much more slowly. This has led to a decline in the proportion of people who are undernourished. Since 1970, literacy rates have increased by more than 50 percent. Physical well-being has also grown apace. More people have access to clean water, sanitation facilities, and minimum health care than ever before. Like the 1999 review, prior World Health Reports largely ignored global warming as a significant threat to the health and well-being of the globe’s population. And rightly so.

Of the 50 million plus deaths in 1997, about one-third stemmed from infectious and parasitic diseases, most of which have nothing to do with climate. The remaining deaths were from such killers as cancer, circulatory diseases, and prenatal conditions, none of which would be aggravated by a warmer world. Most infectious and parasitic diseases are unrelated to climate.

The WHO has identified AIDS, one of the most devastating afflic-tions, as a growing menace in Africa, but it bears no relationship to temperature or rainfall. Only insect-spread diseases, such as malaria and dengue fever, and diseases like cholera and typhoid that are spread through contaminated water, could be worsened by climate change (and then only if swampy polluted areas were allowed to expand without thought to sanitation, window screens, and other precautions that have all but eradicated those diseases in the northern latitudes).

But bear these statistics in mind: In the developed world, as recently as 1985, infectious and parasitic diseases accounted for 5 percent of all deaths; in 1997, they caused only 1 percent of all deaths. In short, even for such insect-borne diseases as malaria, climate is much less important than affluence. Singapore, located two degrees from the equator, is free of that dreadful malady, while the mosquito-carried scourge is endemic in rural areas of Malaysia, only a few hundred miles away. Singapore’s healthy state stems from good sanitary practices that reduce exposure. The wealth of the island-state allows it to maintain an effective public health program.

Nor should we be overly concerned with the diseases spread by mosquitoes in tropical areas. If climate change were to occur, according to the global warming models, the poles would warm more than the equator while temperatures would increase more in the winter and at night than during the day. In consequence, the tropics, including Africa, would warm less than the United States or Europe. Any increased burden on health in Africa or southern Asia would, therefore, be small.

With or without climate change, public sanitation should be emphasized as the most effective means of attacking water- and insect-borne diseases everywhere. A warmer world will not add significantly to morbidity in Third World countries. A poorer world most certainly will.

Both the scientific community and the medical establishment assert that the frightful forecasts of an upsurge in disease and early mortality stemming from climate change are unfounded, exaggerated, or misleading and do not require reducing greenhouse gas emissions. Science magazine reported that “predictions that global warming will spark epidemics have little basis, say infectious-disease specialists, who argue that public health measures will inevitably outweigh effects of climate” (Taubes 1997). The article added: “Many of the researchers behind the dire predictions concede that the scenarios are speculative.”

Global warming as currently forecast by the International Panel on Climate Change (IPCC) would not bring tropical diseases to Americans or shorten their lives or inflict more violent storms bringing death and destruction to the United States. Moreover, the warmer climate predicted for the next century is unlikely to induce a rise in heat-related deaths. As the article in Science magazine points out, “people adapt. . . . One doesn’t see large numbers of cases of heat stroke in New Orleans or Phoenix, even though they are much warmer than Chicago.”

TROPICAL DISEASES

Concern about tropical and insect-spread diseases is overblown. Inhabitants of Singapore, which lies almost on the equator, and of Hong Kong and Hawaii, which are also in the tropics, enjoy life spans as long as or longer than those of people living in Western Europe, Japan, and North America. Both Singapore and Hong Kong are free of malaria, but that mosquito-spread disease ravages nearby regions. Modern sanitation in advanced countries prevents the spread of many scourges found in hot climates. Such low-tech and relatively cheap devices as window screens can slow the spread of insect vectors. The World Health Organization (WHO 1990, 21) notes:

until recent times, endemic malaria was widespread in Europe and parts of North America and . . . yellow fever occasionally caused epidemics in Portugal, Spain and the USA. Stringent control measures . . . and certain changes in life-style following economic progress, have led to the eradication of malaria and yellow fever in these areas.

Under the stimulus of a warmer climate, insect-spread diseases might or might not increase. Many of the hosts or the insects themselves flourish within a relatively small temperature or climatic range. Plague, for example, spreads when the temperature is between 66° and 79° with relatively high humidity but decreases during periods of high rainfall (White and Hertz-Picciotto 1995, 7.7.3). Higher temperatures and more rainfall are conducive to an increase in encephalitis. Malaria-bearing mosquitoes flourish under humid conditions with temperatures above 61° and below 95°. Relative humidity below 25 percent causes either death or dormancy.

Parasitic diseases, such as AIDS, Lyme disease, yellow fever, malaria, and cholera, can usually be controlled through technology, good sanitary practices, and education of the public. Even without warming, it is certainly possible that dengue fever or malaria could invade North America. Unfortunately, some of the government’s well-meaning environmental policies may make the vector more likely. The preservation of wetlands, although useful in conserving species diversity, also provides prime breeding grounds for mosquitoes that can carry these diseases. If the United States does in the future suffer from such insect-borne scourges, the infestation may have less to do with global warming than with the restoration of swampy areas.

Cholera

In 1996, diarrhoeal diseases, such as cholera and dysentery, killed 2.5 million people out of the 52 million who died worldwide (WHO 1997). Through the provision of fresh water and proper sanitation, those diseases are easily preventable. Although a warmer climate might increase the incidence of cholera and similar diseases in unprotected areas, chlorination and filtration could halt their spread.

A manifestation of fear mongering about the health effects of global warming is an article in Science (Colwell 1996) taken from a modified text of Rita Colwell’s 1996 presidential address to the annual meeting of the American Association for the Advancement of Science (AAAS). That address presents a careful analysis of cholera and its recent resurgence in the Americas. What is most singular is not what Colwell says but what she fails to mention.

Despite the title of the address, “Global Climate and Infectious Disease: The Cholera Paradigm,” climate change is scarcely broached; the one reference to it comes in connection with malaria, not cholera. Certainly Colwell makes no effort to tie global warming to the spread of cholera. Furthermore, in a section entitled Global Climate, Global Change, and Human Health, the word climate does not appear or the words warmer, temperature, or global. Also puzzling for such a careful exposition is the absence of any reference to the role that the U.S. Environmental Protection Agency may have played in creating the conditions that led to the explosion of cholera in Peru in 1991. But more on that later.

First a few dry facts about cholera, an infectious disease caused by the Vibrio cholerae, a bacterium that can bring on diarrhea, vomiting, and leg cramps. Without treatment, a person can rapidly lose body fluids, become dehydrated, and go into shock. Death can come quickly. Treatment is simple: the replacement of the fluids and salts with an oral rehydration solution of sugar and salts mixed with water. Less than 1 percent of those who contract cholera and are treated for it die.

Cholera cannot be caught from others but comes from ingesting food or water that contain the bacterium. Eating tainted shellfish, raw or undercooked fish, raw vegetables, or unpeeled fruits can lead to infection. Drinking unpurified water can be dangerous as well. The bacterium thrives in brackish warm water but can survive, in a dormant state, both in colder water and saline water. V. cholerae is also associated with zooplankton, shellfish, and fish. It often colonizes copepods, minute marine crustaceans. Ocean currents and tidal movements can sweep the bacterium, riding on copepods, along coasts and up estuaries where it can remain dormant until conditions are ripe for it to multiply.

In 1817, the British first identified this dreaded disease in Calcutta, whence it spread throughout India, Nepal, and Afghanistan. Ships infested with rats carried it into Asia, Arabia, and the ports of Africa. It reached Moscow, its first port of call in Europe, in 1830, creating panic as locals fled the city. From there it traveled to Poland, Germany, and England. In the decade after it first appeared in Europe, it killed tens of thousands in Paris, London, and Stockholm. It reached North America in 1832, appearing first in New York and Philadelphia, then spreading along the coast to New Orleans. In that same year, the disease killed more than 2,200 people in Quebec. Apparently cholera is not a tropical disease; it can sicken and kill in any climate, although in high latitudes it may do so only in the summer.

Prior to the most recent outbreak, the world suffered six cholera pandemics. By the end of the nineteenth century, however, Europe and North America were free of the disease. The solution was simple: filtration and chlorination of the water supply. Filtering alone reduces not only the spread of cholera but also typhoid. Combining filtration with chlorination eliminates waterborne diseases. A warmer climate, if it were to occur, would not reduce the effectiveness of these water purifi-cation measures.

In January 1991, after many disease-free decades, cholera began sickening villagers in Chancay, Peru, a port less than 40 miles north of Lima. It then spread rapidly up and down the coast. From that outbreak to the end of 1995, Latin America reported more than 1 million cases—many went unreported—and 11,000 deaths. The illness traveled from Peru to Ecuador, Colombia, then to Brazil. Eight months after appearing in Peru, it reached Bolivia. By the end of 1992, virtually all of South and Central America, from Mexico to Argentina, had confirmed cases. In the early 1990s, cholera also entered the United States; however, with the exception of a few cases brought on from eating raw tainted shellfish, virtually all cases were contracted abroad. Seventy-five cases, nearly half of the total 160 reported to the CDC between 1992 and 1994, originated on a single flight from Lima in 1992!

What went wrong to bring an end to Latin America’s 100 years of freedom from cholera? Rita Colwell theorizes that an El Niño* led to a plankton bloom that multiplied the hosts of V. cholerae. But El Niños have been occurring with some regularity for many decades without producing a cholera epidemic. The coast of Peru in 1991 was not even particularly warm compared with a number of other years (see chart 2). Even if El Niño were in part the culprit, the basic cause lies elsewhere. Based on U.S. Environmental Protection Agency studies showing that chlorine might create a slight cancer risk, authorities in Peru decided not to chlorinate their country’s drinking water (Anderson 1991). In all probability, they also were saving money. Chlorination, however, is the single most effective preventive of cholera and other waterborne diseases. After the fiasco in Peru, the EPA determined in 1992 that there was no demonstrable link between chlorinated drinking water and cancer. It was too late; the harm had been done. Peru’s misplaced environmentalism led to more than 300,000 cholera victims in that country alone.

*A warming of the ocean surface off the western coast of South America that occurs every four to twelve years when upwelling of cold, nutrient-rich water does not occur. It causes plankton and fish to die and affects weather over much of the world.

Chart 2

Cholera is a disease of poverty, crowding, and unsanitary conditions. A warmer climate will not carry this disease to affluent countries; in the Third World, however, economic growth can bring freedom from this and many other diseases. We should not impose costs on ourselves or on others that would reduce the resources needed to bring clean water and good sanitation to Latin America, Africa, and Asia.

Malaria and Dengue Fever

A growing chorus has been chanting that global climate change will spread the insect-borne diseases, malaria, dengue fever, and yellow fever, to temperate latitudes. In the last few years, the health effects of global warming have been the subject of lengthy journal articles in JAMA (Journal of the American Medical Association) (January 17, 1996), and Lancet (June 8, 1996, and August 31, 1996), an international journal of medical science and practice. In 1996, the Australian Medical Association sponsored a major conference on the subject. Professor Paul Epstein of the Harvard School of Public Health has claimed that in the past few years mosquitoes carrying malaria and dengue fever have been found at higher altitudes in Africa, Asia, and Latin America (Epstein et al. 1998). For North America, David Danzig, in a Sierra Club publication (1995), has contended that only the tip of Florida is currently warm enough to support malaria-carrying mosquitoes but that global warming could make most of us vulnerable. He should check his history.

Malaria and cholera were both major health problems in the United States in the nineteenth century. Prior to the 1950s, malaria was endemic in the southern portions of the United States. Malaria was also widespread in southern Europe until shortly after World War II, when insecticides and good health practices eliminated it. As mentioned above, a number of epidemiologists stated in Science magazine (November 7, 1997), in the event of climate change, public health measures in the industrialized countries of the world would prevent the spread of such diseases.

Few now realize that, before the Second World War, malaria was common in the United States. The government recorded more than 120,000 cases in 1934; as late as 1940, the number of new sufferers totaled 78,000 (Centers for Disease Control and the Statistical Abstract of the United States). After the war, reported malaria cases in the United States plunged from 63,000 in 1945 to a little over 2,000 in 1950 to only 522 in 1955. By 1960, DDT had almost totally eliminated the disease; only 72 cases were recorded in the whole country. In 1969 and 1970, the Centers for Disease Control reported a resurgence to around 3,000 cases annually, brought in by service personnel returning from Vietnam. Subsequently, immigrants from tropical areas have spawned small upticks in new cases.

In the 1980s and 1990s, as chart 3 shows, the number of reported cases has averaged around 1,200 to 1,300 annually. The CDC reports that since 1985 approximately 1,000 of those cases have been imported every year, with visitors and recent immigrants accounting for about half. The rest come from travelers arriving from tropical countries, service personnel returning from infested areas, and a handful of individuals, typically those living near international airports, bitten by a mosquito that hitched a ride from a poor country. The recent outbreak of West Nile Fever on Long Island shows how vulnerable communities are that host major international airports. More stringent efforts to keep out these unwanted “immigrants” may be called for if the problem worsens.

Chart 3

Yellow and dengue fevers were both widespread in the United States from the seventeenth century onward. Epidemics of yellow fever ravaged New Yorkers and killed tens of thousands of people. In one year, 1878, of 100,000 cases reported along the East Coast, 20,000 people died. Between 1827 and 1946, eight major pandemics of dengue fever overran the United States. In 1922, the disease spread from Texas, with half a million cases, through Louisiana, Georgia, and Florida. Savannah suffered with 30,000 cases, of which nearly 10,000 had hemorrhagic symptoms, a very serious form of the disease. In contrast, for 1996 the CDC listed 86 imported cases of dengue and dengue hemorrhagic fever and eight local transmissions, all in Texas. There were no reported cases of yellow fever.

As a public health issue, those diseases, which did plague the United States in the reputedly colder nineteenth and early twentieth centuries, have been largely exterminated. There is no evidence that resurgence is imminent. Certainly the climate is not keeping the spread of these diseases in check. If it was warm enough in the cold nineteenth century for the mosquitoes to thrive, it is warm enough now!

Is there any basis at all for those scare-mongering prophecies? Is malaria rising worldwide? Not according to the World Health Organization. Over the twentieth century, the number of deaths from malaria has fallen sharply for the world as a whole (see chart 4). Even in sub-Saharan Africa malaria mortality declined until 1970, after which, with the deterioration of the economic situation on that continent, deaths from malaria have risen.

What brought down those scourges? The introduction of DDT clearly played a major role. From the end of World War II until it was banned in 1972, this pesticide worked wonders in eliminating harmful insects, especially mosquitoes. But it wasn’t just insecticides that did the trick. Simple steps, such as screens on windows, the elimination of standing water, and the movement to the suburbs, which reduced population density and thus the risk of transmission, have played a critical role in eliminating mosquito-borne diseases.

Chart 4

In 1995, however, a dengue pandemic afflicted the Caribbean, Central America, and Mexico, generating around 74,000 cases. More than 4,000 Mexicans living in the state of Tamaulipas, which borders Texas, came down with the disease. Yet Americans living a short distance away remained unaffected. The contrast between the twin cities of Reynosa, Mexico, which suffered 2,361 cases, and Hidalgo, Texas, just across the border, is striking. Including the border towns, Texas reported only eight nonimported cases for the whole state.

The only reasonable explanation for the difference between the spread of dengue in Tamaulipas and its absence in Texas is living standards. Where people enjoy good sanitation and public education, have the knowledge and willingness to manage standing water around households, implement programs to control mosquitoes, and employ screens and air-conditioning, these mosquito-borne diseases cannot spread. If the climate does warm, those factors will remain. In short, Americans need not fear an epidemic of tropical diseases.

DEATHS IN WINTER VERSUS SUMMER

Deaths from Cold versus Heat

Recent summers have sizzled. Newspapers have reported the tragic deaths of the poor and the aged on days when the mercury reached torrid levels. Prophets of doom forecast that rising temperatures in the next century portend a future of calamitous mortality. Scenes of men, women, and children collapsing on hot streets haunt our imaginations.

Heat stress does increase mortality, but it affects typically only the old and the infirm, whose lives may be shortened by a few days or perhaps a week. There is no evidence, however, that mortality rates rise significantly. The numbers of heat stress–related deaths are very small; in the United States; the number of deaths due to weather-related cold exceeds them. During a recent ten-year period, which includes the very hot summer of 1988, the average number of weather-connected heat deaths was 132, compared with 385 who died from cold (see chart 5). Even during 1988, more than double the number of Americans died from the cold rather than from the heat of summer. A somewhat warmer climate would clearly reduce more deaths in the winter than it would add in the summer.

Chart 5

Humans also seem to be able to adapt to hot weather. Adjusting for demographic differences and economic factors, people in cities with hot climates enjoy longer life spans than those in cold areas. A warm climate does not increase mortality. Moreover, the spread of air-conditioning reduces the discomfort of extremely high temperatures.

Let us review the documentation supporting the supposition that human mortality will rise with rising temperatures. Death rates during periods of very hot weather have jumped in certain cities, but above- normal mortality has not been recorded during all hot spells or in all cities. Moreover, research concerned with “killer” heat waves has generally ignored or downplayed the reduction in fatalities that warmer winter months would bring.

In a 1991 paper, Laurence Kalkstein, one of the most respected and careful scholars in this field, finds that deaths are related to the length of the hot spell. He suggests that it takes an extended heat wave to raise the death rate. In a later work, he reports that heat spells early in the summer or quick rises in temperature trigger deaths; in other words, unseasonal or rapid warming produces mortality (Kalkstein 1992). But if rapid warming causes deaths, we should find that most of the mortality during heat spells occurs on the first day or so and that fatalities then taper off, rather than increase with the length of the warm spell. As indicated, Kalkstein finds the opposite: deaths go up after a long spell of hot weather.

Kalkstein also finds that a particular weather pattern in St. Louis—characterized by high temperatures, strong southeast winds, moderate humidity, and relatively clear skies with little cloud cover—is correlated with increased mortality. For other cities, either no weather pattern was related to mortality or the patterns that correlated with extra deaths differed. Even in St. Louis, many of the days that exhibited the suspect weather showed no unusual number of fatalities. Moreover, very hot days, those with temperatures over 100°, failed to show death rates higher than the rates on those days when the thermometer made it only to 95°. In fact, the number of recorded deaths in St. Louis during that particular weather pattern varied considerably more than during other periods, which reduces our confidence in the results.

Researchers analyzing hot days and deaths have found no constant relationship; even when extremes in weather and mortality are correlated, the relationship is inconsistent. Cities with the highest average number of summer deaths are found in the Midwest or Northeast; those with the lowest number are in the South (Kalkstein and Davis 1989, 56). Typically analysts have failed to find any relationship between excess mortality and temperature in southern cities, which experience the most heat (Kalkstein 1992, 372). Other studies have found that people who move from a cold to a subtropical climate adjust within a very short period (Rotton 1983). Moreover, Kalkstein and others have reported without explanation that the “threshold” between temperatures that lead to excess deaths and those that have no effect varies significantly among the cities. In some, such as Los Angeles, San Fran-cisco, Boston, and Pittsburgh, the threshold was below 85° while in Phoenix and Las Vegas, it exceeded 110°.

Scholars have also reported contradictory and implausible results. According to several analyses, air pollution is not correlated with premature deaths (Kalkstein 1991). Some researchers have shown that, during hot spells, mortality goes up sharply in females; other researchers have measured increased deaths among males (Kalkstein 1992, citing Applegate et al. 1981; Bridger et al. 1976; Ellis 1972). Blacks are apparently more susceptible in St. Louis; whites, in New York. The lack of agreement on the effects of weather and on premature deaths again raises suspicions about the robustness of the results.

Measurement error may also foul up daily figures. In 1995, for example, Chicago suffered through an extraordinarily hot July that the press characterized as a harbinger of global warming. The coroner reported a marked increase in deaths. What was very curious was that on Friday, Saturday, and Sunday, July 14, 15, and 16, the reported deaths were way below the normal of seventy-eight per day—only fourteen people were reported to have died on Saturday—but on the two following days, Monday and Tuesday, fatalities were well above normal (Chicago Tribune, July 14–July 22, 1995). The previous record low body count for any day in the last thirty years had been forty-six! Given that on Friday, July 14, a record temperature of 106° was measured at Midway Airport, those numbers are not only remarkable but also suspicious. Could it have been that most people in the coroner’s office took the hot weekend off and counted bodies on Monday and Tuesday?

Researchers have attributed the absence of heat-related deaths in southern cities to acclimatization and the prevalence of housing that shields residents from high temperatures. In the North, the housing of the elderly and the poor is usually old and dilapidated. Over the next hundred years, if not sooner, most of those buildings will be torn down and replaced. Should the climate warm, builders will move toward structures that protect the inhabitants from extreme heat, as housing in the South allegedly does now.

These findings may imply simply that out-of-the-ordinary high temperatures increase the mortality of those in a weakened state. Little attention has focused on the question of whether the excess deaths represent premature mortality of a few days among the old or sick or whether the excess deaths point to a significant shortening of life. Studies examining excess deaths by months fail to find any positive correlation with high temperatures, indicating that any daily excess is offset by a reduction in fatalities over the next few days. In the South, where temperatures are routinely very high during the summer, even the elderly adjust. Consequently, if the climate becomes warmer, no excess deaths can be expected.

Fear of killer heat waves appears exaggerated. If temperatures rise slowly over the next century, possibly by the 2° to 6° Fahrenheit currently predicted, people will become acclimated while housing can and, in the normal cycle, will be replaced. After all, half the housing stock in the United States has been built during the last twenty-five years. Consequently, if warming takes place, people and housing will adapt; even if extended warm spells occur, mortality should not rise sharply. Moreover, the models and the evidence to date suggest that most of the warming will take place in the winter and at night. Consequently spells of extreme heat are unlikely to become much more common.

HURRICANES AND TORNADOES

Typically, global-warming prophets claim that climate change will increase the threat from more frequent or violent storms. Their argument, which has some plausibility, is that a warmer climate means that more heat energy will be trapped in the atmosphere, leading to bigger and stronger weather systems. On the other hand, warming is most likely to be greatest near the poles and less at the equator. The strength of weather systems is actually a factor of the differential in temperatures between the two regions. Since this differential will diminish, so too will the likelihood of more intense cyclones.

Major weather disasters do kill. The evidence, however, simply fails to support the proposition that weather is becoming more violent. In the Atlantic basin, the number of intense hurricanes, those scaled between three to five (five being the most violent), has actually declined during the 1970s and 1980s. The four years from 1991 to 1994 enjoyed the fewest hurricanes of any four years over the last half century. Researchers have found that the average number of tropical storms and hurricanes has not changed over the previous 52 years, while there has been a major decrease in the number of intense hurricanes (see chart 6) (Landsea et al. 1996).

Chart 6

For the Pacific around Australia, other researchers have found that the number of tropical cyclones has decreased sharply since 1969/70 (Nicholls et al. 1998). Of the ten deadliest hurricanes to strike the continental United States, all raged prior to 1960, notwithstanding the huge expansion of population in coastal areas vulnerable to such storms.

According to Christopher Landsea (1999b), a National Oceanic and Atmospheric Administration (NOAA) expert on hurricanes: “it is highly unlikely that global warming has (or will) contribute to a drastic change in the number or intensity of hurricanes. We have not observed a long-term increase in the intensity or frequency of Atlantic hurricanes. Actually, 1991–94 marked the four quietest years on record (back to the mid-1940s) with just less than 4 hurricanes per year.” In its 1995 report, the Intergovernmental Panel on Climate Change, the U.N. scientific body studying global warming, noted that (IPCC 1996b): “Knowledge is currently insufficient to say whether there will be any changes in the occurrences or geographical distribution of severe storms, e.g. tropical cyclones.” In other words, there is no reason to expect more or less hurricanes.

Weatherwise magazine rated the ten worst weather events of the twentieth century. First was the Dust Bowl of the 1930s, which brought heat and drought to the heartland of America, leading to the migration of thousands to California from the great plains. Second were the tornadoes that killed more than three hundred people in early April 1974. These storms devastated a dozen states from Alabama to Michigan to North Carolina to Ohio. The third worst disaster occurred on Septem-ber 8, 1900, when a mammoth hurricane destroyed Galveston, killing maybe as many as twelve thousand people. The 1990s experienced three storms that made the list: the March 12–15, 1993, winter storm that shut every airport from Washington to Boston (ranked fourth); Hurricane Andrew (1992) that wreaked devastation on Florida and Louisiana (ranked sixth); and the 1997–98 El Niño (ranked ninth). The choice of the latter event is strange. A paper, in the Bulletin of the American Meteorological Society (September 9, 1999), finds that the benefits from savings on heat, snow removal, lack of spring flood damages, and transportation were almost five times higher than the costs to the economy. Moreover, climatologist Stanley A. Changnon, who authored the study, found that El Niño on net saved more than 650 lives.

Thus, leaving aside the recent El Niño, only two storms in recent years were rated as horrendous. Each of these caused a great deal of property damage but few fatalities. Economic growth explains both the high dollar costs and low loss of human life. As more structures are erected in areas subject to storm damage, dollar costs rise. But improvement in technology brings not only ample warning about the approach of large weather events but also leads to better construction that can more easily withstand nature’s forces.

The two strongest hurricanes to strike the United States occurred in 1935 and 1969. If the warm decade of the 1990s has not brought bigger storms or more of them, and computer models fail to show any relationship between global warming and the ferocity of storms, we should refuse to be frightened by unsubstantiated speculation.

HISTORY OF CLIMATE CHANGES

History demonstrates that warmer is healthier. Since the end of the last Ice Age, the earth has enjoyed two periods that were warmer than the twentieth century. Archaeological evidence shows that people lived longer, enjoyed better nutrition, and multiplied more rapidly in warm periods than during epochs of cold.

That Ice Age ended about 12,000 to 10,000 years ago when the glaciers covering much of North America, Scandinavia, and northern Asia began to retreat to approximately their current positions. In North America the glacial covering lasted longer than in Eurasia because of topographical features that delayed the warming. Throughout history warming and cooling in different regions of the world have not correlated exactly because of the influence of such factors as oceans, mountains, and prevailing winds.

As the earth warmed with the waning of the Ice Age, the sea level rose as much as 300 feet; hunters in Europe roamed through modern Norway; agriculture developed in the Middle East, the Far East, and the Americas. Some seven thousand years ago and lasting for about four millenniums, the earth was more clement than today, perhaps by 4° Fahrenheit, somewhat higher than the IPCC’s best guess (3°) from a doubling of CO2. Although the climate cooled a bit after 3000 B.C., it stayed relatively warmer than the modern world until sometime after 1000 B.C., when chilly temperatures became more common. During the four thousand warmest years, Europe enjoyed mild winters and warm summers with a storm belt far to the north. Rainfall may have been 10 to 15 percent greater than now. Not only was the country less subject to severe storms, but the skies were less cloudy and the days, sunnier (Lamb 1988, 22).

From around 800 A.D. to 1200 or 1300, the globe warmed again and civilization prospered. This warm era displays, although less distinctly, many of the same characteristics as the earlier period of clement weather. Virtually all of northern Europe, the British Isles, Scandinavia, Greenland, and Iceland were considerably warmer than at present (Lamb 1968, 64–65). The Mediterranean, the Near East, and North Africa, including the Sahara, received more rainfall than they do today. During this period of the high Middle Ages, most of North America also enjoyed better weather. In the early centuries of the epoch, China experienced higher temperatures and a more clement climate. From Western Europe to China, East Asia, India, and the Americas, mankind flourished as never before.

This prosperous period collapsed at the end of the thirteenth century with the advent of the “Mini Ice Age,” which, at its most frigid, produced temperatures in central England for January about 4.5°F colder than today. Although the climate fluctuated, periods of cold damp weather lasted until the early part of the nineteenth century. During the chilliest decades, 5 to 15 percent less rain fell in Europe than does normally today; but, due to less evaporation because of the low temperatures, swampy conditions were more prevalent. As a result, in the fourteenth century the population explosion came to an abrupt halt; economic activity slowed; lives shortened as disease spread and diets deteriorated.

Although the influence of climate on human activities has declined with the growth in wealth and resources, climate still has a significant effect on disease and health. A cold wet climate can confine people to close quarters, abetting contagion. In the past, a shift toward a poorer climate led to hunger and famine, making disease more virulent. Before the Industrial Revolution and improved technology, a series of bad years could be devastating. If transportation were costly and slow, as was typical until very recently, even a regionalized drought or an excess of rain might lead to disaster, although crops might be plentiful a short distance away.

For people in premodern times, perhaps the single best measure of their health and well-being is the growth rate of the population. Over history the number of humans has been expanding at ever-more rapid rates. Around 25,000 years ago, the world’s population may have numbered only about three million. Fifteen thousand years later, around 8000 B.C., the total had probably grown by one-third to four million. It took 5,000 more years to jump one more million; but, in the thousand years after 5000 B.C., it added another million. Except for a few periods of disaster, the number of men, women, and children has mounted with increasing rapidity. Only in the last few decades of the twentieth century has the escalation slowed. Certainly there have been good times when man did better and poor times when people suffered—although in most cases these were regional problems. However, in propitious periods, that is, when the climate was warm, the population swelled faster than during less clement eras (see chart 7).

Chart 7

Another measure of the well-being of humans is their life span. The existence of the hunter-gatherer was less rosy than some have contended. Life was short: skeletal remains from before 8000 B.C. show that the average age of death for men was about thirty-three and that for women, twenty-eight. Death for men was frequently violent, and many women must have died in childbirth.

The warmest periods—the Neolithic and Bronze Ages and England in the thirteenth century—enjoyed the longest life spans of the entire record (see chart 8). The rise in life expectancies during the latter warm period easily explains the population explosion that took place during the high Middle Ages. In contrast, the shortening of lives from the late thirteenth to the late fourteenth centuries with the advent of much cooler weather is particularly notable.

Chart 8

Good childhood nutrition is reflected in taller adults. Icelanders must have suffered from lack of food during the Mini Ice Age: their average stature fell by two inches (see chart 9). Only in the modern world, with greatly improved food supplies and medicines, has their height risen to levels exceeding those enjoyed in the medieval warm period.

Chart 9

In summary, the evidence overwhelmingly supports the proposition that, during warm periods, humans have prospered. They multiplied more rapidly, they lived longer, and they were healthier. If the IPCC is right and the globe does warm, history suggests that human health is likely to improve.

STATISTICAL STUDIES OF DEATH RATES

A number of researchers have found a negative relationship between temperature and mortality and/or a correlation between season and death rates (Momiyama and Katayama 1966, 1967, 1972; Momiyama and Kito 1963; Bull and Morton 1978; Rosenwaike 1966). For example, Bull and Morton, British researchers, reported that deaths from myo-cardial infarction, strokes, and pneumonia fell in England and Wales with higher temperatures. In New York, however, they fell only until the temperature reached 68°, then rose with the heat. Momiyama and his colleagues found that deaths followed a seasonal path but that, in the United States, this pattern became less pronounced in the period from the 1920s to the 1960s. Even though a regimen of increased deaths in the winter is apparent for all portions of the United States, England, and Wales, as well as Japan, many subsequent researchers have emphasized summer deaths attributed to high temperatures.

Seasonal Effects

If climate change were to manifest itself as warmer winters without much increase in temperature during the hot months, which some climate models predict, the change in weather could be especially ben-eficial to human health (Gates et al. 1992). The IPCC reports that, over this century, the weather in much of the world has been consistent with such a pattern: winter and night temperatures have risen while summer temperatures have fallen (Folland et al. 1992).

A warmer globe would likely result in the polar jet stream’s retreating toward higher latitudes; in the Northern Hemisphere, the climate belt would move north (Lamb 1972, 117–18; Giles 1990). Thus an average annual 6.7° Fahrenheit increase in temperature for New York City, for example, would give it the climate of Atlanta. NYC’s summertime temperatures, however, would not go up commensurably: the average high temperature in Atlanta during June, July, and August is only 4° warmer than New York City’s, which has on record a higher summer temperature than does the capital of Georgia. Summer temperatures generally differ less than winter temperatures on roughly the same longitude and differ less than average temperatures.

A sample of forty-five metropolitan areas in the United States shows that for each increase of a degree in the average annual temperature, July’s average temperatures go up by only 0.5 degrees while January’s average temperatures climb by 1.5°.* Since warming will likely exert the maximum effect during the coldest periods but have much less effect during the hottest months, the climate change should reduce deaths even more than any summer increase might boost them.

*The data were collected from the Department of Commerce, National Climatic Data Center, 1979.

Deaths in the United States and most other advanced countries in the middle latitudes are higher in the winter than in the summer. Except for accidents, suicides, and homicides, which are slightly higher in the summer, death rates from virtually all other major causes rise in winter months; overall mortality from 1985 to 1990 was 16 percent greater when it was cold than during the warm season (Moore 1998b). These data suggest that, rather than increasing mortality, warmer weather would reduce it, but that possibility is rarely discussed.

Earlier studies have also reported the relationship between season and death rates. Professor F. P. Ellis of the Yale University School of Medicine noted that deaths in the United States between 1952 and 1967 were 13 percent higher on a daily basis in the winter than in the summer (Ellis 1972, table II, 15). This difference is smaller than that experienced during the 1985–90 years, a period that included some of the hottest summers on record. Ellis’s study covered a time during which recorded average temperatures in the United States were somewhat lower than during the 1985–90 period. If hot weather were detrimental to life, the differential between summer and winter death rates during the latter period should have been smaller, not larger.

The increase in average temperatures during this century has apparently been accompanied by a decline in hot weather deaths relative to winter mortality. Before the early or middle part of this century, deaths during the summer months were much higher relative to winter than is currently the case (Momiyama 1977). Perhaps the decline in physical labor, which carries with it a much higher rate of fatal accidents than office work, helps explain the change. The Japanese scholar Ma-sako Momiyama, however, reports that for most advanced countries, such as the United States, Japan, the United Kingdom, France, and Germany, mortality is now concentrated in the winter.

A number of studies, as indicated above, have examined death rates on a daily basis (Bull and Morton 1978; Kalkstein and Davis 1989; Kalkstein 1991). This allows the authors to compare extreme temperatures with mortality. Although the research has shown that it is typically the elderly or the very sick that are affected by temperature extremes, the analyses ignore the degree to which this shortens life. Is it only a few days or a few weeks? That cities in the South fail to show any relationship between deaths and high temperatures suggests that the correlation in the North may stem from deaths of the most vulnerable when the weather turns warm. One way to parse out whether climate extremes shorten lives by only a few days, or whether they lead to more serious reductions in the life span, is to consider longer periods.

Monthly data on deaths and temperatures, for example, show that deaths peak in the cold period. My own research finds that monthly figures on various measures of warmth are correlated with monthly deaths in Washington, D.C. (Moore 1998b). The results support the proposition that climate influences mortality.

Although deaths peak in the winter, factors other than cold, such as less sunlight, could induce the higher mortality. The peaking itself does not prove that warming would lengthen lives; it could be that the length of the day affects mortality. The day’s length is closely correlated with temperature, of course, but, unlike the amount of sunlight, which remains constant each year, how cold it is fluctuates from year to year. My research, however, indicates that the length of the day, although correlated with the death rate, is less statistically significant than temperature (Moore 1998b). Moreover, if measures of temperature are combined with the length of the day, the amount of sunlight loses its statistical significance. Temperature remains the most important variable.

The District of Columbia study probably underestimates the relationship of deaths to temperature since some elderly from the capital winter in warm climates and die there. Nevertheless, the results imply that a 4.5° Fahrenheit—the “best estimate” of the IPCC in 1992 under a CO2 doubling—would cut deaths for the country as a whole by about 37,000 annually (IPCC 1992, 16).

Climatic Effects

Comparing death rates in various parts of the United States can provide us with evidence on how humans are affected by different climates. Within the continental United States, people live in locales that are subtropical, such as Miami, and cities that are subject to brutally cold weather, such as Minneapolis. The contrast between American cities makes the climate variables stand out. Within the United States, most people residing in big cities eat a more or less similar diet, live roughly the same way, and employ the same currency. Differences between the population of various parts of the United States are largely confined to the age distribution, ethnic concentrations, income, and, of course, weather.

In a recent study, I expanded the research from a single city to the effect of climate on death rates around the country. Clearly many factors affect mortality. Within any population, the proportion that is old influences death rates. Since African Americans have lower life expectancies than whites, the proportion that is black affects mortality rates. Income and education are also closely related to life expectancy. As is well known, smoking shortens lives. Severe air pollution has pushed up mortality, at least for short periods.

As expected, age had the largest effect on death rates. The proportion of African Americans was also highly significant in explaining death rates across counties. The higher the median income, the lower the death rate. Holding demographic and economic variables constant, I found that death rates were lower in warm climates. Various measures of climate demonstrate that warmer is healthier or at least extends life expectancies; once the age structure is held constant, there is a well-established direct relationship between death rates and life expectancies. The analysis implies that if the United States were enjoying temperatures 4.5°F warmer than today, 41,000 fewer people would die each year (Moore 1998b). This saving in lives is quite close to the number I estimated based on monthly Washington, D.C., data for the period 1987 through 1989.

In summary, the monthly figures for the city of Washington between 1987 and 1989 indicate that a 4.5°F warmer climate would cut deaths nationwide by about 37,000; the analysis of climate in counties around the United States points toward a saving in lives of about 41,000. These data sets produce roughly the same conclusion: a warmer climate would reduce mortality by about the magnitude of highway deaths, although the latter deaths are more costly in that they involve a much higher proportion of young men and women.

Morbidity

Presumably, if a warmer climate reduced deaths, it would also cut disease. In the early 1970s, the U.S. Department of Transportation (DOT) sponsored a series of conferences on climate change that examined, among other things, the effect of climate on preferences of workers for various climates and on health care expenditures. At that time, the government and most observers were concerned about a possible cooling of the globe. The department organized the meetings because it planned to subsidize the development and construction of a large fleet of supersonic aircraft that environmentalists contended would affect the world’s climate.

The third gathering, held in February 1974, examined the implications of climate change for the economy and people’s well-being and included a study of the costs to human health from cooling, especially any increased expenses for doctors’ services, visits to hospitals, and additional medication (R. Anderson 1974). For that meeting, the DOT asked the researchers to consider a cooling of 2° Celsius (3.6° Fahren-heit) and a warming of 0.5°C (0.9°F). Robert Anderson Jr., the economist who calculated health care outlays, made no estimate of the costs or savings should the climate warm; but his numbers show that for every 5 percent reduction in the annual number of heating-degree days, a measure of winter’s chill, health care costs would fall by $0.6 billion (1971 dollars) (Anderson 1974).* In a paper summarizing the various studies on economic costs and the benefits of climate change, Ralph D’Arge (1974), the principal economist involved in the DOT project, indicated that a 10 percent shift in heating-degree days would be equivalent to a 1°C change in temperature. Thus the gain in reduced health costs from a warming of 4.5° Fahrenheit would be on the order of $3.0 billion in 1971 dollars, or $21.7 billion in 1994 dollars, adjusting for population growth and price changes (using the price index for medical care).

More recently, I examined the relationship between the number of hospital beds per 100,000, the number of physicians per 100,000, and the average annual temperature (Moore 1998b). Although the number of physicians is only weakly related to climate, the number of hospital beds is significantly inversely related. In other words, holding income, race, and age constant, the warmer the climate, the lower the number of hospital beds or doctors. Assuming that the number of hospital beds and physicians reflects correctly the health care needs of their communities and is an index of health care costs, the numbers suggest that, had the climate been 4.5° Fahrenheit warmer, private expenditures on health care in 1994 would have been lower by $19 to $22 billion. Those numbers are remarkably close to the updated figures reported by Professor Robert Anderson ($22 billion). Assuming that government health expenditures would be affected comparably, the total national savings in medical costs would be about $36 billion.

*Each degree that the average temperature for a day falls below 65° Fahrenheit produces one heating-degree day. If the mean temperature on a particular day were 60°, for example, the number of degree days would be five. If the high for a day were 60° and the low 40°, the average would be 50° and the number of degree days would be fifteen.

That figure understates the benefits of warming since it does not include the gains from a reduction in suffering or from a cut in working days lost through disease. A minimum estimate of those gains would include the wage cost of people with jobs who, in the absence of warming, would have been absent from work because of illness. The $36 billion also neglects the gain to those who, because of the better climate, remain healthy and are not in the paid workforce or would have come to work despite suffering from a cold or the flu. If we assume that a 4.5°F warmer temperature would reduce illness by the same amount it is estimated to reduce deaths (1.8 percent) and apply the average workers’ compensation, the savings come to around three-quarters of a billion dollars (Statistical Abstract of the United States 1994, tables 631 and 660). These numbers also do not include any lowering of government expenditures on health care. Conservatively, health care saving would amount to about $37 billion a year.

Statistical Conclusions

Although it is impossible to measure the gains exactly, a moderately warmer climate would likely benefit Americans in many ways, especially in health. Contrary to many dire forecasts, however, the temperature increase predicted by the IPCC under a doubling of greenhouse gases, which is now less than 4.5°F, would yield health benefits for inhabitants of the United States.

In summary, If the IPCC is correct about a warmer climate over the next hundred years, Americans and probably Europeans, the Japanese, and other people living in high latitudes should enjoy improved health and extended lives. High death rates in the tropics appear to be more a function of poverty than of climate. Thus global warming is likely to prove positive for human health.

European Evidence

Further confirmation of the beneficial consequences of heat comes from a German study. That research shows that colder weather, rather than hotter, is a more significant killer. Not only is mortality higher in the winter, but a very cold winter produces a higher number of deaths. During the summer, according to the analysis, heat spells do lead to more deaths; but the increase is relatively small compared to deaths from the cold (Lerchl 1998).

Now a researcher in the United Kingdom has confirmed that those findings apply in his country as well. Prepared for the UK’s Department of the Environment, the report finds that a warmer world would bring even greater health benefits for England and Wales than I found for the United States in the two studies outlined above. Ironically the British research was carried out as part of a study of the impacts of the extraordinarily warm year of 1995.

In his analysis, C.G. Bentham, director, Centre for Environmental Risk, School of Environmental Sciences, University of East Anglia, looked at the relationship between the mean monthly temperatures and monthly deaths from 1976 to 1995 (with the exception of two years for which no figures exist). Although heat waves in Britain kill people, cold weather fells more. A greater number die in the winter months of December, January, and February than leave this world during the hot months of June, July, and August. The highest mortality occurs in January; the lowest, in August.

Bentham’s data (1997) indicate that, for every month except July and August, hotter than normal weather reduces deaths (see chart 10). In July and August, temperature increases of 2° or 3°C, about 3.6 or 5.4 degrees Fahrenheit, boost mortality slightly; but similar increases in other months cut deaths more significantly. In January and December, with a warming of 5.4 degrees Fahrenheit, he estimates deaths would fall by 5 percent. By the same token, an annual increase in temperatures of 3° Celsius would cut mortality by 3 percent. In England and Wales this means a savings of 17,500 lives for the entire year. For a total population of only about fifty million, that constitutes a significant reduction in fatalities.

Chart 10

The study examined whether lower than expected deaths might occur following heat spells or periods of extraordinary cold. Such a pattern would have been observed if extreme weather simply culled those who would have died shortly in any case. Bentham, however, failed to find any relationship between temperature extremes and deaths in subsequent periods, suggesting that it was not simply the weak or the sick elderly who expired.

That 1995 was exceptionally warm in the United Kingdom shows up in Bentham’s figures. In particular, the very mild month of February 1995 tallied fewer deaths than usual for that time of year. Deaths were, however, slightly higher than is typical during the unusually hot summer.

As Bentham puts it, temperatures in England and Wales are suboptimal for human health. Since humans evolved in Africa in a much warmer climate, it is unsurprising that the cold weather of the northern portions of the globe should be less than beneficial for most. Undoubtedly a warmer climate would promote health and well-being. People generally prefer a warm to a cold climate, as shown by the tendency to vacation in tropical areas during the winter and to move to the south on retirement.

Although Bentham’s results are similar to those I found for the United States, he actually unearthed a strikingly larger effect. As mentioned, he estimated that an increase of 3 degrees Celsius would reduce mortality in a population of 50 million by 17,500; I calculated that, for the U.S. population, a world 2.5-degrees-Celsius hotter would save about 40,000 lives annually. Extrapolating, a 3°C boost in temperature would save roughly 48,000 lives in America out of a population of 275 million. If applied to the United States, Bentham’s results indicate that a 3°C warmer world would prevent 65,000 deaths, a markedly greater number. The greater effect of temperature in Great Britain may be attributable to a climate cooler in the summer than in the United States. Consequently the effects of warming would be greater in that country.

In terms of percentages, my Washington, D.C., results imply that a 3°C boost in temperatures would reduce deaths by 2 percent; the nationwide data indicate that the same increase in warmth would cut mortality by 2.2 percent. In England and Wales, 3°C would reduce deaths by 3 percent.

As the data show, there seems no reason to fear global warming and a number of reasons to welcome it. Except for population fanatics who fear a drop in mortality, most people would welcome increases in life.

KYOTO’S EFFECT ON THE ECONOMY AND ON HEALTH

Lower Income Means More Deaths

Most of the concern with climate’s effects on health relates to mortality in the poor tropical portions of the globe. Reducing incomes in the industrialized nations, however, is no remedy for sickness and deaths in Africa and Southeast Asia. Economics is not a zero-sum game in which the poor benefit from making the rich less wealthy, but Kyoto would do just that. It requires the affluent countries of the world to reduce their emissions of greenhouse gases by 5 percent from 1990 levels during the years 2008 to 2012. For the United States and Canada as well, this implies a major cutback, over 30 percent, from levels that would exist under a business-as-usual scenario. On a per capita basis, Canada is a more prolific user of energy even than the United States and would suffer much more from slashing fossil fuel consumption.

Since the Kyoto Protocol exempts Third World countries from any need to curb emissions, calculations show that the growth in greenhouse gas emissions from such countries as China and India would soon dwarf any reductions from the industrialized countries (Bolin 1998). Thus meeting Kyoto would do nothing significant about warming, meaning that further and more drastic reductions in greenhouse gas emissions—perhaps as much as 60 to 80 percent—would be necessary to stabilize CO2 in the atmosphere at levels less than twice preindustrial concentrations. Even that would result in some warming. According to the Climate System Model of the National Center for Atmospheric Research, stabilizing carbon dioxide concentrations at 50 percent above current levels would still lead to a 2.7°F boost in temperatures worldwide. Cutting fossil fuel consumption by enough to stabilize emissions in the next few decades would produce a worldwide depression with falling incomes, rising unemployment, poorer health, and increased mortality. If electricity prices are boosted due to Kyoto, poor families will not be able to afford the electricity needed to run their air conditioners!

KYOTO KILLS!

The improvements in health and life expectancies during the twentieth century have brought great benefits to the human race. What led to this remarkable improvement in health? Greater use of ever cheaper energy and, of course, higher incomes. The Kyoto Protocol threatens both those sources of human gains. Higher incomes, coupled with falling energy prices, have produced the greatest improvements in the well-being of men and women in all of history. Where incomes are high, so is life expectancy. Where incomes are low, disease and death are all too prevalent. Economists studying the relationship of income and earnings to mortality have found that the loss of $5 to $10 million in the U.S. GDP leads to one extra death.

Recently the Energy Information Administration (EIA), part of President Clinton’s Department of Energy, released its estimates of the cost of meeting the Kyoto targets. According to that agency, which was surely under pressure to minimize its estimate of the burden on the American people, the cost, depending on whether trading emission reductions were possible and how many emission credits could be purchased abroad, would be between $77 billion and $338 billion annually.

Given the opposition of Europe to trading emission credits across national boundaries, the United States is unlikely to be able to purchase much of its quota in reduced greenhouse gas emissions from overseas. Assuming, therefore, that trading across national boundaries does not take place, the EIA estimates imply that somewhere between 33,800 and 67,000 more Americans will die annually between 2008 and 2012.

The Kyoto Protocol would devastate Third World countries as well. Even if they remain exempt from the limits on CO2 emissions, they will find that the United States buys fewer of their goods and services. Imported goods from the advanced countries will also cost more. As a result, the poor countries will become even poorer. We cannot estimate the toll on those countries—it would vary greatly from country to country—but we know that being poorer will increase their already too high death rate.

What these countries need is higher, not lower, incomes. With greater earnings, their people can look forward to longer life expectancies and reductions in disease. Higher incomes may also reduce violence between and within these states. All in all, the Kyoto treaty is a far more violent killer than any climate change could be. Let’s arrest it before it kills someone.

Since climate change will have only a very small effect on the world’s health, why are so many rushing to impose onerous taxes and controls on U.S. industry? The carbon tax that the administration suggested and then withdrew would have cost Americans about $180 billion per year. If preventing a rise in disease in poor countries were the purpose of restricting emissions, then it would be much more effective to deal with that problem directly than to put constraints on our energy use. Spending only one-tenth of that amount to provide clean water or mosquito netting would contribute far more to the world’s health than attempting to reduce greenhouse gas emissions.

CONCLUSIONS

Fears of health effects from global warming are overblown and highly speculative. Those who want to reduce greenhouse gases have resorted to scare tactics. In truth the health and well-being of people in rich countries will be largely unaffected by global warming should it occur. The effect of climate change on even poor countries will be small. Warming will be minor in tropical areas, and most diseases are related more to income than to climate.

However, abiding by the Kyoto Protocol will hurt people’s health. It will make them poorer. Even though they are exempted from the protocol’s provisions, Third World countries would be harshly affected by a poorer West. Moreover, as is well known, the Kyoto treaty will neither stop the buildup of greenhouse gases nor prevent climate change. To reduce carbon dioxide emissions, more drastic steps will be necessary. Some believe that, in order to stabilize the climate, our use of fossil fuels must be cut by more than 60 percent. That would certainly be disastrous for mankind, far worse than any climate change. Global warming would have minimal effects on human health and life expectancy. Kyoto kills; climate change does not.

overlay image