Cancer Risk Analysis: New Science and Old Politics

Friday, July 30, 1999

The U.S. government’s environmental regulatory apparatus is about to be challenged in a fundamental way by a novel technology that can—for the first time—measure whether there is cancer risk from minute quantities of chemicals and nuclear radiation. A newly developed supersensitive method—up to 100,000 times more sensitive than any now available—can establish whether genetic material in human cells has been damaged, an initial step that can lead to cancer.

Scientific advances inevitably call for a reexamination and possible overhaul of policies. Regulatory agencies, invoking the “precautionary principle,” have fought long and hard against accepting the concept of a risk “threshold,” below which radiation or chemicals do not induce cancer. Evidently, bureaucrats seem to ignore the fact that we are constantly exposed to small amounts of naturally occurring radiation and to natural cancer-causing chemicals in every bite of food we eat.


The ability to monitor the DNA damage caused by environmental factors—and to do so with great precision—will have enormous policy implications.


In dealing with cancer risk, we want to know What is a safe level of exposure to cancer-causing agents? and Why do individuals show such different sensitivities to carcinogens? Why, for example, do some heavy smokers never get lung cancer?

Cancer research has established general answers to these two questions, but their implementation has not been possible until recently. Last year, however, in the May 15, 1998, issue of Science, a team of Canadian scientists published details and first results of a technique that promises to revolutionize risk assessment for cancer, as well as treatment of the disease itself.

Medical and biochemical research identifies the direct cause of cancer as damage to the DNA in the human cell. The kind of damage depends on the agent, which could be solar ultraviolet radiation, exposure to X rays, chemicals from wood fire smoke, or even eating a charbroiled steak. By far the most important agents, though, are the oxidants/free radicals that arise naturally from the metabolism of the food we eat; in other words, just plain living induces DNA damage—as University of California biochemist Bruce Ames keeps reminding us.

Since we metabolize food constantly, why don’t we get cancer more often? What saves the situation, of course, is the fact that this damage to the DNA molecule can be repaired—as long as the damage rate is not so fast that the correction mechanism can’t keep up. But the effectiveness of correction is based on genes—and here we find individual differences, depending on heredity and perhaps also on certain environmental factors, that can damage the repair machinery. The longer we live, the greater the chance for uncorrected DNA damage that can lead to a cell becoming cancerous—which is why the incidence of cancer increases with age.

THE BREAKTHROUGH

The breakthrough for detecting minute amounts of DNA damage combines three separate techniques: recognition of the damaged DNA by immunochemistry, separation by advanced methods, and supersensitive detection by a laser that induces fluorescence. This ultrasensitive method can observe the effects of even minor amounts of radiation and chemicals and determine if the correction mechanism is functioning properly. As proof, the investigators tested their new method with cellular DNA and naked DNA and found that cells provide about one hundred–fold protection to their DNA. Until now it has been necessary to extrapolate from the effects of massive doses of nuclear radiation experienced by atomic bomb victims—or from massive (and near-lethal) doses of chemicals fed to rats. The common assumption—that the incidence of cancer decreases proportionately with exposure but never reaches zero (“linear hypothesis”)—is just that: an assumption that there is no threshold. But although DNA repair may not be possible at high levels of exposure, it can and does work at low levels of damage; clearly, repairs do take place when DNA is damaged at a low rate by the normal metabolism of food or by the nuclear radiation from the ubiquitous cosmic rays raining down on us from the galaxy. That is why there are no cancer epidemics in high-altitude cities like Denver, where the cosmic ray intensity is so much stronger.

The Canadian investigators also demonstrated that cells turn on inducible DNA repair in response to low-level irradiation, a result that bears much further study. Small doses of chemicals or radiation seem to stimulate the system and confer protection against larger doses. For example, small doses of vitamins A and D are essential to good health but are toxic in larger doses.

One can only begin to imagine some of the future possibilities. For instance, researchers may be able to detect more easily the type and frequency of DNA damage in living tissue after exposure to environmental radiation or chemicals, which may help settle the contentious issue of the possible cancer effects from second-hand smoke. One can envision that in the foreseeable future specialists will use this sensitive technique in various fields of life sciences—from toxicology to molecular biology, from assaying the risk of cancer to the treatment of the disease.

In the policy field also, there could be major breakthroughs. Instead of defining permissible exposure limits on pesticides or herbicides on the basis of testing laboratory animals, it will become possible to measure the effects directly on human DNA at the low levels at which these chemicals exist in the environment. Predictably, there will be legislative obstacles to overcome, and, of course, there may be bureaucratic resistance from regulatory agencies. After all, it took decades for Congress to do away with the notorious Delaney amendment, which banned food additives containing any amount of a known or suspected carcinogen; overriding science, it tried to abolish by law the existence of a safe threshold.

It will be interesting to monitor how the Environmental Protection Agency and other regulatory agencies adapt to this new technology and observe the response of environmental activists who have long been arguing for zero tolerance levels of manmade chemicals and radiation. Superfund legislation and procedures will have to be amended, and a new attitude may develop toward nuclear power and the disposal of spent reactor fuel, long burdened by unrealistic regulatory restrictions.

A case in point: On the basis of well-accepted scientific studies, the EPA has proposed changing the maximum contaminant goal for chloroform, one of the byproducts of water treatment, from 0 to 300 parts per billion. In the face of political pressure and threatened lawsuits, however, the EPA may now back away from this proposal. Science will surely win out in the end, but it will likely be a protracted and difficult struggle.

About the Author

More from Government Regulation