British science writer Matt Ridley is usually astute and thoughtful, so I was surprised to read his myopic views about the antecedents of technological innovation. Instead of owing a debt to basic science—which seeks to discover new knowledge and information but lacks a practical goal—“most technological breakthroughs come from technologists tinkering, not from researchers chasing hypotheses,” he wrote. “Heretical as it may sound,” he concluded, “‘basic science’ isn’t nearly as productive of new inventions as we tend to think.” And in support of his thesis, Ridley offers several examples of “parallel instances” of invention: six different, independent inventors of the thermometer, three of the hypodermic needle, four of vaccination, five of the electric telegraph, and so on.  

This argument is misguided. Even though several people may have invented various gadgets more or less simultaneously, those inventions were grounded in earlier scientific research that had no particular intended practical application and whose significance was completely unsuspected at the time. This is how science works.

For example, after he received the 1969 Nobel Prize in Medicine or Physiology, my M.I.T. microbiology professor, Salvador Luria, joked about the difficulty of perceiving the significance of one’s research findings when they are first obtained. He won the prize for research on the structure and replication mechanisms of bacteriophages, viruses that infect bacteria, which contributed to the foundation on which modern molecular biology rests by shedding light on how viruses work and how more complex organisms reproduce and pass on hereditary characteristics.

To all who had congratulated him on the award, Luria sent a cartoon that showed an elderly couple at the breakfast table. The husband, reading the morning newspaper, exclaims, “Great Scott! I’ve been awarded the Nobel Prize for something I seem to have said, or done, or thought, in 1934!”    

That basic science can lead to major innovative breakthroughs was expressed eloquently in a 2011 Science editorial by French biologist François Jacob in which he described the research that led to his 1965 Nobel Prize in Physiology or Medicine. His lab was working on the mechanism that under certain circumstances causes the bacterium E. coli to suddenly produce bacterial viruses that had been dormant, while at the same time another research group was analyzing, also in E. coli, how the synthesis of a certain enzyme is induced in the presence of a specific sugar. As Jacob wrote, “The two systems appeared mechanistically miles apart. But their juxtaposition would produce a critical breakthrough for our understanding of life”—namely, the concept of an “operon,” a cluster of genes whose expression is regulated by an adjacent regulatory gene. 

Perhaps the quintessential example of the synergy and serendipity of basic research was the origin in the early 1970s of recombinant DNA technology (also known as “genetic modification,” or GM), the prototypic technique of modern genetic engineering. It resulted from the confluence of several esoteric, largely unrelated areas of basic research: enzymology and nucleic acid chemistry led to techniques for cutting and rejoining segments of DNA; advances in fractionation procedures permitted the rapid detection, identification and separation of DNA and proteins; and the accumulated knowledge of microbial physiology and genetics enabled “foreign” DNA to be introduced into a cell’s DNA and made to function there. The result was the ability to move functional genes from one organism to another virtually at will—the birth of modern biotechnology.

Over the past 40 years, recombinant DNA technology has revolutionized numerous industrial sectors, including plant breeding and the production of pharmaceuticals and diagnostic tests. Its breakthroughs include vaccines that prevent infectious diseases and drugs to treat diabetes, cancer, cystic fibrosis, psoriasis, rheumatoid arthritis, and some genetic diseases. A more advanced variant called gene editing was recently used to successfully treat an infant with acute lymphoblastic leukemia who was at death’s door.

An analogous story is that of a revolutionary gene-editing technique called CRISPR-Cas9, which is expected to garner a Nobel Prize within a few years. CRISPR stands for “clustered regularly interspaced short palindromic repeats” — clusters of short DNA sequences that read similarly forward and backward, and which are found in many types of bacteria. Basic research on bacterial genetics revealed the presence of these DNA segments in the 1980s but scientists didn’t understand for almost two decades that they are part of a bacterial defense system. When a virus infects, bacteria can incorporate sequences of viral DNA into their own genetic material, inserting them between the repetitive segments. When the bacteria subsequently encounter that virus again, they use the DNA in these clusters to make RNAs that recognize the matching viral sequences, and a protein attached to one of these RNAs then chews up and degrades the DNA of the infecting virus. It was subsequently shown that it is possible to use a single RNA in conjunction with the cutting protein, an enzyme called Cas9, to cut any desired sequence of DNA. 

The utility of the CRISPR-Cas9 combination is that with a small sequence of RNA that matches the sequences of a DNA that an investigator wishes to modify, the RNA then guides the Cas9 enzyme to the DNA, and because the same cutting protein is used regardless of the target, researchers can change multiple genes in an organism simultaneously using Cas9 and multiple RNA guides. This is being used for a wide spectrum of applications, from improving microorganisms and plants to modifying human stem cells, none of which could have come to pass if it were not for the basic research of 30 years ago. 

A further commercially and medically important example was the creation of “hybridomas,” hybrid cells made in the laboratory by fusing a normal white blood cell that produces antibodies with a cancer cell. This was originally achieved in order to combine desired features from each—namely, the immortality and rapid growth of the cancer cell and the ability of the normal cell to dictate the production of a single, specific, “monoclonal” antibody.

The inventors wanted primarily to study the protein products of these fused cells in order to learn more about the rates of cellular mutation and the generation of antibody diversity, but it turned out that these immortal, antibody-producing cells were useful not only for scientific inquiry but also as a novel and valuable technological instrument for a variety of industrial and medical applications. Indeed, the technology has given rise to a variety of highly specific diagnostic tests and to the anti-cancer drug blockbusters Rituxan (rituximab), Erbitux (cetuximab), and Herceptin (trastuzumab); and to Avastin (bevacizumab), which is widely used to treat cancer and diseases of the retina that commonly cause blindness.

The technological revolutions of recombinant DNA, gene editing, and hybridoma technology could not have been accomplished in the absence of publicly funded basic research. The progress was hardly “inexorable,” nor were technological tinkerers primarily responsible for it, as Ridley envisions it.

Most of the published responses to Ridley’s essay were critical. A long-time California-based investor wrote about the attraction to industry of academic research powerhouses:

Venture capital, biopharmaceutical and other high-tech industries cluster about major research centers because basic science drives innovation. Venture capitalists literally “walk the halls” of major research institutes in search of breakthroughs, embodied in patents and published papers, around which to build companies. Government financing supports those centers.

Two European academics responded to Ridley that fundamental concepts arise from scientists trying to understand the basic laws of nature, but that that once discovered, technological applications don’t automatically follow, because, as in my examples above, “the most significant applications are often the least predictable.” Emphasizing that point, Nobel laureate in physics Leon N. Cooper commented that “it would have been difficult to predict that the investigations of Maxwell, Lorentz and Einstein in electromagnetic theory would lead to improvements in communications,” and “few would have expected that Schrödinger and Heisenberg’s quantum mechanics would lead to the transistor and computers, that Townes’s work on millimeter radiation would give us laser surgery.”

Basic science often does provide the fertile substrate from which technological breakthroughs sprout, and seemingly unrelated and obscure research areas may intersect and synergize unexpectedly. That is why we should continue to support well-designed basic research even in the absence of obvious benefits to society. 

overlay image