Every culture develops its own creation myth. The myth for Silicon Valley starts with the discovery of the transistor. It unfolds according to the irresistible logic of Moore's Law: The size and cost of a transistor are cut in half every eighteen to twenty-four months. Like all myths, this one tells a simple story: "Before transistors, there were few computers. After the transistor was invented, computer power grew rapidly, transforming our lives."

Myths capture elements of truth, but they make sense of events only by hiding many important details. There is more to the computer revolution than the development of smaller and cheaper transistors. For example, why has hard-drive storage capacity increased so rapidly when no transistors appear on the surface of the disk? Forecasts about the long-run effects of the digital revolution hinge on the answer to this question.

The myth correctly identifies the discovery of transistors as the seminal event in the digital revolution. Charles Babbage worked out the basic principles for a general-purpose computer in the 1840s. In the century following Babbage's discovery, people made almost no progress in building a working model. Since the discovery of the transistor, however, computer power has grown exponentially. The contrast could hardly be sharper.

So it's no exaggeration to say that the discovery of the transistor and its subsequent development set the pace for the digital revolution. The myth, however, suggests that the transistor's progress sufficiently explains growth in computing power. This is very misleading. Many supporting technologies needed to be developed to make a working computer. Some of these-magnetic disk drives, fiber-optic data networks, graphical user interfaces-exploited very different technological principles from the ones behind the transistor.

Nevertheless, most commentators focus their story on the transistor, allowing them to create a myth in technological determinism. An innovative new production process is serendipitously discovered, eventually evolving along a predetermined technological trajectory. The emergence of the technology is then used to explain coinciding changes in social, political, and economic life. Thus one reads that the steam engine caused the Industrial Revolution and society's division into workers and capitalists. Now we hear that the transistor is causing the digital revolution, globalization, and the rise of the knowledge-based economy.

In these accounts, however, the causal arrows run only one way. Once an important new technology is invented, it develops according to its own logic, independent of surrounding events.

As individuals, we often feel powerless to affect the course of technological change. Technological determinism, therefore, appeals to our intuition. When studying markets, in contrast, intuition offers little guidance. People generally believe that an increase in the price of a basic commodity, such as milk, has little effect on the quantity purchased individually. Nevertheless, the evidence is clear. In the aggregate, when price increases, quantity purchased decreases. Incentives have subtle but pervasive effects on human behavior, effects that we may fail to see or understand.

Similarly, when the profit from developing a new type of technology increases, people will respond by developing the technology more rapidly. Example: Scientists understood the principles behind magnetic data storage long before the transistor, but in the early 1900s there was little demand. With the advent of the transistor and subsequent development of the central processing unit (CPU), demand for permanent data storage surged ahead, leading to innovations in this area. Instead of using plastic tape, engineers put the magnetic medium on a moving surface, first on the outside of a cylinder, then on the surface of a disk. Before the 1950s, innovations such as these were technologically feasible, but the lack of incentives limited their development.

As the example of hard-disk storage already suggests, cheaper transistors will continue to encourage innovation in complementary technologies. Improvements in one area will raise our impatience with bottlenecks that prevent us from enjoying technological advances in another. Fortunes will be made by people who remove these bottlenecks. Currently, cheaper transistors are the most important force inducing technological change in related fields. As breakthroughs occur, a different engine of growth could evolve. For instance, cheaper transistors have encouraged broadband graphics applications, which in turn have created users impatient with the slow speed of data transmission. As a result, communications technologies are now poised for a big increase in performance.

Forecasts about the digital revolution's future depend on the potential for induced innovation. Sometime in the next two decades, physical barriers will limit our ability to make transistors any smaller. If the determinists are correct, nothing can be done, except to wait for a new source of technological change. If economists are right-if incentives do matter-the digital revolution will continue to spur innovation long after the death of Moore's Law.

Researchers respond not just to rewards but also to costs. This creates another channel through which digital information technology will encourage technological change. Our physical world presents us with a relatively small number of building blocks-the elements of the periodic table-that can be arranged in an inconceivably large number of ways. Search costs limit our discovery of valuable new arrangements. Over time, we have found some useful combinations. Mix iron and carbon together with small amounts of manganese, chromium, nickel, molybdenum, copper, tungsten, cobalt, or silicon, and you can produce a range of different steel alloys. Arrange silicon, some impurities, and some metals in just the right way, and you get a microprocessor.

But nature shows us that we have a long way to go. By pure trial and error, evolution found a way to mix carbon, oxygen, hydrogen, and a few other elements into the seed for a tree. Think of this seed as the software and hardware for building a solar-powered factory. It sucks raw materials out of the ground and air. Depending on the software coded into its DNA, the tree will convert these raw materials into construction material or fruit. In comparison with the tree, existing software-controlled manufacturing systems don't seem very sophisticated.

Of course, the vast majority of possible arrangements lead to muck. The key to technological change and economic growth is to sort quickly and inexpensively through the possibilities to find valuable formulas.

Fundamentally, this search is an information-processing activity. We can start with established bodies of scientific knowledge and predetermine an arrangement that will do something valuable. Alternatively, we can create many different arrangements and try them out, looking for the ones that are valuable. In the pharmaceutical industry, these techniques are labeled rational design and mass screening. Most search processes involve some combination of the two. Chip design, for example, involves extensive design work in the initial stages and many rounds of trial-and-error screening as the chip moves into production.

From the beginning, everyone expected that computers would aid the process of rational design. For products ranging from airliners to power tools, computer design and simulation methodologies have already reduced product cycles and yielded better products. Surprisingly, computer-automated systems are also making screening more efficient. Using methods such as combinatorial chemistry and automated machinery, we can synthesize and screen large numbers of different compounds quickly and inexpensively.

So even if the rate of technological progress in the computer industry itself comes to a stop, the digital revolution will leave us with valuable information-processing tools for the future. Because searching will be cheaper, market incentives will cause us to search more. Technological change and economic growth will therefore be more rapid.

Myths may be comforting, but science is exciting. According to the myth of determinists, the digital revolution has been one long ride down a technological trajectory leading to smaller transistors. They suggest that we were lucky to stumble onto it. More accurately, the long sweep of human history shows that the more we learned and discovered, the better we got at learning and discovering.

We have had more change and growth in the last one hundred years than in the previous nine hundred. New methods of information processing-spoken language, writing, and printing with movable type-were pivotal developments in the acceleration of growth in knowledge and standards of living. There is every reason to hope that the digital revolution will have this same effect, making growth and technological change in the next century even more impressive than it has been in this one.

Of course, humans could make a mess of things. The Chinese had cast iron fifteen centuries before Westerners did and movable type four hundred years before Gutenberg. Nevertheless, the political and social system in China eventually stifled the incentives for additional discovery, and progress there virtually came to a halt. It is always possible that the same thing could happen for the human race as a whole. Technological determinists would then tell us that the technology did it to us. In truth, for progress to come to an end, we will have to do it to ourselves.

overlay image