Silicon Valley has taken on its own mythos, but its origins can be found in the roots of government research and innovation that took place following World War II. FS Insider spoke to Margaret O’Mara about her newly released book, The Code: Silicon Valley and the Remaking of America. She explains what fueled such rapid growth in the now infamous city.
World War II, the Space Race and the Cold War
During the first half of the 20th century the Santa Clara Valley—what is now called Silicon Valley—was dependent on an agricultural economy. There was some entrepreneurial activity going on, thanks to the proximity of Stanford University, but it was all still at a very small scale.
Prior to World War II, the valley was far removed from the centers of finance and business in San Francisco. After the war, the Santa Clara Valley was transformed into the hub of the electronics industry as the need to invest in national defense grew rapidly.
Before WWII, the U.S. government was not invested in research and development. During the war research-driven projects to build better weapons were implemented, such as the Manhattan Project, that brought the U.S. into the nuclear age. Even after the war, the U.S. continued its battle for scientific and technological supremacy with the Soviet Union. There was also recognition that science and technology were essential to national progress. Innovation would lead to improved healthcare and general societal welfare.
“There was a lot going on within government labs, but also there was lot of money flowing to public and private universities and to private-sector defense contractors who essentially became agents of this broader development,” O’Mara said. “So the U.S. government was putting lots of money into basic research—seed money, if you will, for the technology itself—and also it's a really critical customer, a very early customer—for things like small advanced technology, electronics, transistorized technology that there really wasn't a market for it yet.”
Perfect Storm for Growth
The U.S. government was surprisingly hands-off with this research, O’Mara said. Essentially, the federal government built the sandbox for the private sector to play in and got out of the way. After President Kennedy set the goal of beating the Soviets to the moon, the U.S. needed fast, powerful and light electronic technology. The only place that technology was being developed was in Northern California.
By the 1950s, the growing Californian silicon semiconductor industry was comprised of largely homegrown startup firms that were headquartered somewhere else. These included large big defense contractors, such as Lockheed, which located its missiles and space division in Sunnyvale in the 1950s. This was a huge catalyst for the later development of the valley.
These semiconductor firms were established by the 1960s. This included the iconic grandfather of all venture-backed startups, Fairchild Semiconductor. Fairchild was the seed for hundreds of other Silicon Valley companies to come. The cofounders of Fairchild included the cofounders of Intel and Kleiner Perkins, a leading venture capital firm then and now.
The first few years of Fairchild Semiconductor’s existence, the Federal government and particularly NASA were a critical client because of the Apollo Space Program. With NASA buying in bulk, Fairchild was able to scale up production, which in turn drove down prices while competition was limited.
“This brand new, super competitive, super entrepreneurial set of companies grew on a foundation of big government spending,” O’Mara said. “They had a lot of flexibility in doing what they wished and they were able to build commercial businesses on the backbone of that federal investment.”
The Silicon Explosion
The rapid growth and speed of silicon-based semiconductor technology and industry defied all expectations, O’Mara explained. “The silicon semiconductor is to the Digital Revolution what the steam pump was to the Industrial Revolution,” she said. “It created this new power source that is exponentially getting faster and cheaper.”
Within two decades, it was possible to replace mechanical equipment with a digital equivalent, and the possibilities exploded from there. During the 1970s, the industry transitioned from its reliance on government contracts to becoming a major enterprise provider to all sorts of different companies. This allowed the semiconductor companies to not only reinvent existing markets, but to create entirely new markets. In the process, a lot of existing industry was rendered obsolete.
The reason Silicon Valley beat out other centers of innovation had to do with its specialization in building small things. The first silicon semiconductors, transistorized electronics, were becoming smaller and more powerful. By the beginning of the 1970s, the advent of the microprocessor allowed for the marketing of the personal computer.
“The microprocessor possessed all the functions of a computer, now shrunk down to chip size,” O’Mara said. “Once you have a computer on a chip, you can essentially put a box around it and call it a desktop computer. This was the beginning of the next great wave of Silicon Valley industry, which was the personal computer industry.”
For the full audio interview, log in and go to The Code: Silicon Valley and the Remaking of America (part 1). Not a subscriber? Click here for more information.