Despite the persistent myth that Silicon Valley was built by rogue engineers in Palo Alto garages, federal funding — especially from the military — has long been the real developmental engine of the American technology sector. It was robust government spending in science, technology, computation, and higher education that fueled the explosion of American technology after World War II. And these same federal powers eventually rescued the sector when, after Nixon cut the federal defense budget amid the wider economic strife of the early 1970s, it experienced its first major bust.
In response, electronics manufacturers formed the nationwide Semiconductor Industry Association a few years later, lobbying Washington for tech-friendly policies like lower taxes, less regulation, and protection from the thriving Japanese chip sector. Their influence grew when Reagan reversed Nixon’s military cuts in the ’80s. After Reagan announced a new effort to build a missile shield in space, the Pentagon allocated additional funding to encourage research in technologies like microelectronics, self-driving automobiles, fighter jets, and artificial intelligence. Soon, tech was booming again, with investor buzz around personal computing steadily infusing capital into the Silicon Valley ecosystem. The following decade saw exuberant expansion of commercial and residential internet usage, along with new startups, V.C. investment, and press lauding the explosion of the information superhighway. Between 1990 and 1997, the share of American households with personal computers increased from 15 percent to 35 percent, and in the late ’90s, Jeff Bezos and Intel CEO Andy Grove were both named Time magazine’s “Person of the Year.” Big Tech had successfully leveraged its growing power in Washington to enter the American home.
When the dot-com bubble burst in 2001, casting tech into another major downturn, the industry was resuscitated largely by defense funding. George W. Bush expanded foreign and domestic surveillance after the 9/11 attacks, and since much of the work was contracted out, massive federal contracts soon revived the internet-technology sector. By the end of the 2000s, Amazon Web Services and similar cloud-computation platforms from Google and Microsoft had boosted online retail and software usage. The release of the first iPhone, in 2007, and the advent of its App Store the next year helped buoy the market through the Great Recession. As on-demand services like rideshares and food delivery proliferated, a desperate workforce — with a painfully low federal minimum wage, pathetic labor protections, and no national health care — picked up gigs while app valuations climbed.
High tech’s value has long been in producing war-making technology for the federal government; in turn, Washington has consistently given back to the Valley’s defense contractors, universities, and blue-chip engineering firms. We’re seeing this pattern repeat itself in the present downturn. With onshoring policies and supply chain resilience taking on new public urgency, this summer the House and Senate passed the CHIPS and Science Act, which will inject about $78 billion of federal funding into semiconductor research and manufacturing. But little about the bill addresses the factors that have made the industry unsustainable — it does not promise to secure decent jobs or incomes for semiconductor workers, nor does it include provisions for environmental protection or occupational health. In December, the Pentagon signed a $9 billion Department of Defense contract with Amazon, Google, Microsoft, and Oracle to build a “tactical cloud,” which aims to allow military personnel to access data more quickly.
The technology sector attributes its economic power to its own creativity and rogue innovation, but throughout its history it has been dependent on the federal budget. What is at stake today is not whether the public is falling out of love with Big Tech — that has arguably already happened — but whether Washington could be persuaded to do the same.
Jeannette Estruth is an assistant professor of history at Bard College and a faculty associate at the Harvard Berkman Klein Center. She is currently writing a book on the history of Silicon Valley.