The rise of the American software industry in the 20th century was made possible by a partnership between emerging technology companies and the U.S. government. Silicon Valley’s earliest innovations were driven not by technical minds chasing trivial consumer products but by scientists and engineers who aspired to address challenges of industrial and national significance using the most powerful technology of the age. Their pursuit of breakthroughs was intended not to satisfy the passing needs of the moment but rather to drive forward a much grander project, channeling the collective purpose and ambition of a nation.
This early dependence of Silicon Valley on the nation-state and indeed the U.S. military has, for the most part, been forgotten, written out of the region’s history as an inconvenient and dissonant fact—one that clashes with the Valley’s conception of itself as indebted only to its capacity to innovate. The United States since its founding has always been a technological republic, one whose place in the world has been made possible and advanced by its capacity for innovation.
But there is also another essential element of American success. It was a culture, one that cohered around a shared objective, that won the last world war. And it will be a culture that wins, or prevents, the next one.
At present, however, the principal shared features of American society are not civic or political but rather cohere around entertainment, sports, celebrity, and fashion. This is not the result of some unbridgeable political division. The interpersonal tether that makes possible a form of imagined intimacy among strangers within groups of a significant size was severed and banished from the public sphere. The old means of manufacturing a nation—the civic rituals of an educational system, mandatory service in national defense, religion, a common language, and a free and thriving press—have all but been dismantled or withered from neglect and abuse. This distaste for collective experience and endeavor made America, and American culture, vulnerable.
The establishment left has failed its cause and thoroughly eroded its potential. The frenetic pursuit of a shallow egalitarianism in the end hollowed out its broader and more compelling political project. What we need is more cultural specificity in education, technology, and politics—not less. The vacant neutrality of the current moment risks allowing our instinct for discernment to atrophy. Only the resurrection of a shared culture, not its abandonment, will make possible our continued survival and cohesion. And only by combining the pursuit of innovation with the shared objectives of the nation can we both advance our welfare and safeguard the legitimacy of the democratic project itself.
Silicon Valley once stood at the center of American military production and national security. Fairchild Camera and Instrument Corporation, whose semiconductor division was founded in Mountain View, California, and made possible the first primitive personal computers, built reconnaissance equipment for spy satellites used by the CIA beginning in the late 1950s. For a time after World War II, all of the U.S. Navy’s ballistic missiles were produced in Santa Clara County, California. Companies such as Lockheed Missiles and Space, Westinghouse, Ford Aerospace, and United Technologies had thousands of employees working in Silicon Valley on weapons production through the 1980s and into the 1990s.
This union of science and the state in the middle part of the 20th century began in earnest during World War II. In November 1944, as Soviet forces closed in on Germany from the east, President Franklin D. Roosevelt was in Washington, D.C., already contemplating an American victory and the end of the conflict that had remade the world. Roosevelt sent a letter to Vannevar Bush, a pastor’s son who had become the head of the U.S. Office of Scientific Research and Development, where he helped lead the Manhattan Project.
In the letter, Roosevelt described “the unique experiment” that the United States had undertaken during the war to leverage science in service of military ends. Roosevelt anticipated the next era—and partnership between national government and private industry—with precision. He wrote that there was “no reason why the lessons to be found in this experiment”—that is, directing the resources of an emerging scientific establishment to help wage the most significant and violent war that the world had ever known—“cannot be profitably employed in times of peace.”
Roosevelt’s ambition was clear. He intended to see the machinery of the state—its power and prestige, as well as the financial resources of the newly victorious nation and emerging hegemon—spur the scientific community forward in service of, among other things, the advancement of public health and national welfare. The challenge was to ensure that the engineers and researchers who had directed their attention to the industry of war—and particularly the physicists, who, as Bush noted, had “been thrown most violently off stride”—could shift their efforts back to civilian advances in an era of relative peace.
The entanglement of the state and scientific research both before and after the war was itself built on an even longer history of connection between innovation and politics. Many of the earliest leaders of the American republic were inventors, including Thomas Jefferson, who designed sundials and studied writing machines, and Benjamin Franklin, who experimented with and constructed objects as varied as lightning rods and eyeglasses.
Unlike the legions of lawyers who have come to dominate American politics in the modern era, many early American leaders, even if not practitioners of science themselves, were nonetheless remarkably fluent in matters of engineering and technology. John Adams, the second president of the United States, was, by one historian’s account, focused on steering the early republic away from “unprofitable science, identifiable in its focus on objects of vain curiosity,” and toward more practical forms of inquiry, including “applying science to the promotion of agriculture.”
Many of the innovators of the 18th and 19th centuries were polymaths whose interests diverged wildly from the contemporary expectation that depth, as opposed to breadth, is the most effective means of contributing to a field. The frontiers and edges of science were still in that earliest stage of expansion that made possible and encouraged an interdisciplinary approach, one that would be almost certain to stall an academic career today. That cross-pollination, as well as the absence of a rigid adherence to the boundaries between disciplines, was vital to a willingness to experiment, and to the confidence of political leaders to opine on engineering and technical questions that implicated matters of government.
The rise of J. Robert Oppenheimer and dozens of his colleagues in the late 1930s further situated scientists and engineers at the heart of American life and the defense of the democratic experiment. Joseph Licklider, a psychologist whose work at the Massachusetts Institute of Technology anticipated the rise of early forms of AI, was hired in 1962 by the organization that would become the U.S. Defense Advanced Research Projects Agency—an institution whose innovations would include the precursors to the modern internet as well as the global positioning system. His research for his now classic paper “Man-Computer Symbiosis,” which was published in March 1960 and sketched a vision of the interplay between computing intelligence and our own, was supported by the U.S. Air Force.
There was a closeness, and significant degree of trust, in the relationships between political leaders and the scientists on whom they relied for guidance and direction. Shortly after the launch by the Soviet Union of the satellite Sputnik in October 1957, Hans Bethe, the German-born theoretical physicist and adviser to President Dwight Eisenhower, was called to the White House. Within an hour, there was agreement on a path forward to reinvigorate the American space program. “You see that this is done,” Eisenhower told an aide. The pace of change and action in that era was swift. NASA was founded the following year.
By the end of World War II, the blending of science and public life—of technical innovation and affairs of state—was essentially complete and unremarkable. Many of these engineers and innovators would labor in obscurity. Others, however, were celebrities in a way that might be difficult to imagine today. In 1942, as war spread across Europe and the Pacific, an article in Collier’s introduced Vannevar Bush, who was at the time a little-known engineer and government bureaucrat, to the magazine’s readership of nearly 3 million, describing Bush as “the man who may win the war.” (Three years later, Bush published “As We May Think” in The Atlantic, praising scientists for working together in a “common cause,” and anticipating many aspects of the information age that lay ahead.) Albert Einstein was not only one of the 20th century’s greatest scientific minds but also one of its most prominent celebrities—a popular figure whose image and breakthrough discoveries, which so thoroughly defied our intuitive understanding of the nature of space and time, routinely made front-page news. And it was often the science itself that was the focus of coverage.
This was the American century, and engineers were at the heart of the era’s ascendant mythology. The pursuit of public interest through science and engineering was considered a natural extension of the national project, which entailed both protecting U.S. interests and moving society—indeed, civilization—up the hill. And while the scientific community required funding and extensive support from the government, the modern state was equally reliant on the advances that those investments in science and engineering produced. The technical outperformance of the United States in the 20th century—that is, the country’s ability to reliably deliver economic and scientific advances for the public, whether medical breakthroughs or military capabilities—was essential to its credibility.
As the philosopher Jürgen Habermas has suggested, a failure by leaders to deliver on implied or explicit promises to the public has the potential to provoke a crisis of legitimacy for a government. When emerging technologies that give rise to wealth do not advance the broader public interest, trouble often follows. Put differently, the decadence of a culture or civilization, and indeed its ruling class, will be forgiven only if that culture is capable of delivering economic growth and security for the public. In this way, the willingness of the engineering and scientific communities to come to the aid of the nation has been vital not only to the legitimacy of the private sector but to the durability of political institutions across the West.
The modern incarnation of Silicon Valley has strayed significantly from this tradition of collaboration with the U.S. government, focusing instead on the consumer market, including the online advertising and social-media platforms that have come to dominate—and limit—our sense of the potential of technology. A generation of founders cloaked themselves in the rhetoric of lofty and ambitious purpose—their rallying cry that they intend “to change the world” has grown lifeless from overuse—but many of them raised enormous amounts of capital and hired legions of talented engineers merely to build photo-sharing apps and chat interfaces for the modern consumer.
A skepticism of government work and national ambition took hold in the Valley. The grand, collectivist experiments of the middle of the 20th century were discarded in favor of a narrow attentiveness to the desires and needs of the individual. The market rewarded shallow engagement with the potential of technology, as start-up after start-up catered to the whims of late-capitalist culture without any interest in constructing the technical infrastructure that would address our most significant challenges as a nation. The age of social-media platforms and food-delivery apps had arrived. Medical breakthroughs, education reform, and military advances would have to wait.
For decades, the U.S. government was viewed in Silicon Valley as an impediment to innovation and a magnet for controversy—more an obstacle to progress than its logical partner. The technology giants of the current era long avoided government work. The level of internal dysfunction within many state and federal agencies created seemingly insurmountable barriers to entry for outsiders, including the insurgent start-ups of the new economy. In time, the tech industry lost interest in politics and broader collaborations. It viewed the American national project, if it could even be called that, with a mix of skepticism and indifference. As a result, many of the Valley’s best minds, and their flocks of engineering disciples, turned to the consumer for sustenance.
The interests and political instincts of the American elite diverged from those of the rest of the country following the end of World War II. The economic struggles of the country and geopolitical threats of the 20th century today feel distant to most software engineers. The most capable generation of coders has never experienced a war or genuine social upheaval. Why court controversy with your friends or risk their disapproval by working for the U.S. military when you can retreat into the perceived safety of building another app?
As Silicon Valley turned inward and toward the consumer, the U.S. government and the governments of many of its allies scaled back involvement and innovation across numerous domains, including space travel, military software, and medical research. The state’s retreat left a widening innovation gap. Many cheered this divergence: Skeptics of the private sector argued that it could not be trusted to operate in public domains while those in the Valley remained wary of government control and the misuse or abuse of their inventions. For the United States and its allies in Europe and around the world to remain as dominant in this century as they were in the previous one, however, they will require a union of the state and the software industry—not their separation and disentanglement.
Indeed, the legitimacy of the American government and democratic regimes around the world will require an increase in economic and technical output that can be achieved only through the more efficient adoption of technology and software. The public will forgive many failures and sins of the political class. But the electorate will not overlook a systemic inability to harness technology for the purpose of effectively delivering the goods and services that are essential to our lives.
In late 1906, Francis Galton, a British anthropologist, traveled to Plymouth, England, in the country’s southwest, where he attended a livestock fair. His interest was not in purchasing the poultry or cattle that were available for sale at the market but in studying the ability of large groups of individuals to correctly make estimates. Nearly 800 visitors at the market had written down estimates of the weight of a particular ox that was for sale. Each person had to pay six pennies for a chance to submit their guess and win a prize, which deterred, in Galton’s words, “practical joking” that might muddy the results of the experiment. The median estimate of the 787 guesses that Galton received was 1,207 pounds, which turned out to be within 0.8 percent of the correct answer of 1,198 pounds. It was a striking result that would prompt more than a century of research and debate about the wisdom of crowds and their ability to more accurately make estimates, and predictions, than a chosen few. For Galton, the experiment pointed to “the trustworthiness of a democratic judgment.”
But why must we always defer to the wisdom of the crowd when it comes to allocating scarce capital in a market economy? We seem to have unintentionally deprived ourselves of the opportunity to engage in a critical discussion about the businesses and endeavors that ought to exist, not merely the ventures that could. The wisdom of the crowd at the height of the rise of Zynga and Groupon in 2011 made its verdict clear: These were winners that merited further investment. Tens of billions of dollars were wagered on their continued ascent. But there was no forum or platform or meaningful opportunity for anyone to question whether our society’s scarce resources ought to be diverted to the construction of online games or a more effective aggregator of coupons and discounts. The market had spoken, so it must be so.
Americans have, as Michael Sandel of Harvard has argued, been so eager “to banish notions of the good life from public discourse,” to require that “citizens leave their moral and spiritual convictions behind when they enter the public square,” that the resulting void has been filled in large part by the logic of the market—what Sandel has described as “market triumphalism.” And the leaders of Silicon Valley have for the most part been content to submit to this wisdom of the market, allowing its logic and values to supplant their own. It is our own temerity and unwillingness to risk the scorn of the crowd that have deprived us of the opportunity to discuss in any meaningful way what the world we inhabit should be and what companies should exist. The prevailing agnosticism of the modern era, the reluctance to advance a substantive view about cultural value, or lack thereof, for fear of alienating anyone, has paved the way for the market to fill the gap.
The drift of the technological world to the concerns of the consumer both reflected and helped reinforce a certain technological escapism—the instinct by Silicon Valley to steer away from the most important problems we face as a society and toward what are essentially the minor and trivial yet solvable inconveniences of everyday consumer life: such as online shopping and food delivery. An entire swath of arenas, including national defense, violent crime, education reform, and medical research, appeared too intractable, too thorny, and too politically fraught to address in any real way. (This was the challenge we have aimed to address at Palantir—to build technology that serves our mos significant and vital needs, including those of U.S. defense and intelligence agencies, instead of merely catering to the consumer.)
Most were content to set the hard problems aside. Consumer apps and trinkets did not talk back, hold press conferences, or fund pressure groups. The tragedy is that serving the consumer rather than the public has often been far easier and more lucrative for Silicon Valley, and certainly less risky.
The path forward will involve a reconciliation of a commitment to the free market, and its atomization and isolation of individual wants and needs, with the insatiable human desire for some form of collective experience and endeavor. Silicon Valley offered a version of this combination. The Sunnyvales, Palo Altos, and Mountain Views of the world were company towns and city-states, walled off from society and offering something that the national project could no longer provide. Technology companies formed internally coherent communities whose corporate campuses attempted to provide for all of the wants and needs of daily life. They were at their core collectivist endeavors, populated by intensely individualistic and freethinking minds, and built around a set of ideals that many young people craved: freedom to build, ownership of their success, and a commitment above all to results.
Other nations, including many of our geopolitical adversaries, understand the power of affirming shared cultural traditions, mythologies, and values in organizing the efforts of a people. They are far less shy than we are about acknowledging the human need for communal experience. The cultivation of an overly muscular and unthoughtful nationalism has risks. But the rejection of any form of life in common does as well. The reconstruction of a technological republic, in the United States and elsewhere, will require a re-embrace of collective experience, of shared purpose and identity, of civic rituals that are capable of binding us together. The technologies we are building, including the novel forms of AI that may challenge our present monopoly on creative control in this world, are themselves the product of a culture whose maintenance and development we now, more than ever, cannot afford to abandon. It might have been just and necessary to dismantle the old order. We should now build something together in its place.
This essay has been excerpted from Alexander C. Karp and Nicholas W. Zamiska’s new book, The Technological Republic.
The post Why Silicon Valley Lost Its Patriotism appeared first on The Atlantic.