Building and sustaining U.S. preeminence in science and technology

By
Learn more about J.H. Cullum Clark.
J.H. Cullum Clark
Director, Bush Institute-SMU Economic Growth Initiative
George W. Bush Institute
Analog computing machine in the fuel systems building – an early version of the modern computer on Sept. 28, 1949. (NASA)

Part II: Building and sustaining U.S. preeminence in science and technology 

“Science is decisive, both in war and peace.”
U.S. Senator Harley Kilgore, 1946 

Between 1944 and 1950, America’s leaders decided the United States should become the world’s dominant superpower in science and technology, and they built the best model ever devised to make it happen. The nation’s success in realizing this goal stands as one of its most towering accomplishments – and as a core pillar of American prosperity today.  

This essay is the second in a six-part George W. Bush Institute-SMU Economic Growth Initiative series on how America’s distinctive model for innovation and economic growth came to be. U.S. preeminence in science and technology, like other components of the American model, was not inevitable. It resulted from visionary leadership and an audacious bet on the nation’s future.  

America’s science dominance over the past eight decades has translated into steep advantages over foreign peers in business innovation, productivity, and opportunity for ordinary Americans. This is because of a simple truth of economic geography: Ideas travel best at a short distance. Even in today’s networked world, commercialization of new ideas disproportionately occurs close to where they originate. 

American science before 1939 

The United States was a scientific backwater before 1939, compared with leading European nations. While America matched its rivals in practical engineering, Europe led the world in science and medicine. European scientists dominated cutting-edge work on infectious diseases (Louis Pasteur of France), antibiotics (Alexander Fleming, Scotland), radioactivity and radiation treatment for cancer (Marie Curie, France), radio technology (Guglielmo Marconi, Italy), computing (Alan Turing, England), and radar (Heinrich Hertz, Germany and Robert Watson-Watt, England). 

Europe’s lead was especially pronounced in the most revolutionary field of its time: particle physics. Germany dominated physics research until the Hitler regime fired all Jewish scholars from German universities – including 11 future Nobel Prize winners – in 1933. Even then, top scientists mostly moved to the United Kingdom, Switzerland, and Denmark, home of Niels Bohr, godfather of atomic physics in the 1930s. Top U.S. colleges tried to recruit European physicists but had little success until the outbreak of war in 1939, according to Richard Rhodes’ book, The Making of the Atomic Bomb and David C. Cassidy’s Uncertainty: The Life and Science of Werner Heisenberg. Albert Einstein, who came to Princeton University in 1933, was a notable outlier. 

One measure of the gap between Europe and the United States: As of 1939, European scientists had earned more than five times as many Nobel Prizes in physics, chemistry, and medicine as American scientists, as a U.S. Senate report highlighted in 1946.  

The initial push in 1939 to launch a U.S. atomic bomb program came not from American physicists but from three Jewish Hungarian scientists – Leo Szilard, Edward Teller, and Eugene Wigner – who had just arrived in the United States and understood the advances of Hitler’s nuclear weapons program. They asked Einstein to send a now-famous letter of warning to President Franklin D. Roosevelt because they knew no one in FDR’s White House would have heard of them. European émigré scientists played leading roles in the Manhattan Project, as the 2023 film Oppenheimer depicts. 

Germany, which created the modern research university, had an outsized share of premier institutions before Hitler’s rise to power, including the University of Göttingen, worldwide capital of atomic physics. Copenhagen, Denmark; Leyden, the Netherlands; and Zurich, Switzerland, also had top-ranking universities. By contrast, Ivy League schools in the United States like Harvard and Yale remained devoted throughout the 19th century to the ideal of classical education. This excluded “practical” subjects like science and engineering, which were only gradually incorporated in the early 20th century. Innovative institutions inspired by the German model – above all, Cornell, Johns Hopkins, and the Massachusetts Institute of Technology (MIT) – helped lead modernization of U.S. research universities in the early 20th century. But as of 1939, the United States was clearly behind. 

As for science policy, Congress supported only a tiny budget for research and devoted most of it to agricultural R&D, a notable U.S. strength. Military leaders were notoriously stodgy toward new technologies. Nor did the Roosevelt Administration show much interest in science. FDR at least partially subscribed to the popular theory that technological advances enabling industrial automation were to blame for high unemployment throughout the 1930s, according to Daniel J. Kevles in The Physicists: The History of a Scientific Community in Modern America 

World War II changes everything 

America’s entry into World War II after Pearl Harbor led to the greatest mobilization of science for military purposes in history. The federal government increased its R&D spending by 20 times in inflation-adjusted terms and devoted virtually all of it to winning the war. Congress established the Office of Scientific Research and Development (OSRD), which helped fund and coordinate successful projects on radar, the Norden bombsight, synthetic rubber, penicillin, antimalarials, and the atomic bomb. 

When the war came to an end in 1945, people understood that science had played a large role in the Allied victory. In contrast to the ambivalent feelings many Americans have today about the atomic bombing of Hiroshima and Nagasaki, the overwhelming feeling at the time was gratitude that U.S. soldiers would not die in an invasion of Japan. Scientists reported that, for the first time, they could walk into cocktail parties, and everyone wanted to talk with them, as Daniel S. Greenberg wrote in The Politics of Pure Science. 

At the same time, wartime experience convinced many Americans that science doesn’t just happen – it requires leadership and money. Americans emerged from the war far more optimistic that concerted efforts could improve cancer survival rates and other medical outcomes. Military leaders now embraced science and insisted on technological superiority over the Soviet Union. Federal officials came to understand that private firms would not invest in basic research at a large scale because they couldn’t patent scientific findings and capture the economic benefits as they could with finished products. In sum, America was primed for a call to pursue U.S. science preeminence in peacetime as well as in war. 

Two heroes of our story: Dr. Vannevar Bush and Sen. Harley Kilgore 

Historians of science generally think of Vannevar Bush as the chief architect of America’s postwar science establishment. I suggest that U.S. Sen. Harley Kilgore, Democrat of West Virginia – virtually unknown today – deserves credit as a co-equal author of the policies that took America to the pinnacle of global science. 

Vannevar Bush, son of a Massachusetts pastor and a mathematical prodigy, rose to prominence as a leading engineering professor at MIT. While teaching and doing research, he built one of the world’s first computing machines, did work that later inspired the inventor of the World Wide Web, and – on the side – co-founded the defense technology firm Raytheon. In 1939, Bush moved to Washington to become president of the Carnegie Institution, one of the nation’s top funders of university research. He soon became the chief voice calling for mobilization of American science for what he viewed as an inevitable war. In 1940, Bush convinced FDR to establish what became the Office of Scientific Research and Development and make him its chief. From this perch, Bush pushed into high gear the struggling Manhattan Project, which built the atomic bomb.  

In late 1944, FDR asked Bush to chair an expert commission to advise on postwar science policy. Bush published his legendary report, Science, The Endless Frontier, in July 1945, presenting a broad vision for why and how America should pursue worldwide preeminence in science and technology. The report drew a clear distinction between basic and applied research – studying the fundamental nature of matter versus turning this knowledge into practical products that improve human life – and argued that U.S. leadership in the former was essential for success in the latter. It also proposed that Congress should generously fund basic science but leave applied research to the private sector. 

Kilgore, meanwhile, grew up in a low-income family in rural West Virginia. He served in World War I, taught school, became a lawyer and county judge, and, in 1940, won a longshot race to become a U.S. senator. Kilgore arrived in Washington with what he later characterized as “utter, absolute ignorance” of science. But when he got assigned to a committee on wartime industrial mobilization, he quickly grew fascinated by the role of science and technology in the war and their potential peacetime benefits. Intensely curious, Kilgore brought in exceptionally knowledgeable staffers, including Dr. Herbert Schimmel, an Orthodox Jew with a physics Ph.D. from the University of Pennsylvania who couldn’t get an academic job because of his religion.  

Kilgore’s views on science policy started from the liberal principles of FDR’s New Deal social reform program. They also grew from a well-founded conviction that patent monopolies held by large corporations had slowed America’s war mobilization. He came to believe the United States should aim not just to become preeminent in science in the postwar world but to ensure broad and democratic distribution of its benefits. In 1944, he started working on a bill that would create a National Science Foundation (NSF) to promote peacetime research and scientific training. 

The debate over postwar science policy and its compromise resolution 

The six years from 1944 to 1950 saw an intense debate over the direction of postwar science policy. While accounts sometimes treat it as a two-sided struggle between Bush and Kilgore, the reality was more complex. Five questions confronted policymakers as they considered Bush’s Endless Frontier report and Kilgore’s legislative proposal: 

  • Should Congress fund R&D at a much larger scale than it did before 1939?
  • Should Congress go beyond defense R&D to fund basic physics, chemistry, biology, and medical science? Should it fund applied product development as well as basic research? 
  • Who should decide what specific projects to fund – private sector scientists or federal officials?
  • Who should do the research – federal employees, industry staff, or academic scientists?
  • Should researchers be allowed to patent the fruits of federally funded research? 

The debate of 1944-1950 was a five-sided contest in which everyone achieved some of their goals, but no one got everything they wanted. 

  • Moderate Republicans led by Sen. Alexander Smith, a Republican of New Jersey and former general counsel of Princeton University, shared the views of Bush, top leaders of elite universities like Harvard and MIT, and major corporate CEOs. They wanted to direct most funding to basic science at premier research universities, to allow patenting of federally funded inventions, and, above all, to empower private sector scientists to choose projects and limit the power of government officials. 
  • Kilgore, liberal New Dealers in Congress, and smaller universities – noting that some 15 top institutions dominated OSRD research contracts – wanted federal grantmakers to provide for geographically diverse research funding, to broaden support for scientific training, to restrict patenting rights, and to be democratically accountable to the president and Congress. 
  • Conservative Democrats and Republicans in Congress – together with private sector leaders worried about federal control of science – generally opposed a large federal role in research. 
  • The military wanted large-scale funding of defense R&D, with Pentagon leaders choosing which projects to fund – and universities, industry, and federal labs doing the work. 
  • A group of mostly female activists headed by Mary Lasker, a New York philanthropist, pressed for dramatic increases in federal funding for medical research. The group’s congressional allies were ideologically diverse but often motivated by personal experience with cancer or other diseases, according to records from the 80th Congress. 

Kilgore was unable to push his plans through in 1945 and 1946 because of opposition from Bush’s camp as well as from Congressional conservatives and the military. Smith got a bill representing his camp’s views through in 1947 after Republicans won control of Congress, but President Harry Truman vetoed it at the urging of his trusted friend Kilgore. Kilgore and Smith ultimately worked out a compromise NSF bill that won the support of all camps and was signed into law in 1950. 

Meanwhile, the proponents of defense and medical research made dramatic headway. Congress, responding to the intensifying Cold War rivalry between the United States and the Soviet Union, increased defense R&D tremendously. Congress also increased medical research funding through a formerly obscure agency, the National Institutes of Health (NIH), by more than 50 times in inflation-adjusted terms from 1946 to 1951.  

The compromise resolution reached by 1950 gave rise to all the core components of America’s postwar model for science and technology development. It has continued with only modest shifts ever since:  

  • Congress funds basic science at a larger scale than any nation in history but mostly leaves product development to the private sector. 
  • Democratically accountable agencies – NSF, NIH, the Department of Defense, and NASA – choose projects, often based on merit-based peer review. 
  • Scientists at autonomous universities and the national labs do most of the research work. 
  • Researchers as well as private firms may patent federally funded inventions. 

Bush was disappointed by this compromise setup, because he thought it gave too much power to military and medical decision-makers and too little to private sector scientists. He also warned it would dilute science funding by spreading it too far beyond the handful of Northeastern universities that had dominated American science before 1939. Kilgore, on the other hand, succeeded in establishing postwar science on more democratically accountable and geographically diverse foundations than would have happened without his leadership. He led the way in building a bipartisan consensus for U.S. preeminence in civilian as well as defense science. 

The growth and success of American science and technology, 1950-2010 

The model they created proved durable and successful. In the late 1950s, President Dwight Eisenhower and Congress boosted science funding dramatically in response to the Soviet Union’s Sputnik satellite launch. In the 1960s, President Lyndon B. Johnson advanced Kilgore’s vision by spreading science funding more fully across the nation and expanding support for social science research. President Richard Nixon declared war on cancer in the early 1970s. Congress allowed universities to hold patents on federally funded inventions in the Bayh-Dole Act of 1980. Presidents Ronald Reagan, George H.W. Bush, Bill Clinton, and George W. Bush all presided over large funding increases, especially for medical research. 

Federal science funding more than sextupled in inflation-adjusted terms from 1953 to 2010. University research spending rose by roughly 30 times over the same period. While critics argued in the 1960s that university research was too oriented to military applications, life science and medical research came to make up almost three quarters of university investment by the 2000s. University science research also became less concentrated in a handful of institutions. By 2010, the top 50 universities for research spending accounted for half of all spending by the higher education sector – a vast transformation from the late 1940s, when the top 15 easily accounted for more than half.  

Booming federal research investment and university science decisively contributed to business sector innovation. U.S. businesses – which relied on university science for more than 73% of the papers cited in their patents, according to one study – increased R&D spending by 16 times from 1950 to 2010 in inflation-adjusted terms.  

Data on local level innovation shows the close connections between academic science and business innovation: Metro areas with high levels of university research see more business R&D, more patenting, more venture capital investment, and faster economic growth, based on Bush Institute-SMU analysis of data from the NSF and the U.S. Census 

America has realized stunning benefits from products developed by U.S. industry from university science. These include novel antibiotics, chemotherapy agents and cell therapies for cancer, COVID-19 vaccines, GLP-1 agonist drugs for diabetes and obesity, fluoride toothpaste, pacemakers, modern water purification, the spreadsheet, the Google search engine, lithium-ion batteries, ChatGPT, and much more. Defense R&D has produced countless spinoffs for the civilian economy – like the internet, global positioning systems, and advanced prosthetics – in addition to contributing to America’s national security. 

Pulling away from peer countries 

America’s funding model differed from that of peer countries throughout the period from 1950 to 2010. The United States invested considerably more than peer nations in R&D. In 1990, federal funding exceeded that of Germany, France, the U.K., and Japan on a per capita basis by 35% to 84%, though each of these countries increased R&D rapidly from the 1950s to the 1990s. Though the United States still allocated more than half of government R&D funding to defense work as of 1990, federal nondefense research funding per capita still outpaced nondefense spending in Germany, France, and the U.K. by 10% to 100%, based on NSF data. 

In addition, the three-sided relationship between U.S. government, academia, and industry has long functioned differently than in peer nations. U.S. universities – including state institutions – have enjoyed far greater autonomy than most foreign peers. The German, French, and Japanese governments directly controlled most universities and classified researchers as civil servants during much of the 20th century. Even in the U.K., which has one of the world’s oldest systems of self-governing universities, government has exercised much tighter control since the 1980s. Greater autonomy plus intense competition among institutions has played a large role in fueling America’s edge in research excellence, economist Miguel Urquiola shows in his book Markets, Minds, and Money: Why the U.S. Leads the World in University Research

America’s federal grant system has emphasized merit-based grants to individual projects to a much greater degree than foreign peers, which rely more heavily on block grants to whole institutions. Block grants coupled with government control mean that European and Asian peers mostly allocate a larger share of funding on the basis of seniority, personal connections, and other nonmerit considerations. Virtually all peer nations have implicitly acknowledged the American system’s superiority by moving toward merit-based project grants, but large differences remain, as a landmark 2024 report on European innovation by former European Central Bank President Mario Draghi shows. 

America has benefited, moreover, from a freer, more decentralized market for intellectual property transfers from universities to private-sector firms than foreign peers. By the 2000s, most large U.S. research universities operated sophisticated technology commercialization offices. 

These differences together with much larger funding have put American science well ahead of foreign competitors. Before the 2010s, U.S.-based researchers far outperformed European and Japanese peers on measures of scientific excellence like citations in top professional journals and patents, my calculations from NSF data show. Between 1950 and 2010, U.S.-based scientists won more than twice as many science Nobel Prizes on a population-adjusted basis as European-based scientists and more than five times as many as scientists in Japan. 

America likewise performed far ahead of peer nations for creating innovative companies. As of 2010, the United States dominated the software, internet, and information technology services sectors. In biopharmaceuticals – an area of relative strength for Europe – the United States created more than twice as many new drugs as Europe between 2002 and 2012 and employed 60% more workers than Europe, adjusted for population, based on my calculations using European pharmaceutical figures. America’s innovation performance has translated to a stark advantage in value creation: The United States gave rise to some 250 new firms from 1975 to 2024 that came to be valued above $10 billion, compared with about 14 in Europe and 12 in Japan. (Estimates for Japan come from my calculations based on a ranking of firms by market capitalization.) 

America’s lead erodes, 2010-2025

Since 2010, America’s commanding science position has slipped. Federal R&D investment fell almost 10% in inflation-adjusted terms from 2010 to 2022. Federal R&D per capita is still above peer nations, but America’s lead has narrowed as other countries have stepped up investment. Federal nondefense R&D per capita is now below that of Germany and France, while university spending per capita has fallen below that of Germany and the U.K. As for economywide research including business R&D, the United States now invests less as a share of GDP than Germany, Israel, Japan, South Korea, Sweden, and Taiwan.  

The United States still outpaces all peers in business R&D on a per capita basis. But basic science research constitutes a smaller share of total R&D than it did throughout America’s years of greatest dominance, between 1950 and 2010. If Vannevar Bush is still right that applied R&D depends on basic science, slow growth in basic research today may lead to slower business innovation in future decades. Also, the large tech firms Wall Street knows as the “Magnificent Seven” – Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla – now account for fully one in every three business sector R&D dollars, even though they employ fewer than 1% of American workers. Stripping out these firms, America’s edge over Germany, Japan, and Korea in business R&D looks much more modest. 

The most potent challenge to U.S. preeminence in science and technology today comes from China. China has almost caught up with the United States in research investment – and has moved ahead if one adjusts for the lower salaries for Chinese scientists. China is ahead in patents and gaining fast on measures of basic science like paper citations in top professional journals. China now leads in 57 of 64 national security-relevant critical technologies, including quantum cryptography, drones, and hypersonic detection, according to an Australian study. China’s government has also launched new initiatives to attract top Chinese-born researchers now working in America back to China – with growing success.  

An American success story 

America’s decision to pursue global science preeminence – embodied in the policies developed by Vannevar Bush and Harley Kilgore and reinforced for more than 60 years by Congress and nine presidents – has succeeded by any measure. Since innovation takes place with long lags from the emergence of breakthrough ideas to their maximum commercial impact, the effects of today’s policies will play out over the next 20 to 40 years. If America chooses to build on the innovation model that made today’s prosperity possible, America will be well positioned to sustain its preeminence into the late 21st century and beyond. 

Dr. Vannevar Bush
Sen. Harley Kilgore (D-WV)