Over the past two years—and building on the work and findings of the past 15 years—the TLSI convened five Dialogues under the leadership of its co-chairs Dr. Patricia Falcone, Deputy Director, Science and Technology at Lawrence Livermore National Laboratory; Dr. Sally Morton, Executive Vice President, ASU Knowledge Enterprise, Arizona State University; and Dr. Steven H. Walker, Vice President and Chief Technology Officer, Lockheed Martin. Participants—including CTOs and technology experts from technology-intensive industry sectors, universities, national laboratories, and the federal government—discussed the unprecedented pace and scale of today’s technological advancement and the effects on U.S. competitiveness.
In the past decade, the technology landscape has shifted radically along numerous
dimensions. Change is accelerating to unprecedented speed, and the United States
faces its strongest challenger ever in the technology arena.
The TLSI documented this shift with a significant 2015 study in conjunction with Deloitte—the Advanced Technologies Initiative—to provide insights on U.S. and global innovation trends and highlight the challenges faced by U.S. innovation stakeholders in maintaining or improving their tech-based innovation competitiveness. The TLSI uncovered how other nations beyond the United States and the European Union—namely, China—were dramatically transforming their innovation investments and national growth strategies. The research and survey work also marked a dramatic change over a very short period—the TLSI leaders’ perception that the European Union would be the most innovation-competitive rival flipped—with China now at the top of the list; and with data for the first time demonstrating China’s ambitions.
The effort reflects how the United States’ top tech and innovation leaders were chartering unknown territory brought about by a major technological discontinuity, creating great uncertainty about the future. These disruptive changes have brought about a powerful duality—the promise of heretofore unimaginable opportunity, but they also have shaken and threatened the world order.
These revolutionary technologies include advanced digital and telecommunications technologies, biotechnology, hypersonics, autonomous systems, quantum, and the apex technology of artificial intelligence. These technologies are reshaping and driving the global economy, military capabilities, and the global competitive battleground. They are the platform technologies from which the industries that will underpin the future are now arising.
Consider, as an example, how AI is fundamentally transforming the healthcare industry by significantly accelerating drug development and enhancing diagnostic accuracy. The Food and Drug Administration has already approved more than 800 AI/ML devices, and the number of AI-discovered drugs in clinical trials has surged from 17 in 2020 to 67 in 2023. In just on example, a firm using AI took just 18 months and $3 million to identify a drug candidate for treating pulmonary fibrosis, a process that normally would have taken $430 million out-of-pocket expenses and 3-6 years.
Artificial Intelligence is also similarly revolutionizing materials development. By rapidly identifying and simulating millions of materials for specific applications, AI drastically reduces the time required for discovery, as was the case with a collaboration between the Pacific Northwest National Laboratory and a leading software company, which analyzed over 32 million inorganic materials and identified 18 promising candidates for batteries in under 80 hours. This task traditionally would have taken years.
The private sector’s investment in R&D is a critical advantage for U.S. innovation and economic competitiveness—as it supports the nation’s ability to develop and deploy applied work to the marketplace.
The federally funded R&D investments serve a different but just as important role in the nation’s tech and innovation ecosystem. Typically, this federally funded work is long-term in orientation; is fundamental in its approach; non-appropriable by any one company or sector; and serves as the basis for the applied and development work that leads to innovative outcomes over many years. Federally funded basic research—like that which led to the internet and countless other innovations—is the feedstock for future industries and businesses.
So, the concern over a structural decline in the intensity of this particular type of investment class, when the overall economy has been historically strong, is real. Without a strong commitment by a nation to support basic research, the country is underfunding and putting at risk its long-term innovation capacity and capability.
A significant step toward enhancing the U.S. position on the global stage for the middle and second half of the 21st century would be to reverse the trend of structural decline in federally funded research intensity—moving the nation back up to historically high levels of near two percent of GDP. This would benefit both defense and non-defense research efforts.
While the federal government cannot singlehandedly drive innovation in the United States, it can co-create with the private sector a strategic vision and prioritize key initiatives for investment and action. In doing so, the United States can achieve global leadership in the dual-use technologies that will platform the “next economy”—from transformational computing (e.g., AI and quantum), to advanced energy solutions (e.g., small modular reactors (SMRs), and advanced biology (e.g., bioscience, biotechnology, and biomanufacturing).
The CHIPS and Science Act is a good example of how federal investment in R&D can spur significant private investment and economic activity. The CHIPS Program Office (CPO) has announced $32.54 billion in grant awards and up to $5.5 billion in loans, distributed among 32 companies involved in 48 projects across 23 states.5 These projects have lowered the financial risks associated with large-scale private investments, catalyzing a projected total investment of over $380 billion over the next two decades, with a significant portion expected by 2030.
The takeaway is that as technological advancements accelerate annually and as global competitors scale tech-based innovations at blistering speeds, the United States cannot solely rely on private enterprises—often dominated by a small group of technology firms—for innovation. Instead, the country must deploy cutting-edge technologies across all sectors of the economy and expedite innovation. Business, government, academia, and national laboratories must all be empowered to move more quickly to test, validate, and scale innovations, ensuring every sector of the U.S. economy and defense benefits from the most advanced products, services, and technical solutions.
China seeks to supplant the United States as the world’s economic, technological,
military, and geopolitical leader. It has put technology and innovation at the center of
its economic, military, and geo strategies. Chinese President Xi said, “Scientific and
technological innovation has become the main battlefield of the international strategic
game…”
China seeks to become a world S&T superpower and to use this technological superiority for economic, political, and military gain. Beijing is implementing a whole-of-government effort to boost indigenous innovation and promote self- reliance, and is prioritizing advanced power and energy, AI, biotechnology, quantum information science, and semiconductors. Beijing is trying to fast- track its S&T development through investments, intellectual property acquisition and theft, cyber operations, talent recruitment, scientific and academic collaboration, and illicit procurements.
Annual Threat Assessment of the U.S. Intelligence Community
Office of the Director of National Intelligence
February 2024
The PRC is the only competitor with both the intent to reshape the international order and, increasingly, the economic, diplomatic, military, and technological power to do it. Beijing has ambitions to create an enhanced sphere of influence in the Indo-Pacific and to become the world’s leading power. It is using its technological capacity and increasing influence over international institutions to create more permissive conditions for its own authoritarian model, and to mold global technology use and norms to privilege its interests and values.
National Security Strategy
The White House
October 2022
In 2009, China invested $183 billion in R&D. By 2022, its investment had increased to $686 billion (constant dollars)—a nearly 270 percent increase. It is using every tool in its arsenal to build a science and technology capability rivaling those of the United States—pursuing aggressive plans for every strategic critical technology, backed by hundreds of billions of dollars in investment. This includes a multi-pronged strategy to acquire technologies from other countries—especially the United States. As one illustrative example of China’s strategic focus on global tech leadership, China is rapidly expanding its seabed mining capability, a critical resource for rare metals necessary for producing electronics, clean energy products, and microchips, and setting up institutes on deep-sea research, and dozens of colleges on marine sciences, while President Xi has directed that China “master key technologies for entering the deep sea.”
China is also spreading its global influence in the technology landscape, aiming to shape large swaths of the global economic and trading system, and write the rules of the 21st century economy in its state-directed model. It is using its growing role in multilateral institutions—such as the UN’s scientific agencies, WIPO, and international standards-setting bodies—to help achieve its geopolitical goals. It seeks to bring other nations into its sphere through efforts such as Belt and Road Initiative, Digital Silk Road, and Maritime Silk Road—long-term strategies to forge lasting global partnerships rooted in technology entanglement.
While the scientific community generally views the free and open exchange of information as vital to scientific research, China has employed a variety of mechanisms to influence and exploit the openness of the U.S. research enterprise. These include foreign talent recruitment programs, forming partnerships with U.S. research universities, setting up research centers in the United States, financing joint research programs, and sending students to the United States for science and engineering graduate studies. Instances uncovered include U.S. researchers failing to disclose foreign funding and associations; theft of intellectual property; and violations of the peer review process by sharing confidential grant applications. Congress and the Executive Branch instituted new disclosure requirements on applicants for federal R&D funding, especially regarding foreign support, and specific policies governing federal employee and grantee participation in foreign talent recruitment programs. Increased scrutiny of international collaboration is placing significant pressures on universities. There is a growing challenge in finding a balance between fostering international R&D partnerships and safeguarding U.S. technology. Additionally, there is a need to address the expectations of U.S. taxpayers, who expect to see tangible benefits from public R&D investments made in U.S. universities and national laboratories.
The CHIPS and Science Act, passed in 2020, appropriated $50 billion for: financial assistance to establish semiconductor fabrication, assembly, testing, advanced packaging, or R&D in the United States; a new National Semiconductor Technology Center; a National Advanced Packaging Program; microelectronics metrology research; and ManufacturingUSA institutes on semiconductor manufacturing. Two hundred million dollars was provided for workforce education and training, $2 billion for a Department of Defense National Network for Microelectronics Research and Development, and $500 million for international technology and supply chain security and innovation activities.
When the pandemic prompted nationwide lockdowns, millions of white-collar workers transitioned to telework almost overnight, compelling companies to rapidly reengineer work processes, communications, and management structures. Digital strategies initially planned for months or years were implemented in days, and the home delivery sector expanded its workforce by hundreds of thousands. Organizations across various industries adapted by implementing new safety protocols, with distilleries pivoting to produce hand sanitizer, sports equipment manufacturers crafting face shields, and fashion houses sewing masks. Additionally, companies modified production to meet consumer needs, repurposing hotel spaces to accommodate medical workers and enhancing telehealth services. The research community and regulators also mobilized swiftly, developing tests and vaccines in 100 days—a previously unprecedented pace
These include: artificial intelligence, quantum information science, advanced communications, microelectronics, nanotechnology, high-performance computing, biotechnology and biomanufacturing, robotics, advanced manufacturing, financial technologies, undersea technologies, and industrial space.
U.S. defense capabilities are being reshaped by dual-use emerging technologies and game-changing technology-enabled concepts such as artificial intelligence, machine learning, autonomy, next-generation communications, spectrum technologies, space, biotech, and digital technologies that weave defense platforms together for different mission applications and changing battlefield conditions. Leadership in many of these dual-use technologies is in commercial firms, high-tech start-ups, universities, and national laboratories. The U.S. Department of Defense—for example, through President Trump’s April 9, 2025, Executive Order on modernizing defense acquisitions—and the defense primes are making plans to tap highly innovative commercial firms, small businesses, and start-ups to bring advanced technologies to military systems. But the commercial sector is moving so fast, and the investments are so big, the defense industry cannot keep up.
While amazing technology is being developed across the whole U.S. ecosystem, it can take years for it to have its intended impact for national security. Recently, the U.S. Government Accountability Office found that the Department of Defense continues to struggle with delivering innovative technologies quickly. Recent reforms were intended to lead to faster results, but slow, linear development approaches persist. GAO found that leading commercial companies deliver complex, innovative products with speed through iterative cycles of design, development, and production. But the average major defense acquisition program (MDAP) yet to deliver initial capability plans to take over 10 years to do so. Cycle time is increasing. GAO found that, for MDAPs major that have delivered capability, the average amount of time it took to do so increased from 8 years to 11 years—an average increase of 3 years from their original planned date.
The “valley of death,” a term universally disliked yet a persistent bottleneck in the U.S. innovation system, which prevents many potentially valuable innovations from reaching the marketplace or slowing their progress toward commercialization, and keeping many start-ups from a pathway to growth. In the valley of death, companies cannot obtain the capital needed to prototype, demonstrate, test, and validate their innovations, lowering risk and generating the performance and cost data needed to attract commercial financing. This occurs when technologies arise in the start-up sector, and when they are transferred or “spin-out” from universities into the private sector for application and commercialization. The federal government has made efforts to bridge the valley of death, for example, funding an extension of Phase II Small Business Innovation Research program grants, and providing funding for prototype development and pilot demonstrations.
The CHIPS and Science Act established a new NSF Directorate on Technology, Innovation,
and Partnerships. This historic—and, as of this writing, still fragile—initiative
expands the Foundation’s mission, with NSF now tasked with fostering technology
development, innovation, and the growth of regional innovation ecosystems.
Going forward, cities, states, and regions should double down and build on efforts to attract private sector engagement and to coordinate more local, “place-making” ecosystem building efforts to leverage these past investments.
Moving at a blistering pace, SpaceX is disrupting a market dominated by governments for a half-century—disrupting satellite launch, space exploration, and the industry’s ecosystem. It launched its workhorse Falcon 9 rocket 134 times in 2024. And in 2024, the Falcon 9 completed 52 percent of all global orbital rocket launches and delivered 84 percent of total mass to orbit. At an event with the Center for Strategic and International Studies (CSIS), SpaceX President and COO Gwynne Shotwell stated that the company was aiming for 175 to 180 launches in 2025. With its heavy lift rocket Starship in undergoing a test flight campaign, SpaceX aims to send 100- ton payloads to the moon and Mars for $10 million a trip.
After 60 years of work, on December 5, 2022, researchers at Lawrence Livermore National Laboratory’s National Ignition Facility for the first time anywhere by any approach achieved a fusion experiment that produced more fusion energy than the laser energy required to trigger the reaction—a huge advancement for the field, repeated in no less than four subsequent experiments. Nuclear fusion has the potential to deliver an inexhaustible supply of cheap clean energy. The United States has at least 25 companies working on different concepts, and most of the investment. In 2023, the Department of Energy awarded $43 million to eight of these companies to fund R&D and deliver within 18 months an early fusion pilot plant design. Many commercial companies are targeting the early 2030s for putting fusion energy on the grid, and a few start-ups have even more aggressive timelines.
In addition, other efforts at sustainably using and expanding the country’s energy sources—including advanced nuclear R&D for modular, as well as Generation IV nuclear reactors, and enhancing U.S. energy infrastructure—are underway. The Council has called for the nation to launch a “Nuclear Energy Moonshot” to accelerate next-generation nuclear technologies, and turbocharge the production of clean, baseload energy.
In late-2022, a generative AI model—ChatGPT—was released to the public, reaching
1 million users in five days, and 100 million users in two months. According to
BOND’s June 2024 report, this was the fastest user ramp ever for a standalone product;
and generated the fastest software ramp ever (OpenAI hit a $2 billion revenue
run rate in the first full year post-launch of ChatGPT).
The advance of AI will drive the biggest and fastest technology disruptions in history. Businesses, researchers, educators, government officials, and others are beginning to experience disruption as AI begins to transform the relationships between human and machines, shatter the time and cost calculus for a widening array of human endeavors, rewrite the process of scientific discovery, and drastically alter military capabilities and the very character of war. It could drive a collapse in some product life cycles, supercharge the forces of creative-destruction, and propel a leap in productivity. In a recent study, top economists estimated that we could see a near doubling of output after 20 years from an AI-enabled productivity growth rate 44 percent higher than the baseline projections of the U.S. Congressional Budget Office.
On January 20, 2025, Chinese AI development firm DeepSeek disrupted the broad belief that the United States was the undisputed global leader in AI. DeepSeek released its R1 LLM at a tiny fraction of the development cost and workforce as OpenAI and other competitors, while providing its R1 models under an open source license, enabling free use. DeepSeek promises to improve the efficiency and speed of search.
Many of the world’s largest companies by market cap—Amazon, Alphabet, Meta, Apple, Microsoft, Nvidia, etc.—are competing fiercely for leadership in AI. And we are seeing some of the biggest injections of capital into a specific technology in the history of Silicon Valley. In the first quarter of 2024 alone, Microsoft spent $14 billion, Google spent $12 billion, and Meta spent more than $6 billion. They all increased their spending projections for the year ahead. These three companies, along with Apple, are the top R&D spenders in the world.
The AI boom is rapidly increasing demand for compute power, placing pressure on American data centers and the supply of electricity that powers them. The IEA reported that data centers’ total electricity consumption could double to more than 1,000 terawatt-hours by 2026. That is roughly equivalent to the electricity consumption of Japan. IEA forecasts that electricity consumption from data centers in the European Union in 2026 will be 30 percent higher than 2023 levels. By 2033, power demand from Europe’s data centers could be equivalent to the total power consumption of Portugal, Greece, and the Netherlands. Ireland’s AI-related consumption could reach 32 percent of the country’s total electricity demand in 2026. In the United States, a new study reports that data centers could consume up to nine percent of electricity generation by 2030, more than double the consumption today.
Due to surging AI-driven demand for electricity, utilities predict the United States will need the capacity of 34 new nuclear power plants in the next five years. To meet their needs, major technology companies like Microsoft and Amazon are reviving old nuclear facilities like Three Mile Island and signing long-term, exclusive power purchasing agreements with utilities, as well as investing in next-generation nuclear reactors. Recently, Google announced a deal to source energy from small modular reactors (SMRs) being developed by Kairos Power, while Amazon revealed investments in four SMRs operated by Energy Northwest to support data centers in Oregon. Oracle is also designing an AI data center to be powered by three SMRs. The first next-generation reactors are anticipated to be operational in the early 2030s.
For example, the United States leads in the development of nuclear energy technologies, but it has fallen behind China and Russia in deployment. As of April 2024, China has 23 commercial reactors under construction, another estimate indicates 30 under construction, and the United States has none, though it opened the Plant Vogtle in Georgia in March 2024. The United States has the largest nuclear fleet, with 94 reactors, but it took nearly 40 years to add the same nuclear power capacity China added in 10 years. Also, China is rapidly building the world’s first onshore small modular nuclear reactor, scheduled for operation in 2026.
Rules for the 21st century technology-driven global economy, technology standards, and regulations on powerful emerging technologies are being set in international institutions, with competing visions and values on what these models should be. The United States is deploying a new technology statecraft and working with allies and like-minded nations to ensure these new rules of the road adhere to free market principles and democratic values.
For example, the U.S. Department of State established a Special Envoy for Critical and Emerging Technology to cooperate with allies and partners on critical and emerging technologies; lead planning for international technology diplomacy to support national security priorities; and coordinate policy around new global technology developments including in AI, quantum, and biotechnology. The U.S.-EU Trade and Technology Council, formed in 2022, is focused on transatlantic cooperation on development and deployment of new technologies such as AI, 6G, quantum, and biotech based on shared democratic values, including encouraging compatible standards and regulations. Pillar II of AUKUS—a trilateral security partnership for the Indo-Pacific Region between Australia, the United Kingdom, and the United States—aims to improve joint capabilities and interoperability in cyber, AI, quantum, and undersea capabilities. Recently launched, NATO’s Defense Innovation Accelerator for the North Atlantic is supported by joint funds to support competitively awarded grants and accelerators to develop technologies that, if successful, can move to the warfighter, NATO nations, or industrial base.