Over the next few months, and culminating in the December 2025 issue, Communications will be publishing a series of articles as part of its special issue on Federal Funding of Academic Research.
Keeping the Dream Alive
By Eric Horvitz, Margaret Martonosi, Moshe Y. Vardi, and James Larus
The USDA-NIFA AI Institutes for Agriculture
By Vikram Adve
Starting 2020, the U.S. Department of Agriculture’s National Institute of Food and Agriculture (NIFA) has funded five National Artificial Intelligence (AI) Research Institutes. These AI Institutes share a vision to initiate and propel the research, development, and deployment of AI tools and technologies within various sectors of agriculture, as well as to prepare the next-generation of an AI-ready workforce in agricultural sectors. In this article, we report on the various use-inspired and foundational AI problems that these NIFA AI Institutes (or “Ag-AI” institutes, for short) are tackling, providing a sampler of technical outcomes as well as some of their broader impact highlights including education, extension, and workforce development initiatives and community building. Collectively, these AI Institutes are laying a strong foundation to usher in a new AI-era for agriculture. The advances reported represent a clear example of using AI for societal good and for tacking a global grand challenge problem in securing the future of our food production. The advances also highlight the importance of sustaining and bolstering the U.S. strategic national federal investment in the AI Institutes program, which represents one of the strongest strategic partnerships to-date between the USDA NIFA and the U.S. National Science Foundation (NSF).
The LLVM Compiler Infrastructure
By Vikram Adve
This article will offer a brief overview of the LLVM Compiler Infrastructure project, the worldwide impact LLVM has had on research and practice across most major computing industries, and the importance of federal funding for the existence of this effort. It may not be an exaggeration to say that few software systems today are used as widely as LLVM. LLVM is used by nearly all software on all iOS devices and important software on Android – the two dominant mobile phone platforms – as well as major software for cloud computing, desktop and server software, gaming consoles, supercomputing, and increasingly artificial intelligence. LLVM has deeply impacted computing practice through numerous innovative programming languages (Rust, Go, Halide, CUDA, OpenCL, Julia, and others), shader compilers for OpenGL, query optimizers in database systems, WebAssembly front ends, hardware design languages, and even quantum computing. And LLVM has democratized compiler research and education, enabling production-quality compiler projects with relatively little effort. The research on LLVM and the development of the open source infrastructure were made possible by federal funding, especially from the National Science Foundation with additional funding from several other federal agencies.
Only Durable Engines of Discovery: The Impacts of U.S. Federal Research Funding in Computing and Computational Science
By Jack Dongarra
This article will explore the profound impacts of U.S. federal research funding on the fields of computing and computational science over the past century. It highlights how sustained governmental investment has driven critical innovations, establishing computing as an indispensable third pillar of scientific discovery alongside theory and experimentation. The article details foundational contributions, including the development of numerical software libraries such as EISPACK, LINPACK, LAPACK, and MPI, which have enabled high-performance computing (HPC) to advance significantly. It underscores how federally supported initiatives not only spurred technological breakthroughs and economic growth but also cultivated a robust, interdisciplinary workforce essential for future innovations. Additionally, the article emphasizes the strategic importance of HPC for national security, pointing out current challenges posed by global competition and underinvestment in software sustainability. It concludes by recommending a strategic, coordinated approach to future federal funding, stressing the importance of continued investment in infrastructure, software development, workforce training, and international competitiveness to sustain U.S. leadership in computational science.
Federal Funding of Public Key Cryptography
By Martin Hellman
U.S. federal support is currently being cut to academic institutions in ways that will adversely affect both the US economy and national security. While changes are probably needed, they should be made carefully, taking into account the positive economic impact that federal support has already had in developing the Internet and other major technological developments. This paper covers the key role that federal support played in my work on public key cryptography, which won the 2015 ACM Turing Award. It also explains that technology.
A Retrospective View of The UT-Austin TRIPS Project
By Stephen Keckler
The UT-Austin TRIPS research began in 1999, stemming from observations of the expected effects of semiconductor scaling trends on computer architectures. Initially supported by modest grants from the National Science Foundation, the TRIPS project grew over the subsequent 5 years into a substantial research program funded by the US Department of Defense, the University of Texas at Austin, multiple computer companies, and a private foundation. This partnership enabled our team to develop novel computer architectures and compiler technologies that demonstrated the viability of highly parallel chip designs that were composed of distributed processing and memory systems components. This article traces the history and ultimate impact of the project.
Natural Language Processing: How NSF Supports the Cyclic Development of Ideas
By Kathleen McKeown
Tremendous advances in the capabilities of large language models (LLMs) in recent years have made them a component of many of today’s most exciting technologies, from AI content authoring to deep research information synthesis tools. Although the evolution of today’s LLMs began with a shift toward data-driven, empirical methods more than 30 years ago, their astonishing capacities are the result of multiple paradigm shifts over that period, many of which were made possible by the unique nature of National Science Foundation (NSF) funding. The visionary Natural Language Processing (NLP) research funded through NSF has led to technology with major technical and social impact, in areas including mental health, global information access, education, and the physical sciences. Through individual grants, interdisciplinary convenings and larger special initiatives like AI Institutes, NSF funding has supported transformative research as well as multiple generations of students who lead new research at universities or enter industry and drive further innovation. These students, whose education combines decades of distilled NLP expertise with their unique talents and perspectives, are perhaps NSF’s most important legacy. The loss of NSF support in any of these areas threatens the United States’ global leadership in NLP research and technology for many years to come.
How the U.S. National Science Foundation Enabled Software-Defined Networking
By Nick McKeown
It became clear in the early 2000s that the Internet faced challenges with ossification, slow innovation, and complex network management due to vendor-controlled hardware and software. Software-Defined Networking (SDN) emerged as a transformative solution, introducing an open interface for packet forwarding and logically centralized control. Its success stemmed from a virtuous cycle between early U.S. National Science Foundation (NSF)-funded academic research (e.g., 100×100, GENI, FIND) and the pressing needs of cloud hyperscalers for flexible, scalable networks. SDN revolutionized network design and operation across public and private sectors, driving significant commercial adoption, fostering a broad research community, and enabling new capabilities like multi-tenant virtualization and efficient wide-area traffic engineering.
Working Together and Working at a Distance
By Judith Olson
Thirty years of research about how teams of office workers do their work both when they are collocated and when they are remote created findings that made a difference in the world. From experiments in a laboratory, development of new technologies, designing and building a computer supported meeting room, and doing extensive observations of work in the field, we developed an understanding of how to help teams work together effectively. This knowledge has been disseminated through tools and advice to both collaborating scientists and everyday office workers. The findings about how to work at a distance were especially valuable when Covid hit and people were working from home as many still are.
The Innovation Engine: Government-Funded Academic Research
By David Patterson
Five government-funded academic research projects over one career at one university delivered economic benefits across nearly 90% of U.S. states with new product sales about 10,000 times the US government funding —returning taxes back over 1000 times its investment—and trained generations of innovators.
From Research Labs to Global Impact
By Magda Balazinska, Urs Holzle, Jeff Dean, and Parthasarathy Ranganathan
This article summarizes a conversation between Magda Balazinska and three senior Google engineers—Urs Holzle, Jeff Dean, and Parthasarathy Ranganathan on the impact of academic research to companies like Google. The wide-ranging discussion highlights academia’s foundational role in Google’s inception, the origins of key innovations like PageRank, and future opportunities for academic research, and also underscores the two-way relationship between academia and industry in driving technological advancement.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment