SC15 is taking place at a time when high-performance computing (HPC) finds itself at an inflection point. We have come to understand the limitations of current technology and the necessity to more fully explore new directions. We’ve been here before. In fact, the SC conference was founded in 1988 at a time when HPC was in transition from machines using a small number of large processors to massively parallel systems using thousands of processors, a leap enabled by microchip technology. The "attack of the killer micros" they called it in the late 1980s.
Today, there’s agreement we’re reaching the upper limits of the complementary metal-oxide semiconductor (CMOS) technology that is the basis for the integrated circuits currently used in supercomputers. Where we go from here promises to spark some lively discussion as the HPC community gathers in Austin Nov. 14–20. Such times serve as a reminder that HPC is a global enterprise and that progress can only come from the collaboration of industry, government, and academia – a founding idea of the SC conference. The path forward is not crystal clear, yet it’s full of intriguing possibilities — including technologies such as quantum and neuromorphic computing — that only a few years ago seemed like science fiction.
The development of actual quantum computers is in its infancy, and the discipline of using quantum computers in production remains largely theoretical. Nonetheless, the potential usefulness of such systems seems evident; for example, modeling quantum systems, which is hard to do on current digital computers. Quantum computers would also be useful for "needle in the haystack" searches for nuggets in large quantities of data, or for optimization problems seeking the best combination of things under numerous constraints. Where quantum computing will take us is unclear. What is clear is that this is a field we need to explore for the long-term future of HPC.
Another emerging technology area we need to explore is neuromorphic computing, systems that mimic the human brain and represent a fundamental departure from traditional computer design. Neuromorphic computing has the potential to enable machine intelligence and new ways of doing science. Like the human brain, neurosynaptic systems require much less electrical power than their digital computing counterparts. The brain-like neural network design has the potential to be more efficient than conventional chips and shows promise for cognitive tasks such as pattern recognition. This technology could be a powerful complement to the effort to develop next-generation supercomputers.
Innovations and the development of new digital computing technologies also offer possibilities for the future. While there’s a good deal of debate about the future of computing, a consensus is forming that future HPC will be heterogeneous — a mix of quantum, neuromorphic, current digital computing, future digital computing, etc. We don’t know what that mix will look like, and each area brings with it many unanswered questions. That’s why it’s critical for these discussions to be taken up by the broad SC audience. Now is the time to think about the future of HPC, not just over the next few years, but beyond 2023–25 when we expect to introduce the first exascale supercomputers. It’s difficult to prepare for the future when we don’t know what it will look like, but it’s important to begin building a framework for what we want to do in a decade and beyond.
As we have at such crossroads before, we need to tap the HPC community’s pool of talent and think boldly about the next technology. We know there will be many different ideas about how we’re going to get there. But let the debate begin. From the friction of competing ideas will come the spark that lights the way.
Dona Crawford is the Associate Director for the Computation Directorate at Lawrence Livermore National Laboratory and a past chair of SC.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment