News
Architecture and Hardware News

Building a Practical Quantum Computer

Quantum computation has a long road ahead.
Posted
quantum computer concept, illustration
  1. Introduction
  2. Quantum Promises
  3. Excessive Errors
  4. Error Correction
  5. A NISQ Era?
  6. Going the Distance
  7. Author
quantum computer concept, illustration

Researchers have speculated about quantum computation for decades, but recent years have seen steady experimental advances, as well as theoretical proofs that it can efficiently do things that classical computing devices cannot. The field is attracting billions of dollars from governmental research agencies and technology giants, as well as startups. Conventional companies also are exploring the potential impact of quantum computing.

Despite this excitement, including successful sensing devices, quantum computing has not made practical contributions. Moreover, there is still no winner among very different schemes to physically implement quantum bits, or qubits. None of them is ‘good enough’ yet to achieve supercomputer-scale calculations, and they all face major barriers to low error rates and large device counts.

Even optimistically, it could take many years to realize large-scale, error-corrected quantum computing. In the interim, researchers and companies are seeking uses that can exploit the small, less-reliable systems that already are available.

Back to Top

Quantum Promises

The number of possible states that a computational system can represent doubles with each additional bit, but a “classical” system can only be in one state at a time. In principle, a set of qubits can explore many more combinations simultaneously, since each can be a mixture of 0 and 1.

uf1.jpg
Figure. IBM’s Q System One quantum computer.

Exploiting this “superposition,” however, requires qubit manipulations that end up, with very high probability, in a state that solves a target problem. Two such algorithms were devised by independent Bell Labs researchers in the 1990s. There have been few new proposals, but one of the early ones, Peter Shor’s scheme to factor large numbers, has driven sustained concern in cryptography.

Such delicate manipulations are extremely difficult, however, even in the laboratory. The process fails unless the coherence between all the qubits is established and maintained precisely through the entire sequence of operations.

Candidate qubits—including atoms, ions, crystal defects, photons, and superconducting circuits—are sensitive to fluctuations in their environment, which force them prematurely into one or another classical state. Specialists in each field have been striving to reduce the error rates, which would allow more operations to be completed on more qubits before an error occurs.

A leading candidate is superconducting circuitry, which encodes the qubit in the collective motion of the electrons in a superconductor. These planar circuits can exploit scalable integrated-circuit manufacturing to make and interconnect two-dimensional arrays of qubits—currently scores of qubits demonstrated by behemoths Google and IBM, and dedicated company Rigetti.

Back to Top

Excessive Errors

However, maintaining superconducting qubits requires cooling them to millikelvin temperatures—thousandths of a degree above absolute zero. Removing heat generated at these temperatures is very inefficient, requiring on the order of a billion watts to remove a single watt of heat, says D. Scott Holmes of Booz, Allen, and Hamilton, who chairs the chapter on cryogenic electronics and quantum information processing of the International Roadmap for Devices and Systems (IRDS). Cooling technology is improving for large, centralized facilities. “For refrigeration there are economies of scale, so for cryogenic electronics it pushes you away from smartphones and toward large-scale applications,” Holmes said.

Other qubit implementations could be better suited to distributed or portable applications. For example, trapped ions, being pursued by the likes of IonQ or Quantinuum (a recent merger of Honeywell Quantum Solutions and Cambridge Quantum) do not need ultra-low temperatures, although they use a high vacuum to reduce collisions and escape of ions. Semiconductor-style fabrication makes controlled traps for short chains of ions, said Kenneth Brown, professor of electrical and computer engineering at Duke University, who also advises IonQ. There are several schemes to use lasers to manipulate, cool, and probe the quantum states of the ions, even coupling widely separated ions, but implementing optics at large scale will be challenging.

Back to Top

Error Correction

All the qubit candidates face major engineering and technical challenges to scaling up, which is one reason none is yet the clear winner. Still, “Over the last decade, all quantum-computing technologies have reduced the error by an order of magnitude,” Brown said. “I don’t see any hard limits yet.”

Nonetheless, error rates remain high. “If you look at the machines out there right now, you have about a 1% error rate,” said John Martinis, physics professor at the University of California, Santa Barbara. Lower error rates, 10−3 or even better, have been claimed, but Martinis worries they may not be representative. In any case, he said, “That’s just not good enough if you’re going to build a big computer.”.

Microsoft has been sponsoring the search for quasiparticles called “topological qubits,” that should be much less prone to noise-induced errors. Unfortunately, inducing electrons in a material to join to form such quasiparticles has yet to be clearly demonstrated.

Large systems will need to implement quantum error correction. In contrast to classical bits, simply measuring and correcting redundant qubits in the middle of a computation would erase the quantum information. However, also in the 1990s, Shor showed that adding extra qubits and encoding information in their relationships can correct errors without prematurely revealing the data.


Current error rates require a huge overhead of extra qubits for correction, in some cases 1,000 physical qubits for every protected “logical” qubit.


Current error rates, unfortunately, require a huge overhead of extra qubits for correction, in some cases 1,000 physical qubits for every protected “logical” qubit. A serious quantum computer would then need something like a million physical qubits, much more than the 100s of qubits expected soon.

Even if superconducting circuits outpace Moore’s Law by doubling every year, in line with the broadly similar roadmaps publicized by IBM and Google, reaching the needed scale would take more than a decade. For other technologies, projections are even more speculative. “It’s really hard to create a roadmap,” Holmes noted, and there is limited consensus on the best metrics to assess progress.

Back to Top

A NISQ Era?

Concerns about a decade-scale payoff have driven interest in nearer-term applications that can use noisy, intermediate-scale quantum (NISQ) circuits. Indeed, such devices enabled experimental demonstrations of “quantum advantage” over classical computers on a carefully selected computation (also called “quantum supremacy”). Google (in a project led by then-team-member Martinis) did the first such experiment with a superconducting circuit, in 2019, and a Chinese group claimed a quantum advantage for a photonic device in 2021.

Mathematician and computer scientist Gil Kalai of Israel’s Hebrew University of Jerusalem disputes these claims. Moreover, NISQ machines can only solve problems in “a tiny sub-class which is very, very primitive,” Kalai said. “To build quantum error correcting code is even more difficult.” Some physicists have also questioned whether quantum systems will be too chaotic to allow complex computations.

Although such deep skepticism is not widespread, quantum experiments do not yet demonstrate useful calculations with a performance or cost advantage over current supercomputers. “What size of device allows us to beat classical computers for problems of practical interest or scientific interest?” asked Brown. “We just don’t know.”

“No one has an entire business model around solving intractable applications using these computers right now,” said Google researcher Julian Kelly, who stressed that “our bet is on the long term and on error correction.” Much of Google’s research and development aims to work systematically toward that distant goal.

Nonetheless, many researchers and businesses also are exploring potential NISQ applications, as are government agencies. Several companies offer the public remote access to conventional electronics interfaced with quantum computers.

A prominent candidate application is calculating the electronic properties of molecules or solids. Indeed, quantum simulations of materials inspired physicist Richard Feynman’s early proposal for quantum computing, and small molecules and exotic quantum phases have been successfully modeled.

NISQ computers might also be used to speed optimization, for example in scheduling of resources. Scott Buchholz of Deloitte notes that, despite the uncertainties, many forward-looking companies are hedging their risks by exploring the technology’s capacity. “People are looking at how do we actually use these things to improve technical and business outcomes.”

For more than a decade, optimization has been the goal of the unusual quantum-computer company D-Wave. Unlike most systems, which process data with a series of “gate” operations, their machine performs simultaneous “annealing” of thousands of super-conducting qubits to find the global optimum. Many experts still question whether its operation is truly quantum-mechanical or a real advance, however. “I am unable to find an application where D-Wave is cost-effective compared to traditional computers,” said Simson Garfinkel, co-author of the book Law and Policy for the Quantum Age.

Back to Top

Going the Distance

Continued large national investments in quantum computing are motivated in part by concerns about falling behind other countries. For example, government security organizations worry that factorization capability could break secure communications. In their recent book, however, Garfinkel and his co-author, legal expert Chris Hoofnagle of the University of California at Berkeley, note that the symmetric AES encryption with 256 bits can evade any reasonable quantum machine.


Continued large national investments in quantum computing are motivated in part by concerns about falling behind other countries.


They also note that if cost-effective applications are slow to arrive, it might erode the funding for quantum computing. One possible scenario is a “winter” for the field, like that which limited artificial intelligence enthusiasm in past decades.

“It is always possible that things won’t work out,” Buchholz acknowledged, adding, “There’s enough smart people and there’s enough money being invested in this that I’m more optimistic than not.” With so many distinct approaches to qubits, he said, “all it takes is one of them to work.”

*  Further Reading

International Roadmap for Devices and Systems, IRDS 2021: Cryogenic Electronics and Quantum Information Processing, https://bit.ly/3HaUWBx

Quantum Hardware Outlook 2020, Fact Based Insight, https://www.factbasedinsight.com/quantum-hardware-outlook-2020/

Hoofnagle, C.J., and Garfinkel, S.L.
Law and Policy for the Quantum Age, Cambridge University Press, November 2021, https://bit.ly/33MFYEg

Buchholz, S., Golden, D., and Brown, C.
A business leader’s guide to quantum technology, Deloitte Insights, 15 April 2021, https://bit.ly/3BJnilb

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More