Sign In

Communications of the ACM

ACM News

The Quantum Computer Era Arrives

D-Waves latest quantum processor contains 2,000 qubits.

Quantum computers are leading the pack of alternatives to pick up the baton when complementary metal oxide semiconductors reach their atomic limits.

Credit: Kim Stallknecht/The New York Times

The successor to complementary oxide semiconductors (CMOS) is one of the most researched areas in semiconductor technology, since CMOS is running out of gas at the atomic level.

Quantum computers are leading the pack of alternatives to pick up the CMOS baton. D-Wave and IBM, for instance, both have cloud services that let you try out their quantum computer prototypes for free. However, IBM's recent 20 quantum-bit (qubit) digital, general-purpose, often called "universal," Q-Network quantum computer—as opposed to D-Wave's 2,048-qubit optimization-only quantum computer—has sealed the deal on the next generation of computing being "the quantum-computer era."

"The drive to create the first quantum computer has been viewed as the new arms race. The milestone to reach is that of quantum supremacy, essentially the performance of computation that goes beyond the capability of the latest and best supercomputers in existence today," said Michela Menting, digital security research director at Oyster Bay, NY-based market research firm ABI Research. "The D-Wave uses a form of computation called quantum annealing, which is a type of analog quantum computer. These types of analog machines are not the same as general-purpose digital quantum computers, which is what IBM, Microsoft, and others are trying to build."

General-purpose digital or universal quantum computers can address a wide variety of problems too tough for any  supercomputer. IBM's Q has attracted JPMorgan Chase, Daimler AG, Samsung, JSR Corp., Barclays, Hitachi Metals, Honda, Nagase, Keio University, Oak Ridge National Laboratory, Oxford University, and the University of Melbourne to become charter members of the world's first family of universal quantum computers: IBM's Q-Network (accessed from IBM's Cloud using an extensive library of application programmer interfaces).

"We have successfully written code for IBM Q using the most popular quantum algorithms from the National Institute of Standards and Technology [NIST] library," said Stephan Eidenbenz, director of the Information Science and Technology Institute (ISTI) at Los Alamos National Laboratories (LANL). ISTI owns a D-Wave quantum computer, which only runs specialized annealing algorithms for optimization problems. "But building a universal quantum computer has many challenges, and can make use of many techniques, so it might be a decade before they settle on a standardized strategy. Full error correction is perhaps the biggest hurdle."

Anthony Annunziata, associate director of the new IBM Q Network, agrees that full error correction will be required to enable quantum computers to realize standardized circuitry. "IBM has the immediate goal of fulfilling the Holy Grail of quantum computing—universal fault tolerance for the broadest set of quantum-compatible applications."

D-Wave chief executive officer Vern Brownell, on the other hand, says his company's growing customer list, already exceeding 100, is proving that its annealing quantum computer can solve many types of problems too difficult for conventional digital computers today—by writing the program as an optimization problem. What's more, the annealing algorithm does not have the same need for error correction as universal "gate-level" quantum computers like IBM's Q.

"Our users have run hundreds of applications on D-Wave quantum computers, including sampling, simulation, and machine learning, all of which can be built on optimization algorithms," said Brownell. "Gate-level quantum computers [like IBM's Q] will need up to 1,000 qubits just to error-correct a single data qubit, because in gate models both 1 and 0 are energetic, whereas quantum annealing is naturally error-correcting since it minimizes to its lowest energy state of every qubit simultaneously, using quantum tunneling to speed up the process."

Brownell praised IBM for using "superconducting qubits with zero leakage resulting in picowatt power consumption," as does D-Wave. He also praised Microsoft's pioneering work in error correction, but lamented that it "depends on a particle that is predicted by physicists, but which has yet to be observed."

Researchers at Microsoft declined to be interviewed for this article, but according to Wikipedia, "the elements of a topological quantum computer originate in a purely mathematical realm, experiments in fractional quantum Hall systems indicate these elements may be created in the real world using semiconductors made of gallium arsenide at a temperature of near absolute zero and subjected to strong magnetic fields." Wikipedia also explains that the Holy Grail of full error correction for topological quantum computers amounts to error prevention, which entails keeping its qubits far enough apart that they do not cause "random stray pairs of anyons" during operation.

Microsoft recently announced its Q# programming language (similar to its C# standard language). Other competitors on the hunt for a quantum computer include Accenture, Airbus Group, Alibaba Group, Atos Quantum, Booz Allen Hamilton, Google Quantum AI Lab, HP, Intel, Lockheed Martin, Mitsubishi Electric, Nokia Bell Labs, NTT Basic Research Laboratories, Raytheon BBN and Toshiba (with Cambridge Research Laboratory).

After Bell Labs developed a working transistor, many different implementation techniques were proposed, such as bipolar transistors, before CMOS became the standard. CMOS takes twice as many transistors as bipolar to realize a logic gate—the backbone of every electronic circuit. However, its impeccable error-correction record—due to the use of differential signaling—realized the Holy Grail of silicon, enduring to this day. The Holy Grail of quantum computers is also error-free operation.

Quantum computers may also solve the parallel processing problem of automatically dividing-to-conquer programming problems using multiple processors. Asai Asaithambi, a professor in the School of Computing at the University of North Florida, suggests quantum computing will solve the automatic parallelizing problem. "The big parallel problem in a multi-processor systems is what to do when, but quantum computers provide a more compact model for supercomputing than manually parallelizing programs. Instead, quantum computers use the single-processor paradigm which is well understood; it then automatically does the necessary parallelization compared to multi-processing digital computers."

Asaithambi has also has investigated the precise way in which small quantum computers, with properties such as entanglement, can act as coprocessors to conventional digital computers today—together solving problems impossible for either alone. For instance, conventional computers are good at making educated guesses at the ground state of complex molecules, but when the beginning state is far from the ground state, conventional computers can only come up with approximations. Small quantum computers, however, can quickly determine if the new state is the ground state. Thus together, with conventional computers making increasingly closer guesses and small quantum computers quickly verifying or disproving them, could together bring the problem to convergence in record time. This is called Variational Quantum Eigen-solving.

There is no question the era of the quantum computer has begun, although there is wide variation in opinions as to how quickly it can be commercialized. "Most experts agree today that the creation of a quantum computer is simply a matter of engineering, and that its theoretical applications are possible, with optimistic estimates by the private sector varying between five and 15 years for commercialization, while more conservative estimates by academics put successful commercialization 15 to 25 years in the future," said Menting.

R. Colin Johnson is a Kyoto Prize Fellow who has worked as a technology journalist for two decades.


No entries found