ACM News
## Quantum Computing with Exponentially Fewer Errors

In an ideal world, a quantum computer with as few as 100 qubits would suffice to outperform all classical computers on certain classes of computation tasks. Google's Sycamore quantum processor, which has 53 qubits, is nowhere near accomplishing this.

The reason is that existing qubits, the computational building blocks of a quantum computer, are extremely sensitive to outside interference. Any magnetic or thermal disturbance will cause the qubit to change its internal state randomly; the classical analogy would be that all the bit values in the working memory of a microprocessor keep flipping from 0 to 1 and back, unpredictably.

As a result, quantum computing is still limited to computations taking no more than a few microseconds.

However, a team of Google researchers reported in *Nature* that they have successfully implemented quantum error correction on 21 qubits on the Sycamore chip, with huge potential for extending computation time.

Quantum error correction has been done before on smaller arrays of qubits, and on other types of qubit than the superconducting transmon qubits that Sycamore uses.

"This is a strong answer to those sceptics who say a quantum computer will never work in real life," says computer scientist Koen Groenland of QuSoft, the Dutch research center for quantum software in Amsterdam, who was not involved in this research. In Groenland's view, the key achievement of the Google team was showing that the probability of errors decreased exponentially when using more qubits. That means even large qubit circuits can be made resilient to error by adding a modest number of extra qubits. Said Groenland, "That is a great perspective, because to keep a quantum computer working for hours, the error frequency must decrease from once per microseconds to a few per day, or even per week."

Error correction is used in classical computing as well, for instance through redundant bitcoding or by performing parity checks at some stages of the computation. Error correction on qubits needs to be more subtle, however. A qubit can be in a 'superposition' of 0 and 1 at the same time, and multiple qubits can be 'entangled', which means they become a single, complex entity. This is the quantum magic that enables N qubits, in principle, to represent up to 2^{N }classical bits.

Yet as long as a quantum computation is going on, one cannot actually check the values of the qubits, as that would destroy the complex quantum-state of the qubits, with the result being noise. Quantum error correction is done by adding 'measure' qubits to the original data qubits. In one error-correction cycle, the measure qubits compare various combinations of data qubits, then the values of these measure qubits can be read out, after which they are reset. This allows the detection of error events in the data qubits without actually detecting their individual values. In the experiments described in *Nature*, one cycle lasted about one microsecond, but error correction was repeated for up to 50 cycles.

An error-correcting unit with measure and data qubits constitutes one 'logical qubit', the building block for implementing and executing quantum algorithms. Yet, significant hurdles remain to achieving practically useful logical qubits. Francesco Battistel, a specialist in quantum error correction at QuTech in Delft, the Netherlands, points out that in these experiments, error suppression was only demonstrated in a 1-dimensional chain of entangled qubits. In that configuration, each qubit has only two nearest neighbors, but it can correct only one type of error at a time, while a practically useful logical qubit needs to be protected from all types of errors. The most promising approach is to build a logical qubit from a two-dimensional 'surface' array of qubits. As a price, each qubit has four nearest neighbours, with greater potential for error propagation.

The Google team implemented error correction on a 7-qubit surface array in Sycamore, but could not extend those results to larger arrays. For error correction to be successful, the data qubits must have a minimum level of stability, which also depends on the configuration. This minimum has been reached for a long chain of qubits on the Sycamore processor, but not yet for a large surface configuration. Even error correction on the 21-qubit chain was hampered somewhat by crosstalk: the long chain lies folded on the square processor, so some qubits that are far from each other on the chain are physically nearest neighbors.

Google has "a great chip, but they still have to solve a few problems," says Battistel, "and crosstalk is certainly one of them."

The Google team also reports a problem that other researchers have warned against recently: cosmic rays can 'poison' superconducting transmon qubits. Cosmic radiation consists of high-energy gamma photons that can penetrate several meters of shielding. If such a photon hits a transmon chip, it does not just flip the quantum states of some qubits; it takes the chip completely out of action for a while. The team actually registered such events, which cannot be corrected, and resulted in data loss.

Possibly the only practical way of protecting qubits against cosmic rays is using a different type of qubit that is much less sensitive to them, such as the nitrogen-vacancy center qubit now under development at Qutech and elsewhere.

*Arnout Jaspers is a freelance science writer based in Leiden, the Netherlands.*

No entries found