Quantum computers typically have taken hours to boot up because researchers have to make dozens of adjustments and calibrations in order to set up a chip with just five quantum bits. Quantum computers react to small changes in the local environment, and only an error rate of less than 0.1 percent is permissible when measuring ambient conditions.
However, Saarland University researchers have developed Adaptive Hybrid Optimal Control (Ad-HOC), a new method involving an algorithm that reduces the calibration error rate to below the required 0.1 percent threshold while also accelerating the calibration process form six hours to five minutes.
"As many of the parameters, such as temperature, light, and air pressure do not remain stable during the long calibration phase, this can further shorten the time window in which the chip is running error-free and in which it can therefore be used for experiments," says Saarland professor Frank Wilhelm-Mauch. He notes Ad-HOC has been subjected to intense testing by University of California, Santa Barbara researchers.
The new method also is fully automated, according to Wilhelm-Mauch, and can be applied to quantum processors of almost any size.
From Saarland University
View Full Article
Abstracts Copyright © 2014 Information Inc., Bethesda, Maryland, USA
No entries found