Sign In

Communications of the ACM

ACM TechNews

Quantum Computer Slips Onto Chips


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
University of Bristol Professor Jeremy O'Brien

"A full-scale useful factoring machine is still at least two decades away, but this [chip] is one important step in that direction," says University of Bristol Professor Jeremy O'Brien.

Credit: University of Bristol

A silicon chip about the size of a penny that uses photons to run Shor's algorithm has been developed by a team of researchers at the University of Bristol in the United Kingdom. Up to now, laboratory-sized optical computers have been required to run the algorithm, which computes the two numbers that multiply together to produce a given figure. Factoring large numbers is impractical for traditional or classical computers, but quantum computers can execute this task easily, at least in principle.

Optical computing, in which photons, rather than electrons, carry information, has been promoted as a potential future for data processing. But photons also have quantum indeterminacy, which means they can represent multiple states concurrently. The Bristol team uses waveguides — channels etched into the chips that provide a path for the photons around the chips — to effect quantum factoring on a much smaller scale than has been required up to now.

"To get a useful computer it needs to be probably a million times more complex, so a full-scale useful factoring machine is still at least two decades away," says Bristol professor and team leader Jeremy O'Brien. "But this [chip] is one important step in that direction."

From BBC News
View Full Article

 

Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account