History has demonstrated that where there are people, there are secrets. From elaborately coded messages on paper to today's sophisticated cryptographic algorithms, a desire to maintain privacy has persisted. Of course, as technology has advanced, the ability to cipher messages but also crack the codes has grown.
"Today's encryption methods are excellent, but we are reaching an inflection point," says Chris Peikert, an associate professor in the Department of Science and Engineering at the University of Michigan Ann Arbor. "The introduction of quantum computing changes the equation completely. In principle, these devices could break any reasonably-sized public key."
Such an event would wreak havoc. "It would affect nearly everything we do with computers," says Dustin Moody, a mathematician whose focus at the U.S. National Institute of Standards and Technology (NIST) includes computer security. Within this scenario, he says, computing subsystems, virtual private networks (VPNs), and digital signatures would no longer be secure. As a result, personal data, corporate records, intellectual property, and online transactions would all be at risk.
Consequently, cryptographers are developing new encryption standards that would be resistant to the brute force power of quantum computing. At the center of this effort is an initiative at NIST to identify both lattice-based and code-based algorithms that could protect classical computing systems but also introduce new and more advanced capabilities.
Code Read
Modern encryption relies heavily on a framework known as public-key cryptography, also referred to as asymmetric cryptography. It includes widely used methods like RSA (Rivest–Shamir–Adleman) and Diffie-Hellman. It's not the entire picture—there is also symmetric-key crypto that uses private keys, but it is not as susceptible to being cracked by quantum computing. Moreover, "[Public-key] handles the bulk of the encryption in the world today," Moody points out.
Encryption standards have advanced over the decades. There are no known cases of anyone cracking the underlying math of any modern encryption algorithm, says Vadim Lyubashevsky, principal research scientist for IBM Research in Zurich, Switzerland. However, in 2016 the U.S. National Security Agency (NSA) sounded an alarm. It released a memo saying that those who had not switched to the latest public-key cryptography standard should refrain from doing so. "They indicated that we will soon need quantum-safe algorithms," he adds.
A simple but disturbing truth is at the center of this concern. "A quantum computer could conceivably crack today's public-key encryption algorithms," Lyubashevsky explains. Although current quantum machines do not yet have the power or programming to achieve this feat, they are advancing rapidly. It is estimated that once these machines reach the threshold of approximately 10 million physical qubits (a few thousand logical qubits), they will be capable of breaking current public-key encryption methods.
"This represents a big risk, considering the time it takes to deploy entirely new cryptography and the possibility of attackers saving data now and attacking it later with a quantum computer," Peikert says.
Secure Horizons
Developing new and more advanced cryptographic algorithms has been on NIST's radar for the last decade. In 2017, the agency introduced a competition that attracted 69 algorithms from some of the leading crypto experts in the world. The contest is now in its third round and 15 algorithms remain (7 primary selections and 8 alternates). Participants study and attempt to crack each other's submissions.
The project is pushing the envelope on mathematics and coding. Since quantum computers can't yet dissect the algorithms, researchers use classical computers and make assumptions about the methods that might be used to crack the cryptography. "We can't know exactly how quantum computers will perform, but we can look at the characteristics of the algorithm, including the gates and resources it would need, and compare that to a classical computer," Moody says.
Some participants are using lattice-based schemes, while others have turned to non-compact code-based algorithms. "Different approaches have trade-offs based on speed and security," observes Peikert who, like Lyubashevsky, has an algorithm under consideration.
Ultimately, Peikert says, this could lead to different use cases and even multiple standards. NIST hopes to make final selections within the next year or two, post them for public comment and evaluation, and then make a final decision about adoption, Moody notes.
Beyond making cryptography more resilient, these encryption algorithms could introduce new features and capabilities. For instance, homomorphic encryption would allow data analysts to use encrypted data sets to conduct research. This could have profound ramifications for healthcare and many other fields, where privacy is critical. "The ability to work on encrypted data—essentially blindfolded—could be a game-changer," Peikert says.
Although the transition to any new standard will take years and involve headaches as the conversion process takes place, cryptography experts argue that it's an essential step forward. Concludes Peikert: "We're entering the golden age of cryptography. There are some amazing capabilities on the horizon."
Samuel Greengard is an author and journalist based in West Linn, OR, USA.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment