A Close Call: Lattice Cryptography Safe for Now

Business
May 2, 2024

By David Joseph, Ph.D.

Cryptography evolves

Organizations today depend on cryptography as the bedrock of their secure communications, providing integrity, authenticity, and confidentiality. Cryptography is often hard wired into each individual software application -- that’s a big problem if you want to be nimble and update the encryption to more secure standards across many thousands of apps that make up modern IT infrastructure.   This nimbleness is called cryptoagility. 

Cryptography is evolving rapidly to counter hackers who are developing more sophisticated attacks.  As security protocols are broken, companies need to replace and upgrade quickly.   For example, one of the standard ways to protect password files – SHA1 – was shown to be vulnerable in 2005, but by the time it was ultimately broken in 2017 (12 years later), one in five websites still relied on it due to the difficulty of updating web certificates. 

The recently claimed quantum attack against lattices

A paper released this month cast doubt on a new family of cryptography that many were counting on for the next generation of secure communications, which must be resistant to the threat posed by quantum computers

The paper (in pre-print - i.e. not yet peer reviewed) reinvigorated discussions of cryptoagility by claiming that an efficient quantum algorithm solves a presumed hard computational problem closely related to three upcoming NIST Post-Quantum Standards for both key exchange and signatures. For over a week, foremost experts in quantum computing and lattice cryptography opined on the implications and pored over the proofs, even teaming up on Discord to quickly exchange ideas. Finally a mistake was identified in the proposed algo.

The bug identified in the paper does not appear to be easily fixed, thus lattice cryptography remains safe. The algorithms that could have been impacted have not yet been standardized, and as such are not now widely implemented. However, the prospect of relying on an alternative to lattice-based cryptography would have meant a significant shift in trajectory for standards bodies, academics, and security regulators.

Plan for cryptographic obsolescence

While the claimed result has not been proven out, this is a good reminder that almost all cryptography is based on computational problems that are only presumed to be hard. As these standards filter into production, it is critical to adopt modern cryptography management practices that ensure cryptographic agility and resilience.

Reflecting on the mortality of cryptography before it has even been deployed should urge us to put modern cryptographic best-practices at the forefront of system design. The best way to minimize the future costs of updating cryptography is to plan for obsolescence even before algorithms are rolled out. In the history of cryptography, few algorithms – and fewer implementations – have stood the test of time.

Adopting practices to monitor the deployment and usage of cryptography and move to a cryptoagile framework will raise the level of security of all companies and governments.  

David Joseph is a product manager in the Quantum Security Group, where he was the first cybersecurity employee. Formerly a researcher, he has a background in theoretical cryptography and quantum computing for attacking information security, which he studied during his PhD at Imperial College London. He is an author on “Syndrome Decoding in the Head”, a digital signature scheme submitted by SandboxAQ and CryptoExperts to the NIST call for signatures, and a leading author on the 2022 Nature paper “Transitioning Organizations to Post-Quantum Cryptography.”

No items found.