Feature image via MIT Technology Review
The development of quantum computers has been likened to a “race”, time and time again. The analogy is perpetuated by a widely held belief that a large-scale quantum computer will be able to easily break any and all of today’s modern cryptographic protections, putting both national and individual digital security at risk. While this ominous assumption helps to fuel the fire that keeps the development of quantum computing simmering steadily, it is only partially true. While a large-scale quantum computer will, theoretically, be able to break most modern public-key encryptions, alternative, quantum-secure cryptography systems already exist. Different formulations of these are being actively investigated for large-scale viability in preparation for this eventuality. Understandably, without even having reached the quantum era, security researchers have their sights set firmly on quantum-proof defences.
Most systems today rely on what is known as an asymmetric, or public-key, cryptography to send secure messages, encrypt files on a computer, or transfer cryptocurrencies from digital wallets. In the simplest terms, encryption is the process of scrambling a message. It is only with access to the key that the message can be unlocked and deciphered. However, as a recent slew of cyberattacks targeting Australia demonstrate, encrypting data does not mean it becomes impossible to hack. It simply means it is very computationally intensive to do so.
A number of encryption algorithms, including the most common, known as RSA, are based on factoring large numbers to generate complex keys. While it is a fairly straight-forward and economical task for a computer to algorithmically generate an encryption standard in this way, it is prohibitively difficult and requires massive amounts of computational power to reverse engineer that calculation. This is known as “computational hardness”: it enables classical computers to create encryption standards but prevents even the most advanced machines from being able to break that same encryption.
However, it is exactly this type of intractable mathematical problem, which gives classical computers such a hard time, that quantum computers will be particularly adept at solving. While we know that quantum computers will not be better at doing everything a classical computer can, they do have a considerable advantage over classical machines in this realm, due to the unique quantum quality of superposition. The principle of quantum superposition enables qubits to be both one and zero at the same time, powering the quantum ability to reduce the time complexity (i.e. how long an algorithm will take to produce a result, relative to the size of its input) of a problem exponentially, by looking at all possible outcomes of a problem at the same time.
In 1994, Bells Laboratories’ mathematician, Peter Shor, developed the eponymous Shor’s algorithm, which leveraged the principle of superposition to demonstrate how quantum computers could efficiently find the prime factors of a composite integer. This is precisely the type of calculation required to break most public-key encryption standards, like the widely used RSA. Since then, the highest integer factored using Shor’s algorithm on a 4-qubit quantum computer in 2014 was 56,153. To put this into perspective, the recommended length of RSA encryption is 2,048 bits long—or 617 decimal digits—an integer exponentially larger than the currently held quantum record and therefore, a problem exponentially more difficult to solve.
While it is nearly certain that the continued advancement of quantum development will eventually threaten even the high level of security afforded by RSA-2048, estimates of when that reality will actually arrive vary wildly—from five years to multiple decades. Given the unpredictability of this time frame, preparing for the day when quantum computers will be able to break even the most complex encryption standards is an important process to begin now. Recognizing the significance of this situation, the U.S. National Institute of Standards and Technology (NIST) is paving the way in its support for the transition to post-quantum or quantum-resistant cryptography across national systems in the U.S.
According to the NIST, it took nearly two decades to fully deploy today’s modern cryptographic public-key systems. Critical national infrastructures, such as the U.S. Department of Defence IT system, will need to be quantum-proofed by the time quantum computers reach maturity—a deadline that while difficult to define, is one that we are nonetheless shuffling toward. The unpredictability of this timeline has resulted in a significant institutional push for standardising quantum-proof encryption. In a 2016 report, the NIST investigated the potential applicability of various quantum-resistant families of existing public key cryptographic approaches, including lattice-based, code-based, hash-based, isogeny-based, and multivariate systems. Despite these efforts, it may not be until 2022 when draft standards begin to emerge.
The crypto agility of any security infrastructure is paramount to its ability to keep individual, organisational, national or international sensitive data secure with ever-advancing, classical and quantum capabilities. While they may be fairly uncomplicated to execute, large-scale systems changes are inherently difficult to incentivize. In this case however, the risk of not acting preventatively could prove to be immensely damaging. Not only could a sudden quantum development result in a surprise intrusion of our digital security, but a malicious occurrence like this could critically endanger the continued development of many other quantum applications across unrelated sectors, like healthcare, finance and energy. Immediate action will help to ensure security and strategic advantage issues do not become major barriers to fully realizing the potentially transformative social and environmental value of quantum technologies.
Leave a Reply