Qubits and Quibbles

BLOG: Heidelberg Laureate Forum

Laureates of mathematics and computer science meet the next generation
Heidelberg Laureate Forum

Scott Aaronson, winner of the 2020 ACM Prize in Computing

On the opening day of the 8th Heidelberg Laureate Forum, Scott Aaronson, winner of the 2020 ACM Prize in Computing, discussed the recent advancements in quantum computing and the impact that “quantum supremacy” could have on the future of computing. Aaronson described quantum mechanics as the operating system of the universe, through which everything in nature runs as an application program.

As Aaronson explained, the state of any isolated physical system can be shown as a unit vector of complex numbers called “amplitudes,” the quantum mechanical representation of probabilities.  The ability to accurately simulate isolated physical systems would be very useful to many scientific fields; however, this is exponentially hard to do on a classical computer because the dimensions of a quantum state increase exponentially with the number of particles in the system. This is because a quantum bit, or qubit, can exist in state of superposition where it is both 0 and 1 simultaneously. This is written in Dirac notation as α|0⟩+ β|1⟩ where α and β are amplitudes or complex numbers that represent the probability of being either 0 or 1. Once you measure the qubit, the superposition “collapses” and shows you either a 0 or 1.

Since a qubit can exist in this superposition, to represent a system with “n” number of qubits states requires a vector of 2^n amplitudes. For example, a 1000 qubit system, which would represent a small molecule, requires 2^1000 amplitudes. 2^1000 is greater than the number of particles that exist in the observable universe, so simulating the behavior of such a system is not possible for classical computers.

Aaronson went on to discuss the recent hunt for “quantum supremacy” or “quantum advantage.” Quantum supremacy was coined in 2012 by John Preskill, a physicist and professor at CalTech University, and it refers to the point at which a quantum computer is able to compute some well-defined mathematical task faster than an current classical computer (regardless of the usefulness of this task). In 2019, a team from Google published a claim of quantum supremacy in Nature.

The quantum processor designed by the Google team is called “Sycamore” and was able to create quantum states on 53 qubits. According to the team, the Sycamore processor “takes about 200 seconds to sample one instance of a quantum circuit a million times—our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years.” This is a truly game-changing speed-up, however it is still limited to one very specific, and not particularly useful task. A response from IBM at the time claimed “that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity.” IBM never actually tested this task on a classical system, but, either way, it is clear that there is still much research yet to be done in order to create more useful quantum computers.   

While more advanced quantum computers would be useful to fields such as biochemistry and nuclear physics, they are also predicted to threaten encryption systems, like RSA, that we rely on to transmit data over the Internet. Overly-simplified, RSA is a public key cryptography algorithm that generates public and private keys through the manipulation of prime numbers. Currently, RSA is a very secure encryption scheme because it takes a long time to factor large semiprimes (the product of two prime numbers) which are used in RSA. In 1994, mathematician Peter Shor created Shor’s algorithm, which is able to find the prime factors for any integer N using a quantum computer. Thus if (or when) a quantum computer is created that is capable of running Shor’s algorithm, it could be used to break RSA and other public-key cryptography that relies on factoring primes to generate keys.

Not only would this impact, say, your online banking transactions, but it could also jeopardize the security of military or intelligence information. Encrypted data could be collected by adversaries in the present and, once a sufficiently advanced quantum computer is created, that data could then be decrypted and the intel utilized. The fear of such a future has prompted organizations like the U.S. National Institute of Standards and Technology to create programs on post-quantum cryptography. In 2016 NIST announced a program to “develop and standardize one or more additional public-key cryptographic algorithms to augment FIPS 186-4, Digital Signature Standard (DSS).” This program is currently in round three of candidate algorithms submissions. More information can be found here.

A few years ago, the Computing Community Consortium, a visioning body for computer science research (and my employer) ran a visioning workshop on post quantum cryptography migration and released a workshop report that highlights outstanding research questions in this area. The most important take away from the report is: “While NIST’s much-needed standardization initiative addresses the selection of new cryptographic algorithms, there is an urgent need to consider the complex problem of migration from today’s widely deployed algorithms to PQC alternatives” (p. 18). This is not merely a technical problem, but also a social and policy issue as you need to ensure that all the developers who work in this space are able to migrate to post-quantum alternatives if it becomes necessary.

While I can’t say what the future will hold, starting next week, you can watch the entirety of Scott Aaronson’s lecture on the HLF Youtube channel.

Avatar photo

Posted by

Khari Douglas is the Senior Program Associate for Engagement for the Computing Community Consortium (CCC), a standing committee within the Computing Research Association (CRA). In this role, Khari interacts with members of the computing research community and policy makers to organize visioning workshops and coordinate outreach activities. He is also the host and producer of the Catalyzing Computing podcast.

1 comment

  1. Quantum Computing could transform Chemistry from an Art to a Science
    Citation from above: For example, a 1000 qubit system, which would represent a small molecule, requires 2^1000 amplitudes.

    Todays chemists rely on trial an error and on rules of generalization from observation and quantum system knowledge. But this rules are only rules of thumb: they do not always work because each molecule is a world of its own, a world where unique quantum states are at home.
    Quantum Computing could transform chemistry to an engineering discipline where one could plan a molecule based on desired properties. Indeed, this is exactly what Jeremy O‘Brian, the CEO of PsiQuantum tells in the you-tube video /FUTURE COMPUTE/ MIT Technology Review: The Race to a Million Qubit Quantum Computer. Jeremy O‘Brian is convinced, that
    – A practical Quantum Computer needs error correction because errors are inevitable in a hybrid analog-digital system like a quantum computer (the digital part ist the read-out, the analog part ist what the quantum gates do with the continuum a quantum system represents)
    – each logical qubit in a quantum computer is formed by 1000 to 10‘000 physical qubits in order to suppress errors
    – a million qubit Quantum computer requires manufacturing processes as used in the semiconductor industry
    – a million qubit quantum computer has the size of a board room an consists of racks connected by fiber optic cables
    – PsiQuantum’s fusion based quantum computing can realise a million qubit quantum computer in this decade

Leave a Reply


E-Mail-Benachrichtigung bei weiteren Kommentaren.
-- Auch möglich: Abo ohne Kommentar. +