Qubit Contenders
BLOG: Heidelberg Laureate Forum
Quantum computers have been ‘about a decade away’ from solving real-world practical problems better than modern supercomputers for, well, a lot longer than a decade now. Their promise famously lies in the qubit, the quantum version of the traditional bit. Where a common bit can represent either a 0 or a 1, a qubit takes advantage of the quantum phenomenon of superposition to be able to represent a 0, a 1 or a state where it is any proportion of both 0 and 1 simultaneously – opening the door to a whole new world of possibilities.
When you have more than one qubit, other quantum mechanical phenomena, particularly interference and entanglement, come into play. A single qubit \(\psi\) can be described by \(\alpha|0\rangle + \beta|1\rangle\) where \(\alpha\) and \(\beta\) are probability amplitudes that represent the probability of being either 0 or 1. Constructive and destructive interference can be used to amplify or cancel out a probability amplitude state. And entanglement can correlate qubits with each other to form a single system, so that measuring the state of one of the qubits lets you know the state of the other without measuring it.
Some quantum algorithms have already been designed to take advantage of these quantum phenomena, where entanglement and interference are utilized on different sets of qubits to perform a computation, after which a measurement is made that collapses each superposition down to a definite 1 or 0, giving an answer.
Finnicky Qubits
But there is just one small problem – by its very nature, a qubit is extremely finnicky. You need to build it from a physical system with two distinguishable configurations that correspond to the computational basis states, \(|0\rangle\) and \(|1\rangle\), and the system must exhibit quantum properties. Once you have managed to fashion one qubit with these qualities, you then need to build many, many more (ideally identical) qubits to provide the kind of processing capacity that can be useful.
On top of that, they must easily interact with one another to allow complex computations, while at the same time remaining robust. Frustratingly though, almost all proposed qubits so far are extremely fragile. When they interact with their environment or strongly interact with one another, they decohere, collapsing into a single reality and transforming into mundane classical bits.
To get anywhere close to reaching the full potential of quantum computers, solutions need to be found to engineer a large number of high-quality, well-connected qubits into a system that can perform accurate computations according to a quantum algorithm and measure the final state of a set of qubits that gives the desired result.
A quantum supercomputer consisting of a million such qubits that can perform one quintillion operations should do it, according to Microsoft. With such a machine, scientific and commercial applications are limited only by the imagination. For example, drug discovery would take days, not years, with the quantum computer being able to quickly simulate complex molecules to identify promising candidates. A quantum supercomputer would also be able to design materials from the atomic level up, imbuing them with extraordinary properties for various purposes, from energy to construction and transportation. It would even be capable of simulating the physics of strongly interacting quantum systems towards building a deeper understanding of how our universe fundamentally works.
The Frontrunner
So is there a qubit out there that Goldilocks would judge to be ‘just right’ for a future quantum supercomputer? One candidate tech behemoths such as IBM and Google are putting their considerable weight behind is the superconducting qubit. Superconducting qubits are very much in IBM and Google’s wheelhouse. Essentially, they are tiny, simple circuits that can be fabricated using processes similar to ones technology companies already use. But these circuits are a little different; they are quantized, meaning that they follow the rules of quantum mechanics to only take on discrete states, and they operate at near absolute zero (i.e. close to -273.15 degrees Celsius) so that their constituent metals become superconducting, conducting current without resistance and thereby avoiding resistive losses and quantum decoherence.
Quantum computers built from superconducting qubits have been at the forefront of the quantum computing revolution for some time. For example, released in 2019, IBM Quantum System One was the first circuit-based commercial quantum computer. Also in 2019, Google’s 53-qubit Sycamore quantum processor grabbed headlines for claiming the experimental realization of ‘quantum supremacy’ on a specific task. In their own words: “Our Sycamore processor takes about 200 seconds to sample one instance of a quantum circuit a million times – our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years.”
IBM and others have since gone on to increase their qubit numbers, with 2022’s IBM Osprey quantum processor featuring 433 superconducting qubits, and more recently their Condor chip housing 1121 qubits. However, a number of challenges are facing further development of quantum computers based on superconducting qubits.
Already, innovative cryogenic methods have been required to overcome mechanical issues associated with cooling the Condor and other chips based on superconducting qubits, and further innovations will be needed in next-generation systems. Moreover, superconducting qubits are sensitive to environmental noise and exhibit high error rates, problems that will only become worse as qubit numbers scale.
Bright minds are now focusing on quality rather than quantity of qubits, and are making progress in advancing quantum error correction and error mitigation, improving processor architectures that significantly reduce errors and developing new cooling methods. But others are looking elsewhere, instead researching whether different types of qubits could outshine superconducting qubits in the long term.
Leading the Charge
Trapped-ion qubits are a leading contender. An ion is a nucleus or molecule that has lost or gained an electron to make it charged. It can then be suspended in free space, or ‘trapped’ using electromagnetic fields. Several ions can be trapped, laser-cooled and placed close together in a one-dimensional array, called an ion chain. This configuration allows the excitations of the ions and the motion of the ion chain to be carefully manipulated with lasers. In particular, the laser light can be wielded to transition an electron in the atom from its ground to an excited state, an ideal pair of qubit basis states. Quantum operations can then be performed using laser or microwave pulses, which can manipulate the internal states of the ions and the interactions between them, including nudging the ions into a state of entanglement.
As trapped-ion qubits are based on ionized atoms, each qubit is identical. And, once prepared in a particular stable quantum state, they remain in that state for very long periods of time (coherence times measured in seconds to minutes as opposed to microseconds for superconducting qubits). Another advantage is that any qubit in the system can be directly entangled with any other qubit. What is more, they exhibit comparatively low error rates and are highly controllable.
For these reasons, research groups across the world take trapped-ion systems as their platform of choice for experimenting with and developing quantum computing technology and conducting quantum simulations. And they are not alone. Companies like IonQ and Quantinuum have already made commercial quantum computers using trapped ions.
Yet these systems seem to have hit a barrier in terms of the sheer number of qubits they can trap and utilize. IonQ’s largest system, Forte, boasts 36 physical qubits and, just this year, Quantinuum announced the launch of its 56-qubit System Model H2. Scaling to 100 or more qubits may prove to be a bridge too far. This is because when the ion chain contains many ions, coherence time and readout accuracy suffer. Longer ion chains have shorter coherence times, with 100+ ion chains suffering significant effects of decoherence. At the same time, it becomes harder to read individual ions with high accuracy as chains grow longer because laser pulse precision beyond current technological capabilities is needed to do so.
Light Alternative
If these hurdles cannot be overcome, photonic qubits are also a serious option. Using the photon as a qubit, the primary advantage of photonic qubits is that they are naturally resilient to types of noise that cause errors and decoherence in other platforms, because they interact much less with their environment. This means that, where superconducting qubits and trapped-ion qubits require complicated cooling setups, quantum computers that depend on photonic qubits can, in principle, operate at room temperature.
They are also very flexible in terms of how they encode quantum information. For example, photonic qubits can be generated using quantum dots in nanophotonic waveguide circuits, or by utilizing the polarization of photons or their paths of travel. Moreover, advances in the fabrication of integrated optical components mean such quantum computers could fit on a single chip, and also readily integrate into existing fibre-optic based telecommunications systems, paving the way for secure quantum communication.
There is just one major problem: Photons not only interact very little with their environment, they also do not interact much with each other. This means they are easily lost and difficult to control, making it challenging to perform complex quantum operations.
Among others, tackling this challenge head-on are the University of Science and Technology of China (USTC) and the Canadian company Xanadu. USTC’s Jiuzhang 3.0 and Xanadu’s Borealis can both solve Gaussian boson sampling problems, a mathematical model suitable for quantum computation, in microseconds, as opposed to the many thousands of years a traditional supercomputer would require to accomplish the same task. This quantum supremacy is however restricted to Gaussian boson sampling. To encode interesting problems reflective of real-world applications in a photonic quantum computer will require significant further development.
Exotic Options and Improvements
These qubit contenders are just the tip of the iceberg. More than an honourable mention should also go to nitrogen-vacancy centres in diamond lattices (imperfections in diamond crystals to create qubits) and neutral-atoms, both of which show exciting promise as qubits.
Furthermore, topological qubits continue to be regarded by some as the panacea of quantum computing. Microsoft, for example, has long seen topological qubits as the path to follow. Encoding quantum information in the topological phase of matter of a system instead of in the properties of individual particles or atoms, this approach delivers a layer of abstraction that protects topological qubits from noise. This provides the potential to reduce errors drastically while allowing a given quantum system to scale.
Recently, both Google and Quantinuum made a breakthrough in this direction, announcing the creation of a new breed of topological quasiparticle called the non-Abelian anyon, discovered using their superconducting qubit and trapped-ion quantum computers, respectively. The hope is that this new quasiparticle may be the missing piece needed for error-free computation from scaled quantum computers based on these approaches.
Another ingenious approach to protecting qubits from errors is to spread and encode information over a collection of physical qubits that form a single ‘logical qubit’. This provides a way to perform reliable quantum computations even when noise and errors affect individual physical qubits, as the information is abstracted to the collective logical qubit. Error-corrected logical qubits have been demonstrated with superconducting, trapped-ion, and neutral atom qubits.
Do these breakthroughs mean quantum computers are now on the verge of solving real-world practical problems better than modern supercomputers? Well, the answer remains no. Each approach to quantum computing still has significant hurdles to overcome. But with the prize ultimately being extraordinary insights into the universe and discoveries beyond human capabilities, progress on so many different fronts continues at a terrifying pace – quantum computing has never been closer to finally benefiting the world.
Endlich mal ein konkreter Hinweis , wie so ein Qubit arbeitet ?
Verstehe ich das richtig, dass ein Ion als Qubit arbeitet , indem gleichzeitig mehrere Elektronenbahnen angeregt werden können ?
Thanks for your article,…
but in the end, there is just one simple message one should know about, you yourself put it in a nutshell:
“Do these breakthroughs mean quantum computers are now on the verge of solving real-world practical problems better than modern supercomputers? Well, the answer remains no.”
Facing reality means: That’s actually all you need to know about quantum computers.
So why bother with quantum computers?
“But with the prize ultimately being extraordinary insights into the universe and discoveries beyond human capabilities, progress on so many different fronts continues at a terrifying pace – quantum computing has never been closer to finally benefiting the world.”
Really? So far it/this never happened.
Think about it…
The myth of theories which are creative in the real world
Long before the introduction of integral and differential calculus, long before theoretical models of the load-bearing capacity and bending possibilities of beams and columns, practically oriented Egyptians, Romans and Greeks created complex structures, some of whose fragments can still be seen today. It was not the theory of the semiconductor that created the semiconductor; the semiconductor as an electrotechnical object for tinkering and inventing left room for theoretical considerations. Functioning technology as innovation of Applied physics requires and required experimental “doers”, the concept of trial & error showed the way.
…since you mentioned it, speaking of the universe…
It should not only amaze interested people, but especially well-trained scientists, such as astrophysicists, that they think, or are more precisely sure, that they can make statements about cosmic processes outside our solar system. On closer inspection, this “astrophysicist thinking” is not only improbable but absurd.
All cosmological “observational studies” are not controllable laboratory experiments. The causes and phenomena of all possible object observations outside our solar system cannot (simply) be assumed to be known. Furthermore, the human observation time span is extremely small compared to the time spans in which cosmic movements took and take place. To base assumptions on the data from the human observation period is “far-fetched”, to put it casually. All current supposedly empirical measurements are heavily theory-laden. Postulated time spans, distances and energy densities are subjective and also extremely theory-dependent.
PsiQuantum, a photonic quantum computing company, which implements Fault-tolerant “Fusion-based quantum computation” will build a US-Based Utility-Scale Quantum Computer in Chicago, Illinois. Utility-scale means 1 million physical qubits. Furthermore, a utility-scale quantum computer is planned for Australia.
Benjamin Skuse wrote (07. Aug 2024):
> […] entanglement can correlate qubits with each other to form a single system, so that measuring the state of one of the qubits lets you know the state of the other without measuring it.
Presumptive “knowledge” which is held without dedicated measurement is readily disregarded when knowledge is obtained by actual measurement.
And having measured (and selected) the state of qubit \(P\) as \(| 1 \rangle\), for instance, does not tell you at all whether the correlated state of some other qubit \(Q\) had been, say, \(| \Leftrightarrow \rangle\), or \(| \Updownarrow \rangle\), or some particular other state which might be expressed as complex-linear combination of those (two orthogonal) states.
Instead, by (repeatedly) having measured both qubits of a given system, \((P \otimes Q) \), and correlating the pairwise findings (trial by trial), it may be subsequently inferred whether the system had been
– entangled as \( \alpha ~ | 0 \rangle ~ | \Leftrightarrow \rangle + \beta ~ | 1 \rangle ~ | \Updownarrow \rangle \), or
– entangled as \( \alpha ~ | 0 \rangle ~ | \Updownarrow \rangle + \beta ~ | 1 \rangle ~ | \Leftrightarrow \rangle \), or (generally)
– entangled as \( \alpha ~ | 0 \rangle ~ \left( r ~ | \Leftrightarrow \rangle + s ~ | \Updownarrow \rangle \right) + \beta ~ | 1 \rangle ~ \left( {\bar s} ~ | \Leftrightarrow \rangle – {\bar r} ~ | \Updownarrow \rangle \right) \),
(where the complex conjugation of probability amplitude values \(r\) and \(s\), resp., is denoted by the bar),
– or not entangled at all, in the trials under consideration.
p.s.
> […] progress in advancing quantum error correction and error mitigation […]
Notable: https://arxiv.org/abs/2210.11505
p.s. — Surely also relevant: https://arxiv.org/abs/2402.05673