close
close

Meeting the challenge of quantum error correction

Meeting the challenge of quantum error correction

&Bullet; physics 17, 176

Google Quantum AI researchers have demonstrated subthreshold error correction, a necessary prerequisite for building noise-resistant quantum computers large enough to perform useful calculations.

Google Quantum AI

Photo of the Willow chip from Google.

They are mistakes bête noire of quantum computing. They can come from material defects, thermal fluctuations, cosmic radiation or other sources and only become more disruptive the larger a quantum processor is. But demonstration of an unprecedented ability to correct quantum errors could spell the end of this trend. A team of Google Quantum AI researchers in California have demonstrated a “subthreshold” error correction method with their latest quantum processor called Willow – a method that actually works better as the number of quantum bits, or qubits, increases (1). The team also showed that this new quantum chip can solve a benchmark test that would require 10 septillion (10) in 5 minutes25) years on today’s most powerful supercomputers.

“I find it amazing that such an excellent level of control is now actually possible and that quantum error correction actually appears to behave as we predicted,” says quantum information researcher Lorenza Viola of Dartmouth College in New Hampshire. Proving that error correction extends the length of time a qubit can store information “is a remarkable milestone,” says CalTech theoretical physicist John Preskill. (Neither Viola nor Preskill were involved in the work.)

The basic idea of ​​error correction is that many “physical” qubits work together to encode a single “logical” qubit. Similar to error correction in classical devices, its quantum counterpart uses redundancy: a logical qubit of information is not stored in a single physical qubit, but is distributed across an entangled state of physical qubits. The challenge is to identify errors in the fragile quantum states without introducing additional errors. Researchers have developed sophisticated methods, called surface codes, that can correct errors in a two-dimensional planar array of qubits (see Viewpoint: Error-correcting surface codes are experimentally verified).

The surface code approach requires significant hardware overhead – additional physical qubits and quantum gates that perform error correction operations – which in turn introduces more opportunities for error. Since the 1990s, researchers have predicted that error correction can only lead to a net improvement if the error rate of physical qubits is below a certain threshold. “It makes no sense to do quantum error correction if it doesn’t go below the threshold,” says Julian Kelly, director of quantum hardware at Google.

Previous error-correction efforts typically worsened the error rate as more qubits were added, but Google’s AI researchers have reversed that direction. “We are finally below the threshold,” says Michael Newman, a researcher at Google Quantum AI. The milestone was achieved in an experiment that used the Willow chip, a 2D array of 105 superconducting qubits, to store a single information qubit in a square grid of physical “data” qubits. To verify that the error rate scales as desired, they varied the size of this grid from 3×3 to 5×5 to 7×7, corresponding to 9, 25, and 49 data qubits, respectively (along with other qubits that cause errors ). -corrective operations).

Google Quantum AI and employees (1)
Schematic of a logical qubit for a 7×7 surface code (including a 7×7 grid of physical or data qubits, as well as other qubits that perform error-correcting operations).

They found that with each increase in grid size, the error rate decreased by a factor of 2 (an exponential decrease), reaching a rate of 1.4 × 10−3 Errors per cycle of error correction for the 7 × 7 grid. For comparison, a single physical qubit experiences approximately 3 × 10−3 Errors over a comparable period of time, meaning the 49-qubit combination causes fewer errors than just one physical qubit. “This shows that error correction is truly greater than the sum of its parts,” says Newman. Furthermore, the observed exponential suppression implies that further increasing the grid size should lead to increasingly lower error rates.

The key factor in the result was an improvement in the performance of the physical qubits. Compared to the qubits in Google’s previous quantum processor Sycamore, Willow’s qubits feature an up to fivefold increase in qubit coherence time and a twofold reduction in error rate. Kevin Satzinger, also a research scientist at Google Quantum AI, says that the increase in physical qubit quality is due to a new, dedicated manufacturing facility as well as improved processor architecture design through so-called gap engineering.

To assess whether their Willow processor had “beyond classical” capabilities, the team used it to perform a task called “Random Circuit Sampling” (RCS), which generates samples of the output distribution of a random quantum circuit. Although RCS has no practical use, it is a leading benchmark for evaluating the performance of a quantum computer because it deals with a computational task considered intractable by classical supercomputers. In the RCS test, Willow achieved a clear quantum advantage by quickly performing a calculation that a classical supercomputer could not do on time scales far beyond the age of the universe.

Google has outlined a roadmap for a large-scale, error-corrected quantum computer that includes scaling its processor to up to a million physical qubits and reducing logic error rates to less than 10−12 Errors per cycle. Such a computer could solve a variety of classically intractable problems in drug design, fusion energy, quantum-assisted machine learning and other fields, the researchers say. To enable this scaling, an important research direction is to further improve the underlying physical qubits, says Satzinger. “Thanks to the proven effect of quantum error correction, even a small improvement in physical qubits can make a difference of orders of magnitude.”

Viola says the result reinforces hope that fault-tolerant calculations may be within reach, but further progress requires studying the possible physical mechanisms that lead to logical errors. “As error rates become ever smaller, new or previously unconsidered noise effects and error propagation mechanisms may become relevant,” she says. “We still have a long way to go before quantum computers can run a wide range of useful applications, but this demonstration is a significant step in that direction,” says Preskill.

–Matteo Rini

Matteo Rini is the editor of Physics magazine.

–Michael Schirber

Michael Schirber is corresponding editor for Physics magazine based in Lyon, France.

References

  1. Google Quantum AI et al., “Quantum Error Correction Below the Surface Code Threshold,” Nature .

Current articles

A laser casts a shadow
optics

A laser casts a shadow

By interacting with a crystal, light becomes an opaque object – an effect that can be used for applications. Read more “

Long-range quantum radar
Quantum information

Long-range quantum radar

A proposed remote sensing scheme could potentially probe targets hundreds of kilometers away and takes advantage of one of the strangest quantum properties of light. Read more “

Recognizing the scars of spacetime
astrophysics

Recognizing the scars of spacetime

Scientists have developed a way to use current gravitational wave detectors to observe permanent deformations in space-time caused by certain supernovae. Read more “

More articles

Leave a Reply

Your email address will not be published. Required fields are marked *