Just like error correction is to classical computers, quantum error correction (QEC) is the key to realizing useful quantum computing. According to IEEE Spectrum’s article, titled “Quantum Error Correction: Time to Make it Work”, today’s best quantum processors suffer from errors a billion trillion times more frequently than classical central processing units (CPU). Therefore, beyond the implementation of fault tolerant quantum error correction codes (QECC), the notion of error-free quantum computing is actually going to require getting the full stack of the quantum computer involved. Some of the preventative measures that will be needed to reduce error rates below the minimum acceptable thresholds include:
- At the hardware level, protecting quantum information involves shielding qubits from the environment, selecting modalities with long coherence times, maximizing qubit connectivity, arranging physical qubits to allow for the implementation of QECC, and, of course, error-free fabrication of all the various components.
- At the control systems level, pulses of lasers or microwaves must be timed with exquisite precision and measurements must not be disturbed.
- At the software level, transpilation and compilation must send the correct pulse schedules to the control systems.
- At the quantum algorithm level, initial quantum states need to be prepared accurately, circuit depth must be minimized, and multi-qubit operations must be used as sparingly as possible.
The selection of a QEC code is going to be vital, of course. One attribute they all have in common is the use of multiple physical qubits to encode single logical qubits. Some of the most popular QECC examples, and how they encode logical qubits, include:
- Shor Code - The first QECC created and uses nine physical qubits to encode one logical qubit that can correct both bit flip and phase flip errors.
- Steane Code - The use of seven physical qubits to encode one logical qubit, also corrects both bit flip and phase flip errors, and which does not cause new errors to arise, which makes it fault-tolerant.
- Surface Code - The use of two-dimensional lattices of physical qubits to encode logical qubits, which then encode information through the physical rearrangement of the logical qubits.
- Quantum Low-Density Parity Check (qLDPC) Code - maintains consistency in the use of stabilizer qubits, both in the number of stabilizer qubits protecting each physical qubit and the number of physical qubits being protected by each stabilizer qubit.
For further reading, our “Introduction to Quantum Error Correction” page summarize a The Quantum Insider article written by our CMO in which he discusses the need for QEC, the types of errors that can occur, a comparison of error mitigation and error correction, the concept of logical qubits, prototypical QECC, the current state of the art in QEC, and concerns about the scalability of quantum computers when factoring in the large physical qubit counts required by QEC.
The Challenge of Quantum Errors
The first and foremost challenge of QEC is containing errors. If one physical qubit contains the information and it experiences an error, the information is incorrect. But if three physical qubits contain the same information and one experiences an error, the two qubits with the correct information outvote, in some sense, the incorrect qubit.
The problem, if this remains unchecked, is that a second qubit may eventually experience an error. Now the two incorrect qubits outvote the one correct qubit, and that incorrect information can cascade throughout the entire system as further operations are applied.
The mission of a QECC is to prevent this from happening, to prevent localized errors from accumulating and spreading. Auxiliary qubits, called stabilizer qubits, can be used to detect that the qubits are in disagreement. An operation can be applied that brings the incorrect qubit back into agreement with the two correct qubits. This way, when the second qubit experiences an error, the other two qubits are still correct, this new disagreement is detected, and this new disagreement is corrected. This logical qubit continues to provide correct information to the remainder of the system.
QEC encounters some interesting challenges along the way. Some scenarios are:
- Not all time-tested classical error correction techniques are applicable to quantum computing due to the no-cloning theorem, which prohibits the precise duplication of unknown quantum information.
- An error correction threshold of 99% fidelity for two-qubit entangling gates must be surpassed in order for fault-tolerance to be realized. Although, a team from Harvard, MIT, and QuEra have already demonstrated that this is possible at-scale.
- Quantum technologies are imperfect, resulting in errors from the application of gates, including crosstalk as gates unintentionally affect neighboring qubits.
- Quantum metrology and quantum error correction need to be developed together, as precise measurements may help reduce unwanted effects from environmental noise.
The first point is significant, because QEC could be much easier if classical error correction could be employed. Unfortunately, the probabilistic nature of qubits, despite all of the benefits that provides, becomes an obstacle in this instance. Imagine a qubit in an equal superposition, during which it has a 50% probability of measuring 0 and a 50% probability of measuring 1. If the qubit is prepared and measured 1000 times, its probabilistic nature may result in 502 measurements of 0, for example, and 498 measurements of 1. Reconstructing this state would result in a slightly different preparation, which means that the new qubit would not be a precise clone of the original. The unknown state can be copied quite closely, but it cannot be an exact clone.
One of the paradoxes of QEC is that attempting to detect and correct errors, that can inadvertently result in new errors. There might be a measurement error that results in a gate being applied unnecessarily or a needed correction not being applied. Even if the correct action is executed, the application of the gate can introduce new errors.
Another paradox of QEC is its resource requirements. Huge number of physical qubits will be required to encode enough logical qubits to perform useful computation. But the more physical qubits required to encode logical qubits, which improves accuracy, reduces the number of logical qubits available to do useful computation.
The Quantum Error Correction Code
Although there is more than one QEC code, there is some basic commonality among them. The following can be thought of as the high-level definition of a QECC:
- The goal is to protect quantum information from as many sources of noise and errors as possible.
- The quantum information is spread across multiple, perhaps many, physical qubits.
- Errors on individual qubits are detected and corrected, preventing their spread across the system.
There is some variety among implementations. Some of the lesser-known approaches to QEC include:
- Single shot quantum error correction, which uses only one round of detection and correction and then presumes that residual errors will be too limited to have negative consequences.
- Analog quantum error correction, which accounts for continuous quantum variables used in analog mode quantum computing.
- Entanglement purification and quantum error correction, which applies to quantum networks and communication applications.
It is important to note that technological advancements may continue to evolve the implementation of QECC. For example, auxiliary qubits are usually required to perform logical operations on logical qubits. However, the “qubit shuttling” ability of neutral atoms allows the physical qubits of the logical qubits to interact without these auxiliary qubits. This reduces the demand on the total number of physical qubits required. Qubit shuttling may also reduce the number of physical qubits that are required per logical qubit, further easing the requirements for fault-tolerant quantum computing.
Quantum Error Correction in Action
The IEEE Spectrum article notes that QEC has a strong theoretical foundation at this point. Beyond the accumulation of theoretical work, experimental proofs-of-concept have been demonstrated, as well. More research is needed on both sides, but progress has been made and continues to be made.
It is worth noting that many errors result from the application of gates and errors in the fabrication of qubits. The former can be avoided by simply switching to analog mode quantum computing, which uses continuous pulses instead of discrete pulses. The latter can be avoided by using natural qubits such as neutral atoms, which cannot be defective. So, while analog mode neutral atom quantum computing still requires QEC, it is closer to practical universal quantum computing than other modalities.