From Quantum Mechanical Experiments to Information Processing Systems
A physicist imagines a quantum computer by DALLE

From Quantum Mechanical Experiments to Information Processing Systems

DiVincenzo's 5 Criteria for the Physical Implementation of a Quantum Computer

In 2000, just 18 years after Feynman's vision of a quantum computer, the race to build one was just beginning. Researchers began to explore many different embodiments of quantum bits, such as photons, atoms or superconducting circuits.

DiVincenzo published five basic criteria as a checklist for quantum mechanical information processing. ?? Here I give simplified summary:


1. A scalable physical system with well characterized qubits ??

A quantum bit, or qubit, is a two-level system that represents a logic 1 and a logic 0. For example, it can be physically encoded in the discrete energy levels of an atom. Discrete means that when the atom is observed, it is only found in one of these energy levels and never in between. Only when not observed can the qubit be in a superposition of these states.

Well characterised means that these two selected energy levels can be precisely controlled by an external apparatus. If the quantum particle has more energy levels, the qubit should have a very low probability of escaping to them.

Scalable means that it is possible to build a large apparatus that can contain many of these qubits side by side and control them with high fidelity, which requires them to be isolated from unwanted interactions with the environment.


2. The ability to initialize the state of the qubits to a simple fiducial state ??

This is due to the simple computational requirement that registers should be initialized to a known value before the computation starts. Each computation usually starts with the qubits in a classical state, which means not entangled and not in superposition. Initialization can be achieved, for example, by observing the qubit so that it is forced to choose a state. The qubit can then be rotated into the desired state.

Since the qubits have a high error rate, a elaborate error correction is needed to detect and correct errors during the quantum computation. Error correction requires a continuous supply of initialized qubits as temporary information storage. Since qubits have a limited lifetime before the interaction with the environment randomizes the information stored on them, the speed with which a qubit can be initialized is critical.


3. Long relevant decoherence times, much longer than the gate operation time ?

The decoherence time describes how long the information in qubits remains significantly detectable before it fades into random noise. Importantly, the decoherence time doesn't need to match the total computation time because quantum error correction can extend the information's lifetime. As DiVincenzo noted:

"if the decoherence time is 10^4 ? 10^5 times the “clock time” of the quantum computer, that is, the time for the execution of an individual quantum gate (see criteria 4), then error correction can be successful."

4. A “universal” set of quantum gates ??

Quantum gates perform logical operations on qubits. Single-qubit gates control the superposition of individual qubits, which can be visualized as vectors described by two angles—requiring at least two different gates, such as rotations around the x and y axes. To implement these gates, qubits need external apparatus to switch specific transformations (Hamiltonians) on and off, achievable through methods like controlled lasers, applied voltages, or currents interacting with the quantum particle.

A multi-qubit gate is needed to entangle the qubits, which creates a causal relationship between them. To do this, an external apparatus must allow controlled interaction between the qubits, causing them to evolve as a function of their states. One realization of an entanglement gate is the CNOT gate. Depending on a control qubit, the state of a target qubit is negated or not.

Since quantum gates are imperfect, error correction is necessary. The errors are either random or systematic; systematic errors accumulate and can severely compromise computations, so they must be minimized through proper calibration of the control apparatus. Random errors degrade coherence time. DiVincenzo quantified the tolerable random error rate as being around 10^?4 to 10^?5 per gate.


5. A qubit-specific measurement capability ??

Finally, the information from the qubits has to be extracted by making an observation. To do this, the external apparatus forces individual qubits to reveal their state. This collapses the quantum state and translates the qubits into classical bits. Each qubit readout has a certain randomness, which can be accounted for by repeating the measurement many times.

Error correction involves performing many readouts on intermediate qubits to detect errors within the computation. Therefore, the readout must be qubit-specific and much faster than the decoherence time.


Prof DiVincenzo is now director of the Institute for Theoretical Nanoelectronics at Forschungszentrum Jülich . In his presentations he still has the same great humor as in his 2000 paper that you can find here. ??


Commenting on the quantum race, DiVincenzo said:

"So, what is the “winning” technology going to be? I don’t think that any living mortal has an answer to this question, and at this point it may be counterproductive even to ask it."

That statement still applies today, as the race is still undecided. ???

要查看或添加评论,请登录

Pau Dietz Romero的更多文章

社区洞察

其他会员也浏览了