Distributed Quantum Computing: A path to large scale quantum computing - Part 1
Shashank V Raghavan??
Artificial Intelligence?| Autonomous Systems??| Resident Robot Geek??| Quantum Computing??| Product and Program Management??
A major obstacle for quantum computers to overcome in order to be useful for industrial scale problems is the need to be scaled up. Scaling quantum computers up to levels beyond the NISQ (i.e. 10s-100s of noisy qubits) era will require scientific breakthroughs and overcoming many current technological hurdles. One proposed solution to overcoming the scaling problem is to connect many smaller scale quantum processors together to form a distributed quantum computing. In this article I’ll summarize what a “distributed quantum computer” means and also discuss the challenges introduced when networking quantum computers.
To start off, it’s important to know what is meant by “scaling” when it comes to quantum computing. Scaling with regards to quantum computing generally refers to a few key things. The number of logical qubits that the quantum computer contains. “Logical” means a qubit that one can perform logical operations on. The number of logical qubits is important because as the size of the inputs — longer lists, bigger numbers, etc.— to a quantum algorithm grows, generally the number of qubits required to the perform the algorithm increases, and usually quite rapidly.
Another component to scaling up is the number of sequential operations (or logical gates) that can be applied to the qubits before too much noise is introduced to get any usable information out. This is generally referred to as the maximum circuit depth of the quantum computer. Circuit depth depends on a number of things, but most important are 1) the quality of technology performing the operations on the qubit, for example microwave or laser pulses, and 2) the quality of the qubits themself, generally measured by how long they can remain in their quantum states before decohering to a classical state. These two properties are the main factors that govern how deep a circuit can be. Circuit depth is important because non-trivial quantum algorithms not only require many qubits, but will generally require a high circuit depth, and so being able to perform many sequential logical operations on the qubits will be of top importance for scaling.
领英推荐
Lastly, the next most common feature referred to when discussing scaling is how “connected” the quantum computer is, or “qubit connectivity”. What connectivity measures is effectively the number of qubits that each qubit can interact directly with. For example, if a quantum computer has n qubits and each qubit can interact directly with each other qubit, then the connectivity of this quantum computer is n-1. If on the other hand a qubit can only interact directly with one other qubit, then the connectivity is 1. This is important because when performing controlled operations (e.g. a CNOT gate), generally the qubits need to be physically near each other. If the qubits are not connected nor near, then moving the qubits to a place where such a controlled operation can be performed is required. This is usually done via what is known as a “swap chain” or via a teleportation protocol. The more connected a quantum computer is, the easier is it to perform controlled operations between arbitrary qubits, which has many implications when performing quantum algorithms in practice — a major one being the simplification of circuit compilation. Low connectivity implies a high complexity for circuit compilation which generally means longer runtimes and less time for performing logical operations. Increasing connectivity is a complex hardware problem and much work is also being done for improving algorithms for qubit routing within quantum computers.
The scaling problem is one of a fine balance between improving certain aspects while not affecting the others. For example, simply increasing the number of qubits in a quantum computer can make the connectivity more challenging. Or increasing the connectivity can make increasing circuit depth harder. To get a quick overall estimate for the quality of a quantum computer, the number of qubits, the maximum circuit depth, and the level of connectivity in a quantum computer have been combined to have a single measure called the quantum volume. Quantum volume is not a completely fair measure given the vast differences in technologies between the various quantum computer implementations (e.g. superconducting vs. trapped ion).
One of the approaches to scaling that I find very interesting is distributed quantum computing. As it was done with classical computing when scaling became an issue, the concept of networking smaller processors together in order to distributing a computational load was introduced. For quantum computing, the same idea applies: Scaling quantum computers up is an issue, and so one can consider networking smaller quantum computers together. Naturally, since quantum computing varies drastically from classical computing, designing networked quantum computers introduces challenges that do not exist in purely classical networks. To more clearly understand these challenges, it helps to first understand what exactly a distributed quantum computing architecture could be. This will discussed in my next article, stay tuned.