Quantum computing and communications: Dead and very much alive.
Not ironically, I remain in a state of supposition with regard to quantum computing and communications. That is, when going about my day, I imagine myself both positive about its prospects, while also simultaneously remaining skeptical. Much like Schr?dinger’s vivid theoretical, feline-featured, visualization of his equation governing the phenomenon, however, I tend to have a more definitive opinion when expressly asked.
For the uninitiated, the previous (incoherent) ramblings refer to one of the fundamental characteristics of quantum mechanics. That is, a particle such a photon or electron (to name two familiar elementary types) are assumed to exist in multiple states (a supposition) simultaneously, until such time they are measured. Only then is their actual position, energy level, and spin known with certainty. It is this detail that underpins the concept of a wave-particle duality. This is famously demonstrated through a double-slit experiment where particles fired from one side of the slits form an interference pattern on the other, indicative of their supposed states. The introduction of measurement devices, however, explicitly discerns the path. At this point, the wave function collapses, and evidence of localized impacts reveals the presence of particles. ?
The application of this phenomenon in computation has obvious advantages. While classical computers can maintain just one state at a time (1 or 0), with bits operating independently, a quantum bit (qubit) can maintain both states simultaneously. OK – but if the measured result is still a one or a zero, so what? That’s where quantum entanglement comes in. Two or more particles can be correlated in such a way that the state of one is identical to the others. Entangle two, for example, and they exhibit the same state, when measured. The more qubits we can entangle, the larger number of states they can be in at any time. Two qubits can be in 4 states, three in 8, and so on. With a quick callback to our wave-particle duality discussion, the state of a quantum computer is determined by the sum of all individual quantum wavefunctions – the overall wavefunction.
With the potential for not only constructive interference but also destructive interference when adding waveforms, we must continually tune the qubit states. This increases the probability that a correct answer to a computation is received when the computer is measured. If all this feels particularly organic, that’s sort of the point. Quantum computers excel in solving problems rooted in nature, which is why they are viewed as particularly important in the areas of life sciences.
There are many companies building quantum computers, and they are approaching them in different manners. Ultimately, the primary measurement (pun intended) of the effectiveness of a quantum computer is the number of qubits they support and how immune they are to internal and external interferences. Interactions from outside particles in the form or electromagnetic, microwave, temperature variations or even mechanical vibrations can dramatically limit the scale and effectiveness of quantum computers. But even the most heavily insulated implementations can suffer from unintended qubit couplings and crosstalk that will lead to computational errors. These issues are, of course, intensified as the number of qubits increases, an issue that has significantly slowed development of larger systems.
Topological qubits, amongst others, hold promise for potentially reducing certain types of interference affecting traditional qubits, thereby enabling quantum computers to scale more readily. Requiring the realization concepts only theorized to date, such as the Majorana Fermion quasiparticle, it remains to be seen how quickly these problems can be overcome. While some are trying to find a particle that is more immune to noise, others are looking to solve this issue through the application of quantum error correction (QEC). Anyone familiar with data and transmission error correction techniques today won't be surprised to hear that the basic principle of QEC involves the encoding of information into entangled states across multiple qubits. The issue here is now many error-prone qubits it takes to make one error resistant qubit. The answer likely depends on the flavor of quantum computer being used, with some putting the number at hundreds, while others are in single digits.
At the same time as these quantum innovations are matching classical supercomputing power, however, generative AI is exploding. With specialized training, this might be why we will see ever more powerful (frontier) large language models (LLMs) stall the development of quantum computers in the short term. They only need GPUs to function, which (as evidenced by the meteoric rise of Nvidia’s share price) are more than just theoretical. And Microsoft’s CEO attested to this fact when he said “I was thinking, quantum is the next big accelerator,” “It turned out the GPU was the next big accelerator.” The small point eco-friendly Microsoft tries to brush aside, however, is the incredible amount of energy required to drive these GPUs. And in countries like the US, still wary of nuclear power since Three Mile Island back in '79, the veiled suggestion that it's the answer may not fly with everyone.
领英推荐
The simple fact is that quantum computers still promise significantly greater computational capabilities while consuming a fraction of the energy.
Aside from compute, though, quantum holds exciting potential in the field of communications – the quantum internet. Think back to my original description of entanglement, where I noted that the state of particles can be correlated regardless of distance. Send one of a pair of entangled photons, for example, 100 kilometers away. Collapse the supposition of the local one by measuring its state, and the remote one instantly assumes one identical. With a yet unknown connection, this quantum state transfer is referred to as teleportation which, as a trekkie, sends tingles where it probably shouldn’t. TMI? Sorry. Anyway, photons transmitted in this manner are referred to as flying qubits, which also just sounds cool, doesn’t it?
While the techniques are already employed for quantum cryptography, in the form of quantum key distribution (QKD), there are limiting factors that also restrict its application for information transfer. The most obvious is the distance a photon can be transmitted and amplified before requiring regeneration, an action that requires measurement which inherently breaks the supposition. But it is that native resilience to man-in-the-middle attacks that makes quantum so attractive. Meanwhile, answering the question of reach is the quantum repeater. Although once again still in the research phases, these devices would perform entanglement swapping in order to scale-out quantum transmission systems. Only then can we really consider the quantum internet viable.
So, what of all the talk about post-quantum cryptography (PQC), you ask? Is that not an application of quantum? That’s a great question. The answer is no. But the origin of PQC is rooted in the idea that wide-scale quantum computing is imminent. It has long been presumed, by some, that nation states began furiously capturing and storing sensitive, classically encrypted, data on the assumption that quantum computers could one day soon break the current public-key algorithms and release the secrets held within. This possibility originated in 1994 with the development of Shor’s algorithm that demonstrated how a quantum computer could break classical cryptographic schemes.
With the progress of quantum computers catching up with that long-held paranoia, in 2016 the US National Institute of Standards and Technology (NIST) initiated a formal process of evaluating cryptographic algorithms that would withstand all-out assault by quantum computers. This work has also been picked up internationally by the Internet Engineering Task Force (IETF) among others. There are a lot of PQC candidates being shortlisted by these standards bodies: Lattice-based, code based, hash-based multivariate and supersingular to name a few. All aim to resist quantum attacks, in the future.
But how far in the future are we really talking about, here? It is this promise and these advancements coupled with seemingly insurmountable obstacles that leave me to think quantum is both dead and alive. Let me know your thoughts in the comments section! Yes - I know - all I’m going to do with that request is prove that no one has read this. But that’s OK. If I never look myself, I can remain in that blissful state of supposition that includes the fact that lots of people have.
Retired
8 个月I’m not on this or other social media much lately, but it’s so good to see posts from you & your old boss. I miss working with you two, but I don’t miss working. I’m happy to see that you’re doing well, Simon.
Terrific overview! I do appreciate the concerted industry attempt to crisply ID the development gates of where we are and how far we need to go to reach end results. https://cloudblogs.microsoft.com/quantum/2023/11/08/microsoft-and-photonic-join-forces-on-the-path-to-quantum-at-scale/
Vice President of Marketing for tech companies
8 个月Great piece Dredgie. I feel better educated. Although I’m still not sure I can explain the cat!