Quantum Computation and the birth of Quantum Computers: Its strength, potentials, weakness and limitations. Part II
Felix Wejeyan
Lead Quantum Computation Researcher | Theoretical & Mathematical Physicist PhD Researcher.
What if I told you that there are functors (functions if you like) whose mathematical structure and design we are able to map out succinctly from primitive notion, but whose axiomatic formations and concatenations makes them beyond computable – at least going by our contemporary design and definition of what we mean when we say “computable”. The measure of our combined (theoretical and applicational) understanding of the underlying essence of what to compute means, and the differences between what is computable and what is not, would form the epistemic and ontic foundations upon which we can hope to develop a working theory for quantum states and measurements, and build useful purposes and application for quantum technology.
On one hand, we have come a very long way with honing our computational capabilities to where it is at the moment; on the other hand, one of our greatest weaknesses is that we have come to believe, without any reasonable doubt, that every phenomenon is computable; And, quantum phenomena/computation/computers is here to put that believe to test, and most importantly, ascertain how adaptable and skillful we can become in our understanding and applications of computation and computational tendencies.
If you missed the first part of this newsletter, please read for continuity: https://www.dhirubhai.net/pulse/quantum-computation-birth-computers-its-strength-weakness-wejeyan-emjdc
Generically, to compute means to use mathematical and/or logical operations to process any input through a series of steps (or algorithms) in other to produce an output. This process had always been deterministic until the arrival of quantum mechanical computations, and a new set of “probabilities” were discovered/invented. Note that computing, most times, encompasses arithmetic calculations, data processing, algorithm execution, machine learning and simulation; but it was the thought of simulating physics (particularly quantum mechanics) with computers that gave birth to the idea of quantum computation.
The important question would be, if this computer were laid out, is there in fact an organized algorithm by which a solution could be laid out, that is, computed? Richard Feynman.
While simulating physical phenomena with classical computers, we imitate time as continues and uniformly divided from event to event and state to state, if discretized. How do we actually simulate time itself and not just imitate it? The proposition that the positron behaves as an electron going backwards in time generated the implication that in order to simulate time and not imitate it, the future and the past states of an isolated physical system are computably interconnected. The double-slit experiment also demonstrated that the future of a light photon which just emanates from its source is dependent on whether in its path would be a single or a double slit to pass through.
Classical computers are adaptable to and can simulate classical physics because it is local, causal and reversible; But, is there a computer that can simulate a physically isolated state function which has both the past and future state elements within its present state configuration? How can such a computer describe its computational results in our already familiar simulation of time by imitation? Can such a computer, if properly localized, simulate its localized future? ?
…always have had a great deal of difficulty in understanding the world view that quantum mechanics represents. At least I do, because I'm an old enough man that I haven't got to the point that this stuff is obvious to me. Okay, I still get nervous with it. …every new idea, takes a generation or two until it becomes obvious that there's no real problem. It has not yet become obvious to me that there's no real problem. I cannot define the real problem; therefore, I suspect there's no real problem, but I'm not sure there's no real problem. So that's why I like to investigate things. Richard Feynman.
To make the answers of these question seemingly within our computational capabilities, quantum mechanical equations describe functions which houses both the past, present and future variables within a probabilistic description, at the expense of classical precision and predictability. Now the big questions have been reduced to, how do we simulate these quantum probabilities?
领英推荐
The major problems with simulating these probabilities are discretizing probabilities, dealing with negative probabilities, and the explosive increment in the configuration for a sizeable isolated system. To be able to address these problems, we had to sacrifice our predictive precision and go for a probabilistic computer. This probabilistic computer is an imitator of nature, and wouldn’t simulate nature exactly, but would simulate it with similar probability. Still, there is a nagging question at the back of our minds; How can we design a computer that can simulate quantum mechanics?
Those efforts come in two flavors, depending on the desired level of description of the theory: the so-called Generalized Probabilistic Theories approach and the Black boxes approach.
We could either design a computer which is built with quantum mechanical elements which obeys quantum mechanical laws, or continue with our classical computational capabilities (and computers) but adapt them to imitate quantum computational tendencies. Do we build the computer with quantum mechanical elements (a direct translation of quantum mechanical languages in and out of the computer) that obeys quantum mechanical laws? Or do we build a logical universal Automator similar to our already familiar classical computer to imitate quantum mechanical processes?
At this point, a clear “chasm” (designed and defined by lots of no-go theorems) is drawn between two designs/implementation of quantum computation, and we have to choose which is most feasible. Well, we chose the familiar classical computational algorithms (by panel-beating classical logic into quantum logic), mixed with a blend of quantum mechanical elements (our hypothetical quantum processors), thus, giving birth to a quantum computer. So, quantum computers were not designed for speed or supremacy over classical computers; it is rather designed to solve a different type of computational problem that is apparently beyond the scope of classical computation, or difficult classical computational problems quite easily. ?
Alan Turing described the "human computer" as someone who is "supposed to be following fixed rules; he has no authority to deviate from them in any detail.”
The respect for this chasm is deeply rooted on the fact that quantum logic/computations seem strictly ad hoc in design, lacks any sort of intuitionistic visualization, and, according to the mathematicians, hasn’t been founded on a rigorous/axiomatic mathematical foundation. The possible options to finding solutions to this problem was talked about in the previous part of this newsletter, but a lingering question is, does quantum computation necessarily require a consistent and rigorous axiomatic foundation? Are there other ways of achieving a relative/probabilistic consistent structure which aren’t axiomatic as anything yet defined? This would be proffering that we would need to readdress, redefine and probably extend what we ultimately mean by computation. We know there is a real problem with quantum computation, since we cannot define the real problem, we therefore suspect there is no real problem; but we are not sure there is no real problem, so we continue to investigate.
Another interesting question to answer is, what do we hope to be able to build/design/implement if we are able to breach this chasm between both possible designs for quantum computers? In other words, what possibilities awaits us when we are able to mix both quantum processors and classical processors in a complete hybrid circuit? One could suggest the use of interactive and non-interactive Zero Knowledge Proof (ZKP) as a theory to breach the chasm, thus preserving the probabilistic and unpredictable nature of quantum mechanics; but that opens doors to many more problems, with scalability standing as a colossus amongst these problems: as we are certain that such monstrosities as contemporary designs of quantum computers would soon become the ENIAC/UNIVAC-1 of the future of quantum computers.
This article is for instructional and didactic purpose. If you found it informative and inspiring please like, comment, share and re-post.
Next week’s publication: Quantum Mathematical Objects: Are the mathematical objects used in describing Quantum Mechanics discovered or invented?