THE RADICAL DISCOVERY OF THE NOBEL PRIZE IN PHYSICS 2022 is fully explained
THE RADICAL DISCOVERY OF THE NOBEL PRIZE IN PHYSICS 2022.
INTRODUCTION
The Royal Swedish Academy of Sciences announced their decision to award the Nobel Prize in Physics 2022 to three researchers Alain Aspect, John F. Clauser, and Anton Zeilinger.
Brief profile of these Nobel Laureates
Alain Aspect, born 1947 in Agen, France. PhD 1983 from Paris-Sud University, Orsay, France. Professor at Institut d’Optique Graduate School – Université Paris-Saclay and école Polytechnique, Palaiseau, France.
John F. Clauser was born in 1942 in Pasadena, CA, USA. Ph.D. 1969 from Columbia University, New York, USA. Research Physicist, J.F. Clauser & Assoc., Walnut Creek, CA, USA.
Anton Zeilinger, born 1945 in Ried im Innkreis, Austria. Ph.D. 1971 from the University of Vienna, Austria. Professor at the University of Vienna, Austria
ENTANGLED STATES – FROM THEORY TO TECHNOLOGY
Alain Aspect, John Clauser and Anton Zeilinger have each conducted groundbreaking experiments using entangled quantum states, where two particles behave like a single unit even when they are separated. Their results have cleared the way for new technology based on quantum information. The indescribable effects of quantum mechanics are starting to find applications. There is now a large field of research that includes quantum computers, quantum networks, and secure quantum encrypted communication. One key factor in this development is how quantum mechanics allows two or more particles to exist in what is called an entangled state. What happens to one of the particles in an entangled pair determines what happens to the other particle, even if they are far apart. For a long time, the question was whether the correlation was because the particles in an entangled pair contained hidden variables, instructions that tell them which result from they should give in an experiment. In the 1960s, John Stewart Bell developed the mathematical inequality that is named after him. This states that if there are hidden variables, the correlation between the results of a large number of measurements will never exceed a certain value. However, quantum mechanics predicts that a certain type of experiment will violate Bell’s inequality, thus resulting in a stronger correlation than would otherwise be possible.
John Clauser developed John Bell’s ideas, leading to a practical experiment. When he took the measurements, they supported quantum mechanics by clearly violating a Bell inequality. This means that quantum mechanics cannot be replaced by a theory that uses hidden variables. Some ambiguities remained after his experiment. Alain Aspect developed the setup, using it in a way that closed an important loophole. He was able to switch the measurement settings after an entangled pair had left its source, so the setting that existed when they were emitted could not affect the result. Using refined tools and long series of experiments, Anton Zeilinger started to use entangled quantum states. Among other things, his research group has demonstrated a phenomenon called quantum teleportation, which makes it possible to move a quantum state from one particle to one at a distance. says Anders Irb?ck, Chair of the Nobel Committee for Physics, Says “It has become increasingly clear that a new kind of quantum technology is emerging. We can see that the laureates’ work with entangled states is of great importance, even beyond the fundamental questions about the interpretation of quantum mechanics,”
The background before invention took years of several scientists who worked as a computer mechanic
Thomas G?rnitz on 26 October 2022 through a research paper recalled the previous progress so that we can understand what the Nobel Prize Physics 2022 means for the future. The long-pending public recognition for experiments that confirm the theoretical aspects of a fundamental property of quantum physics. The 2022 Nobel Prize in Physics has been awarded to three experimental physicists whose fundamental experiments have dispelled widespread and popular doubts about the structure and interpretation of the quantum theory. These experiments have paved the way for technical applications that were previously unimaginable as reality. The Beaming and message transmission, which is tap-proof for reasons of natural law, previously existed only in science fiction films. Above all, however, the experiments were important milestones for a better understanding of reality. They help to free quantum theory from the misconception that it is beyond our mind that it is limited to the field of "microphysics" and that it is only relevant to specialists.
1 Einstein and the non-locality of quantum theory
Einstein supported his doubts about the fundamental property of the so-called "nonlocality" of quantum theory by proposing an experiment. The experiments of John Clauser, Alain Aspect, and Anton Zeilinger were essential in dispelling Einstein's objections. Together with Boris Podolsky and Nathan Rosen, Albert Einstein had theoretically shown the following: If quantum theory describes reality accurately, i. e. if there are non-local properties within its framework and thus properties that are widely distributed in space, then two particles that are created together would then remain mysteriously connected, even if they have flown far apart. Since then, experiments that investigate such relationships have been called EPR paradox experiments. Weizs?cker once told Thomas G?rnitz about a conference on EPR at which Rosen was also present. There it was stated approximately: Einstein would have invented EPR to show the "absurdity of quantum theory." Einstein had the idea; Rosen made the calculations, and Podolski the advertisement. Rosen would not have disagreed with this. For the first time, a physics paper was published in the New York Times before it appeared in Physical Reviews. Einstein called the mentioned persistent connection in the EPR experiment a "spooky action at a distance". According to the theory, the change at one particle results in an instantaneous change at the other particle. Thus, the changes would not only occur at most at the speed of light from one place to another, as required by the theory of relativity. Such phenomena seemed unacceptable to Einstein. Erwin Schr?dinger, who also like Einstein felt a great aversion to the then prevailing interpretation of the quantum theory, called such connected particles "entangled". The term "coherent states", which is often used today, perhaps better describes the phenomenon. For 2 two entangled ropes are still two ropes, the point is the emergence of a "partless wholeness". Bell's inequality John Bell wanted to show that Einstein was right in his distrust of this "non-local" aspect of quantum theory. "Non-local" because these relations exist independently of the distance of the entangled particles. Bell had developed an inequality for this in the 1960s. For the non-experts it is to be explained: With an equation always the same is written on both sides. It is expressed, however, mostly differently. So everybody knows Pythagoras: a2 +b2 = c2. The Thales circle above the hypotenuse c is the geometric locus of all those points where the catheti a and b enclose a right angle. For the points outside the Thales circle the angle becomes acute, inside it becomes obtuse. This is true for the right triangles. Their corners all lie on the Thales circle. If the angle between the sides a and b becomes smaller than the right one, because the corner between a and b becomes further away from the fixed side c, then we have inequality. a2 +b2 > c2 is a condition for not having a right angle, but an acute angle. a2 +b2 < c2 is the condition for an obtuse angle. Thus, inequalities can be used to clarify differences or conditions. Bell's inequality is a condition for the difference between "local" and "nonlocal". According to a local theory like relativity, a "remote effect", as Einstein had called it, cannot exist. Nothing propagates faster than the speed of light. In a non-local theory like quantum theory, such apparent long-distance effects can exist. However, the word "remote action" creates a wrong picture here. We are dealing with quantum physical correlations. These can change immediately, instantaneously. The change of a correlation is not an impact from one place to another place. For a long time, it was considered a foregone conclusion among experts that the EPR thought experiment would not become a real one. John Bell's inequality, however, opened a door for verification. Local theories satisfy it, but it does not apply to quantum theory; it violates Bell's inequality.
CLAUSER'S EXPERIMENT IN 1972
Clauser's experiment in 1972 John Clauser, like Einstein, wanted to take action against an understanding and the resulting representations of quantum theory that contradicted the usual ideas about Einstein's special theory of relativity. Clauser was the first to have the courage to do so and also the experimental means to test Bell's inequality. The experiment, published by him in 1972 with Stuart J. Freedman (1944-2012), turned out as neither Einstein nor Bell nor Clauser had hoped. The article ended with the statement, "we observe no evidence for a deviation from the predictions of quantum mechanics, ... We consider these results to be strong evidence against local hidden-variable theories." If there would be such hidden local variables, then these would determine at the beginning of the experiment already in each of the produced particles, how the later measurement would turn out.
Each particle could then carry this determination with it, and a hidden connection of some kind to the other particle would be superfluous. However, at that time there was still much criticism of Clauser's result. Among many 3 physicists, the consequences of quantum theory were unpopular. This dislike of the randomness that quantum theory reveals in nature still exists half a century later. Since quantum theory makes statistical statements, some believed, and some still believe, that a hidden deterministic theory could underlie it. In these considerations, one took classical statistical thermodynamics as a model. There one can start from deterministic mechanics with small particles. Averaging over locations and velocities then results in a statistical theory. In Clauser's experiment, one of the intended checks of the particle behavior, i.e. the photons to be emitted, was set up at a time. Then two entangled photons were emitted from a calcium atom and their respective properties were measured. The objection to Clauser's result was based on the fact that the particle behavior check was already set up in each case before the photons flew off. It was argued that the photons could mysteriously adjust to the respective test situation before flying off. That the experiment turned out as it follows from quantum theory would not be actual proof that a local interpretation was impossible.
Aspect's experiment 1982
Aspect's experiment 1982 Alan Aspect published a paper in 1982 with Jean Dalibard and Gerard Roger on a new experiment against which this argument could no longer be made. In the experiment, photons were sent through an optical fiber. They were therefore longer on the way than with Clauser. This made it possible to set up the check only after the photons had already flown off. A joint secret "tuning" of the photons to the test situation was no longer possible before flying off. At the end of Aspect's work, there is the remark: "A more ideal experiment with random and complete switching would be necessary for a fully conclusive argument against the whole class of supplementary-parameter theories obeying Einstein's causality. However, our observed violation of Bell's inequalities indicates that the experimental accuracy was good enough for pointing out a hypothetical discrepancy with the predictions of quantum mechanics. No such effect was observed." In a footnote follows another important statement: "Let us emphasize that such results cannot be taken as providing the possibility of faster-than-light communication." This fact is occasionally ignored in popular accounts of non-local phenomena. So, there is no contradiction between EPR behavior and Special Relativity
It is necessary to explain in detail. John Bell's reaction sometime after Aspect's paper appeared, John Bell gave a lecture at the Technical University of Munich in Garching on the experiments on his inequality, especially Aspect's new experiment. Carl Weizs?cker and Thomas G?rnitz, both wanted to listen to Bell. Even then, there was still criticism of the interpretation of the results. Some hoped that there could still be possible gaps in the evidence. It made a great impression on G?rnitz that John Bell, who of course knew all these criticisms, merely said about it: He was "cold-blooded enough" to be able to accept that reality was not as he had wished it to be - namely that behind the quantum theory a deterministic theory in the sense of Einstein was hidden.
In the book, "Der kreative Kosmos" (2002), which G?rnitz published together with his wife, reported about this encounter with Bell and pointed out the great importance of the connections experimentally substantiated by Aspect. Zeilinger's experiments on non-locality in quantum theory Anton Zeilinger received his doctorate in 1971 from Helmut Rauch (1939-2019) with a thesis on the behavior of neutrons. Rauch caused a sensation among physicists when he showed in 1974 that not only photons, i. e. light quanta, but even elementary particles as heavy as neutrons have wave character and can produce interference patterns. Zeilinger then conducted such interference experiments with increasingly heavy quantum particles in the course of his research. In 1999, Zeilinger's research group had begun spectacular experiments with fullerenes, molecules made of 60 carbon atoms. Interference experiments with such large masses seemed unimaginable until then. This work, which named Markus Arndt as the first author, had shown that such a huge molecule can react like a wave and generate interference lines. For a wave to generate such lines, "slits" are necessary. One speaks also of "diffraction gratings". At the slits "elementary waves" are generated, whose wave trains propagate spherically. They intensify where the mountain meets mountain or valley meets valley and cancel where mountains meet valleys. In this case, the wavelength and the gaps with their distances must be of the same order of magnitude. Further experiments initiated by John Clauser and Anton Zeilinger and carried out with their collaborators have led to the fact that today no serious physicist doubts these phenomena of non-locality. Unfortunately, we never met John Clauser personally, but Anton Zeilinger accepted G?rnitz invitation for a lecture in Frankfurt in 2000. In 2000, the hundredth birthday of quantum theory was celebrated. Together with G?rnitz postdoctoral student at the time, now Professor Dr. Gesche Pospiech, we had organized a lecture series open to the university, sponsored by Deutsche Bank: "Weltbilder im Lichte der Quantentheorie" ("Worldviews in the Light of Quantum Theory").
In 2000, Anton Zeilinger (Nobel Prize 2022) & Sir Roger Penrose (Nobel Prize 2020) met at a conference. Zeilinger spoke on 29.11.2000 about "Reality and Knowledge in the World View of Quantum Physics". Zeilinger's statement inspired me that he hoped that such interference experiments could also be carried out with viruses in the foreseeable future. That was important to him so that biologists would finally realize that life could not be explained without quantum theory. In 2009 a popular magazine published a special issue (Telepolis Special 01/2009 "Future. The world in 1000 years"). G?rnitz contributed and had given it the title: "Quanta will remain different / A review of the macroscopic quantum world of the future". G?rnitz attributed in 2009 to the "Nobel Prize winner Zeilinger" the successful interference experiment with viruses already for the year 2012. The isolation of a virus is much more complicated than with a very large biomolecule. For some time now, Zeilinger has handed over the research on large molecules to Markus Arndt. Markus Arndt and G?rnitz met in August 2020 at an event in the Physics Center Bad Honnef of the German Physical Society. The "physics-philosophy quartet" was about the question "Quantum Mechanics and Reality. - What is Matter? Zeilinger and "beaming While the interference experiments with large organic molecules excited rather the experts, the experiments of Zeilinger and co-workers on the "beaming" of quantum properties and especially on the tap-proof transmission of messages by means of entangled photons attracted also the attention of a larger public. In these experiments, information about quantum states is transmitted from one place to another by means of entangled photons. How in the application of these methods the quantum theory ensures that "eavesdropping" is impossible for reasons of natural laws, is explained below. It is also known of Anton Zeilinger that his interests are wide-ranging. His openness also for philosophical and theological questions is evident in his publications. Since in his experiments, the information is essential, he had also taken note of v. Weizs?cker's idea of primordial alternatives. So Zeilinger writes 2003 in his book: "Einsteins Veil, The New World of Quantum Physics". "Information is the primordial matter of the universe". However, it is to be read with him on the next page: "It is certainly reasonable to assume that the amount of information we need to characterize a system becomes smaller, the smaller the system is." This assessment leads to the necessary discussion of the concept of information according to Weizs?cker. A reflection on "information" is necessary to make the primary non-locality and the role of quantum information in physics a little easier to understand.
Having made these preparations, and before proceeding to the explanation of the experiments, it is important to ask but also to answer the following question: How should we talk about quantum phenomena? G?rnitz book "Quanta are Different" (1999), explained the "incomprehensibility of quantum theory" feeds on ideas that come from classical physics and that point in the wrong direction. In everyday life we are familiar with two principles: Not only the facts we know from the past and the present influence our present actions. Even possibilities that have not yet become factual are capable of significantly influencing our current actions. Beyond the facts known to us, we also act according to what we expect, fear or hope. The process of our thinking and behavior is subject not only to influences that are factual but also to those which are probable but are by no means already established as facts. This is a central everyday experience. It has been important for decades to show that quantum theory not only captures this central aspect of human experience but actually grounds it. The becoming effective of possibilities is not only valid for us humans, but it is valid for nature in general when examined very closely. It should be noted that what is meant here is not the possibilities treated as "unknown facts" in classical probability theory. Already in inanimate nature not yet factual possibilities influence the actual behavior of quantum particles. This can be seen, for example, in interference experiments, because quantum objects behave completely differently depending on which possibilities are open to them or closed to them. This applies, for example, to whether they have at the same time the possibility to fly through several adjacent gaps, or whether they are forced by control or by other restrictions of their possibilities to fly only through one hole in fact. The becoming effective of possibilities is captured in physics only by quantum theory. Second, it is very often true, and especially true in living beings, that a whole is more than the sum of the parts into which it can be divided, or from which it was built. In former times, especially students of medicine or biology had to dissect frogs. If then the parts lie neatly separated on the tray, then the impression cannot be avoided that in this dissection the essence of a living frog is no longer present. Also, the quantum theory knows wholeness with properties that none of its possible parts can show in any way. With the decomposition of a wholeness essential properties are lost because they cannot be present anymore even in the sum of all parts. To illustrate the "instantaneousness" in the change of wholeness, the holistic shape of a large vase may be used as an analogy. It is also true for a vase that it cannot be understood as the sum of its possible broken pieces. If a small corner at the top of a large vase is knocked out, then the entire vase is instantly damaged, not just the top rim. The disintegration of a whole is an instantaneous process. The change in shape is instantaneous, regardless of its extent or size, and has nothing to do with a violation of relativity. (The sound waves from the blow travel slowly through the porcelain, of course). Quantum theory is thus not only a physics of possibilities but also a physics of relations because relations produce wholes. These characterizations follow the mathematical structure of the quantum theory. This clarifies the fundamental point about quantum theory in understandable everyday language, we have been writing about it for decades. Nevertheless, in our perception, quantum theory is still mostly referred to with the usually too narrow term "quantum mechanics" and presented to the general public with the restriction to "smallest particles". This is then often garnished with the thesis that quantum theory is incomprehensible or even crazy. The basic structures of the quantum theory shown here and the behavior of quantum objects resulting from it correspond absolutely to our everyday experiences. The labeling of quantum theory as the physics of possibilities and relations applies to all quantum structures, including quantum particles. In addition to photons, this includes elementary particles such as neutrinos, electrons, and protons, as well as more complex ones such as atoms and molecules. "Elementary particles" are represented as the simplest structures even today in a tradition that goes back thousands of years. But since in quantum theory, the exact specification of the state of a particle requires in principle the specification of infinitely many values, it becomes clear that a quantum particle is not a simple structure, i.e. it embodies "nothing simple". A quantum bit, on the other hand, is the simplest of the possible quantum structures. It is a whole, which cannot be divided into any parts. Only two values determine its state. Any measurement on a quantum bit can only produce a result of two mutually exclusive facts. Note that we speak of simple structures and not of spatially small structures! But also, for more complex quantum structures it is true that they are partless wholes before they have been decomposed factually (or in the theoretical description) into possible parts. Therefore, for understanding, it is important that an entangled structure always acts as a wholeness. However, it does not follow from this that everything which is perceived as wholeness must also be completely entangled. In a molecule, for example, the behavior and properties can only be explained by the wholeness of the electron shell. This is constituted by the entangled electrons. They act like partless wholeness. In this, there is much more in properties than what can be imagined from the pictures of "many electrons" alone. The tensor product structure, which is fundamental for quantum theory, makes completely new things possible. Since in the context of chemistry no particles with rest mass are created or annihilated, it is certainly very convenient in a description of a molecule to treat the atomic nuclei like factual structures, as it is usual in the context of classical physics. Describing entanglement also for the atomic nuclei is unlikely to be of any use. So, we can conclude that in some cases, even in the atomic realm, the full accuracy of quantum theory may not be necessary.
These two principles, "A whole is more than the sum of its parts” and “possibilities produce real effects", which are familiar to us from everyday life, characterize quantum theory as said before as a physics of possibilities and a physics of relations. The concept of possibilities also belongs that possibilities must always be thought of as multiplicities. If, on the other hand, they were interpreted as facts, then they must contradict each other. It is possible now to go to the cinema or to the theater tomorrow evening. But it is impossible to do this as a fact at the same time. The "measuring process" is a special case of the transition from possibilities to facts. Since quantum theory as a theory of possibilities does not describe facts, classical physics remains indispensable as a theory of facts. Historically, the appearance of a fact in quantum theory was postulated as given when an observer established a measurement result. That a measuring result is understood as a fact, that should be evident. However, from point of view, further discussion got out of course. Partly the consciousness of the observer was not only understood as the entity with which he takes note of the fact, but the consciousness should even cause the fact. Clarity was brought by the theory of "decoherence" founded by Zeh, Joos, and Kiefer. As long as a quantum system can be described as "isolated", its states remain as possibilities without any fact here of "extended presence". The more massive a quantum system is, the more difficult it becomes to keep it isolated. Through interactions with its environment, the system becomes entangled with it. The nearer environment becomes entangled with the wider environment, and so on.
Now, in the theoretical description of the quantum system, one can computationally cut it out of the entanglement with the environment. Then, one no longer obtains only one quantum state and the non-orthogonal states compatible with it. Instead, the result is a "density matrix." This goes more and more into a classical probability distribution of different quantum states. The density matrix is not a quantum theoretical superposition. It expresses that this "cutting up" does not result in a unique quantum system with a defined state but in several possible ones. The theory of decoherence then shows that this description with a density matrix approaches more and more in the course of the temporal development an "ignorance of facts", one of which has occurred. This corresponds finally to the thesis: The die has fallen, but the dice shaker still hides the result. In the mathematical description, the decoherence leads only after an infinite time to an unknown fact. Now again the pragmatism of physics comes into play. If an experimenter observes the quantum system, then decides when enough time has passed and thus the situation appears to him as factual. Then the fact can also come to the attention. As already explained in "Quanta are different", quite generally and also without a present observer at a quantum system, a fact arises when information about quantum possibilities has irreversibly escaped from the system into the expanding cosmos. Also, at this the close connection between quantum theory and cosmology becomes visible. However, before showing how quantum theoretical principles can be used to facilitate insight into how the experiments honored with the 2022 Nobel Prize can be understood and interpreted, clarification of the ideas about "quantum information" is necessary.
C. F. v. Weizs?cker in 1979. His early philosophical insight: "Matter is information" has value. That's why we always debated with him about the role of information. He had coined the term "Ur-Alternativen“("primordial alternatives") for quantum structures that have the simplest space of states that is mathematically possible. Thus, an "Ur" has a "two-dimensional complex-valued Hilbert space." With the multiplicative structure of quantum theory, all arbitrarily complicated structures can be generated from such abstract two-dimensional complex structures. If information is to be equivalent to matter and energy, then it must be defined absolutely. Weizs?cker, however, was of a different opinion. He spoke of the "relativity of the urs". Therefore, he formulated in 1985 in "Aufbau der Physik": "An 'absolute' concept of information has no sense; information always exists only 'under one concept', more precisely 'relatively on two semantic levels'." Information is usually determined only up to a value that can be freely chosen. Thus, the information in the genome can be determined by interpreting the four nucleobases as letters and then calculating the information from their combination. However, one could also start from the atoms and first calculate the information leading to the nucleobases. This value would then have to be added to the previous one. For the arbitrarily chosen starting points, one could finally also start with protons, neutrons, and electrons, explaining what v. Weizs?cker means by different semantic levels. However, as long as a physical quantity, e.g., temperature or information, is still referred to as an arbitrarily chosen zero or starting point, we have not yet understood this quantity in a fundamental sense. As an example, think of temperature before the discovery of absolute zero. In daily life, we relate temperature to the freezing point of water, but one could - as is still common in the USA with the Fahrenheit scale - also define another temperature as 0 degrees. Only with the absolute zero point of -273°C = 0 K could temperature be understood as the internal movement of substances. 0 K is then the state where no internal motion exists anymore. And even less than "no motion" cannot be imagined. Energy also provides an example. Thus, the potential energy is defined only up to an arbitrary constant. Only in the theory of relativity, the energy can become an absolute quantity by the discovery of the equivalence of mass and energy, so that finally in the quantum field theory the positivity of the energy becomes a central postulate. The connection between information and entropy is explained further at that time: For the concept of entropy there are - similarly as for energy - many different definitions, also its mathematical form appears in different shapes. In our context the most appropriate and shortest and nevertheless still applicable definition might be the following: Entropy is information, which is not available. The information was used in physics earlier only in the form of the term "entropy". The entropy was also determined up to now only up to a freely selectable constant. The changes in entropy can be measured like the changes in potential energy. Entropy is information to which no meaning can be assigned. At the interface of quantum theory and gravitation theory, i.e., in the connection with the "black holes", the entropy now likewise becomes a fundamental physical quantity that can receive an absolute value. A black hole is such a big and dense mass that the gravitational effect, as it were, cuts such an object out of space. One can clarify this fact also in the framework of Newton's theory of gravity: Even light is too slow to be able to escape the gravitational field. In the framework of the general relativity theory, it is formulated that a "horizon" is formed. Everything can go inside, and nothing can go out. With it also no information about the happening inside can get into the outside. All the information inside is inaccessible, is entropy. Jacob Bekenstein (1947-2015) and Stephen Hawking (1942-2018) made it clear in the 1970s that in the case of black holes, both phenomena of quantum theory and general relativity are intertwined. They had shown that the number of unknown qubits inside a black hole is related with a simple proportionality factor to the surface of its horizon. This surface area, in turn, is related in a simple way to the square of the black hole's mass. And the mass can be measured by its gravitational effect. How can one arrive at "Absolute Information"? One must let all information become inaccessible. For black holes there is a theoretical upper limit of the mass - none could contain more than the entire matter of the universe. Now, in a thought experiment, one constructs a black hole with the entire contents of the universe. Then two things become apparent.
First, this black hole would be as big as the space of the cosmos. If one were to create such a hypothetical black hole with the presently available observational data and then mentally drop another single proton into it, the total entropy would thereby increase by 10^41 Qubits. That would be the maximum value of information, which could be assigned to a single proton. Everything that might have been concrete knowledge before would now be inaccessible behind the horizon of the huge Black Hole. So, one proton would be able to add this gigantic value of entropy. Thus, v. Weizs?cker's earlier considerations from the 1970s: "One proton is 10^40 bits" had been connected to the established physics. In further work with Eva Ruhnau, it was then shown that the quantum-theoretical considerations about the qubits still follow an important result for the interior of a black hole. The unphysical infinite energy density in the center is eliminated. According to calculations, a black hole inside looks like our cosmos when it was as small as the black hole is now. How can information actually become the "primordial substance of the universe"? In biology, there is the famous statement by Theodosius Dobzhansky: "nothing in biology makes sense except in the light of evolution." This could be modified for quantum theory to: "nothing in quantum theory makes sense except in the light of cosmic evolution." As said, the 'absolute zero' of temperature provides an example of how important an absolute value is. Zero degrees Kelvin means there is no more internal motion - and less than no motion is impossible. Since 'no energy' and 'no mass' are meaningful physical statements about a theoretical limit, an absolute value must also be required for "information" if it is to become equivalent to these two quantities.
A connection to the cosmological considerations is inevitable for information. The many calculations carried out and published in this connection are summarized and explained in the book "Understanding Quantum Theory" (2022). If the information is the original substance of the universe, then 0 quantum bits would be a limit value that could designate the beginning of the big bang. The existence of time, space, and everything in it begin then theoretically with a quantum bit. The quantum bits are thereby related to the whole, they are absolute in this sense. The constant increase in the number of quantum bits is another name for the expansion of cosmic space. If therefore the "information is the basic building block of our world", then the information must not be equal with matter or with energy, but there must exist an equivalence of matter and energy with quantum information to be understood as absolute and therefore "objective". Equivalence is not equality, but it must be connected with the possibility of a transformation process. Let us take a simple example from everyday life. You cannot eat a €0.50 piece, but under certain conditions €0.50 is equivalent to a pretzel. So, under a certain context, one can be converted into the other. Absolute Bits of Quantum Information (AQIs) The demanded equivalence between matter, energy, and information is only possible if the term "quantum information" is used in an absolute sense. "Absolute" means for the quantum bits must be related to the cosmos as a whole. This reference of the very simplest of the mathematically possible quantum structures to the cosmos requires, on the one hand, that they are not associated with any concrete object or structure within the cosmos. They are therefore highly abstract and not yet connected with any concrete or special meaning. They are still meaning-free but open to possible meanings.
They are called "Absolute Bits of Quantum Information" (AQIs). In the course of cosmic development, many of them form in each case also to material and energetic quantum particles. These in turn can join together to larger formations, whose properties then - again as special formations of AQIs - are open for possible further formations. In this way, the elementary particles develop, and from these then the atoms and molecules. But by no means do all AQIs in the cosmos form themselves into particles. Concentrations of AQIs which are no particles cause those effects for which nowadays the term "dark matter" had been introduced. The AQIs are to be understood therefore as still free from any concrete design and meaning. To avoid the above-mentioned misunderstanding "information = meaning", in this context "information" had to be replaced by a neutral term: "polyposis, the pre-coined". We can understand the AQIs of polyposis as the elementary pre-structures for the immeasurably many formations and shapes which arise from them in the cosmic evolution. Later in cosmic evolution living beings arise. These can carry out their respective meaning assignments and evaluations of the forms and shapes which have developed from the AQIs. Thereby some properties of such objects become "meaningful information". An AQI alone is so simple that there can be nothing simpler. Even a distinction between "form" and "content" is not yet possible with it. Since there is an equivalence to energy, an AQI must have the smallest energy which is possible in the cosmos. The quantum theory began with Max Planck's formula: E=hf=hc/l. The energy E of a quantum is equal to the quantum of action h times the frequency f or quantum of action h times the speed of light c, divided by the wavelength l. The smallest energy corresponds to the largest wavelength. The largest wavelength in the cosmos will correspond to the respective radius of curvature of cosmic space. As the cosmos expands, the largest possible wavelength will also become larger and larger and therefore the energy equivalent of an AQI will become smaller and smaller in parallel with this expansion. The following relation results: E = mc^2 = nAQI h / tCosmos. Here the energy E of an object is equivalent to its rest mass m, multiplied by the square of the speed of light c. The equality continues with the number of AQIs forming the object nAQI, multiplied by the quantum of action h and divided by the age of the cosmos. An AQI is as said a cosmically extended structure. Only with very many of them something "small", something localized, can be produced. Thus "non-locality", and extendedness, become a foundation of the quantum theory. Locality requires a lot of information or a lot of energy or mass. The AQIs as the foundation of the reality describable by natural science Since thousands of years, one followed the idea of antique Greek philosophers, the simplest would be spatially tiny objects. They were so small that they could not be decomposed into something even smaller. They were indivisible, " atomos". Today we know that atoms can be decomposed. The only thing that is in principle indivisible is quantum bits. They are the simplest of the possible structures and are therefore equivalent to the smallest amount of energy possible in the cosmos. Their quantum physical description with a wave function assigns to it a wavelength of the order of the extension of the cosmic space.
The simplest and therefore fundamental has a cosmic character. It must be admitted that this realization means a great demand for the change of our ideas. The dogma, in the small, becomes simpler and applies after all in the area of chemistry, thus up to the atoms, with great success. This can be justified by the fact that in the area of chemistry and biology, the events are determined exclusively by electromagnetic interaction. This interaction is long-range, so it can work from the "microscopic" into the mesoscopic and macroscopic. On the other hand, it is so weak that in the field of chemistry, no objects with rest mass can be produced. Only photons, which are massless, appear in chemistry as real and virtual quanta with changing quantities. How does one come to the energetic and material quantum particles? In the search for rules and laws of nature, always insignificant appearing things are neglected. Only by neglecting and ignoring the unimportant do the necessary similarities and equalities emerge. Thus, in classical physics, the effecting of possibilities and the relational structure of reality is ignored. In many cases, this is sufficient and successful, in others not - as quantum theory shows. Thus, physics differs from mathematics in that mathematical structures are used pragmatically only where they provide a good fit to nature. Thus, in particle physics, instead of curved cosmic space, a flat, i.e., an uncurved spacetime model is used, the Minkowski space. In this model, there are clear mathematical definitions for what can be used as the description of a particle and its states.
With Dirk Graudenz and C. F. v. Weizs?cker (1992), it was calculated how in the Minkowski space of particle physics the quantum particles without mass can be formed from the qubits. For particles also with a rest mass the calculations were carried out with Uwe Schom?cker (2012). So, if also particles with rest mass are built up from AQIs, it is obvious that then some qubits can be exchanged between these particles as meaningful quantum information. An illustrative example is the exchange of the property "kinetic energy", i.e., motion with a certain velocity, at the impact of two billiard balls. The mathematical structure of the polyposis has also provided mathematical arguments for why there are exactly the three quantum interactions in nature known as the weak, electromagnetic, and strong interactions (G?rnitz & Schom?cker, 2016) The structure of general relativity can be derived as a classical approximation for entities in the polyposis cosmos. (G?rnitz 2011)22 The precondition was the cosmological model, which was founded with the AQIs. (G?rnitz 1987, 1988, G?rnitz & Schom?cker 2021)23 With these extensive calculations the AQIs and their conclusions become from philosophical theses to a physical theories. Qubits as properties of objects and AQIs as basic structures of quantum theory. As mentioned, Zeilinger writes in "Einstein's Veil" about information: "... that the smaller the system, the smaller the amount of information we need to characterize it." Here the idea that less information is associated with a smaller spatial structure can easily become entrenched. While this is consistent with the old ideas about the smallest particles, it is intrinsically incompatible with quantum theory. With this thesis, there is nothing to feel of the postulated connection between the universe and information as its "basic substance". The connection to a theory about the universe, thus a cosmology, is necessary if one wants to reach from the philosophical thesis to a scientific theory. As Planck's formula shows, more and more information is necessary to localize a smaller and smaller object in space. It is true, however, that from a complicated structure in Zeilinger's experiments very often only one single bit is needed and used as "meaningful information". This bit is then a property of a quantum object, e. g. a photon. The mass or energy of this object, which is localized in the cosmos or on the earth, then corresponds to a huge number of further AQIs. It is important to distinguish between a qubit as an interesting property of a quantum object on the one hand, and an AQI as a cosmically extended quantum pre-structure on the other. The quantum object is formed from AQIs. By the interaction of many AQIs, a localized structure is formed, at which again properties are definable as partial structures of it. Some properties are classical. They can be copied. Thus, a text can be read several times, i.e. be transported by photons from the paper into the eye. Some properties are quantic, e.g. the current energy and spin of a quantum object like an ion or molecule. They are not copied, but the information about the such property is transported as qubits from photons to other quantum objects.
The "non-copying" also applies in the quantum computer. But since there the carrier and the "transport path" are known, in quantum computing, only the two real parameters of the Bloch sphere are used out of the four real parameters that define the state of a quantum bit as two complex numbers. This difference between AQIs and quantum computing is not always apparent in the various representations. The property is localized at the object, and an AQI alone is cosmically extended. The image of "smallest particles" or of something similarly tiny provides a successful because useful, idea only in classical physics. As said, the simplest of the mathematically possible quantum structures are the AQIs. The absolute quantum information embodies with its non-locality a fundamental basic structure of the quantum theory. The AQIs represent at the same time the fundamental information in the cosmic space from which all other and more localized structures are formed. What does this imply for understanding the experiments of Nobel laureates? The experiments on EPR As described, two entangled particles are created in EPR. In most cases, this is done by splitting one photon into two with half the energy. Here, lasers are used to be able to generate precisely defined photons. The problem of understanding these experiments arises from the awkward language regime. Since Einstein, it is spoken of as "two particles". A "partless wholeness", as propagated for decades, does not appear. Schr?dinger, who did not like the consequences of the quantum theory and who had worked only on general relativity from about 1936, had strengthened this idea with the term "entanglement". As said, two entangled ropes remain two ropes. The whole is more than the sum of its parts Even if one creates an entangled whole from two particles, the separate existence of these particles within this whole is no longer given. Nevertheless, one usually speaks of the particles flying apart. However, in the translation of the mathematical description of this process into normal language, one should speak instead of a "growing spatial extension" of this wholeness. If now this wholeness is decomposed by a measuring intervention at a place, then the whole wholeness is changed instantaneously.
In this decomposition, the initial parts can be restored, but other decompositions of the wholeness are also possible, and this is important. This also points to the fact that the idea that the initial structures are still present in wholeness cannot be proven. If one realizes that all objects involved in the EPR experiments are different forms and manifestations of the AQIs, then some things can be understood more easily. During entanglement, a wholeness is created in which such properties also appear, i.e., special structures of quantum information that cannot be formulated at the initial parts. Mathematically, this results from the fact that the states of wholeness belong to a state space whose dimension number is the product of the dimension numbers of the parts. In classical physics, on the other hand, the dimension numbers of a composite system are merely the sum of the dimension numbers of the parts. This tremendous increase in possible meaning, which is opened up by the quantum structure, leads to the fact that completely new properties become possible at a wholeness. An analogy could be provided by a number of letters. Their interaction in words and sentences makes meanings possible, which cannot be created with many single letters alone. Contingent properties, which therefore do not fundamentally concern an object, but are variable and possible at it, can be transferred from there to other objects and thus become significant for living beings as information. Since the initial particles are also only forms of AQIs, it is easier to understand that from such a composite whole also forms of AQIs can emerge, which had not been present before. These can be, for example, entangled photons that carry special information across the state of this entity that was not present and not even possible at the initial parts. A diphoton with spin zero, i. e. without a polarization direction, is decomposed by a measurement. With the measuring intervention, this wholeness is decomposed. Only by this decomposition two "halves" are created. The particle, which becomes real at the measuring intervention, is put into a factual state. During the measurement on the left side, a photon is created. At this, it is measured whether the spin points are obliquely upward or obliquely downward. One of these two possibilities becomes factual during the measurement. On the right side, a photon is also created by decomposition. This appears in a defined quantum state, which is correlated to the left measurement result. One quantum object spreads out in space. Extended wholeness Then a measurement decomposition is made. When decomposing, a certain direction must be taken on the left side - on one of the direction measurements - because of the measurement. The right part goes into the corresponding quantum state Left results - depending on the measurement orientation - one factual direction becomes or else becomes a factual direction. If a different measurement orientation is chosen on the left side, then the spin will follow this default. Thus, if it is measured whether the spin points to the right or to the left, then one of these two spin directions will become factual at the photon. The photon, which originates at the right side, appears in the defined quantum state, which is correlated to this left measurement result. The other "half" of the wholeness, whose properties are determined by the separation process, remains in a quantum state until its measurement. This quantum state is completely determined by the result of the measurement intervention on the left side. Thus, a completely determined quantum state arises instantaneously, but not yet a measurement fact. If the same measuring alignment is chosen on the right side as on the left side, then the measuring result of the right side can be predicted with certainty on the left side. However, whether this choice was made in such a way, can be known on the left side only when a message arrives from the right side. Then decomposing, a certain direction must be taken on the left side - on one of the direction measurements - because of the measurement. The right part goes into the corresponding quantum state Left results - depending on the measurement orientation - one factual direction becomes a factual direction becomes or else. If a measurement is now also made on the right side with the same measurement orientation, on the left side the result on the right can be predicted immediately with certainty. Let the left be found become with certainty Then the quantum state on the right. The "instantaneous" had disturbed Einstein. But, as already Aspect had stated, this does not mean at all a contradiction to the special relativity theory. During this process of decomposition of wholeness, no message can be transmitted, energy and matter anyway. And only such transmission or transport is forbidden by the theory of relativity. If a measurement is performed on the quantum state of the right side, which is not adapted to the quantum state present there, but which tests the quantum state "spin to the left" with the measurement "spin up or down?", then with equal probability one of the two possible answers will appear as the factual measurement result. Very often the EPR process is described falsely in such a way as if at the other, the not yet measured end also a factual state would be present. This is wrong. The created other part is put into a completely determined quantum state. However, with the quantum state, it is by no means fixed yet, into which factual state it will change for its part at a measurement. In addition, it is to be reminded once again that to a quantum state, very many different possibilities compatible with it exist. Therefore, many different measurement results can follow. Only a state, which is orthogonal in the state space to the existing one, cannot possibly be found at a measurement. If in the photon the polarization is perpendicular, then horizontal cannot be found, but oblique can. "Oblique" is compatible with vertical and of course also with "horizontal", but horizontal not vertical. Only if two identical measurement settings are applied, also the two factual results completely determined. This brings us to the next point: Attempts to beam For the transmission of factual information one can encode it and send it to the other place. Factual information can also be copied. Everybody knows this since the copiers stand in the offices. Quantum information cannot be copied. For this one would have to know the exact state. It is obvious to measure it. But then the state would be factual, but it is different from the quantum state before the measurement. Moreover, the indeterminacy relation forbids correlations. However, if a different measurement orientation is selected on the right side, the result for the right side can only be predicted with 50% on the left - i.e., we can only guess. Let left be found and with 50% becomes 50% Then the quantum state on the right quantum system. According to this, there is no quantum state in which two conjugate quantities would have an exact value, e.g., location and momentum, or all components of angular momentum. With an exact copy, one could exactly measure one quantity at one system and conjugate one at the other. And then both would be known exactly.
Sometimes, however, one would like to have an exact quantum state available at another location, but one cannot perform a transport because of the interferences. With quantum teleportation, it becomes possible to transfer the complete quantum information about the exact quantum state to a distant particle. Thus, in beaming, the quantum information about the exact current state of a quantum particle is transferred to another quantum particle of the same variety. Quantum particles of the same variety are completely indistinguishable. Thus, if a quantum particle B is in the perfectly exact state as quantum particle A, it means the same thing as if A had been moved to B's place. The first attempts at beaming that were noticed by the public took place in Vienna with a bank transfer through a sewer under the Danube. This was very well protected from external disturbances due to its location. After all, the system must not be decohered by influences from the environment. Later, the experiments between two Canary Islands, i.e., over a distance of about 143 km through the atmosphere, made a great impression.
After China launched a satellite capable of sending entangled photons to Vienna and China, the experiments were then extended to a distance of 1200 km. The explanations of teleportation are not quite straightforward. Three participants are involved. A special entangled state is created between Alice and Bob. Chris has a quantum object whose state g is to be teleported to Bob. Now Alice performs a so-called "Bell transformation" with her share of the entangled state and with g. Thereby the entanglement between Alice and Bob is terminated and a new one between g and the state Bob is established. ap-proof communication based on quantum information If we want to set up a communications link that cannot be intercepted because the laws of quantum theory make that impossible, that is not only a huge challenge to experimental sophistication. The explanation about it is also based on the exchange of entangled photons like beaming, but it is easier to understand. In the experiments on quantum communication, a photon with higher energy is split into two photons in a special crystal. Their energy is then in the sum of the energy of the initial photon. Since these photons are entangled, it makes more sense to speak of a diphoton. This diphoton has spin-zero. This property, when decomposed into two photons, leads to a strict correlation of their polarizations. Each of the two photons carries a quantum bit which is correlated with that of the other photon. Thus, quantum information is not transferred from one place to another, as in beaming. Rather, correlated qubits are created by the suitable decomposition of a quantum system at two different places. The diphoton propagates in the space until it arrives at one of two measuring devices. There it is measured. This measurement splits the diphoton into two separate and no longer entangled photons. The polarization, i.e. the alignment of its photon spin, is measured on the photon that is created there during the measurement decomposition of the diphoton. Since the spin of the first photon is factually fixed during the measurement, the second photon immediately gets into the quantum state corresponding to it.
CONCLUSION
Through the prize-winning experiments, it has become even more apparent than before that quantum theory is fundamentally a theory of information. Therefore, it also becomes visible that, as a holistic theory, it has a primarily non-local character. For millennia, people have been looking for the fundamental and simple in smaller and smaller particles. The quantum theory shows that they do not become simpler, but that they explode faster and faster because they are connected with a more and more enormous energy density. "More and more energy" does not sound like simplicity. On the other hand, it should be evident that with more and more information more and more precise localization becomes possible. Because Absolute Bits of Quantum Information, the AQIs, form the "basic stuff of the universe" or better "the fundamental structure of the cosmos", a new understanding of both the nature of matter and the role of the mental and the spiritual (“des Geistigen”) follows. This has an impact not only on the general understanding of reality but also on the realization of the essence of man and finally psychosomatics. Because Absolute Bits of Quantum Information, the AQIs, form the "basic substance of the universe" or better "the fundamental structure of the cosmos", a new understanding follows from it both for the nature of matter and for the role of the mental and of the spiritual. This has implications not only for the general understanding of reality and the relational structure of nature. It also has an influence on the understanding of the essence of man and finally consequences up to psychosomatics. The AQIs also made it possible to scientifically understand the effect of psychological information on the brain and body.
This will revolutionize the progress of quantum mechanics and pave the way for quantum computing and quantum computer. It will prove a sea-change breakthrough and change the entire spectrum of software technologies and computer science. The digital and cyber world are going to exponential progress for the benefit of the entire global civilization not possible now to describe and comprehend.