What is quantum computing? And how does it differ from conventional computing? (2/4)
Mohammad Yosry
Int'l Assoc. AIA - LEED AP? BD+C and EDGE Expert. Architect and Award-Winning Author
This is to answer our first question from the previous article:
What is quantum computing? And how does it differ from conventional computing?
The previous article introduced Quantum Computing and its impacts on the Architectural, Engineering, and Construction (AEC) Industry. If you haven't read it, please visit the link below,
Disclaimer:
The following article is of specialized scientific nature. It may be sophisticated to many, but it's crucial to understand the basics of Quantum before discussing its applications and relevance to our subject, i.e., the AEC industry. So, please take the time and absorb the following information to understand this research's purpose entirely.
Before diving into Quantum Computing (QC), I would like to test your patience with me first addressing some terms and briefly explaining what Quantum is and where this term originated.
The Oxford Learner's Dictionaries?(2020) defines the term "Quantum" as "a very small quantity of electromagnetic energy." Apart from its original Latin root word, "Quantus" (meaning how much, how great).?The Cambridge Dictionary?(2020) provides additional perspective on the term Quantum as "the smallest amount or unit of something, especially energy." Over the years, the term "Quantum" has been applied to nouns such as mechanics, leap, and now computers. This section's interest is in applying the term to mechanics and then computers.
Quantum computers?(QCs)?use the laws of quantum mechanics to do calculations. The following is a brief background on quantum mechanics (on which quantum computing is based) before describing what quantum computing entails.
Quantum mechanics?is an area in physics that deals with the weird behavior of the universe's photons, electrons, and other particles. And unlike classical physics theories' largely deterministic and intuitive nature, quantum mechanics theories are based on probabilities with latent uncertainties.
The relevance of quantum mechanics can be established as follows, many equations of classical mechanics stop to be useful at the very small-scale level of atoms, electrons, etc.
Quantum mechanics counterintuitively and exotically describes things at the small-scale level to give accurate predictions of a wide range of observable phenomena that classical physics does not. For example, while objects may exist in specific places at specific times in classical mechanics, in quantum mechanics, objects exist in a fog of probabilities. They have particular chances of being at different locations. Hence, quantum mechanics results can provide strange but relevant conclusions about the world where it becomes non-feasible to apply classical mechanics concepts to explain.?
Pioneered around the turn of the 20th century, about the same period Albert Einstein published the theory of relativity, quantum mechanics started as a controversial set of mathematical explanations of experiments that classical mechanics could not explain. However, quantum mechanics has over the years gained acceptance via experimental verifications by several scientists over the decades.
To reinforce the significant role of quantum physics in extant daily activities, the following section summarizes peculiar examples of practical uses of this knowledge.
? Atomic clocks, Global Positioning Systems (GPS) constellations are equipped with atomic clocks that enhance the accuracy in timing satellite signals, enabling smart navigation.
? LASERS are quantum devices applied as light sources to carry messages through fiber optic cables that support telecommunications. LASER stands for "Light Amplification by Stimulated Emission of Radiation," grounded in Albert Einstein's 1917 paper. Stimulated emissions occur when atoms in high-energy states encounter photons of specific suitable wavelengths to induce them to emit identical photons to parent atoms.
领英推荐
? Modern semiconductors depend on solid objects' band structures, and certain critical computer tools/parts are quantum phenomena.
Thus, this and some of the latter/former bullets reinforce the extant role of quantum physics in the Information and Communication Technology Industry.
That being said, what is quantum computing?
Quantum computing?methods apply the unique properties of matter at the nano-scale level, and they can employ aspects of the near-mystical phenomena of quantum mechanics to provide massive computing processing power.
Comparatively, while conventional computers apply a stream of optical or electrical pulses in binary codes of?1s and?0s called?bits, the basic units of a quantum computer are the?qubit?(or quantum bits). Quantum computers can be measured in terms of the number of qubits they contain, which are synonymous with the number of transistors in a conventional computer. The power of quantum computers can be underpinned by their capacity to generate and manipulate qubits.
Qubits are characteristically subatomic particles (e.g., photons, electrons, etc.), and generating and managing them from a scientific and engineering perspective is challenging. Hence, quantum computers were once seen as an impossible feat due to their intricate need and the need to accommodate them in unconventional environments. Qubits possess peculiar features that enable a group of connected qubits to provide more computing processing power than a similar number of conventional computing bits. These crucial features include?wave-particle duality, superposition, entanglement, coherence, decoherence, measurement, and teleportation.
We won't discuss these different features and their characteristics here due to the complexity of this matter. If you're interested in knowing more about those features, please do not hesitate to contact me directly, and I will be more than happy to share my fair knowledge about the matter.
Though quantum computers can speed up computing power via specific orchestrated quantum algorithms, it is worth mentioning that they are more prone to errors than conventional computers!
From a comparative view, classical computers are relatively reliable at the bit level and have fewer than one error in 1024 operations. While in specific quantum computing applications, it has been estimated that it will take about 1,000 physical qubits to create a single logical qubit (i.e., an error-corrected and fault-tolerant qubit to attain reliable results). However, as research is in progress, encouraging results are emerging.
For instance, it has recently been demonstrated by a specific quantum computing builder (IonQ) that 13 physical qubits can be applied to deliver one near-perfect logical qubit via a technology based on trapped-ion quantum computing (IonQ, 2020). Their new 5th generation quantum computer has 32 ion qubits in its latest release, almost tripling the 11 qubits in its previous quantum computer. Had all 32 qubits been used in IonQ's calculation, the expected quantum volume would have exploded from 4 million to a quantum volume of over 4 billion.?That would be 31 million times greater than any quantum volume ever published.
?
To conclude,
Attaining a quantum advantage is when quantum computers can do useful work that no conventional computers can solve in a reasonable time frame in a practical, cost-effective manner.
A problem that will take a classical computer 10 seconds, 2 minutes, 330 years, 3,300 years, and the age of the universe to solve could be done respectively within 1 minute, 2 minutes, 10 minutes, 11 minutes, and 24 minutes using a quantum computer. Thus, based on this data, quantum advantage can, for example, be demonstrated from the locus when the quantum computer can do work that will take about 330 years for a conventional computer to complete.
Next, we'll talk about the potential implications of quantum computing on the AEC industry.
Stay tuned...