Quantum Computing- The New Giant Leap for Mankind….
Mohsien Hassim
Seasoned Business Transformation Executive with a solid Foundation in Finance/Technology/Risk (GRC/ESG)/Security (Cyber)/Strategy and Digital Transformation. AI Researcher & Enthusiast.
We all recall the fateful day on 20th July 1969 when Neil Armstrong set foot on the moon and uttered the words “That's one small step for a man, one giant leap for mankind." This event changed the history of the human race forever. It has been reported that the Apollo Guidance Computer had RAM of?4KB, and a 32KB hard disk (that is, kilobytes not mega or Giga). It had to be compact to fit into the Apollo 11 and measured approximately 60cm x 30cm x 15cm but weighed around 30kg. The Apollo 11 computer had a processor (an electronic circuit that performs operations on external data sources) which ran at a mere 0.043 MHz. In comparison, today’s Smartphones run at speeds of about 2490 MHz, which is 1,000,000 times the processing power of the computer that landed a man on the moon more than 50 years ago.
Computers have become an integrated part of modern society. They have evolved over many decades from the vacuum tube-based version of 1940/50s to the transistor-based version of the 1950s/'60s to the integrated circuit-based version of the 1960s/'70s and the microprocessor-based version that is with us today. Processing power, memory and onboard storage have grown as the needs of the user and industry have increased. With the advent of Cloud Computing and the growth of data centres, we have seen an exponential growth in storage requirements by the major players in this space.
Those out there who are old enough to recall the challenges with the vacuum tube technology of yesterday. They would overheat, consume large amounts of power and did not last very long needing to be replaced often. ENIAC (Electronic Numerical Integrator And Computer)?was the world's first general-purpose electronic computer, which became operational in 1946 and contained approximately 18000 vacuum tubes and could perform 300 multiplications per second. ENIAC can be seen as the ‘father’ to the modern classical/traditional computer.
The 4th Industrial Revolution (4IR) has certainly added to the growth of increased computing power, more storage needs and the explosion in data, commonly referred to as ‘Big Data’. The resurgence of Artificial Intelligence or AI through reinvestment by many quarters due to the benefits AI brings has warranted increased computing supported by the need for quality data that must be supported by adequate data storage. In the world of computing, Moore's law is the observation that the number of transistors in a dense integrated circuit doubles about every two years. However, as the need for processing power is increasing so rapidly, Moore’s law will soon run into major physical constraints!
With the limitations of the conventional computer which uses voltages flowing through circuits and gates that are controlled and manipulated entirely by classical mechanics, industry has been in search of faster computing power.
Through the joint workings of industry, research houses and universities over decades, the birth of Quantum Computing has opened a new era of computing power. Quantum Computing uses the laws of quantum mechanics to perform massively parallel computing through superposition, entanglement, and decoherence. That is a mouthful.
Let me throw some light on Quantum Mechanics…
The origins of Quantum mechanics?date back to the early 1900s when German scientist Max Plank published a paper on the effect of radiation on a black substance. Plank assumed that, like matter, energy is made up of separate units rather than a continuous electromagnetic wave during his investigations. Niels Bohr and Max Planck, two of the founding fathers of Quantum Theory, each received a Nobel Prize in Physics for their work on quanta. Einstein is considered the third founder of Quantum Theory because he described light as quanta in his theory of the Photoelectric Effect, for which he won the 1921 Nobel Prize.
Quantum mechanics is in the field of physics. It is the theory that describes the behaviour of microscopic systems, such as photons, electrons, atoms and molecules (at the microscopic level). It explains how extremely small objects simultaneously have the characteristics of both particles (tiny pieces of matter) and waves (a disturbance or variation that transfers energy). This relationship is referred to by Physicists as the “wave-particle duality.” At subatomic levels, the equations that describe how particles behave are different from those that describe the macroscopic world around us.
The particle portion of the wave-particle duality involves how objects can be described as “quanta.” A 'quanta' is the smallest discrete unit (such as a particle) of a natural phenomenon in a system where the units are in a bound state. For example, a ‘quanta’ of electromagnetic radiation, or light, is a photon. A bound state is one where the particles are trapped. One example of a bound state is the electrons, neutrons, and protons that are in an atom.
Quantum computers take advantage of these subatomic-level behaviours to perform computations in a completely new way. Using specialised technology that incorporates computer technology (hardware) and specifically developed algorithms, the benefits of quantum mechanics can be used to solve complex problems that classical computers or supercomputers are not able to solve or solve quickly enough.
Quantum Computing has gained significant interest from the public in recent times due to the need to be able to solve various problems which are so complex that traditional (classical) computers were not able to solve.
Classical computers are more efficient and fast for smaller calculations but much slower when dealing with complex exponential-type problems, for example, large number factorization. Quantum computers are capable of running parallel applications allowing for faster number crunching (i.e., calculations) than the classical or traditional computers. Conventional computer technology like transistors, logical gates such as NOT, AND, OR gates, and integrated circuits are not compatible hardware for developing the quantum computer. Quantum computers make use of qubits which are made up of subatomic particles. Information is stored using superposition and entanglement making it faster to solve complex problems. Information on quantum computers is stored and processed in units called qubits. Like bits in classical computers, they can have different values, like zero or one. Qubits, however, can also have mixtures of zero and one, like 30 per cent zero and 70 per cent one, for example. This ability makes them quite powerful.
Just like a binary bit is the basic unit of information in classical (or traditional) computing, a qubit (or quantum bit) is?the basic unit of information?in quantum computing
It is important to understand the architecture behind a quantum computer. It is not like a classical or traditional computer as we all have become familiar with it. Since a quantum computer must eventually interface with users, data, and networks—tasks that conventional computing excels at—a quantum computer can leverage a conventional computer for these tasks whenever it is most efficient to do so. Furthermore, qubit systems require carefully orchestrated control to function in a useful way; this control can be managed using conventional computers.
The main advantage of quantum computers is exponential speeding and can be able to process multiple applications parallelly at the same time. Quantum computing known as parallel computing uses a phenomenon known as quantum tunnelling, which reduces the consumption of power 100 to 1000 times when compared to classical computers. This section discusses the difference between classical and quantum bits which enable the quantum computer very faster to solve the problems like human brain.
To assist in conceptualising the necessary hardware components for an analogue or gate-based quantum computer, the hardware can be modelled in four abstract layers: the “quantum data plane,” where the qubits reside; the “control and measurement plane,” responsible for carrying out operations and measurements on the qubits as required; the “control processor plane,” which determines the sequence of operations and measurements that the algorithm requires, potentially using measurement outcomes to inform subsequent quantum operations; and the “host processor,” a classical computer that handles access to networks, large storage arrays, and user interfaces. This host processor runs a conventional operating system/user interface, which facilitates user interactions, and has a high bandwidth connection to the control processor.
The global quantum computing market size is?estimated to grow from USD 0.36 billion in 2023 to USD 1.63 billion by 2035, representing a CAGR of 13% during the forecast period 2023-2035.
The current field of quantum computers is not quite ready for prime time:?McKinsey has estimated that 5,000 quantum computers will be operational by 2030?but that the hardware and software necessary for handling the most complex problems won't be available until 2035 or later.
Quantum computers use the counterintuitive rules that govern matter at the atomic and subatomic level to process information in ways that are impossible with conventional, or “classical,” computers. Experts suspect that this technology will be able to make an impact in fields as disparate as drug discovery, cryptography, finance, and supply-chain logistics.
领英推荐
The promise is certainly there, but so is the hype.
In 2022, for instance, Haim Israel, managing director of research at Bank of America, declared that quantum computing will be?“bigger than fire and bigger than all the revolutions that humanity has seen.”?Even among scientists, a slew of claims and vicious counterclaims have made it a hard field to assess.
Understanding how Quamtum Computing operates is not straightforward. Assessing our progress in building useful quantum computers comes down to one central factor: whether we can handle the noise. Noise you say. Yes, classical or traditional computers are also ‘noisy’ but not to the level of Quantum Computing.
The delicate nature of quantum systems makes them extremely vulnerable to the slightest disturbance. Disturbances like a stray photon created by heat, a random signal from the surrounding electronics, or a physical vibration. This noise wreaks havoc, generating errors or even stopping a quantum computation in its tracks. It does not matter how big your processor is, or what the killer applications might turn out to be: unless noise can be tamed, a quantum computer will never surpass what a classical computer can do.?Extensive research is currently underway to solve the ‘noise’ challenge. Only time will tell whether this debilitating issue can be resolved adequately.
Quantum machines are quite error-prone. These computational errors are inherent to quantum systems and need to be solved. As a result, capital and talent are being deployed to resolve the quantum error matter to provide early detection, developing ways to build machines that notice their own mistakes and correct them. Although?advances?have been made addressing error detection in quantum computing, the current view is that these quantum errors will unlikely fully disappear. Classical computers will remain relevant in verifying the results even with the progression of quantum computers.
Quantum computers need to be kept at temperatures close to absolute zero, which is on the order of -270 degrees Celsius. With such extreme operating temperature requirements, Quantum computers are certainly not aimed at the domestic market or the conventional business.
Quantum computers will not replace classical computers but augment them for the majority of work they will do for the foreseeable future. Since quantum computers cannot yet scale to process our enormous real-world datasets, we still need classical systems to do the large-scale processing. The quantum computer will not replace supercomputers. Instead, it will supplement them and be used for highly specific calculations. Quantum computers do not have their own user interface but must be accessed via supercomputers.
Classical computers have unique qualities that will be hard for quantum computers to attain. The ability to store data, for example, is unique to classical computers since the memory of quantum computers only lasts a few hundred microseconds?at most.
?You provide an input, an algorithm processes it, and you end up with an output.
Quantum computations, on the other hand, take a range of different inputs and return a range of possibilities. Instead of getting a straightforward answer, you get an estimate of how probable different answers are.
Quantum computing can be very useful when dealing with complex problems in which you have many different input variables and complex algorithms. On a classical computer, such a process would usually take a very long time; the speed of Quantum Computers to do the same is the key differentiator.
Quantum computers could narrow down the range of possible input variables and solutions to a problem. The classical or traditional computer will be used to obtain a straightforward answer by testing the range of inputs that the quantum computer. This demonstrates a fundamental difference between quantum and classical computers; Quantum computers cannot give straightforward answers like classical computers do. Classical computations are quite simple.
Classical computers will therefore remain useful for decades to come. Their continued relevance is not just a question of how long quantum computers will take to reach mainstream adoption. The exact nature of where quantum computers will fit into society and more business in providing solutions is still not clear. This creates a serious challenge in assessing the potential ROI for quantum computing.
According to?McKinsey, there are four areas where quantum computing could yield immense long-term gains. Classical computing will remain relevant in these areas and complement the benefits of quantum technology.
Quantum computing has come a long way. Although there is much work to be done to overcome the challenges present.
The promises of quantum computing are plentiful. It could help develop lifesaving drugs with unprecedented speed, build better investment portfolios for finance and usher in a new era of cryptography.
Quantum computing will complement classical or traditional (& super) computers. The world of AI which is growing at an unprecedented rate will benefit from quantum computing. We look forward to the day that affordable and reliable Quantum computing is a true reality for all to benefit from.
Sources:
Interested in research, monitoring, and investigation of everything related to the Earth, the Earth’s atmosphere, and the links with the universe, the hourglass
2 个月Nice
Director at Edge Group of Companies | South African representative of Kestria - the world’s largest executive search alliance | Executive Search, Recruitment, Talent Mapping, and Advisory
4 个月Very well written
Managing Director @ Wolfpack Information Risk | CISM, CISSP, MBA
5 个月Nicely written Mohsien
Digital Transformation Leader | AI/ML Expert | Tech Innovation | Advisor| Speaker| Board Member
5 个月Very informative Mohsien