Demystifying Quantum Computing: From Classical Bits to Quantum Qubits
Mario Rozario
Manager, Technical Delivery & Enablement | Thought Leader | Author of Tech-Tonic Tremors Newsletter | Writer for Generative AI, DataDrivenInvestor & ILLUMINATION (Medium Publications) | Distinguished Toastmaster
At the CES conference in 2025, earlier in January, NVIDIA CEO and keynote speaker Jensen Huang threw cold water on an emerging industry that's going through its initial pains just like AI was a few decades ago.
Within 24 hours of making that statement, stocks of major Quantum companies like IonQ, Rigetti Computing, and D-Wave dropped by nearly 40%.
To paraphrase, Jensen said, "Very useful quantum computers are still decades away. Estimates suggest it could take 15 to 30 years before quantum computing becomes commercially viable."
Why would he say that? Is Quantum Computing still a distant dream?
This is the first in a series of articles that will attempt to shed some light on quantum computing.
Let's start by looking at a trending industry like Generative AI (Gen AI).
For most individuals, ChatGPT, Llama, or even Perplexity are their touchpoints with Gen AI. But if you dig deeper into the technology powering Gen AI, you will probably zone out listening to terms such as transformers, decoders, auto-encoders, etc.
Quantum computing, takes this even further, it is based on some of the most complicated underlying concepts in math to understand, if at all one even attempts to understand it. Now, an entire industry is being built on this foundation.
Before we delve into quantum computing, lets discuss how it all began.
Revisiting Classical Computing?
To understand quantum computing, you first need to understand Classical Computing.
We are all still living in the age of classical computing. Even NVIDIA's most advanced GPU is still considered classical computing.
So, what is classical computing?
Let me touch on a few key attributes. All digital systems store data in bits that could either be 1 or 0. We have a whole number system (the binary number system) taught in most computer science courses to familiarize us with the fact that at the lowest level, the computer thinks in terms of 0s and 1s no matter what language we feed it.
As far as the hardware for these systems goes, classical computers, laptops, and servers use hardware like transistors and logical gates to run the show at the hardware level. Here, too, bits of 0s and 1s stream across the motherboard from one component to another, carrying information.
In a classical system, all information is encoded in terms of 0s and 1s at the lowest level. This is the foundation of the current classical computing system.
The underlying principles of classical computing are Deterministic.
Now, you may laugh at me and wonder why I said this!
Isn't everything deterministic?
In classical computing systems, all states are well-defined.
If you were to fire a query at a system that isn't changing its state at a given point in time, you would get a definite result. If you were to repeat it under the same conditions, your result would be the same. This quality of a system is called deterministic.
Determinism is an integral part of our systems today. Imagine if we built e-commerce stores, trading systems, and money transfer systems that gave us different answers if we asked the same question repeatedly and nothing had changed. Such systems would be of no use to us.
Hey !! Wait a minute, but AI too is not deterministic, right?
You're partly right. Some Artificial Intelligence algorithms, especially deep learning, are non-deterministic in nature. They are built on probability, but at the core hardware level, their operations are still deterministic.
So, we live in a world built on classical computing.
Now, let's understand what Quantum computing is and how it differs at a very fundamental level.
What is Quantum Computing?
The fundamental difference between quantum and classical computing is at the hardware level. The core difference lies here.
While classical computing stores data in terms of bits, which can contain 1s and 0s, quantum computing systems store data in qubits, which can assume multiple states simultaneously.
What? Simultaneously?
领英推荐
Yes. So, one qubit can have the probability of a 1 or a 0 or any of the values in between them simultaneously, and you would never know.
This feature of a qubit, which stores values simultaneously, gives it far-reaching capabilities way beyond those of a bit in classical computing.
If you were to increase the number of bits in both these systems, you would find that while classical computing scales linearly, quantum computing scales exponentially. The graph below highlights this speedup.
You didn't tell me how a qubit can store these values simultaneously.
Well, compare this to a toss of a coin. If you toss a coin and watch it spinning in the air at any point, it could be heads or tails, but you can't call it because it hasn't landed. Once it lands on the ground or your hand, you can look at it and call heads or tails. You cannot make the call while it's spinning in the air. While it's in the air, it's simultaneously in 2 states - heads or tails or multiple in-between states that could be written in a complicated probability function involving complex numbers.
It's strange how we can complicate a simple coin toss.
This is precisely how you should view qubits. To generalize, at any point in time, if a system has N qubits, it could hold up to 2 ^ N values simultaneously. This ability allows it to perform 2 ^ N processes in parallel, giving it way more capability than classical computing. This difference is at the hardware level, and that is why it's fundamentally different from classical computing.
This ability of a qubit to hold more than one value simultaneously is called superposition at the scientific level, as it takes a "super" - "position" compared to standard bits.
The underlying principles of quantum computing are Probabilistic.
Here in lies another big difference. Just like a coin spinning in the air, we can't call the result until it lands, and there are so many factors determining the outcome: -
If we had all these factors figured out (which is tough), since apart from air humidity and pressure, there is no way to estimate the other 2, even then, the best we could make is a calculated guess.
A calculated guess is driven again by probability.
Herein lies the catch.
Quantum computing gives us a huge speedup in computing power, but it comes at the cost of determinism, which is where classical computing scores. The challenge now is to remove or, at best, limit the impacts of this uncertain outcome.
This is where error rates come in. One of the key challenges in quantum computing is to reduce the error rate in its operations.
So, what are Error Rates?
Just like a coin spinning in the air, we are interested in the outcome of the coin toss. The outcome is obtained when it falls on our hand or the ground, and we observe the result.
In quantum computing, measuring the outcome is called "collapsing the wave."
Okay, so why is it even called this?
Well, until the coin lands, its state can be described by a probability function of different states. It's only once you measure it that its state collapses into heads or tails.
The challenge of quantum computing is to get our measurements to match the desired outcome regardless of the probability associated with the processes.
Quantum computing is subject to various external environmental disturbances easily, and hence, this could impact the outcome.
However, recent advancements in the field have reduced these errors considerably. For instance, Google's recent quantum chip, Willow, has made headway in error reduction (see below).
Let's summarize
Today, we covered the basics of quantum computing and some core differences between classical computing.
In the next installment, we will focus on quantum algorithms and how companies use them.
If you enjoyed today’s Tech-Tonic Tremors, please subscribe and share?. Your comments and opinions about these events happening in our lifetimes will propel me to write even more. Join the conversation in the comments below!
Senior Manager at Accenture
3 周The clarity of your explanation made a complex topic easy to understand Mario
Senior Engineering Manager
1 个月Explained it very well, Mario
Agile Delivery Manager | Program Management | Data Analytics | Cloud Adoption | Customer Relationship Management | PMP, CASM, CASPO, Teradata Vantage Certified
1 个月Very informative
Blockchain Architect |EVM | Solana | Solidity | Blockchain Analyst | Smart Contract | ICO| Meme Coin|
1 个月Quantum computing is truly the next frontier of technological innovation!
Building the most secure, decentralized AI network @Qryptum.
1 个月NVIDIA’s CEO claims “useful quantum computing is 20 years away,” while D-Wave’s Alan Baratz fires back: “Dead wrong”? The quantum bubble?will?burst for overhyped players just as AI did. Yet, those betting on hybrid systems (quantum + classical) and?logical qubits?will dominate.