Introduction to Quantum Computing
We will be covering a lot of topics in quantum computing. I will give you a sense of what quantum computers are and their level of progress. We will also look at the power of quantum computing and the “dark” side of this technology. If you do not have much familiarity with this topic, this will be a lot of information, but hold tight, I will take you through this step by step. In the end, you should have enough familiarity to be able to dig deeper and even have conversations on this topic if it comes up at work.
Let’s go back 73 years to 1947 New Jersey. The first transistor has just been made at Bell Labs by Bill Shockley, Walter Brattain and John Bardeen after many painstaking experiments between physicists, chemists and materials scientists. Not much attention is given to the Bell announcement except the small article in the New York times and the Time magazine. However certain labs and researchers even as far as Japan are paying attention and trying to replicate the results. There are only a handful of people who see the value of this new technology that will replace the older vacuum tubes that were used to amplify sound signals on the Bell telephone lines and in large room-sized adding machines.
The diagram on the left shows the first transistor - a few wires sticking out of a Germanium crystal. However, this invention would revolutionize the computing industry over the next several decades.
Two engineers from Japan licensed the new technology from Bell labs and started working on their own, to make smaller transistor radios, and became the Sony company, and they could not keep up with the demand. Texas Instruments also started selling radios in the mid 50s but were not profitable due to the high transistor failure rate.
There were still a lot of technical challenges to make an actual circuit and Shockley moved to his childhood suburb in Palo Alto near Stanford University where he created the Shockley Semiconductor company. This was the beginning of what is now known as Silicon Valley.
So what is a transistor? Basically a transistor, shown on the bottom left (#1), is made from three layers of materials, this can be silicon, germanium, Indium and various oxides, however, the goal is to create an amplified current in the two output wires C and E, when the middle one B has a small voltage applied to it. Thus one can use a transistor to represent a “switch” that you can turn “on” and “off” as needed. When there is no output current, the circuit is “Off” and we can represent that as a zero, and when there is an output current, the circuit is On and we can represent that as a 1. Thus we use a single transistor as a unit of Binary Information. 1 or 0…called the BIT.
As the technology improved, transistors become more stable, didn’t break, and were imbedded in casings to create larger circuits. This led to the ability to imbed logic, for example AND, OR and NOT. A NOT operation as shown at (#2), turns an on, or 1 into an off or 0. And turns a 0 into a 1. A core set of logic gates, especially the NAND, that can be used to create any kind of arithmetic or operations that we might need, is called a universal gate set . Thus larger circuits like the "Adder" shown in (#3) were created out of these gates. This led to integrated circuit chips shown at (#4) , where many transistors are used to carry out binary logic and arithmetic, on many bits of data stored in registers of memory. The main component is called the Central Processing Unit or the CPU. An image of the Intel 4040 CPU is shown above and was produced in 1974. Later as computing needs continued to increase chip manufactures like Intel started adding multiple Core processors into one CPU as in (#5). Now we have multi-core CPUs with billions of transistors along with Graphical Processing Units or GPUs. (#6) shows the motherboard where the various processors are attached.
Early in the evolution of computers Gordon Moore who was the Founder of Intel noticed that they could double the number of transistors every 2 years. This became the Moore’s law and has held for the last 50+ years. By the mid 70s Intel had already created chips like the Intel 8080 with about 5000 transistors. In the next two years they were at 10,000 transistors. The graph shows how Intel and other chip manufacturers continued to improve the fabrication technology to develop transistors on smaller and smaller scales. New technology breakthroughs were needed at every step, but those were overcome, and more and more transistors were squeezed into a square inch of silicon wafer.
Traditional semiconductor scaling will end by 2024. Currently transistors are around 10-20 nanometers in scale, and IBM just created a 5 nanometers chip this year. However, the transistors are now arranged in a vertical fin structure. At this point, transistors are so small that quantum effects become relevant forcing reengineering. Manufactures have already started to create 3D processing architectures and thus stacking circuits vertically to overcome this problem but leading to increasing delivery time, costs and complexity. Some are researching carbon nanotube with Molybdenum Disulfide transistors to allow them to go smaller. It is still very clear that for computing to continue to make progress according to Moore’s law, there will need to be continued innovation and jumps to newer exotic mechanisms. And this is where quantum computing could play a role.
Now imagine, being at the cusp of another technological revolution…
Richard Feynman, at the end of one of his lectures, in 1981 mentions the above statement. He was eluding to the fact that classical computers would not be able to model nature, because at a fundamental level, nature is quantum mechanical and highly complex. Only a computer made of quantum mechanical properties would be able to thus simulate nature.
In many cases we reduce the complexity in the mathematics or in the processing requirements when we are simulating things like molecules or fluids to derive solutions that are good enough approximations of reality. However, what we are finding now is that by using the behavior of quantum mechanics it is possible to build computers that inherit some of these properties, allowing us to reduce the scale and power needs and also give us the exponential computation power needed to solve our most complex problems.
There are some findings that even some biological processes leverage quantum mechanics. There have been some papers that claim that photosynthesis in some bacteria uses properties of quantum superposition to efficiently transfer the light’s energy to the reaction center.
Our data and processing requirements continue to increase with needs for new drug discovery, machine learning and AI, optimizing satellites, reducing traffic congestion, making airplanes more fuel efficient, continuing to keep our data and communications secure and meeting the energy needs of the planet.
So how do quantum computers work?
Before we really get into that, it is important to wrap our heads around the concept of a qubit. Just like a bit was able to be a 0 OR a 1, a qubit has the property of being in a state which is a combination of 0 and 1, or in a superposition of 0 and 1.
One way to visualize this is to imagine a qubit as a pointer (or arrow or technically vector) pointing from the center of a sphere (technically the Bloch sphere) in various directions to the surface of the sphere. (it can point inside, but that is a more complicated state). When the qubit is pointing to the North pole and you measure to see what value it has, it always displays a 0, and if pointing to the south pole, you always measure a 1.
Let's contrast this with a bit. Remember, we used a transistor to create a bit and it was either a 0 or a 1. You can check that by checking if it is off or on, or if there is a voltage or not. Similarly, in the qubit, you would always measure a 0 if it is pointing up, and a 1 if it is pointing down (this is just convention and a way to visualize).
Now, imagine your vector is anywhere at the equator. It is the same distance from the North and South pole. Now if you measure it, it will sometimes show a 1 and sometimes show a 0. In fact it will act like a fair coin. If you measure enough times, you will conclude that you got 50% 0 and 50% 1 values. We say the qubit is in an equal superposition of states 0 and 1.
Now imagine your qubit starts at the North Pole at the 0 state. You apply some energy to it so that it moves only partially down the sphere. Now if you measure the qubit 100 times and 25 show a 1, and 75 show a 0. So you conclude that the probability of 0 or the 0 state is 75% and the probability of the 1 state is 25%.
Lets just think about this. If we create a qubit through some mechanism, we expect that it will have these two opposing states working against each other. This is like a regular wave example, where positive and negative waves interact with each other eventually canceling each other out.
Now, in case someone is wondering what if you rotate the qubit around the equator, does the probability change? It does not if you are measuring along the same North-South direction. So to some extent, we don’t always know where this vector is unless we repeat our state, tilt the globe in various directions, measure and repeat the process. However, mathematically, we can calculate exactly where the qubit is depending on what operations we run on it. And amazingly, quantum mechanics can be represented by linear algebra and as long as you do the math correctly, you know what should happen with each qubit….unless of course you have too many qubits, and then you need a big computer, and eventually, you can't keep track even with a supercomputer….but we will get there.
This all sounds great; however, you do have to appreciate we are trying to control and manipulate spins of subatomic particles. These particles are affected by the very atoms that surround them and hold them, they are affected by radiation and magnetic fields in the surroundings.
Classical computers started with lots of flaws, and instabilities. Vacuum tube adding machines failed when the vacuum tube burnt out, even early transistors had problems with wires detaching from the semiconducting material. Chips would have low manufacturing quality, where thousands would get discarded and stored so the competition could not get their hands on them. But as time went on, the quality of computer chips and parts increased. Now we expect a computer to work without failure, and it is quite amazing how even after millions of calculations per second, no step fails, and we get almost perfect computational power with our computers.
Quantum technology is still early in this evolution and Qubits currently are not very stable. The state of the qubit is constantly being impacted by the environment which can cause Energy Exchange and thus decoherence. Gate fidelity is a term used to describe how many gate operations can be done before the information in the qubits is lost.
The diagrams show a visualization of how noise from the environment which can be from magnetic fields, temperature and other effects can cause the qubit to shrink by Dephasing or Depolarization or Amplitude Damping. Energy Exchange from the qubit to its surrounding causes it to move from its 1 state to 0 state and random noise from the environment affects the pure state we wanted. Even the control systems have noise and are not precise enough and therefore the qubits don’t always end up where we want them, they can overshoot or undershoot.
Some of these systematic errors can be fixed by calibration, however, there are also special rotations that can also be performed that will ensure, (for these type or errors only) that the qubit arrives at the correct location. Other ways are to group qubits together so that if one fails, it is possible to detect the failure and correct that qubit without any measurement. We will talk about these "logical qubits" later. Basically, the key ways for us to improve quantum computers is through making qubits out of stable materials, removing the noise from the environment, improving our control mechanisms, creating correcting pulses and grouping qubits into one high quality logical qubit.
So how do we create a qubit?
Well I mentioned you can create a bit using a transistor, but you can also create a bit of information using light bulbs (on or off), car door (open or close), raising your hand (1) or lowering your hand (0), or sending signals thorough wires like a beep or no beep and use that to represent a 1 or zero, or you can use a long beep as a 1 or short beep as a zero as in Morse Code.
Just something to think about: All information – sound, music, conversations, videos, even our genetic code, can be converted and represented as 0 and 1s and of course, that is how all digital information is stored and transmitted.
In similar ways physicists and other researches have found different ways to create qubits. Let’s go through a few general examples to give you a taste of this.
Physicists have known for a long time that subatomic particles (electrons and neutrons) have a property called spin. Its not exactly like a spinning top but you can think of it like that. When researchers measured certain atoms, they discovered that there is a small magnetic field generated by electrons and even the nucleus. You can use this property of the protons and neutrons inside the nucleus to move all the spins in one direction and detect the energy they transmit as they rearrange themselves, this is used in magnetic resonance spectroscopy and some of the early experiments with quantum algorithms were done with these devices.
Also, the spin of a single electron of the outermost shell of an atom can be used as a qubit.
Ions (or charged atoms) can be trapped in a magnetic field and their electrons can be moved up and down different energy levels to represent a 0 or 1 state. The two energy levels will behave in the same way as described on the Bloch sphere. And this is used in Ion Trap quantum computers at IonQ and Honeywell
You can also use the property of superconductivity. In this case, you can take a special material and cool it down to very cold temperatures and the material will lose all its resistance to the current. In other words current will freely move in that material. This property is used to make a special circular or square circuit, along with a small junction (called a Josephson junction). Using the direction of the free current, clockwise or anti-clockwise one can define qubits with state 0 or 1. Keep in mind again, this also behaves in the same way described on the Bloch sphere.
Many of the leading companies like IBM, D-Wave, Rigetti and Google are using this type of superconducting circuit which is easy to make with existing semiconductor chip fabrication technology that is already available and has been perfected over 70 years.
But there are other ways to make qubits, including photons through lasers, defects in a crystal like a diamond crystal with a nitrogen atom. Here there are unique energy levels due to the Nitrogen atom that can be exploited to make qubits. However, it is hard to manufacture.
My favourite, is using a strange type of particle called a Majorana Fermion, which is being developed by Microsoft and Purdue University.
However, you can also just use linear algebra and your best computer to try creating a quantum simulator, based out of the linear algebra that represents quantum mechanics and a lot of companies have those available.
Let’s go deeper into two kinds of superconducting quantum computers which are the most often used today. And we will also talk about the Ion Trap which is a new type that is becoming available.
For Superconducting, we are using semiconductors circuit board technology. However, there are differences in the quantum circuits.
On the top left you can see a representation of a Josephson junction. Two of these are used between two superconducting materials shaped in a circle or square called a SQUID. This is kept at 15mK for two reasons, one is to bring out the superconducting properties where current flows freely in both directions, and the other is to remove the noise due to heat and vibrational energy from the surrounding atoms. Note, this temperature is just 0.015K above absolute zero and thus also colder than the temperature of space.
This circuit is then used along with other control circuits and connections between the qubits so that they can be put in superposition, entangled and interact with each other with varying degrees of coupling or be manipulated in specific ways to represent gate operations.
In this diagram I am showing two different kinds of quantum computers. D-Wave is a Canadian company that used research by a Japanese professor, to create an annealing quantum computer. IBM, Google and Rigetti on the other hand are using a similar technology along with other control systems to create a universal quantum computer.
What is annealing: In metallurgy and materials science, annealing is a heat treatment that alters the physical and sometimes chemical properties of a material to reduce its hardness, making it more workable. It also allows certain additives to settle into the metal to create high quality alloys. In an annealing quantum computer, qubits are prepared in a low energy state, next any problem constraints are added as coupling strengths between the qubits. Then this system of qubits are exposed to a changing energy configuration to a new energy state that represents the problem. So in effect, if this process was done carefully, the qubits start from their original configuration and then end up at the lowest energy configuration of the problem. In other words they end up in the minimum (typically close to the minimum) of the problem we are solving.
How does this help? For many optimization problems or chemistry problems, if you can create the energy function, then you can have an annealing quantum computer find the minimum energy, which is very useful and for large optimization problems or large molecules, very hard to do classically.
The gate quantum computer shown in the middle represents the IBM 5 qubit chip. These are the Universal quantum computers and are designed to flip or entangle qubits in such a way that any logical operation can be performed. These are technically more challenging and currently have less qubits than the annealing quantum computers. The image on the right is of an IBM quantum computer with its casing removed. The quantum chip is housed at the bottom. All the various tubes are used to bring control signals to the chip and take results from the chip. Each level from top to bottom gets colder and colder so that the chip at the bottom is at 15mK.
The bottom left diagram shows an Ion Trap. It shows the magnets that hold the ions suspended. A combination of their repulsive forces and the changing magnetic fields keeps the ions equally positioned in a straight line as shown in the middle, where 1, then 2 and on to 12 ions are brought into the trap. When the ions are lit up, due to the emission of a photon of light as the electron falls to the zero state, they are 1 and when they are not lit up they are 0.
On this chart I am comparing the progress of the classical computer with quantum computers. In 1906 the Vacuum Tube was invented to boost signals and also acted as a switch. It was used in radios and vacuum-tube based adding machines that were as big as a house. It was around that time IBM’s president Thomas Watson made the claim that "I think there is a world market for maybe five computers.“ It is fair to say not many people would have wanted those. Then in 1947 the transistor is created which leads to the first transistor-based computer at MIT Lincoln Labs in the mid 50s. The first integrated circuit (chip) is created in 1958 leading to the first commercially available chip the Intel 4004 which has 2100 transistors. By 1989 we have the Intel 486 chip which has a million transistors and so on, to now, where we have the AMD Threadripper and others with around 20 Billion transistors.
Right now we have about 5000 qubit annealing quantum computers and around 50 to 70 qubit Universal quantum computers. Even though this is not an apples to apples comparison, lets put our current computers around 1972 on the classical timeline. That is very early in the evolution of computers, and before many of the programming languages we now use, before the internet, and definitely before any smart phones, VR or AI.
Before reaching the timeline shown for quantum computers, it is important to note that quantum mechanics evolved from various observations and theories from the early 1800s to the early 1900s. It started with the famous double slit experiment, and black body radiation and Cathode ray tubes leading to Max Plank describing energy packets or “quanta” and explanation of the photoelectric effect.
In 1927 Brussels, at the most famous fifth Solvay conference on Electrons and Photons very prominent physicists were present including Paul Dirac, Albert Einstein among others like Marie Curie and Wolfgang Pauli, Louis De Broglie, Werner Heisenberg and Erwin Schrodinger.
There was a debate between Dirac and Einstein on the structure of quantum mechanics where Einstein did not like the non-local interaction called Entanglement and called it “Spooky action at a distance”. In any case, Dirac kept coming back and explaining all of Einstein’s challenges, and though it wasn’t official, Dirac had won and the version of quantum mechanics we use now is based on his mathematical formulation. No one can fully explain it, but it works. In fact Richard Feynman is quoted as saying “I think I can safely say that nobody understands quantum mechanics” and there is an attitude in quantum mechanics when students try to ask their professors to try to explain quantum mechanics they are told to “shut up and calculate”.
Coming back to our timeline, by 1962 Brian David Josephson produces the mathematics of the Josephson effect, in 1981 Richard Feynman mentions simulating physics with a quantum computer in his seminal lecture. In 1989 Wolfgang Paul invents the 3D quadrupole which is used to trap ions in a magnetic field and is used in atomic clocks. In 1994 Peter Shor shows that qubits can be used to solve the factorization problem efficiently (i.e. faster than classical computing). Lov Grover creates a search algorithm two years later which also shows a quantum speedup. By 1999 the D-Wave company is created based on the quantum annealing theory developed by Tadashi Kadowaki, Hidetoshi Nishimori at Tokyo Institute of Technology. I have seen his lecture where he said “When I proposed the concept of quantum annealing, I never expected that someone will manufacture such a machine, It’s a crazy idea, but they did it and they sold it”. Actually D-Wave sold their first system for $10M to Lockheed Martin in 2010. Would you like to get your hands on this machine?
In 2009, Chad Rigetti completed his PhD in physics and created Rigetti Systems in 2013. In 2015 IonQ was founded by Chris Monroe at University of Maryland and Jungsang Kim who is a professor at Duke U. Their device uses the Ion Trap which was developed from the technology used in Atomic clocks. Later IBM and others who had also been working on quantum computing internally started announcing their quantum initiatives. IBM and Google are at 53 qubits and 72 qubits respectively. Intel is at 49 qubits. I have seen articles of Rigetti working on 128 qubits. Honeywell also has an Ion Trap quantum computer. While D-Wave has already announced and is releasing a 5000 qubit Pegasus annealing quantum computer.
As you can see there has been great progress in classical computers, but there is an equivalent race taking place now in quantum computers. Some of the advancements in our understanding of quantum mechanics and quantum algorithms have in turn helped classical computers. In one case, a quantum algorithm beat the best classical algorithm, but then mathematicians went and changed their classical algorithms with inspiration from the new quantum techniques and turned around and beat the quantum algorithm. Toshiba has a quantum simulator that uses a classical algorithm inspired by quantum mechanics and uses classical hardware. It will be very exciting to see what happens in 5, 10 and even 20 years and how quantum computing changes our world.
Why is the number of qubits important?
Let's look at how a bit in a classical computer stores information. Each bit can have a value of 0 or 1. You can only store 1 value in it. Now if you have a chip with multiple bits, or a register, you can represent 2^n binary values. So for a register with 3 bits, you can store 000, 001, 010, … so on to 111 one at a time. Or you can store on of the 2^3=8 values. Similarly if you had a register of 100 bits, you could still only store one value, even though it could be a very large binary value.
Now let's look at qubits. Each qubit as we said can be in a superposition of a 0 and 1 state. It thus stores the probability of how much 0 you have and how much 1 you have. But, basically, you can store the value of the 2 states. Now let’s look at 2 qubits. Here you can store the states of 00, 01, 10, 11. Thus you can store the values of 2^2=4 states. With 3 qubits you can store 8 possible values within the superposition of the qubits.
For this you do have to have qubits that are connected to each other. If not, you don’t get this effect perfectly. There are ways to get around it, but I won’t get into that. So here is the amazing thing. Quantum computers with 30 connected qubits can have 4billion different states which is about the size of the base pairs in a human genome. So you could keep track of the value of each one of the base pairs and somehow use that to find what you are looking for. That is also 4 times greater than every webpage on the internet.
When you get to about 64 qubits, you are at 1.8x10^19. That is about 2 with 19 zeros. That is the number of grains of sand on Earth. We are already at 53-72 qubits. So these computers with their qubits in superposition, have the potential to store a value for each grain of sand on earth.
We are still lacking high fidelity qubits, loading real world data into the qubits, running long gate operations and interconnectivity between the qubits.
By the time you get to 167 qubits, you have the potential to have as many different values stored in superposition as the number of atoms on Earth.
By encoding the problem on to the qubits either in an annealing computer through coupling terms or through gate operations on a universal quantum computer, it is possible to leverage the high amount of states to produce an answer. Since all the states are stored in these qubits together, they can be manipulated together in parallel to speed up finding solutions. Thus a quantum computer has some unique features that promise to give it exponential power over a classical computer.
I want to give you a flavor of how one can manipulate these qubits inside a real quantum computer. This field is still in development, and the example is greatly simplified just to touch upon the concepts. When qubits are entangled to each other the Block sphere representation is not very useful, as it is for only single qubits.
Imagine also a simplified financial portfolio optimization problem where we want the lowest risk portfolio. The asset selected will have a value of 1 and those not selected will be 0. We are ignoring returns and borrowing. Each qubit will represent an asset.
For each asset we know the covariance terms with the other assets. For example, asset 1 and 2 are negatively correlated. When one asset value goes up, the other asset value goes down. To reduce our risk, we want assets like that on our portfolio. Now take two assets that are positively correlated. If one asset goes up in value, the other one does as well. So in this case, you can pick either, you don’t really need both. But you must decide which one to pick for lowest risk.
By summing up the risk of the individual assets, and the combination from their covariance terms, one can pick a specific set of assets that will create the minimum risk.
We can use an annealing computer where each qubit represents an asset. We then turn on these covariance terms as coupling values between the qubits. Then we let it run through the annealing process multiple times. In then end we will get energy values along with the assets selected. The lowest value gives us the ideal portfolio mix we are looking for, where the asset we want has a 1 value and the asset we don’t want has a 0 value.
How is this done with a universal quantum computer?
Starting at the very left, the diagram shows a register of four qubits Q[1] to Q[4] representing 4 assets.
At the first step the qubits are initialized to the zero state. Notice all the qubit vectors are pointing up to the North which is the 0 state. If measured all qubits now we would have 100% chance of getting 0 on each qubit. Zeros on each qubit represents the binary number 0000.
Now we apply a gate called the Hadamard gate on each qubit which rotates the qubits to the equator of the Bloch sphere. This is where each qubit is either a 0 or 1 (50% of each). Thus if we were to measure the register, we would get 0 or 1 on each. This creates the probability of getting all 2^4=16 binary values from 0000 to 1111 with equal probability. We say the qubits are in equal superposition. We could have all the assets or we could have no asset.
Next we add individual qubit rotations around the equator and we couple the qubits together 1 to 2, 2 to 3, 3 to 1 and so on so that we have some form of mathematical equivalence between the covariance terms and the entanglement between the qubits. This process is still in development and different gate operations are being figured out. But in any case, the goal is to get qubits to represent the problem, and then rotate them back to the North South poles with another row of Hadamard gates and measure the answer.
In the bar chart on the right, I am showing 100 measurement of the four qubits. We get 0101 75% of the time which means in this result we want asset 1 and 3 (right most value is qubit 1), while we get 1101 25% of the time indicating we want asset 2 25% of the time.
Now we try out different angles to measure our multi-dimensional state. We have to repeat for various angles and then find the minimum value that we were looking for. Note, since gate computers are not that robust right now, we leverage classical computers to help with the rotation terms so we can find the answer quickly.
This is just one example of a simplified version of the QAOA quantum algorithm, and there are many complicated examples.
Ok, well that was the toughest slide. We are going to start coming back to more business realities.
Let’s summarize a few of the more important quantum algorithms that are out there. Grover’s algorithm showed that quantum computers could do a search more efficiently that classical computers while Shor’s algorithm showed that one could find factors (using period or order finding) more efficiently, and thus break RSA encryption. However, the current quantum computers are not strong enough to factor a large number or even do a practical search.
Quantum computers do not have any quantum-RAM or quantum-Database that can be used to do searches on. There are ideas about this, like using the properties of a hologram. Where just like in a hologram you can store 3D information on a 2D surface, and by looking at the image from different angles, you can see all the parts of the image change and show what they would look like from that angle. One day, we could have a quantum database which would give us all the possible answers in superposition for a query.
Quantum Annealing, as I have mentioned, is used to find the minimum energy configuration for any problem thus it is used for optimization problems.
The HHL algorithm is being used to solve a linear set of equations
The financial portfolio example using the gate model was a simplified version of the quantum approximate optimization algorithm and this is till being developed. I have seen some promising papers on this just out past November, but it is an active area of research and I am not sure if we have the best solution yet. It is being used, similar to quantum annealing, to solve combinatorial optimization problems. These are problems like the travelling salesman problem and others, where as the number of elements increase, the solution becomes exponentially harder.
Variational Quantum Eigensolver is another type of quantum algorithm which is being used with quite a lot of success in simulating small molecules like Lithium Hydride. Again, Quantum annealing has also shown that it can be used to find the lowest energy state of even larger molecules.
If you think about the rechargeable batteries in electric cars like the Tesla, cobalt is the most expensive metal, not Lithium, and it is used to prevent overheating and recharging. However, there are limited known supplies and from countries where there are no protection against child labor. We need computers that can simulate the action of cobalt and find replacements.
The Haber-Bosch process requires 900 degree Celsius temperature and 200 atmosphere pressure along with an iron catalyst to convert atmospheric Nitrogen and Hydrogen into Ammonia. This is used to create fertilizers and there are estimates that 1-2% of the energy we produce on Earth goes into this to grow our plants and thus make our food. However, nitrogen fixation bacteria do the same process with hardly any energy, and no pressure or temperature using the catalyst nitrogenize. We need computational power that can model such large molecules and allow us to manufacture our own catalysts that can reduced the cost of creating fertilizer. Those are just two chemistry problems that we hope will be solved by quantum computers.
In 2016 NIST published an article that basically said that none of the major cryptographic algorithms are secure anymore these include RSA and DSA as shown on the table on the left. I won't get into details of how RSA works, but based on Shor’s algorithm it was essentially evaluated that eventually quantum computers would become strong enough to break RSA. NIST also solicited ideas for secure communications in the post quantum cryptography era. These would be algorithms that might rely on quantum computers but basically, they would not be able to be broken with a quantum computer, or any other quantum inspired algorithm.
The table on the right highlights the family of post quantum cryptography schemes under consideration by NIST. I don’t think they have released any standard yet.
But why is it that, if we don’t have a fully functioning error-corrected quantum computer with millions of qubits, NIST is saying that RSA is compromised?
Let's look at the figure at the bottom left. Assume that some time in the future, in the year C, RSA is compromised, let’s say C is 2030. Now let’s talk about the requirements for securing historic data in your company, or government organization and that is P years. Thus your security area expects to keep all records from the past P years secure. Let’s say that is 10 years. Now assume that hackers who have been stealing data all along, have been storing this data and waiting for some method they can use to decrypt it in the future. So when, for example, a well funded rogue nation invests in building a quantum computer that can break encryption in the year 2030, they will have all that data sitting with them, and they can, at that time, begin to decrypt all the data going back in time. Thus they can decrypt your unsecure data from the last 10 years. In the example S= C-P, thus S=2030-10=2020, in other words your data if in the hands of hackers will be compromised before you want it to be public. For drug manufacturers who take many years in research before reaping benefits, this would be a considerable loss. For healthcare companies who are expected to secure their customers data would be at risk if their long-term customer data older than 10 years is decrypted. And governments that do not want classified information released for more than 10 years in the past will already face the situation that that information could be exposed before they are ready to release it to the public.
Thus depending on your company’s policy on safeguarding past data, when you think RSA is actually compromised by a quantum computer, will tell you how many years you have to begin to move towards understanding and converting all your data to post quantum cryptography using the then available quantum security devices, even if they have not been standardized by NIST.
But realistically, when could RSA and DSA be compromised by a quantum computer?
Ok this is a busy chart, but lets first orient to the axes, The bottom x axis shows the years starting with 2016. The left vertical axis is the number of qubits. The right vertical axis refers to a logic qubit. For simplicity we will assume that current qubits are not very stable and lose their information, so need helper qubits for error correction. Let’s say it will take a 1000 noisy qubits to make 1 high quality qubit.
Now let’s look at the set of dots on the blue diagonal band. That is the current progress with annealing quantum computers. Even though there are many qubits, they are not Universal and thus cannot break RSA encryption.
There is a smaller darker yellow diagonal band which shows the progress of Universal quantum computers. As you know, we are at the very early stages of quantum computers. We have a long way to go. I have also shown a dash-dot blue straight line which represents if quantum computers follow Moore’s law. However, superconducting semiconductor-based quantum computers have been growing at a faster rate than Moore’s law right now, and that is indicated by the finer red dotted line with the higher slope. Finally, since gate computers will likely face a scaling issue as the number of qubits increases and they must stack them, the rate might begin to slow down, this is shown with the curved blue dotted line.
In other words the growth of qubits could range from one end of the green to the other end depending on how quickly the technological barriers are overcome. But for now, since these have noisy qubits (and we are currently in the Noisy Intermediate Scale Quantum Computer (or NISQ) Era represented by the pink-gold area, error correction will be needed and so we need to use the right-side axis with the logical qubits.
There are also newer more stable and higher quality qubit technologies being developed like the Ion Traps, Nitrogen Vacancy centers and the Majorana Fermions, these technologies might scale better with less error. In other words, there might not be a need to have 1000 qubits for 1 high quality qubit. These are shown as the dotted curve, and one would use the qubit size on the left for these.
So it is typically claimed that we would need 1million high quality qubits to break RSA. The earliest this could happen is by 2028. Though I think that is unlikely. The earliest we would reach 1M logical qubits is 2034. I think this is more likely. On the other hand, I think there will be technological challenges as we scale up, and superconducting quantum computers scaling might slow down. Other technologies might begin to pick up in the next few years and scale much better, like the Ion Traps. So I think the most conservative timeframe everyone reaches this point is 2048. So one could see RSA encryption officially broken in any of the years in the orange range from 2034 to 2048.
The actual video presentation to Duke University can be found below:
Presentation graphics can be downloaded from below:
Feel free to connect with me and send me any questions.
Air Traffic Controller Trainee
5 年Really nice presentation.
Lead AI Research Engineer, Data Science Solution Architect, Generative AI
5 年Amazing intro
The 2nd least qualified person in quantum.
5 年It's good to remind people that the first transistors didn't exactly work perfectly. It adds hope for the future of qubits.
Sr. PMT at Amazon | Product Visionary Leader | CREW DAD!
5 年Alex, What a treat. Thank you!?
Passion for Impact
5 年Ovidiu-Ionu? Michiu