Why is the leap to quantum not happening?
Image: Jasper AI/Newsweek

Why is the leap to quantum not happening?

Quantum computing, with Schrodinger’s Cat at its very core, is still both very much alive and dead.?

In 2019, Google’s most advanced computer at the time achieved “quantum supremacy” (more on that later) where it performed a task in 200 seconds that it said would take the world’s best non-quantum supercomputer 10,000 years to complete.?

It was a “key milestone,” a “landmark” achievement, and a “hello world” moment making headlines the world over. The point was that quantum computing was doing something no other computer could do for the first time. It was *incredibly* specific, but it was the first clear proof of concept that this was the future, as most technologists predict.?

And it was clear proof… until it wasn’t. Last August, researchers in China provided evidence that this “10,000 years” task could be performed in a matter of seconds without quantum.??

Does this mean that the hunt for quantum was a wild goose chase all along? Given the billions of dollars already being spent, does it have to “succeed”? And what does “success” look like in a new computing world that, by its very nature, cannot be defined??

With all the mainstream heads turned to AI, quantum is likely the technology required to make AI more intelligent, just without the headlines. In the shortest definition possible (a simple quantum explanation is here if needed), the “superposition” of quantum bits (qubits) means processing power increases exponentially the more qubits are added. In simple terms, think of normal computing as the conscious mind and quantum as the subconscious one. In normal computing, everything has a definition. In quantum, objects are only defined after being observed. Quantum supremacy is the moment when a quantum computer can do something it is impossible for a classical one to achieve.

?“The Google experiment did what it was meant to do, start this race.” University of Maryland quantum scientist Dominik Hangleiter told Science.

The debate is whether nature can be truly represented by 0s and 1s, the binary code all classic computers use up to this point, or if it requires something more complicated, where a qubit has many states between simply “on” or “off.”?

“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical,” theoretical physics pioneer Richard Feynman said in the early 1980s to kick this whole thing off.?

And this thing has really kicked off. Even if we haven’t really had the “hello world” moment, the market for cloud quantum computing is expected to grow from around $800 million to $4.1 billion by 2028, according to Research and Markets; the U.K. published its National Quantum Strategy in March, setting out a ten-year vision and £2.5 billion ($3.1 billion) of funding and the U.S. is spending around $850 million a year on the National Quantum Initiative.?

“You’ll probably never have a quantum chip in your laptop or smartphone” Amit Katwala

IBM announced this week it was opening a new quantum computing data center in Germany, its first foray into Europe. “Our goal is to bring useful quantum computing to the world,” IBM VP of Quantum Jay Gambetta said in a press briefing. “To me, that means we have to bring access.” This is a hint that we’re at the point where quantum is actually becoming useful. And U.S. officials agree:?

“The development of [quantum information science and technology] QIST is critical to U.S. economic and national security,” the National Quantum Initiative Advisory Committee (NQIAC) said. “Key scientific, engineering, and systems integration challenges remain and must be solved for the United States to realize the full economic impacts and benefits of QIST.”

It’s a world of infinite possibilities and near-infinite acronyms.?

There are clear parallels with how the AI race is shaping up. China, after being slow on the uptake with quantum, is now third in the number of quantum start-ups (after the U.S. and the U.K.) and is being seen as a clear battle for dominance in the sector.?

The rhetoric is grand and confusing, to say the least.

“The number of classical bits that would be necessary to represent a state on the IBM [quantum] Osprey processor far exceeds the total number of atoms in the known universe,” IBM said in its launch of a 433-qubit processor. IBM wants its quantum system to have “4,000+ qubits by 2025.”?

The numbers are all huge and lacking context, but a 4,000-qubit machine is the moment at which current banking encryption could be, it is said, cracked in seconds. This computer would be at least 300 trillion times as powerful as a current classical machine. But as above, these numbers are all theoretical because they haven’t happened yet. And so much of what has been promised for decades with quantum hasn’t arrived.?

“You’ll probably never have a quantum chip in your laptop or smartphone,” Amit Katwala wrote in Wired. “There’s not going to be an iPhone Q. Quantum computers have been theorized about for decades, but the reason it’s taken so long for them to arrive is that they’re incredibly sensitive to interference.”

This interference, as with most things in quantum, is complicated.?

“Error correction is especially important in quantum computers because efficient quantum algorithms make use of large-scale quantum interference, which is fragile, i.e. sensitive to imprecision in the computer and to unwanted coupling between the computer and the rest of the world,” Andrew M. Steane, of the Centre for Quantum Computation at the University of Oxford, said.?

“This makes large-scale quantum computation so difficult as to be practically impossible unless error correction methods are used.”?

Some experts contend that it will take a million qubit machine or more to create a “useful” machine with the correct error correction without disrupting quantum states.??

“The holy grail of quantum computing will continue to be building a machine capable of fault tolerance,” Richard Murray, co-founder and CEO of ORCA computing and chair and director of UKQuantum, said. “This will continue to be something that is achieved far beyond 2023… the future corporate users of quantum computing will largely see it as too far off the time horizon to care much. The exception will be government and anyone with a significant, long-term interest in cryptography.

“There is also an outside chance that next year will be the year when quantum rules out the possibility of short-term applications for good and doubles down on the seven to 10-year journey towards large-scale fault-tolerant systems.”?

It’s unlikely that the amount of venture capital invested in quantum technology is willing to wait 10 years (or longer) for returns on their investment. But comparing quantum computing’s timeline to that of classical computing provides a good comparison. Electronic computing was demonstrated in the 1940s; in widespread use by the 1960s but really only grew into mainstream practical applications in the 1980s/90s.?

The first experimental quantum algorithm was shown in 1998. That means, relatively speaking, we’re at about the stage of the electronic calculator with quantum.?

It’s just we have all the “classical” computers hooked up to the internet to be able to hear about all the experts “calculating” exactly when the quantum moment might be. But if quantum has taught us anything, it means that the second a prediction is observed, it becomes invalid. That means we definitely won’t be making one.?

The questions we still don’t have answers to after researching this article: (all musings/questions/opinions welcome to [email protected] if you think you know the answers)?

  1. What use will quantum entanglements have? This is the idea that two entangled particles could act identically at the same time millions of miles apart. This hurts the mind even more than “simple” quantum theory?
  2. What will the first mainstream applications be? Nobody I’ve spoken to can distill clearly how a “normal” person might use quantum or what it would be used for. It’s likely to click into place at some point, but not yet
  3. Is graphene a missing link here? I first wrote about the “miracle material” 12 years ago, and it’s struggling with the same “early buzz, limited applications” as quantum. Could it be used here to make quantum more practical?


要查看或添加评论,请登录

Alex Hudson的更多文章

社区洞察

其他会员也浏览了