Quantum Computing breakthroughs? Er... let's not get too ahead of ourselves just yet...
Inside an IBM Quantum System One (Picture: IBM)

Quantum Computing breakthroughs? Er... let's not get too ahead of ourselves just yet...

Quantum computing is back in the hype cycle with Microsoft, Amazon, Google, and even Honeywell all recently announcing new experimental chips that promise the next big step towards quantum tech.

Take Microsoft’s Majorana chip, which the company says will bring faster, more stable quantum systems that could crack industrial-scale problems in years, not decades.

The tech press has already called it a game-changer, but it’s really just another chapter in the long, messy road to making quantum computing actually useful. And here’s the reality: despite the grand promises, quantum computing still hasn’t overcome four fundamental problems.


1 - First things first: Quantum’s VHS vs. Betamax moment

Right now, quantum computing is in its format war phase – the big tech companies are all betting on different approaches, hoping their approach will be the one that sticks.

It’s the early semiconductor industry all over again. Once upon a time, people debated whether carbon or silicon would offer the best material to make chips (indeed, carbon nanotubes might bring a return to that debate)! Now, we’re watching the same battle unfold with “cat qubits” and Majorana-based “topological qubits”.

When I was previously involved with the University of Bristol in helping academics commercialise their quantum research - their favoured approach was with photons. Down at the University of Sussex, the flavour of the day is with 'trapped-ions'. Each of these different process is grounded in different materials, and very different techniques to achieve the quantum effect. It's like you need a plumber and and electrician to build a house - but neither are very useful if you've got a hole in the roof.

In other words, we’re facing a Betamax vs VHS moment. Each tech giant and academic centre thinks its way is best, but we might need to figure out the answer to the other three problems that follow before we can decide which:


2 - The qubit problem: Are we chasing a mirage?

For years, quantum computing has been locked in a race for vanity metrics. Somewhat similar to OpenAI throwing as much language at ChatGPT as possible and boasting about their 'parameter count' in a bid to reach artificial general intelligence, tech firms are hooked on raising the number of qubits for quantum systems.

But here’s the issue: we don’t even know how many qubits are actually needed to solve useful problems.

Classical computing had a clear metric for performance. As all 'Turing-Complete' computers (by which we mean essentially a computer that can calculate the answer to any solvable problem given sufficient time) are equivalent - the quest for faster processors or better architecture had real clear ROI. Either the chip could get to the answer faster or it would use less electricity in doing so, or both.

With quantum, we’re guessing. No one can say for sure whether 1,000 or 1,000,000 qubits will be the tipping point. It’s like buying furniture for a house without knowing the size of the rooms – you only figure out if it fits once you’ve tried to force it through the door.

The industry is chasing raw qubit numbers blindly, with no real understanding of when we’ll reach something with commercial viability. We all believe we'll get there (unlike with AGI where there is much more of a split on whether it's achievable), we just don't have any way of knowing what the destination is until we've reached it.


3 - The error correction nightmare

And that leads to our next problem. Even if we (have) hit the magic qubit number, quantum computing still doesn’t work at scale because error correction remains a disaster.

In classical computing, error correction was solved decades ago. An extra 'bit' was often added to data being transmitted or computed in order to verify that no error had beeen introduced.

In quantum? The more qubits you add, the less reliable the system gets.

Google and Microsoft have both claimed breakthroughs, but these are just lab experiments. Until we see a real, scalable solution, none of this actually matters.

A 1,000-qubit machine with no error correction might only be about as useful as a 10-qubit machine with no error correction, which is to say, not useful at all. My favourite analogy here is it being like a very fast airplane engine that has very poor fuel economy. If the weight of all that fuel means that the plane can't fly, well, then the plane can't fly!


4 - The big question: What’s it actually for?

Let's agree on what we're going to make it out of and how to do it. Let’s assume we solve the qubit problem. Let’s assume we solve error correction.

Then what?

For now, quantum computing doesn’t have a killer application. The go-to example is always cracking cryptography, but beyond that, the commercial applications are still unclear.

Part of the problem is that the math simply isn’t ready yet. In quantum, we’re building hardware without fully understanding what it’s meant to solve. That creates a paradox:

  • No hardware = no progress in quantum-specific math.
  • No math = no clear use cases to justify better hardware.

Right now, quantum computing sits at the bottom of an innovation pyramid. We have quantum inspired technology – from MRI scanners to atomic clocks – and increasingly quantum metrology with use cases like infection detection in healthcare. But it’s the top of this pyramid that remains a mystery. True quantum computing, which has the potential to be game changing, is still mostly theoretical.

When I was at Deutsche Bank I authored a strategy for the bank on getting ahead of this paradox, and I know that many other banks have since done likewise. Similarly, people in the oil and gas industry as well as pharmaceuticals have also embarked on their quests to figure out the utility of quantum computers. But it's speculative. Without the applications for the mathematics, we can't figure out the potential return on investment.

All this takes a leap of faith. A quantum leap.

Microsoft’s Majorana, Amazon’s Ocelot, and others might help push us a little higher in this pyramid, but they don’t change the fact that we don’t yet know what quantum computing is meant to revolutionise.

My father was once reluctant to spend £300 on buying me my first computer (an Amstrad CPC-464), as he didn't really know what I'd do with it (other than play games). While I didn't understand him then, I can totally understand why a CEO might be reluctant to spend $15m on purchasing a D-Wave 2000Q, as s/ he don't really know what their team would do with it (other than play 'innovation' games).

And until we figure out error correction, practical applications, and whether we’re even measuring return on investment properly, quantum computing is still just turning money into more science, and not turning science back into money.

But if you think this is the moment everything changes, I’ve got a V2000 to sell you (anyone remember those)?

?

Dr. Alberto Chierici

Building AI ventures | Helping leaders use AI responsibly | Tesla alumni | Author of The Ethics of AI

6 天前

It's definitely a seuperimposition until you measure reality :))

回复
Wil Koenig

Organizational Design, Business Agility and Digital Enterprise Strategist | Agile and Open Innovation Program Leader | Enterprise Architect | Technologist | Advisor

1 周

...more qubits... louder music...

回复
Jesper Soederberg

Lead Producer at LEGO System A/S

1 周

Mr. Schr?dingers cat is missing - is it hiding in your quantum computer?

回复
Daniel Neale (FICRS)

Responsible Investment - Social Lead - at Church Commissioners for England

1 周

Joking about quantum is a super-position.

回复

要查看或添加评论,请登录

Charles Radclyffe的更多文章

  • Social Media is just like cigarettes

    Social Media is just like cigarettes

    In the 19th century, tobacco companies invented the cigarette filter. However they found it was almost entirely useless…

    8 条评论
  • The Techno-colonial Era

    The Techno-colonial Era

    I have a complicated relationship with America. My great-grandfather emigrated there in the 19th century, marrying in…

    7 条评论
  • AI need not be taxing. Or should it?

    AI need not be taxing. Or should it?

    With the Budget just around the corner, the debate about who should pay more tax is heating up. Corporate giants like…

    3 条评论
  • The perfect Apple laptop

    The perfect Apple laptop

    Previous article: Before I reveal my winner, let me describe two very honourable mentions that came very close to…

    13 条评论
  • Conclusion - so what is the best laptop Apple has ever made?

    Conclusion - so what is the best laptop Apple has ever made?

    Previous article: Most commentators that you'll read this Christmas seem to be in agreement that last year’s M2 MacBook…

  • 2022 - The new Mac : M2 MacBook Air

    2022 - The new Mac : M2 MacBook Air

    Previous article: This is the point where I should confess that I'm about to lead you all on a little with my review…

    10 条评论
  • 2017 - The modern Mac: 13" MacBook Pro

    2017 - The modern Mac: 13" MacBook Pro

    Previous article: My 2017-era Apple laptop isn't actually from 2017, it's from 2016 - and it's not even mine, it's my…

    1 条评论
  • 2012 - The big Mac: 17" MacBook Pro

    2012 - The big Mac: 17" MacBook Pro

    Previous article: I'll break from my habit of this series and cut straight to the chase: I love this laptop. With the…

    5 条评论
  • 2007 - The black Mac: Intel MacBook

    2007 - The black Mac: Intel MacBook

    Previous article: While the Duo is the Apple laptop I've owned the longest, this MacBook is the one that's had the most…

  • 2002 - The millennial Mac: iBook G4 "Snow"

    2002 - The millennial Mac: iBook G4 "Snow"

    Previous article: You might be wondering at this point why I opted for the iBook G4 and not the instantly recognisable…

    4 条评论

社区洞察