The Solace of Quantum

The Solace of Quantum

It’s impossible to explain how quantum computing works in a short article, or even a long article, and this one isn’t even going to make the effort.

The problem is twofold. Quantum mechanics is a famously intractable topic. Almost all popular accounts resort to very misleading metaphors that reduce rather than increase understanding; or at best give a comforting illusion of it. And don’t get me started on half dead cats. But just as big a problem, perhaps surprisingly given the ubiquity of computers, is that few people understand what ‘computing’ is either. 

That’s a double whammy, but it also points to a reasonable way forward. Because, self-evidently, it isn’t necessary for a business professional to understand what logic gates or finite state machines are in order to grasp most of the importance and utility of enterprise computing. So, to a point, we can take a look at quantum computing from a similar, ‘cut the crap, I’m a busy Executive’ perspective.

The qualifier - ‘to a point’ - is necessary for two reasons. Firstly, because I’m going to have to indulge in some historical context. It may not be useful to you, but I can’t help myself. And secondly, because the two technologies, classical and quantum computing, are at very different life-cycle stages. The former is very mature, with a standardised design (based on the ‘Von Neumann architecture’, dating back to 1945), standardised components (transistors, in practical use since the 1950’s) and even standard, or at least familiar, applications (spreadsheets, browsers, management information systems, WIMPS, and so on). Most business professionals alive today have been rubbing up alongside this technology ecosystem for most of their careers. 

Quantum computing on the other hand is at the ‘pre-transistor’ stage. 

States of the Art

The basic quantum of Quantum is dubbed the qubit. Which is more or less where consensus ends. Whereas computer bits are invariably represented by transistors - electronic switches - there’s no real agreement yet on the best way to represent qubits, and lots of different approaches are being tried: photons, electrons, atoms, quasi-particles; charge, spin, location, time. Take your pick. 

What they do have in common is that they don’t work very well. In the vernacular, they are ‘NISQ’ - noisy, intermediate-scale, quantum devices. ‘Noisy’ doesn’t mean they’re antisocial, it means they’re error prone. ‘Intermediate scale’ is a euphemism meaning they’re too small. There’s an expectation that they’ll be scaled up significantly over time, but Moore’s Law doesn’t really apply to qubits[1]. Notwithstanding the vast sums of government money, corporate investment and venture capital flooding into quantum computing research and development globally, it’s generally unclear whether and over what timescale the technology will really have a practical impact. 

That said, there’s also a kind of ideological split over the basic architecture of a quantum box which may have a practical bearing. The model of quantum computing using ‘circuits’ of quantum ‘gates’ being pursued by IBM, Google and others is sort of analogous to the logic design of a classical computer, and potentially allows for completely general applications to be developed (although, as we discuss below, in most cases this wouldn’t make sense). Others however, notably quantum pioneer D-Wave, have focused on a narrower technology dubbed ‘quantum annealing’. This doesn’t allow for general computation but is aimed at a specific type of ‘optimisation’ challenge based on the ‘travelling salesman problem’ - minimising the distance or time taken to visit a bunch of destinations. Although there are esoteric arguments about whether this is quantum computing at all, and less esoteric arguments about whether it’s cheating, it is one of the very few areas of research which may have yielded real-word, commercial application. Or at least come close.

(Very) Early Adopters

Although details are sketchy, Canadian retail chain Save-on-Foods announced last autumn that it had successfully applied hybrid quantum algorithms (running on D-Wave technology) to complex “grocery optimisation solutions”, claiming to have reduced the computation time for a specific task from 25 hours to 2 minutes on a repeatable basis. They’re intending to go from an in-store pilot to a fully fledged business application. Be that as it may, VW have been experimenting with traffic flow modelling, and BMW are meanwhile looking at potential supply chain optimisation. Although the latter will involve trading off delivery times and procurement cost savings in complex ways, these all look like variations of the travelling salesman problem if you squint a bit.

For the time being however, sensitivity to noise means that even the annealing approach needs a lot of physical qubits to model one logical qubit, and current machines just aren’t big enough to tackle really hard problems without sacrificing accuracy for reliability. 

Generally specific

Even for a more general, ‘circuits and gates’ based technology, ‘quantum supremacy’, which sounds impressive but simply refers to a procedure that a quantum computer can perform significantly faster than a classical piece of kit, applies even in theory only to a very limited range of functions. At least for the foreseeable future, there is a narrow, specific range of algorithms which would be cost-effective to run on quantum architecture. 

For most purposes, a quantum computer isn’t just massively expensive and a pain to host, it’s not as effective as your laptop.

The most likely model for quantum computing is (I think) a very traditional program running on a very traditional machine. Occasionally it would subcontract a specific function, where quantum supremacy is actually important (such as searching an unstructured data set or breaking into the Pentagon), to an expensive fridge. 

Back to the Future

Some time towards the second half of the 1980’s, mini-computer and server-based local area networks, and then personal computing, began to take over from the previously dominant paradigm. Up until that point, most serious computing was done on centralised mainframe computers on a ‘time sharing’ basis. They were eye-wateringly expensive, but dozens of users could run programs on them at (virtually) the same time, so the economics could make sense for larger organisations. IBM was the superpower in this world.

As a student in the mid 1980’s I’d have cause to use one of these things frequently. You’d prepare a program carefully, submit it to a ‘job queue’ and get the output a while later. Rinse and repeat. It was an unwieldy procedure by current standards, but it instilled a kind of diligence[2], and it was fit for purpose.

Nowadays, anyone can get an account at IBM’s Thomas J. Watson Research Centre and run programs on their collection of quantum computers there. The whole process is very reminiscent of those mainframe days (albeit without the big, green-and-white, folded printouts). For the moment this model is really a research tool, and it’s fun, but it seems at least possible that there’ll come a time when this would be a viable means of making serious quantum computing capability available in principle to a range of organisations that wouldn’t be able to afford, or justify, hosting their own machines - Quantum-as-a-Service.

At present the characters taking a frantic interest in quantum computing are either national security organisations, worried sick about the implications of machines that can easily crack previously impenetrable encryption, or massively wealthy investment banks and hedge funds looking for a competitive advantage in program trading. Less frenetically, some major manufacturers are exploring quantum computing as a means of supercharging their supply chain management, as we’ve noted above. For most organisations however, hosting a few thousand qubits in a phenomenally expensive fridge is not going to be cost-effective. It’s possible of course that practical desktop quantum computers may be a thing one day[3], but early attempts fall a long way short of anything useful.

On the other hand, the remote, time-sharing approach may well make quantum computing available and relevant to much more modest enterprises, very few of which are likely to have this technology on their radar at the moment.

Of Carts and Horses

In the first half of the 19th century, art and science came together in the form of Ada King, Countess of Lovelace. A daughter of Lord Byron, her mother steered her towards mathematics in a bid to cure her of her father’s poetic insanity. In the end she would synthesise both, and write the world's first real computer program[4]. She died tragically young, at 36, of uterine cancer.

Three things were truly extraordinary about Ada’s algorithm[5]. Firstly, no real computer yet existed on which to run the program. Secondly, the theoretical machine that had inspired Ada’s work, Babbage’s Analytical Engine, fell well short of Ada’s conception of computing potential. And thirdly, despite all of those constraints, she managed to conceive programming structures, such as conditional loops, that would become a mainstay of computing well over a hundred years later. 

The birth of the modern computer is often traced back to Alan Turing’s seminal paper On Computable Numbers, with an Application to the Entscheidungsproblem. It was 1936, and physical computers were still in the future. Turing nonetheless invented a theoretical physical machine (if you see what I mean), involving a moving tape and a head that could read and write, as the basis for his cogitations. I’d argue that what Ada Lovelace did, almost in passing, was in some ways just as impressive. Turing had to invent a fantasy machine in order to construct his theory, Lovelace had to free herself from the bounds of Babbage’s early mechanical designs in order to see the bigger picture[6]. Maybe there’s something to be said for a lyrical heritage. 

I digress, but my point is this: some of the more practically significant developments in quantum computing over the last year or so may arguably have been in higher level ‘software’ - the sort of things that Ada would recognise - in advance of them having genuinely useful machines to run on[7]. These innovations may help facilitate a uniform approach across different quantum architectures, while at the same time enabling programmers to think at a more agreeable level than transmon qubits and quantum Fourier transforms[8]. Almost no Python programmers know how to write machine code, and the world would be much less efficient if they took the trouble to try. 

Conclusion/Concussion

The current quantum computing gold rush is of course seriously in danger of leading to overhyping and disappointment, at least in the short to medium term. The technical problems are immense, and the skills needed are very different to those of conventional computing, at least in a corporate environment. Success, whatever that looks like, is by no means guaranteed. Much of the current effort is ultimately driven by fear of missing out.

At the risk of oversimplifying though, for those classes of problems that involve exhausting myriad complex possible solutions, there is at least the potential for a (*cough*) quantum leap. And QaaS may well make such applications available to organisations that would otherwise find them inaccessible. Hedge funds and investment banks may be hunting intensively for decisive competitive advantage. For the rest of us, keeping an eye out for some possibility of step operational or logistical improvement may make the scanning effort worthwhile. Watch this space, but don’t be driven to distraction. That’s my job.

*****

[1] In the sense that adding extra qubits to a machine is very hard, and currently gets harder with scale. Although I guess you could argue that, because each new qubit doubles the theoretical volume of information, Moore’s Law sneaks in through the back door.

[2] The reason being that it was extremely inefficient to just just run a job to see what the errors would be, correct it on the fly and re-run it, which can be done in real time nowadays. Not least because the consequence of an error in the code would usually be the dreaded ‘core dump’ print out, a massive and fiendishly indecipherable pile of hieroglyphics, which apart from anything else would advertise your humiliating failure to anyone watching you cart it back to your desk, and which would make your heart sink whenever you went to pick up your output and saw something inches thick waiting for you instead of a single sliver of paper.

[3] Given that unwieldy quantum mainframes still don’t work very well, this may be a long way off, but ShinQ, a Chinese start-up, is claiming to have a 2 qubit desktop quantum computer for sale at $5,000. It’s bullshit, but there’s some very clever engineering involved in any case.

[4] There are those who would dispute this claim. But they’re wrong.

[5] The algorithm calculated the Bernoulli sequence.

[6] In fairness, Alonzo Church, my real unsung hero of computer science, developed a universal model of computation with no such quasi-physical foundation. Turing himself showed the two formulations to be equivalent and acknowledged that Church had beat him to the post.

[7] Having once owned a Sinclair ZX81, I’m very familiar with this concept.

[8] Examples include the t|ket> compiler from Cambridge Quantum Computing, Riverlane’s Deltaflow operating system and Origin’s Pilot operating system.

As someone who was not familiar with Quantum Computing, this was an enlightening article and appreciated learning about these developments. The article was informative, but also entertaining, which is no small feat when talking quantum! Thanks for sharing!

Adrian Nixon

Graphene and 2D Materials Scientist. Editor in Chief of the Nixene Journal. International Space Elevator Consortium Board Member. Strategic Advisory Board member of StellarModal the space transportation association.

4 年

Chris and Ravi : So, at the risk of being thick: - Much of the effort in quantum computing is creating rather expensive fridges - Meanwhile the real leaps forward are being made in the higher level software - and the two have yet to be brought together effectively because both are a work in progress?

回复
Ravi Sundaram

Sr Director of Product @ Pragmatic Semi |Helping organizations capitalize on deep tech| Semiconductors | Quantum Technologies |Cybersecurity | ex-AWS

4 年

Chris Bentley as a fellow Quantum enthusiast ( both the physics and its exploitation ) thoroughly enjoyed reading your points of view. At the risk of splitting hairs :-) I would contend that we are past the transistor stage but definitely not past the large scale integration stage. Individual qubits are sorted , integration of a circuit with single digit numbers are being demonstrated well. The challenge for the next decade starts here where intermediate scale integration needs to be sorted while living with the errors( NISQ) or finding ways to do this without errors ( the race for finding the perfect qubit platform where contenders range from ions , electron spins , photons etc.). Topologically protected majorana states showed some real promise but seems to have hit a wall : https://www.google.co.uk/amp/s/www.wired.com/story/microsoft-win-quantum-computing-error/amp Really enjoyed learning more about the impact of Ada Lovelace and Turing and their different styles of thought. There is something to be said about unincumbered thought to solve problems. Thanks for this and keep em coming !

要查看或添加评论,请登录

Chris Bentley的更多文章

  • AI, AI, Oh

    AI, AI, Oh

    “Somewhere nearby is Colossal Cave, where others have found fortunes in treasure and gold, though it is rumoured that…

    7 条评论
  • The Quantum State

    The Quantum State

    “Of the three main areas of quantum technologies, quantum computing continues to attract the most investment, with $3…

    6 条评论
  • Impossible Things Before Breakfast

    Impossible Things Before Breakfast

    Reflections on Commercialising Quantum Hanging around at the Economist’s densely scheduled Commercialising Quantum…

    3 条评论
  • The Nakamoto Legacy

    The Nakamoto Legacy

    Having noticed recently that every other email in my spam folder at the moment is concerned with some crypto-currency…

    12 条评论
  • Punctuation Marks

    Punctuation Marks

    “Evolution is cleverer than you are” Orgel’s Second Rule In 1972, around the time I was giving up piano lessons, the…

    8 条评论
  • Ludwig

    Ludwig

    Why businessmen should read Wittgenstein I have no interest in football. Watching overpaid youngsters with expensive…

    6 条评论
  • Tron for Suits

    Tron for Suits

    "I kept dreaming of a world I thought I’d never see. And then, one day, I got in…" Kevin Flynn (Jeff Bridges), Tron:…

    9 条评论
  • Artificial Intelligence: the Strategic Context

    Artificial Intelligence: the Strategic Context

    In this article we’ll look at the strategic context of AI - what it actually is (and isn’t) in practical terms, why it…

    4 条评论
  • Wicked

    Wicked

    “For every complex problem there is an answer that is clear, simple, and wrong.” (H.

    8 条评论
  • Underdog

    Underdog

    “The strong do what they can while the weak must suffer what they must” - Thucydides, History of the Peloponnesian War…

    13 条评论

社区洞察

其他会员也浏览了