Quo Quantum?

By James Kobielus

Quantum computing could become the singularity that accelerates all analytics at all scales faster than the speed of light.

As I stated a few years ago, quantum approaches promise seemingly magical, astronomically parallel, unbreakably encrypted, and superluminal computations. This isn’t science fiction. It’s science, as developed theoretically by visionaries such as Richard Feynman and, in terms of working prototypes, by researchers at a wide range of academic and commercial institutions, including IBM.

Deep Learning As Quantum Computing’s Killer App

And now quantum computing is starting to look more practical and feasible as a potential convergence platform for the furthest frontiers in big-data analytics. Deep learning is chief among those frontiers, and the higher dimensionality analyses are where current pre-quantum computing approaches are starting to show their limitations.

Crunching through high-dimensional data is an exceptionally resource-intensive task. It often consumes every last scrap of available processors, memory, disk, and I/O that we can throw at it. As I discussed in this recent post, extremely high-dimensional data is the bane of deep learning. Examples of the sorts of high-dimensional objects against which deep learning algorithms are usually applied include streaming media, photographic images, aggregated environmental feeds, rich behavioral data, and geospatial intelligence. Some industry observers are touting graphical processing units (GPUs) as the ideal chipsets for deep learning. However, for all their advantages over CPUs, GPUs pale in performance next to the staggering computational acceleration promised by quantum approaches.

In exciting news out of China, researchers have recently demonstrated the ability to execute high-dimensionality machine-learning algorithms on quantum computers. Specifically, physicists at the University of Science and Technology of China in Hefei, have used quantum entanglement to radically accelerate the classification of high-dimensional data by machine-learning algorithms.  The algorithms they demonstrated are fundamental to the more sophisticated applications of deep learning and cognitive computing in their ability to detect patterns in images, videos, Internet of Things streams, and other complex objects.

The researchers accelerated these algorithms by encoding high-dimensionality data as quantum bits—aka “qubits”---within their prototype quantum computer. The dimensions (i.e., properties of the objects described in the data) were encoded as “vectors” represented in the entangled quantum states (eg., spin, color, angular momentum) of the photons processed in their quantum-computing fabric. Quantum computers excel as manipulating data vectors, and the researchers entangled the quantum states of the machine-learning data vectors before comparing the statistical/mathematical distance among the vectors. This approach is applicable both to the vector analysis done in supervised-learning algorithms (i.e., for detecting patterns in fresh data against those in an apriori example, also known as a “reference vector”), and for unsupervised learning, which lacks any such reference vector.

Practical Advances in Quantum Computing Fabrics

The researchers developed this approach on a small-scale optical quantum computer. Here, in their own words, is a description of how this approach to high-dimensionality machine learning might be scaled up to astonishing proportions: “To calculate the distance between two large vectors with a dimension of 1021 (or, in the language of Big Data, we can call it 1 Zettabyte (ZB)), a GHz clock-rate classical computer will take about hundreds of thousands of years. A GHz clock-rate quantum computer, if we can build it in the future, with the exponential speed-up, will take only about a second to estimate the distance between these two vectors after they are entangled with the ancillary qubit…..We are working on controlling an increasingly large number of quantum bits for more powerful quantum machine learning, By controlling multiple degrees of freedom of a single photon, we aim to generate 6-photon, 18-qubit entanglement in the near future. Using semiconductor quantum dots, we are trying to build a solid-state platform for approximately 20-photon entanglement in about five years. With the enhanced ability in quantum control, we will perform more complicated quantum artificial intelligence tasks."

If you think this is an isolated advance in quantum computing without any near-term practical use, consider also this recent news out of IBM. In a nutshell, researchers have developed an improved approach for error detection on quantum processes and have also created a prototype design that can more easily be embedded in microchips and scaled up to massively parallel proportions.

Given these and similar advances in research labs around the planet, does anybody still doubt the disruptive potential of quantum machine learning in our lifetime?

Michael Brown

Curious Human | Technologist | Social Engineer

9 年

All it takes in error correction. Lot's of it.... ;)

回复

要查看或添加评论,请登录

James Kobielus的更多文章

社区洞察

其他会员也浏览了