GPUs: Re-inventing data visualization
Gordon Moore predicted, over fifty years ago, that the number of transistors that can fit on a silicon chip will double every two years as technology advances. Amazingly he was right, along with predicting the invention of PCs, cellphones, and self-driving cars! In today’s world of ever-increasing computing power, his law is still the curve that processing unit designers measure success against. The amount of worldwide data is also growing quickly, doubling roughly every three years; and as this data grows at an accelerated pace, the limits of CPU-based systems are being realized.
This is where the advanced power of graphics processing units comes into play. Joe Eaton, tech lead at NVIDIA, led a presentation I attended on how GPUs are re-inventing data visualization. The event also had speakers from MapD, a company that builds DB and visualization apps that take advantage of the parallel processing power of GPUs. Their message was clear: GPUs outperform CPUs in data critical areas such as computing power.
With GPU acceleration, a huge speed-up in computing power is achieved, and is often used to power deep learning, analytics, and engineering applications around the world. GPU-accelerated computing makes applications run faster by offloading compute-intensive operations to the GPU, while letting the CPU do the rest. GPUs also have thousands of cores in a massively parallel architecture designed to handle multiple simultaneous tasks more efficiently. It’s no wonder Bitcoin miner ASICs are decked out with GPUs!
Companies like MapD take advantage of GPUs by creating products with powerful back-end in-memory rendering capabilities. They also develop front-end tools that make building large-scale data visualization presentations easier. A MapD demo showed off some fancy map visualizations that would dynamically change almost immediately after clicks. A PNG file, rendered on the back-end, made the various map plots on the screen appear to change quickly and fluidly. But the impressive part was that the visualizations showed data plugged from billions of database rows, in just fractions of seconds! (Yes, I did say billions). They made it look easy.
Such real-time calculations of huge proportions used in data-heavy UI applications, which take milliseconds to process on GPUs today, used to take hours. Nowadays, huge graph analytics are processed much faster with the help of back-end GPU rendering. Super complex graphs with massive amounts of nodes and vertices have become easy to render in real-time. This year, all major cloud vendors have launched support for GPU instances, which means that your ultra fast and powerful data visualization application can now become a reality.