TPU - Google opened the Pandora Box in computing world?

TPU - Google opened the Pandora Box in computing world?

Last year during Google's I/O conference, Sundar Pichai revealed that Google is using a special customised version of compute hardware for it's machine learning & AI processing program.Google call it as " Tensor Processing Unit" or "TPU".

With the amount of data-set which needs to be processed by these Machine Learning algorithms,deploying GPUs, FPGAs might have made Google to built large DataCenters with huge compute power to perform the task. But engineers & scientists in Google came up with idea of building a new compute chip which can improve cost-performance by 10 X over GPUs and punch more power into the existing HW rather adding new datacenters. They kept the design & architecture of TPU as tightly guarded secret.

Finally Google has released a whitepaper on TPU with gives insights into it's architecture and performance benchmark. I'm sharing few important points out of that.

Rather than be tightly integrated with a CPU, to reduce the chances of delaying deployment, the TPU was designed to be a coprocessor on the PCIe I/O bus, allowing it to plug into existing servers just as a GPU does.Each TPU chip can be installed in a data center rack on a board that fits into a hard disk drive slot.

The TPU was designed, verified, built, and deployed in datacenters in just 15 months. They have implemented that TPUs in their products like Google Image Search, Google Photos and the Google Cloud Vision API and they were one of the key factor behind Google DeepMind's victory over Lee Sedol, the first time where a computer defeating a world champion in the ancient game of Go.

Google designed the chip specifically for neural networks, which can run 15 to 30 times faster than general purpose chips built with similar manufacturing techniques.During tests the TPU server has 17 to 34 times better total-performance/Watt than Haswell, which makes the TPU server 14 to 16 times the performance/Watt of the K80 server.

The TPU leverages its advantage in MACs and on-chip memory to run short programs written using the domain specific TensorFlow framework 15 times as fast as the K80 GPU, resulting in a performance/Watt advantage of 29 times, which is correlated with performance/total cost of ownership.

Google is not alone in this space, Nvidia & Intel are playing their cards by developing powerful processors that will meet the growing demands of the compute hungry environments like machine learning, AI & deep learning.

All the cloud service providers are now offering high perf GPU processing servers for heavy work loads starting with AWS cg1.4xlarge, Azure N Series servers,Google Cloud GPU Instances.

But TPU has surely opened a Pandora box which will wake up the hardware vendors to re-think the design of compute chips for the datacenter & cloud environment to meet the growing demand for huge compute power for modern applications like Machine learning,neural networking & AI.

Shrinath K.

Lead engineer work modernisation

7 年

it's core machine learning

回复

要查看或添加评论,请登录

Venkatasudhan Lakshminarayanan的更多文章

  • What it takes to succeed in Multi Cloud Journey?

    What it takes to succeed in Multi Cloud Journey?

    What it takes to succeed in Multi Cloud Journey? More and more customers are looking to diversify their cloud presence…

  • Google's AlphaGo & it's fairy tale !!!

    Google's AlphaGo & it's fairy tale !!!

    AlphaGo is a AI computer program developed by Alphabet Inc.'s Google DeepMind in London to play the ancient Chinese…

    1 条评论
  • DNA as a Storage for Data !!!

    DNA as a Storage for Data !!!

    I recently came across articles in internet regarding how, computer scientists Luis Ceze, from the University of…

  • Stark Enterprise - 21st Century Most Innovative company ??

    Stark Enterprise - 21st Century Most Innovative company ??

    Would you disagree, if I argue "Stark Industries is the most innovative company of the 21st century”?? Here are the…

    2 条评论
  • Cloud AI race has begun !!!!

    Cloud AI race has begun !!!!

    Google CEO Sundar Pichai has stated his belief that we are moving from a mobile-first computing world to an AI-first…

  • RaaS - Dangerous Model of self service ??

    RaaS - Dangerous Model of self service ??

    In the era of Iaas,SaaS & PaaS cloud self services, some people from Russia have introduced a new self service called…

    2 条评论
  • Donald Trump and Bots!!!

    Donald Trump and Bots!!!

    Definition of Bots as per Wikipedia : An Internet Bot, also known as web robot, WWW robot or simply bot, is a software…

  • Block Chain - An identity of the future

    Block Chain - An identity of the future

    In 2008, “Satoshi Nakamoto” is the name used by the unknown person or persons who designed the protocol for digital…

  • Machine Learning, No more a future tech!!!

    Machine Learning, No more a future tech!!!

    What is Machine Learning? As per Professor Tom Mitchell of Carnegie Mellon University, Machine Learning is defined as A…

  • My Cloud Journey !!!

    My Cloud Journey !!!

    I still remember the moment during my campus interview, when I cheerfully told "Cloud", when the HR asked me which…

社区洞察

其他会员也浏览了