Thu May 17 2018
Google’s Next generation TPU
Tensor Processing Unit (TPU) is the first custom accelerator ASIC (application-specific integrated circuit) for machine learning (ML). It is customized to give high performance and power efficiency when running TensorFlow. The Tensor Processing Unit was first announced in 2016 at Google I/O when the company said that the TPU had already been used inside their data centers for over a year. The second generation TPU was announced in May 2017. And most recently at Google I/O 2018, Google announced its third generation of silicon, the Tensor Processor Unit 3.0.
Let's find its features
-
TPUs are the hardware components that enable most of Google’s AI and machine learning capabilities, including AlphaGo.
-
It can be leased to developers through Google Cloud.
-
Tensor processing unit 3.0 has 8 times the performance of the previous generation, with speeds reaching 100 petaflops.
-
It allows to develop better models, larger models, more accurate model and helps to tackle even big problems.
-
The chips are so powerful that Google had to add liquid cooling to their data centers to compensate for the additional heat. An image shows a copper-plated cooler system, with water from the same pipe run through all four cooling plates.
-
This system is fully operational and being deployed across its Google Compute Engine, a platform other companies and researchers can tap for computing resources similar to Amazon’s AWS and Microsoft Azure.
-
It's going to allow Google’s servers to do what is known as both inference and training simultaneously.
-
Google makes its tools and services faster, and eventually reaches a point where its AI tools are too far ahead and lock developers and users into its ecosystem.
-
It’s gradually expanding into new business segments that all require robust data sets and operations to learn human behavior.
-
Google's CEO highlighted works in machine learning techniques for diagnosis of diabetic retinopathy including a large persuasive study in India. Recent work has shown that AI review of eye scans can be predictive of heart attack and stroke risk.
-
Pichai said that Healthcare will be dramatically transformed by AI.
-
It is probably not too strong to say that Google and other hyperscalers have become prime movers in AI technology development, from the algorithm and full applications to chips, hardware architecture, and data management.
What does it use for?
Google used TPUs in the AlphaGo vs Lee Sedol series of man-machine Go games. Google has also used TPUs for Google Street View text processing and was able to find all the text in the Street View database in less than five days. In Google Photos, an individual TPU can process over 100 million photos a day. It is also used in RankBrain which Google uses to provide search results. Even you can use it for your own business with the help you Google Compute Engine.