Download App
Better Online and Trade Show Sourcing Experiences.Scan the QR code to download.
Learn More
Hot Topics
TensorFlow accelerators are used in AlphaGo.

The TPU fits on a module that plugs into a hard disk slot in a server rack.
Source: Google via EE Times
Google has developed its own accelerator chips for artificial intelligence it calls tensor processing units or TPUs after the open source TensorFlow algorithms it released last year. The news was the big surprise saved for the end of a two-hour keynote at the search giant's recent annual Google IO event in the heart of Silicon Valley.
"We have started building tensor processing units ... TPUs are an order of magnitude higher performance per watt than commercial FPGAs and GPUs, they powered the AlphaGo system," said Sundar Pichai, Google's chief executive, citing the Google computer that beat a human Go champion.
The accelerators have been running in Google's data centers for more than a year, according to a blog by Norm Jouppi, a distinguished hardware engineer at Google. "TPUs already power many applications at Google, including RankBrain, used to improve the relevancy of search results and Street View, to improve the accuracy and quality of our maps and navigation," he said.
The chips ride a module that plugs into a hard drive slot on server racks. Engineers had them running just 22 days after they tested first silicon said Jouppi, who previously helped design servers and processors at Hewlett Packard and Digital Equipment.
More Sourcing News
Read Also