Google designing AI processors

Global SourcesUpdated on 2023/12/01

Hot Topics

Global Sources Exhibitions

TensorFlow accelerators are used in AlphaGo.

The TPU fits on a module that plugs into a hard disk slot in a server rack.
Source: Google via EE Times

Google has developed its own accelerator chips for artificial intelligence it calls tensor processing units or TPUs after the open source TensorFlow algorithms it released last year. The news was the big surprise saved for the end of a two-hour keynote at the search giant's recent annual Google IO event in the heart of Silicon Valley.

"We have started building tensor processing units ... TPUs are an order of magnitude higher performance per watt than commercial FPGAs and GPUs, they powered the AlphaGo system," said Sundar Pichai, Google's chief executive, citing the Google computer that beat a human Go champion.

The accelerators have been running in Google's data centers for more than a year, according to a blog by Norm Jouppi, a distinguished hardware engineer at Google. "TPUs already power many applications at Google, including RankBrain, used to improve the relevancy of search results and Street View, to improve the accuracy and quality of our maps and navigation," he said.

The chips ride a module that plugs into a hard drive slot on server racks. Engineers had them running just 22 days after they tested first silicon said Jouppi, who previously helped design servers and processors at Hewlett Packard and Digital Equipment.

To read the full article, go to EETimes.

See a comparison table of selected PCB assembly suppliers on GlobalSources.com

Source the latest products from verified suppliers on our global sourcing platform, or install our app. Subscribe to our magazines for more in-depth insights and product discovery.

More Sourcing News

  • Leave us Feedback

  • Download App

    Scan the QR code to download

    iOS & Android
    iOS & Android
    (Mainland China)