Google is making a fast specialized TPU chip for edge devices and a suite of services to support it

Google is making a fast specialized TPU chip for edge devices and a suite of services to support it

6 years ago
Anonymous $RBasgWKaIV

https://techcrunch.com/2018/07/25/google-is-making-a-fast-specialized-tpu-chip-for-edge-devices-and-a-suite-of-services-to-support-it/

In a pretty substantial move into trying to own the entire AI stack, Google today announced that it will be rolling out a version of its Tensor Processing Unit — a custom chip optimized for its machine learning framework TensorFlow — optimized for inference in edge devices.

That’s a bit of a word salad to unpack, but here’s the end result: Google is looking to have a complete suite of customized hardware for developers looking to build products around machine learning, such as image or speech recognition, that it owns from the device all the way through to the server. Google will have the cloud TPU (the third version of which will soon roll out) to handle training models for various machine learning-driven tasks, and then run the inference from that model on a specialized chip that runs a lighter version of TensorFlow that doesn’t consume as much power. Google is exploiting an opportunity to split the process of inference and machine training into two different sets of hardware and dramatically reduce the footprint required in a device that’s actually capturing the data. That would result in faster processing, less power consumption, and potentially more importantly, a dramatically smaller surface area for the actual chip.

Google is making a fast specialized TPU chip for edge devices and a suite of services to support it

Jul 25, 2018, 4:39pm UTC
https://techcrunch.com/2018/07/25/google-is-making-a-fast-specialized-tpu-chip-for-edge-devices-and-a-suite-of-services-to-support-it/ > In a pretty substantial move into trying to own the entire AI stack, Google today announced that it will be rolling out a version of its Tensor Processing Unit — a custom chip optimized for its machine learning framework TensorFlow — optimized for inference in edge devices. > That’s a bit of a word salad to unpack, but here’s the end result: Google is looking to have a complete suite of customized hardware for developers looking to build products around machine learning, such as image or speech recognition, that it owns from the device all the way through to the server. Google will have the cloud TPU (the third version of which will soon roll out) to handle training models for various machine learning-driven tasks, and then run the inference from that model on a specialized chip that runs a lighter version of TensorFlow that doesn’t consume as much power. Google is exploiting an opportunity to split the process of inference and machine training into two different sets of hardware and dramatically reduce the footprint required in a device that’s actually capturing the data. That would result in faster processing, less power consumption, and potentially more importantly, a dramatically smaller surface area for the actual chip.