The future of AI needs hardware accelerators based on analog memory devices

The future of AI needs hardware accelerators based on analog memory devices

6 years ago
Anonymous $roN-uuAfLt

https://phys.org/news/2018-06-future-ai-hardware-based-analog.html

DNNs must get larger and faster, both in the cloud and at the edge – and this means energy-efficiency must improve dramatically. While better GPUs or other digital accelerators can help to some extent, such systems unavoidably spend a lot of time and energy moving data from memory to processing and back. We can improve both speed and energy-efficiency by performing AI calculations in the analog domain with right at the location of the data – but this only makes sense to do if the resulting neural networks are just as smart as those implemented with conventional digital hardware.

Analog techniques, involving continuously variable signals rather than binary 0s and 1s, have inherent limits on their precision—which is why modern computers are generally digital computers. However, AI researchers have begun to realize that their DNN models still work well even when digital precision is reduced to levels that would be far too low for almost any other computer application. Thus, for DNNs, it's possible that maybe analog computation could also work.

Last Seen
12 minutes ago
Reputation
0
Spam
0.000
Last Seen
25 minutes ago
Reputation
0
Spam
0.000
Last Seen
38 minutes ago
Reputation
0
Spam
0.000
Last Seen
18 minutes ago
Reputation
0
Spam
0.000
Last Seen
9 minutes ago
Reputation
0
Spam
0.000
Last Seen
49 minutes ago
Reputation
0
Spam
0.000
Last Seen
56 minutes ago
Reputation
0
Spam
0.000