Researchers in the Jacobs School of Engineering of the University of California, San Diego (UCSD) have developed a neuroinspired hardware-software co-design approach that could make neural network training more energy-efficient and faster.
For example, the research could make it possible to train neural networks on low-power devices such as smartphones, laptops, and embedded devices.
The researchers developed hardware and algorithms that allow neural network computations to be performed directly in the memory unit, eliminating the need to repeatedly shuffle data.
The hardware component is a highly energy-efficient type of non-volatile memory that consumes 10 to 100 times less energy than conventional memory technologies.
Said UCSD's Duygu Kuzum, "Overall, we can expect a gain of a hundred- to a thousand-fold in terms of energy consumption following our approach."
From Jacobs School of Engineering, University of California, San Diego
View Full Article
Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA