advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Engineers at Stanford have created a tiny AI chip for edge devices

Despite being the big bad of sci-fi for the longest time, artificial intelligence is now a part of our everyday lives. Whether you’re using a navigation app to find the fastest way to get to work or taking a photo, AI is pervasive.

However, AI does have some restrictions as regards its deployment, especially as regards energy efficiency. This limits where AI can be deployed especially at the edge as power restrictions need to be taken into consideration.

Engineers at Stanford University have developed a novel AI chip that improves the energy efficiency of the technology.

The chip started with the concept of removing energy consumption when data moves from storage to the compute unit. To do this engineers created a resistive random-access memory (RRAM) chip that does the AI processing within its memory. The solution is called a compute in memory chip the engineers have dubbed it NeuRRAM.

“Having those calculations done on the chip instead of sending information to and from the cloud could enable faster, more secure, cheaper, and more scalable AI going into the future, and give more people access to AI power,” said Willard R. and Inez Kerr Bell Professor in the School of Engineering, H.-S. Philip Wong.

The engineers store the AI models for the chip on non-volatile RRAM. This type of memory is said to be great for AI applications although further testing and implementation for the purpose still needs to be explored.

Importantly, this memory uses very little power, thanks to the close proximity between the data and the compute power. The engineers at Stanford say that their NeuRRAM chip does more work with limited battery power than current AI chips can.

“To show the accuracy of NeuRRAM’s AI abilities, the team tested how it functioned on different tasks. They found that it’s 99% accurate in letter recognition from the MNIST dataset, 85.7% accurate on image classification from the CIFAR-10 dataset, 84.7% accurate on Google speech command recognition and showed a 70% reduction in image-reconstruction error on a Bayesian image recovery task,” Stanford University wrote in a blog post.

While NeuRRAM is still only a physical proof-of-concept, its potential is exciting. Small, low power AI chips could, for example, be deployed in crop fields to monitor soil conditions in real-time and make adjusts to irrigation.

“By having these kinds of smart electronics that can be placed almost anywhere, you can monitor the changing world and be part of the solution. These chips could be used to solve all kinds of problems from climate change to food security,” said Wong.

You can read more about NeuRAMM in the article, A compute-in-memory chip based on resistive random-access memory that was published in Nature last week.

[Image – CC 0 Pixabay]

advertisement

About Author

advertisement

Related News

advertisement