Michael Grothaus reports in Fast Company:
Eyeriss allows cores to talk to each other directly so they don’t need to use up a phone’s main memory. But perhaps the biggest advantage of Eyeriss is that it would mean smartphones would no longer need to talk to a distant server over an Internet connection. All algorithms could be run locally, meaning personal digital assistant services like Google Now and Siri would be much speedier
In computing, a neural network is a collection of processors that mimic the way the human brain works. Simply put, a neural network analyzes data, learns about it, and then decides how to act upon that data. It is because of neural networks, which don’t rely on standard rule-based programing, that we have software that can recognize and distinguish between different human voices or that can recognize individual objects. Both Google Now and Apple’s Siri are examples of consumer neural networks.
The problem with neural networks is that they take a huge amount of processing power, typically in the form of power-hungry GPUs. Because of this, neural network tasks have been relegated to distant servers plugged into a wall somewhere and not on our mobile devices. The only way our mobile devices can currently access the power of these neural networks is by sending data from our smartphones over the Internet to a company’s neural network servers, which then crunches the data, and sends an answer back over the Internet to our phones. This is the precise reason why Google Now or Siri won’t work without an Internet connection—and why they sometimes have to "think" a long time to get you an answer..
But all that could soon be an inconvenience of the past. MIT has announced that it has designed a new chip that can perform powerful artificial intelligence tasks right on your phone, reports MIT News. Called Eyeriss, the chip is 10 times as efficient as current-mobile GPUs, meaning it could process highly complex neural network algorithms without causing your smartphone to take a battery hit.
The Eyeriss chip has 168 cores, each capable of operating as a "neuron." Each core has its own memory where it can store and analyze data. If it needs the help of another core, it compresses that data before sharing it with another core. Eyeriss allows cores to talk to each other directly so they don’t need to use up a phone’s main memory.
But perhaps the biggest advantage of Eyeriss is that it would mean smartphones would no longer need to talk to a distant server over an Internet connection. All algorithms could be run locally, meaning personal digital assistant services like Google Now and Siri would be much speedier and work even when you didn’t have an Internet connection. It also means that your data and queries would be kept private since they aren’t being sent over a network connection to be processed.
How do you like that, Siri?
0 comments:
Post a Comment