Intel showcases neuromorphic computing innovations at Labs Day

Today during its Labs Day expo, Intel shared an update on progress within the Intel Neuromorphic Research Community (INRC), the ecosystem of over 100 academic groups, government labs, research institutions, and companies founded in 2018 to further neuromorphic computing. Intel and the INRC claim to have achieved breakthroughs in applying neuromorphic hardware to a range of applications, from gesture and voice recognition to autonomous drone navigation.

Along with Intel, researchers at IBM, HP, MIT, Purdue, and Stanford hope to leverage neuromorphic computing — circuits that mimic the human nervous system’s biology — to develop supercomputers 1,000 times more powerful than any today. Custom-designed neuromorphic chips excel at constraint satisfaction problems, which require evaluating a large number of potential solutions to identify the one or few that satisfy specific constraints. They’ve also been shown to rapidly identify the shortest paths in graphs and perform approximate image searches, as well as mathematically optimizing specific objectives over time in real-world optimization problems.

Intel’s 14-nanometer Loihi chip — its flagship neuromorphic computing hardware — contains over 2 billion transistors and 130,000 artificial neurons with 130 million synapses. Uniquely, the chip features a programmable microcode engine for on-die training of asynchronous spiking neural networks (SNNs), or AI models that incorporate time into their operating model such that the components of the model don’t process input data simultaneously. Loihi processes information up to 1,000 times faster and 10,000 more efficiently than traditional processors, and it can solve certain types of optimization problems with gains in speed and energy efficiency greater than three orders of magnitude, according to Intel. Moreover, Loihi maintains real-time performance results and uses only 30% more power when scaled up 50 times, whereas traditional hardware uses 500% more power to do the same.

Some members of the INRC see business use cases for chips like Loihi. Lenovo, Logitech, Mercedes-Benz, and Prophesee hope to use it to enable things like more efficient and adaptive robotics, rapid search of databases for similar content, and edge devices that make planning and optimization decisions in real time.

For instance, Intel this morning revealed that Accenture tested the ability to recognize voice commands on Loihi versus a standard graphics card and found the chip was up to 1,000 times more energy efficient and responded up to 200 milliseconds faster with comparable accuracy. Accenture also found that Loihi is highly adept at learning and recognizing individualized gestures, processing input from a camera in just a few exposures.

Intel neuromorphic

Intel says that through the INRC, Mercedes-Benz is exploring how Accenture’s results could apply to real-world scenarios, such as adding new voice commands to in-vehicle infotainment systems. Other Intel partners are investigating how Loihi could be used in products like interactive smart homes and touchless displays.

Beyond gesture and voice recognition, Intel reports that Loihi performs well with datacenter tasks such as retrieving images from databases. The company’s research partners demonstrated that the chip could generate image feature vectors over 3 times more energy-efficiently than a processor or graphics card while maintaining the same level of accuracy. (Features are individual independent variables that act like an input in AI systems.) In addition, Intel discovered that Loihi can solve optimization and search problems over 1,000 times more efficiently and 100 times faster compared to traditional processors, lending weight to work published earlier this year claiming to show Loihi’s ability to search feature vectors in million-image databases 24 times faster and with 30 times lower energy than a processor.

On the robotics front, Intel reports that researchers from Rutgers and TU Delft completed new demonstrations of robotic navigation and micro-drone control applications running on Loihi. TU Delft’s drone performed landings with a spiking neural network. Meanwhile, Rutgers found its Loihi solutions required 75 times less power than conventional mobile graphics cards without perceivable losses in performance. In fact, in a study accepted to the 2020 Conference on Robot Learning, the Rutgers team concluded that Loihi could learn tasks with 140 times lower energy consumption compared with a mobile graphics chip.

Intel and partners also conducted two state-of-the-art neuromorphic robotics demonstrations during Labs Day. For a project collaborating with researchers from ETH Zurich, Intel showed Loihi controlling a horizon-tracking drone with just 200 microseconds of visual processing latency, representing what the company claims is a 1,000 times gain in combined efficiency and speed compared to previous solutions. Separately, Intel and researchers from the Italian Institute of Technology showed that multiple functions could run on a Loihi chip built into the latter’s iCub robotics platform. Among the functions were object recognition with fast, few-shot learning (i.e., learning that requires only a few examples to reinforce concepts), spatial awareness from each learned object, and real-time decision-making in response to human interactions.

Mike Davies, the director of Intel’s neurmorphic computing lab, told VentureBeat in a phone interview that he believes a major challenge standing in the way of neuromorphic chip commercialization is a lack of a programming model for neuromorphic architectures. With neuromorphic hardware, programmers must anticipate how algorithms will behave within the chip’s unique environment and come up with schemes to represent legacy data.

“If you’ve grown up knowing nothing but the computer architecture model, it’s sort of embedded — it’s coded line by line as opposed to this more biologically-inspired model of computing … involving hundreds of thousands if not millions of interacting processing units,” Davies said. “That’s why I think the robotics domain in general is really exciting, but maybe not the nearest term-application for neuromorphic computing. When I think in terms of near-term, a feasible concrete goal is to enable better audio. There’s a number of applications there that we’re looking at I think will be exciting, things like adapting in real time to a specific speaker.”

Intel neuromorphic

Intel says that as INRC grows, it will continue investing in the collaborative effort and working with members to provide support and explore where neuromorphic computing can add real-world value. Moreover, the company says it will continue to draw on learnings from the INRC and incorporate them into the development of Intel’s next-generation neuromorphic research chip, which it wasn’t ready to discuss today.

Earlier this year, Intel announced the general readiness of Pohoiki Springs, a powerful self-contained neuromorphic system that’s about the size of five standard servers. The company gave access to members of the Intel Neuromorphic Research Community via the cloud using Intel’s Nx SDK and community-contributed software components, providing a tool to scale up research and explore ways to accelerate workloads that run slowly on today’s conventional architectures.

Intel claims Pohoiki Springs, which was originally announced in July 2019, is similar in neural capacity to the brain of a small mammal, with 768 Loihi chips and 100 million neurons spread across 24 Arria10 FPGA Nahuku expansion boards (containing 32 chips each) that operate at under 500 watts. This is ostensibly a step on the path to supporting larger and more sophisticated neuromorphic workloads. Intel recently demonstrated that the chips can be used to “teach” an AI model to distinguish between 10 different scents, control a robotic assistive arm for wheelchairs, and power touch-sensing robotic “skin.”

In October, Intel inked a three-year agreement with Sandia National Laboratories to explore the value of neuromorphic computing for scaled-up AI problems as a part of the U.S. Department of Energy’s (DOE) Advanced Scientific Computing Research program. In somewhat related news, the company recently entered into an agreement with Argonne National Laboratory to develop and design microelectronics technologies such as exascale, neuromorphic, and quantum computing.

Enterprise

Articles You May Like

Intel Announces New Initiatives for AI PC Developers and Hardware Vendors
N. Korea-linked Kimsuky Shifts to Compiled HTML Help Files in Ongoing Cyberattacks
The UK is regulating memes about crypto and other investments to curb scams from ‘finfluencers’
WhatsApp to Get a New Feature to Set All Media Uploads to HD Quality: Report
WhatsApp Spotted Working on AI-Powered Image Editor, Ask Meta AI Feature

Leave a Reply

Your email address will not be published. Required fields are marked *