New!Stay informed with our freeemail newsletter.
advertisement
Science News
from research organizations

Spiking tool improves artificially intelligent devices

Technique to benefit smart phones, self-driving cars, automated image interpretation

Date:
February 27, 2019
Source:
DOE/Sandia National Laboratories
Summary:
The aptly named software package Whetstone enables neural computer networks to process information up to 100 times more efficiently than current standards, making possible an increased use of artificial intelligence in mobile phones, self-driving cars, and image interpretation.
Share:
advertisement

FULL STORY

Whetstone, a software tool that sharpens the output of artificial neurons, has enabled neural computer networks to process information up to a hundred times more efficiently than the current industry standard, say the Sandia National Laboratories researchers who developed it.

The aptly named software, which greatly reduces the amount of circuitry needed to perform autonomous tasks, is expected to increase the penetration of artificial intelligence into markets for mobile phones, self-driving cars and automated interpretation of images.

"Instead of sending out endless energy dribbles of information," Sandia neuroscientist Brad Aimone said, "artificial neurons trained by Whetstone release energy in spikes, much like human neurons do."

最大的人工智能公司爱博网投官方网站produced spiking tools for their own products, but none are as fast or efficient as Whetstone, says Sandia mathematician William Severa. "Large companies are aware of this process and have built similar systems, but often theirs work only for their own designs. Whetstone will work on many neural platforms."

The open-source code was recently featured in a technical article inNature Machine Intelligenceand has been proposed by Sandia for a patent.

How to sharpen neurons

Artificial neurons are basically capacitors that absorb and sum electrical charges they then release in tiny bursts of electricity. Computer chips, termed "neuromorphic systems," assemble neural networks into large groupings that mimic the human brain by sending electrical stimuli to neurons firing in no predictable order. This contrasts with a more lock-step procedure used by desktop computers with their pre-set electronic processes.

advertisement

Because of their haphazard firing, neuromorphic systems often are slower than conventional computers but also require far less energy to operate. They also require a different approach to programming because otherwise their artificial neurons fire too often or not often enough, which has been a problem in bringing them online commercially.

Whetstone, which functions as a supplemental computer code tacked on to more conventional software training programs, trains and sharpens artificial neurons by leveraging those that spike only when a sufficient amount of energy -- read, information -- has been collected. The training has proved effective in improving standard neural networks and is in process of being evaluated for the emerging technology of neuromorphic systems.

Catherine Schuman, a neural network researcher at Oak Ridge National Laboratories, said, "Whetstone is an important tool for the neuromorphic community. It provides a standardized way to train traditional neural networks that are amenable for deployment on neuromorphic systems, which had previously been done in an ad hoc manner."

The strict teacher

The Whetstone process, Aimone said, can be visualized as controlling a class of talkative elementary school students who are tasked with identifying an object on their teacher's desk. Prior to Whetstone, the students sent a continuous stream of sensor input to their formerly overwhelmed teacher, who had to listen to all of it -- ¬every bump and giggle, so to speak -- before passing a decision into the neural system. This huge amount of information often requires cloud-based computation to process, or the addition of more local computing equipment combined with a sharp increase in electrical power. Both options increase the time and cost of commercial artificial intelligence products, lessen their security and privacy and make their acceptance less likely.

Under Whetstone, their newly strict teacher only pays attention to a simple "yes" or "no" measurement of each student -- when they raise their hands with a solution, rather than to everything they are saying. Suppose, for example, the intent is to identify whether a piece of green fruit on the teacher's desk is an apple. Each student is a sensor that may respond to a different quality of what may be an apple: Does it have the correct quality of smell, taste, texture and so on? And while the student who looks for red may vote "no" the other student who looks for green would vote "yes." When the number of answers, either yay or nay, is electrically high enough to trigger the neuron's capacity to fire, that simple result, instead of endless waffling, enters the overall neural system.

advertisement

While Whetstone simplifications could potentially increase errors, the overwhelming number of participating neurons -- often over a million¬¬ -- provide information that statistically make up for the inaccuracies introduced by the data simplification, Severa said, responsible for the mathematics of the program.

"Combining overly detailed internal information with the huge number of neurons reporting in is a kind of double booking," he says. "It's unnecessary. Our results tell us the classical way -- calculating everything without simplifying -- is wasteful. That is why we can save energy and do it well."

Patched programs work best

The software program works best when patched in to programs meant to train new artificial-intelligence equipment, so Whetstone doesn't have to overcome learned patterns with already established energy minimums.

The work is a continuation of a Sandia project called Hardware Acceleration of Adaptive Neural Algorithms, which explored neural platforms in work supported by Sandia's Laboratory Directed Research and Development office. The current work is supported by the Department of Energy's Advanced Simulation and Computing Program.


Story Source:

Materialsprovided byDOE/Sandia National Laboratories.注:内容可以编辑风格and length.


Journal Reference:

  1. William Severa, Craig M. Vineyard, Ryan Dellana, Stephen J. Verzi, James B. Aimone.Training deep neural networks for binary communication with the Whetstone method.Nature Machine Intelligence, 2019; 1 (2): 86 DOI:10.1038/s42256-018-0015-y

Cite This Page:

DOE/Sandia National Laboratories. "Spiking tool improves artificially intelligent devices." ScienceDaily. ScienceDaily, 27 February 2019. /releases/2019/02/190227155808.htm>.
DOE/Sandia National Laboratories. (2019, February 27). Spiking tool improves artificially intelligent devices.ScienceDaily. Retrieved October 27, 2023 from www.koonmotors.com/releases/2019/02/190227155808.htm
DOE/Sandia National Laboratories. "Spiking tool improves artificially intelligent devices." ScienceDaily. www.koonmotors.com/releases/2019/02/190227155808.htm (accessed October 27, 2023).

Explore More
from ScienceDaily

RELATED STORIES