
Showing a neuromorphic advantage, both the IBM TrueNorth and Intel Loihi neuromorphic chips observed by Sandia National Laboratories researchers were significantly more energy efficient than conventional computing hardware. The graph shows Loihi can perform about 10 times more calculations per unit of energy than a conventional processor. Energy is the limiting factor — more chips can be inserted to run things in parallel, thus faster, but the same electric bill occurs whether it is one computer doing everything or 10,000 computers doing the work.
Image courtesy of Sandia National Laboratories.
With the insertion of a little math, Sandia National Laboratories researchers have shown that neuromorphic computers, which synthetically replicate the brain’s logic, can solve more complex problems than those posed by artificial intelligence and may even earn a place in high-performance computing.
A random walk diffusion model based on data from Sandia National Laboratories algorithms running on an Intel Loihi neuromorphic platform. Video courtesy of Sandia National Laboratories.
The findings, detailed in a recent article in the journal Nature Electronics, show that neuromorphic simulations using the statistical method called random walks can track X-rays passing through bone and soft tissue, disease passing through a population, information flowing through social networks and the movements of financial markets, among other uses, said Sandia theoretical neuroscientist and lead researcher James Bradley Aimone.
“Basically, we have shown that neuromorphic hardware can yield computational advantages relevant to many applications, not just artificial intelligence to which it’s obviously kin,” said Aimone. “Newly discovered applications range from radiation transport and molecular simulations to computational finance, biology modeling and particle physics.”
In optimal cases, neuromorphic computers will solve problems faster and use less energy than conventional computing, he said.
The bold assertions should be of interest to the high-performance computing community because finding capabilities to solve statistical problems is of increasing concern, Aimone said.
“These problems aren’t really well-suited for GPUs [graphics processing units], which is what future exascale systems are likely going to rely on,” Aimone said. “What’s exciting is that no one really has looked at neuromorphic computing for these types of applications before.”
Sandia engineer and paper author Brian Franke said, “The natural randomness of the processes you list will make them inefficient when directly mapped onto vector processors like GPUs on next-generation computational efforts. Meanwhile, neuromorphic architectures are an intriguing and radically different alternative for particle simulation that may lead to a scalable and energy-efficient approach for solving problems of interest to us.”
Franke models photon and electron radiation to understand their effects on components.
The team successfully applied neuromorphic-computing algorithms to model random walks of gaseous molecules diffusing through a barrier, a basic chemistry problem, using the 50-million-chip Loihi platform Sandia received approximately a year and a half ago from Intel Corp., said Aimone. “Then we showed that our algorithm can be extended to more sophisticated diffusion processes useful in a range of applications.”
The claims are not meant to challenge the primacy of standard computing methods used to run utilities, desktops and phones. “There are, however, areas in which the combination of computing speed and lower energy costs may make neuromorphic computing the ultimately desirable choice,” he said.
Unlike the difficulties posed by adding qubits to quantum computers — another interesting method of moving beyond the limitations of conventional computing — chips containing artificial neurons are cheap and easy to install, Aimone said.
There can still be a high cost for moving data on or off the neurochip processor. “As you collect more, it slows down the system, and eventually it won’t run at all,” said Sandia mathematician and paper author William Severa. “But we overcame this by configuring a small group of neurons that effectively computed summary statistics, and we output those summaries instead of the raw data.”
Severa wrote several of the experiment’s algorithms.
Like the brain, neuromorphic computing works by electrifying small pin-like structures, adding tiny charges emitted from surrounding sensors until a certain electrical level is reached. Then the pin, like a biological neuron, flashes a tiny electrical burst, an action known as spiking. Unlike the metronomical regularity with which information is passed along in conventional computers, said Aimone, the artificial neurons of neuromorphic computing flash irregularly, as biological ones do in the brain, and so may take longer to transmit information. But because the process only depletes energies from sensors and neurons if they contribute data, it requires less energy than formal computing, which must poll every processor whether contributing or not. The conceptually bio-based process has another advantage: Its computing and memory components exist in the same structure, while conventional computing uses up energy by distant transfer between these two functions. The slow reaction time of the artificial neurons initially may slow down its solutions, but this factor disappears as the number of neurons is increased so more information is available in the same time period to be totaled, said Aimone.
The process begins by using a Markov chain — a mathematical construct where, like a Monopoly gameboard, the next outcome depends only on the current state and not the history of all previous states. That randomness contrasts, said Sandia mathematician and paper author Darby Smith, with most linked events. For example, he said, the number of days a patient must remain in the hospital are at least partially determined by the preceding length of stay.
Beginning with the Markov random basis, the researchers used Monte Carlo simulations, a fundamental computational tool, to run a series of random walks that attempt to cover as many routes as possible.
“Monte Carlo algorithms are a natural solution method for radiation transport problems,” said Franke. “Particles are simulated in a process that mirrors the physical process.”
The energy of each walk was recorded as a single energy spike by an artificial neuron reading the result of each walk in turn. “This neural net is more energy efficient in sum than recording each moment of each walk, as ordinary computing must do. This partially accounts for the speed and efficiency of the neuromorphic process,” said Aimone. More chips will help the process move faster using the same amount of energy, he said.
The next version of Loihi, said Sandia researcher Craig Vineyard, will increase its current chip scale from 128,000 neurons per chip to up to one million. Larger scale systems then combine multiple chips to a board.
“Perhaps it makes sense that a technology like Loihi may find its way into a future high-performance computing platform,” said Aimone. “This could help make HPC much more energy efficient, climate-friendly and just all around more affordable.”
Original Article: Neuromorphic computing widely applicable, Sandia researchers show
More from: Sandia National Laboratories
The Latest on: Neuromorphic computing
- Animal Learning and Intelligence Newson June 30, 2022 at 4:59 pm
another step toward completing the goal of neuromorphic computing ... May 19, 2022 — Researchers have found that eating cranberries could improve memory, ward off dementia, and reduce 'bad ...
- Photonic synapses with low power consumption and high sensitivityon June 30, 2022 at 9:53 am
Neuromorphic photonics/electronics is the future of ultralow energy intelligent computing and artificial intelligence (AI). In recent years, inspired by the human brain, artificial neuromorphic ...
- Kaspersky invests in development of neuromorphic processors to be used in next-gen smart deviceson June 30, 2022 at 4:17 am
Kaspersky has become a shareholder of Motive Neuromorphic Technologies, a company specializing in neuromorphic computing technologies, with a 15% stake. The organizations’ joint development efforts ...
- Rensselaer Polytechnic announces institute for advanced computingon June 24, 2022 at 10:42 am
DAIC will develop solutions to complex and challenging problems in science, engineering, technology, and human behavior by advancing network systems, including those that are a hybrid of conventional, ...
- Material Mimics Neural Signals for Brain-Like Computer Processingon June 22, 2022 at 5:00 pm
Key to the material is its tunable electronic instabilities, which can allow for what’s called neuromorphic computing—or computing that replicates the brain’s capabilities and efficiencies. This is ...
- Rensselaer Announces Institute for Data, Artificial Intelligence, and Computingon June 21, 2022 at 12:06 pm
Rensselaer Polytechnic Institute President Shirley Ann Jackson today announced the launch of the Rensselaer Institute for Data, Artificial Intelligence, and Computing (DAIC). The goal of the Institute ...
- Neuromorphic Computing Market Business Opportunities by Leading Players 2022, Business Strategies, Development Plans, Trends, Size Forecast till 2030on June 20, 2022 at 2:21 am
The Global Neuromorphic Computing Market 2022 is segmented as per type of product and application. Each segment is carefully analyzed for exploring its market potential. All of the segments are ...
- Neuromorphic Computing Market Size, Share, Trends, Global Industry Analysis, Growth, Business Overview, Forecast 2022-2029 – VMRon June 13, 2022 at 5:14 pm
Jun 14, 2022 (Heraldkeepers) -- New Jersey, United States,- The Global Neuromorphic Computing Market research includes an in-depth analysis of key geographical trends, market dynamics, and global ...
- Neuromorphic Chip Market is estimated to Rise at a CAGR of 43.8% during the Forecast Period, TMR Studyon June 13, 2022 at 10:20 am
As a result, the introduction of neuromorphic computing is critical in these industries and is also expected to drive revenue opportunities for the neuromorphic chip companies. Spiking neural ...
- The Future of Digitalisation: Quantum and Neuromorphic Computingon June 7, 2022 at 11:07 am
Whether it is automatic speech recognition, self-driving cars or medical research: our need for computing power is constantly on the rise. Yet, the possibilities for increasing the performance of ...
via Bing News
The Latest on: Neuromorphic computing
via Google News
Add Comment