Illustration of the on-chip classification process with the Yin-Yang dataset. Each symbol represents the spike time delay for various classifying neurons.
Credit: Göltz and Kriener et al. (Heidelberg / Bern)
RESEARCHERS FROM HEIDELBERG AND BERN DEVELOP A NEW TRAINING APPROACH FOR SPIKING NEURAL NETWORKS
Developing a machine that processes information as efficiently as the human brain has been a long-standing research goal towards true artificial intelligence. An interdisciplinary research team at Heidelberg University and the University of Bern (Switzerland) led by Dr Mihai Petrovici is tackling this problem with the help of biologically-inspired artificial neural networks.
Spiking neural networks, which mimic the structure and function of a natural nervous system, represent promising candidates because they are powerful, fast, and energy-efficient. One key challenge is how to train such complex systems. The German-Swiss research team has now developed and successfully implemented an algorithm that achieves such training.
The nerve cells (or neurons) in the brain transmit information using short electrical pulses known as spikes. These spikes are triggered when a certain stimulus threshold is exceeded. Both the frequency with which a single neuron produces such spikes and the temporal sequence of the individual spikes are critical for the exchange of information. “The main difference of biological spiking networks to artificial neural networks is that, because they are using spike-based information processing, they can solve complex tasks such as image recognition and classification with extreme energy efficiency,” states Julian Göltz, a doctoral candidate in Dr Petrovici’s research group.
Both the human brain and the architecturally similar artificial spiking neural networks can only perform at their full potential if the individual neurons are properly connected to one another. But how can brain-inspired – that is, neuromorphic – systems be adjusted to process spiking input correctly? “This question is fundamental for the development of powerful artificial networks based on biological models,” stresses Laura Kriener, also a member of Dr Petrovici’s research team. Special algorithms are required to guarantee that the neurons in a spiking neural network fire at the correct time. These algorithms adjust the connections between the neurons so that the network can perform the required task, such as classifying images with high precision.
The team under the direction of Dr Petrovici developed just such an algorithm. “Using this approach, we can train spiking neural networks to code and transmit information exclusively in single spikes. They thereby produce the desired results especially quickly and efficiently,” explains Julian Göltz. Moreover, the researchers succeeded in implementing a neural network trained with this algorithm on a physical platform – the BrainScaleS-2 neuromorphic hardware platform developed at Heidelberg University.
According to the researchers, the BrainScaleS system processes information up to a thousand times faster than the human brain and needs far less energy than conventional computer systems. It is part of the European Human Brain Project, which integrates technologies like neuromorphic computing into an open platform called EBRAINS. “However, our work is not only interesting for neuromorphic computing and biologically inspired hardware. It also acknowledges the demand from the scientific community to transfer so-called Deep Learning approaches to neuroscience and thereby further unveil the secrets of the human brain,” emphasises Dr Petrovici.
Original Article: SOLVING COMPLEX LEARNING TASKS IN BRAIN-INSPIRED COMPUTERS
The Latest on: Spiking neural networks
- Significant energy savings using neuromorphic hardwareon May 24, 2022 at 11:48 am
New research illustrates neuromorphic technology is up to sixteen times more energy-efficient for large deep learning networks than other AI systems.
- TU Graz and Intel demonstrate significant energy savings using neuromorphic hardwareon May 24, 2022 at 9:44 am
Research published in Nature Machine Intelligence illustrates neuromorphic technology is up to sixteen times more energy-efficient for large deep learning networks than other AI systems.
- KAUST Researchers Create Chip Mimicking Human Brainon May 24, 2022 at 2:34 am
Researchers at the King Abdullah University of Science and Technology (KAUST) are presenting new solutions to achieve the best AI (artificial intelligence) speeds and performances by building a neural ...
- BrainChip partners with Edge Impulse for a platform that mimics the brainon May 19, 2022 at 8:11 am
The companies are aiming to make BrainChip’s neuromorphic technology based on spiking neural networks mainstream. Both companies will combine their technologies to realize faster development ...
- Spiking Neural Networks: Research Projects or Commercial Products?on May 17, 2022 at 5:00 pm
Spiking neural networks (SNNs) often are touted as a way to get close to the power efficiency of the brain, but there is widespread confusion about what exactly that means. In fact, there is ...
- Energy-efficient AI hardware technology via a brain-inspired stashing systemon May 16, 2022 at 5:00 pm
Researchers demonstrate neuromodulation-inspired stashing system for the energy-efficient learning of a spiking neural network using a self-rectifying memristor array Researchers have proposed a ...
- Neuromorphic Computing Will Need Partners To Break Into The Datacenteron May 10, 2022 at 5:00 pm
The initial plan was to get the commercial SoC into the market by 2019, but BrainChip extended the deadline to add the capability to run convolutional neural networks (CNNs) along with spiking neural ...
- Tenstorrent Is Changing the Way We Think About AI Chipson May 2, 2022 at 5:00 pm
On the flip side you've got the spiking artificial neural network, which is a lot less popular and has had a lot less success in in broad applications.” Spiking neural networks (SNNs) more closely ...
- Complete Neural Processor for Edge AIon August 11, 2020 at 6:43 am
Inspired by the biological function of neurons but engineered on a digital logic process, this event-based spiking neural network (SNN) IP is inherently lower power than traditional convolutional ...
via Bing News
The Latest on: Spiking neural networks
via Google News