Home » artificial neural networks » Facial recognition can arise spontaneously in completely untrained deep neural networks
For Your Consideration

Facial recognition can arise spontaneously in completely untrained deep neural networks

via KAIST

via KAIST

A KAIST team shows that primitive visual selectivity of faces can arise spontaneously in completely untrained deep neural networks

Researchers have found that higher visual cognitive functions can arise spontaneously in untrained neural networks. A KAIST research team led by Professor Se-Bum Paik from the Department of Bio and Brain Engineering has shown that visual selectivity of facial images can arise even in completely untrained deep neural networks.

This new finding has provided revelatory insights into mechanisms underlying the development of cognitive functions in both biological and artificial neural networks, also making a significant impact on our understanding of the origin of early brain functions before sensory experiences.

The study published in Nature Communications on December 16 demonstrates that neuronal activities selective to facial images are observed in randomly initialized deep neural networks in the complete absence of learning, and that they show the characteristics of those observed in biological brains.

The ability to identify and recognize faces is a crucial function for social behavior, and this ability is thought to originate from neuronal tuning at the single or multi-neuronal level. Neurons that selectively respond to faces are observed in young animals of various species, and this raises intense debate whether face-selective neurons can arise innately in the brain or if they require visual experience.

Using a model neural network that captures properties of the ventral stream of the visual cortex, the research team found that face-selectivity can emerge spontaneously from random feedforward wirings in untrained deep neural networks. The team showed that the character of this innate face-selectivity is comparable to that observed with face-selective neurons in the brain, and that this spontaneous neuronal tuning for faces enables the network to perform face detection tasks.

These results imply a possible scenario in which the random feedforward connections that develop in early, untrained networks may be sufficient for initializing primitive visual cognitive functions.

Professor Paik said, “Our findings suggest that innate cognitive functions can emerge spontaneously from the statistical complexity embedded in the hierarchical feedforward projection circuitry, even in the complete absence of learning”.

He continued, “Our results provide a broad conceptual advance as well as advanced insight into the mechanisms underlying the development of innate functions in both biological and artificial neural networks, which may unravel the mystery of the generation and evolution of intelligence.” This work was supported by the National Research Foundation of Korea (NRF) and by the KAIST singularity research project.

 

Original Article: Face Detection in Untrained Deep Neural Networks?

More from: Korea Advanced Institute of Science and Technology 

 

The Latest on: Untrained neural networks

  • Performance of lateral flow tests for detecting SARS-CoV-2 in children falls short of minimum criteria
    on January 19, 2022 at 5:44 am

    None of the included studies assessed sample collection by untrained people or self-testing, which likely worsens performance. And the findings might not be applicable to future SARS-CoV-2 ...

  • The free-energy principle explains the brain
    on January 14, 2022 at 6:57 am

    The RIKEN Center for Brain Science (CBS) in Japan, along with colleagues, has shown that the free-energy principle can explain how neural networks are optimized for efficiency. Published in the ...

  • The free-energy principle explains the brain
    on January 13, 2022 at 4:01 pm

    Face Detection in Untrained Deep Neural Networks? Dec. 21, 2021 — Researchers have found that higher visual cognitive functions can arise spontaneously in untrained neural networks.

  • Researchers Reveal "Surprisingly Simple" Smell Arithmetic
    on January 11, 2022 at 2:12 am

    Smell it inside or outside; summer or winter; in a coffee shop with a scone; in a pizza parlor with pepperoni — even at a pizza parlor with a scone! — coffee smells like coffee. Why don’t other smells ...

  • Face detection in untrained deep neural networks
    on December 21, 2021 at 9:11 am

    Researchers have found that higher visual cognitive functions can arise spontaneously in untrained neural networks. A KAIST research team led by Professor Se-Bum Paik from the Department of Bio ...

  • Peer-Reviewed Publication
    on December 20, 2021 at 4:00 pm

    Researchers have found that higher visual cognitive functions can arise spontaneously in untrained neural networks. A KAIST research team led by Professor Se-Bum Paik from the Department of Bio ...

  • MLCD: A Unified Software Package for Cancer Diagnosis
    on December 9, 2021 at 7:58 am

    A software package has been developed that uses classifiers to predict regions of interest to semantically segment them using a deep neural net and to classify ... We will, in future work, also ...

  • Deepfakes: What are they and why would I make one?
    on August 1, 2019 at 9:13 am

    In a nutshell, neural networks are a type of machine learning ... or that might seem real to the untrained eye. The vast majority of deepfakes circulating the Internet are featuring celebrities ...

  • Topic: neural nets
    on November 18, 2014 at 12:32 pm

    The software company Wolfram Research is launching a public repository for trained and untrained neural network models. The Wolfram Neural Net Repository builds on the company’s Wolfram Language ...

via Bing News

The Latest on: Untrained neural networks

via Google News

Subscribe to Our Categories

Add Comment

Click here to post a comment

Your thoughtful comments are most welcome!

This site uses Akismet to reduce spam. Learn how your comment data is processed.