acm-header
Sign In

Communications of the ACM

ACM TechNews

Neural Network Model Unravels People with Autism's Difficulty with Facial Expressions


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
four faces with different expressions

Credit: Anxiety

Researchers at Japan's Tohoku University developed an artificial neural network model that can help explain the difficulty people with autism spectrum disorder have in interpreting facial expressions.

The model takes into account predictive processing theory, which states that the brain predicts the next sensory stimulus and adapts using such sensory information as facial expressions to reduce errors in its predictions. The model learned to predict the movement of parts of the face using videos of facial expressions and was able to generalize facial expressions not provided during training.

However, the model's ability to generalize decreased with the heterogeneity of activity in the neural population, restraining emotional cluster formation in higher-level neurons, similar to what occurs with autism spectrum disorder.

The researchers describe their work in "Neural Network Modeling of Altered Facial Expression Recognition in Autism Spectrum Disorders based on Predictive Processing Framework," published in Scientific Reports.

"The study will help advance developing appropriate intervention methods for people who find it difficult to identify emotions," says Yuta Takahashi of the Department of Psychiatry at Tohoku University Hospital.

From News-Medical.net
View Full Article

 

Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA


 

No entries found