Artificial neural networks model facial processing in autism
Many of us easily recognize emotions expressed in others' faces. A smile may mean happiness, while a frown may indicate anger. Autistic people often have a more difficult time with this task. It's unclear why. But new research, published June 15 in The Journal of Neuroscience, sheds light on the inner workings of the brain to suggest an answer. And it does so using a tool that opens new pathways to modeling the computation in our heads: artificial intelligence.
Researchers have primarily suggested two brain areas where the differences might lie. A region on the side of the primate (including human) brain called the inferior temporal (IT) cortex contributes to facial recognition. Meanwhile, a deeper region called the amygdala receives input from the IT cortex and other sources and helps process emotions.
Kohitij Kar, a research scientist in the lab of MIT Professor James DiCarlo, hoped to zero in on the answer. (DiCarlo, the Peter de Florez Professor in the Department of Brain and Cognitive Sciences, is also a member of the McGovern Institute for Brain Research and director of MIT's Quest for Intelligence.)
Related Posts