Artificial neural networks model face processing in autism
A new computational model could explain differences in recognizing facial emotions.
A new computational model could explain differences in recognizing facial emotions.
Postbac Jessica Chomik-Morales hopes to inspire the next generation of Spanish-speaking scientists with her podcast, “Mi Ultima Neurona.”
Martin Luther King Jr. Scholar bridges disciplines to translate vision into elegant math and neuroscience theory.
Researchers find similarities between how some computer-vision systems process images and how humans see out of the corners of our eyes.
MIT neuroscientists have identified a population of neurons in the human brain that respond to singing but not other types of music.
Professor and cognitive neuroscientist recognized for groundbreaking work on the functional organization of the human brain.
MIT neuroscientists have developed a computer model that can answer that question as well as the human brain.
Computational modeling shows that both our ears and our environment influence how we hear.
Study suggests this area of the visual cortex emerges much earlier in development than previously thought.
A new machine-learning system helps robots understand and perform certain social interactions.
Neuroscientists find the internal workings of next-word prediction models resemble those of language-processing centers in the brain.
When asked to classify odors, artificial neural networks adopt a structure that closely resembles that of the brain’s olfactory circuitry.
We seem to be wired to calculate not the shortest path but the “pointiest” one, facing us toward our destination as much as possible.
Brain and cognitive sciences professor will lead the Institute’s interdisciplinary initiative to advance research in natural and artificial intelligence.
EECS faculty head of artificial intelligence and decision making honored for significant and extended contributions to the field of AI.