Lawrence Udeigwe uses elegant math to understand complex systems of the brain
Martin Luther King Jr. Scholar bridges disciplines to translate vision into elegant math and neuroscience theory.
Martin Luther King Jr. Scholar bridges disciplines to translate vision into elegant math and neuroscience theory.
Researchers find similarities between how some computer-vision systems process images and how humans see out of the corners of our eyes.
MIT neuroscientists have identified a population of neurons in the human brain that respond to singing but not other types of music.
Professor and cognitive neuroscientist recognized for groundbreaking work on the functional organization of the human brain.
MIT neuroscientists have developed a computer model that can answer that question as well as the human brain.
Computational modeling shows that both our ears and our environment influence how we hear.
Study suggests this area of the visual cortex emerges much earlier in development than previously thought.
A new machine-learning system helps robots understand and perform certain social interactions.
Neuroscientists find the internal workings of next-word prediction models resemble those of language-processing centers in the brain.
When asked to classify odors, artificial neural networks adopt a structure that closely resembles that of the brain’s olfactory circuitry.
We seem to be wired to calculate not the shortest path but the “pointiest” one, facing us toward our destination as much as possible.
Brain and cognitive sciences professor will lead the Institute’s interdisciplinary initiative to advance research in natural and artificial intelligence.
EECS faculty head of artificial intelligence and decision making honored for significant and extended contributions to the field of AI.
Adding a module that mimics part of the brain can prevent common errors made by computer vision models.
What's SSUP? The Sample, Simulate, Update cognitive model developed by MIT researchers learns to use tools like humans do.