Face-specific brain area responds to faces even in people born blind
Study finds that the fusiform face area is active when blind people touch 3D models of faces.
Study finds that the fusiform face area is active when blind people touch 3D models of faces.
An artificial intelligence tool lets users edit generative adversarial network models with simple copy-and-paste commands.
Through innovation in software and hardware, researchers move to reduce the financial and environmental costs of modern artificial intelligence.
Recent advances give theoretical insight into why deep learning networks are successful.
A global team of researchers searches for insights during a weeklong virtual “datathon.”
Music gesture artificial intelligence tool developed at the MIT-IBM Watson AI Lab uses body movements to isolate the sounds of individual instruments.
Researchers capture our shifting gaze in a model that suggests how to prioritize visual information based on viewing duration.
UROP students explore applications in robotics, health care, language understanding, and nuclear engineering.
The MIT-IBM Watson AI Lab is funding 10 research projects aimed at addressing the health and economic consequences of the pandemic.
Researchers test how far artificial intelligence models can go in dreaming up varied poses and colors of objects and animals in photos.
Researchers unveil a pruning algorithm to make artificial intelligence applications run faster.
Researchers show that computers can “write” algorithms that adapt to radically different environments better than algorithms designed by humans.
Translated into sound, SARS-CoV-2 tricks our ear in the same way the virus tricks our cells.
With help from artificial intelligence, researchers identify hidden power of vitamin A and ordinary chewing gum glaze.
Ten staff members recognized for dedication to School of Science and to MIT.