Toward a machine learning model that can reason about everyday actions
Researchers train a model to reach human-level performance at recognizing abstract concepts in video.
Researchers train a model to reach human-level performance at recognizing abstract concepts in video.
Study finds that the fusiform face area is active when blind people touch 3D models of faces.
An artificial intelligence tool lets users edit generative adversarial network models with simple copy-and-paste commands.
Through innovation in software and hardware, researchers move to reduce the financial and environmental costs of modern artificial intelligence.
Recent advances give theoretical insight into why deep learning networks are successful.
A global team of researchers searches for insights during a weeklong virtual “datathon.”
Music gesture artificial intelligence tool developed at the MIT-IBM Watson AI Lab uses body movements to isolate the sounds of individual instruments.
Ion-based technology may enable energy-efficient simulations of the brain’s learning process, for neural network AI systems.
Researchers capture our shifting gaze in a model that suggests how to prioritize visual information based on viewing duration.
UROP students explore applications in robotics, health care, language understanding, and nuclear engineering.
The MIT-IBM Watson AI Lab is funding 10 research projects aimed at addressing the health and economic consequences of the pandemic.
Researchers test how far artificial intelligence models can go in dreaming up varied poses and colors of objects and animals in photos.
Researchers unveil a pruning algorithm to make artificial intelligence applications run faster.
Researchers show that computers can “write” algorithms that adapt to radically different environments better than algorithms designed by humans.
Translated into sound, SARS-CoV-2 tricks our ear in the same way the virus tricks our cells.