Machines that learn language more like kids do
Computer model could improve human-machine interaction, provide insight into how children learn language.
Computer model could improve human-machine interaction, provide insight into how children learn language.
Technique from MIT could lead to tiny, self-powered devices for environmental, industrial, or medical monitoring.
In simulations, robots move through new environments by exploring, observing, and drawing from learned experiences.
Program users can tinker with landing and path planning scenarios to identify optimal landing sites for Mars rovers.
Machine learning system efficiently recognizes activities by observing how objects change in only a few key frames.
Breakthrough CSAIL system suggests robots could one day be able to see well enough to be useful in people’s homes and offices.
Alex Hattori, a senior in MechE and six-time national yo-yo champion, explores yo-yos and robotics inside and out of the classroom.
AeroAstro grad students win multi-university challenge by demonstrating the utility of machine vision in a complex system.
Visiting students learn what it takes to be an engineer — and a bit more about themselves — at the Edgerton Center’s annual Engineering Design Workshop.
Personalized machine-learning models capture subtle variations in facial expressions to better gauge how we feel.
Made of electronic circuits coupled to minute particles, the devices could flow through intestines or pipelines to detect problems.
Improved design may be used for exploring disaster zones and other dangerous or inaccessible environments.
Spyce, a robot-assisted restaurant located in Boston, was invented to respond to a common MIT student desire: good, low-cost food.
Machine learning network offers personalized estimates of children’s behavior.
Low-power design will allow devices as small as a honeybee to determine their location while flying.