Skip to content ↓

Topic

Human-computer interaction

Download RSS feed: News Articles / In the Media / Audio

Displaying 31 - 45 of 76 news clips related to this topic.
Show:

Popular Science

Popular Science reporter Andrew Paul writes that a study co-authored by Institute Prof. Daron Acemoglu examines the impact of automation on the workforce over the past four decades and finds that “‘so-so automation’ exacerbates wage gaps between white and blue collar workers more than almost any other factor.”

Politico

Prof. Cynthia Breazeal discusses her work exploring how artificial intelligence can help students impacted by Covid, including refugees or children with disabilities, reports Ryan Heath for Politico. “We want to be super clear on what the role is of the robot versus the community, of which this robot is a part of. That's part of the ethical design thinking,” says Breazeal. “We don't want to have the robot overstep its responsibilities. All of our data that we collect is protected and encrypted.”

TechCrunch

TechCrunch reporter Brian Heater spotlights new MIT robotics research, including a team of CSAIL researchers “working on a system that utilizes a robotic arm to help people get dressed.” Heater notes that the “issue is one of robotic vision — specifically finding a method to give the system a better view of the human arm it’s working to dress.”

The Economist

Prof. Julie Shah speaks with The Economist about her work developing systems to help robots operate safely and efficiently with humans. “Robots need to see us as more than just an obstacle to maneuver around,” says Shah. “They need to work with us and anticipate what we need.”

The Boston Globe

Assaf Biderman ‘05, associate director of the MIT SENSEable City Lab, discusses his startup Superpedestrian, a transportation robotics company that has developed electric scooters available in over 60 cities across the world.  “I think we hit the holy grail of micromobility, which is detecting when you’re on the sidewalk every time and stopping or slowing the vehicle,” said Biderman.

TechCrunch

A new study by MIT researchers finds people are more likely to interact with a smart device if it demonstrates more humanlike attributes, reports Brian Heater for TechCrunch. The researchers found “users are more likely to engage with both the device — and each other — more when it exhibits some form of social cues,” writes Heater. “That can mean something as simple as the face/screen of the device rotating to meet the speaker’s gaze.”

Forbes

Forbes contributor Stephanie MacConnell spotlights the work of research affiliate Shriya Srinivasan PhD '20 in a roundup of women under the age of 30 who are transforming U.S. healthcare. Srinivasan is “working on technology that will enable patients to control and even ‘feel’ sensation through their prosthetic limb,” notes MacConnell.

TechCrunch

Ikigai, an MIT startup, is building automated workflows where human decision making will be a part of the process, reports Ron Miller for Tech Crunch. “What we saw is that there are use cases… [that involve] manual processes in the organizations that were extremely difficult to automate because a fundamental step involved humans making judgements or decisions with data, and where both the data and rules they’re operating on would change very often,” co-founder and CEO Vinayak Ramesh M.Eng ‘18, ‘12 tells Miller.

Popular Science

Popular Science reporter Charlotte Hu writes that MIT researchers have simulated an environment in which socially-aware robots are able to choose whether they want to help or hinder one another, as part of an effort to help improve human-robot interactions. “If you look at the vast majority of what someone says during their day, it has to do with what other [people] want, what they think, getting what that person wants out of another [person],” explains research scientist Andrei Barbu. “And if you want to get to the point where you have a robot inside someone’s home, understanding social interactions is incredibly important.”

TechCrunch

MIT researchers have developed a new machine learning system that can help robots learn to perform certain social interactions, reports Brian Heater for TechCrunch. “Researchers conducted tests in a simulated environment, to develop what they deemed ‘realistic and predictable’ interactions between robots,” writes Heater. “In the simulation, one robot watches another perform a task, attempts to determine the goal and then either attempts to help or hamper it in that task.”

Mashable

MIT researchers are using magnets to help improve control of prosthetic limbs, reports Emmett Smith for Mashable. “The researchers inserted magnetic beads into muscle tissue to track the specific movements of each muscle,” reports Smith. “That information is then transferred to the bionic limb, giving the users direct control over it.”

The Washington Post

Prof. Daron Acemoglu makes the case in a piece for The Washington Post that there should be oversight of how AI is applied, arguing that current AI technologies are already having tangible impacts on the labor market, the criminal justice system and on democratic discourse and politics. “Will AI be allowed to work increasingly to displace and monitor humans, or steered toward complementing and augmenting human capabilities,” Acemoglu writes, “creating new opportunities for workers?”

Mashable

In this video, Mashable spotlights how MIT researchers have developed a new system that can 3-D print objects without human intervention. “The system works thanks to a software toolkit that lets you design custom blueprints,” Mashable explains.

TechCrunch

CSAIL researchers have developed a new system, dubbed LaserFactory, that can print custom devices and robots without human intervention, reports Brian Heater for TechCrunch. “The system is comprised of a software kit and hardware platform designed to create structures and assemble circuitry and sensors for the machine,” Heater writes.

Boston Globe

Boston Globe reporter Hiawatha Bray spotlights Pison Technology, an MIT startup that has developed a new gesture control system that can be used to manipulate “digital devices by intercepting the electronic traffic between our hands and our brains, and translating them into commands the machines can understand.”