Skip to content ↓

Topic

Assistive technology

Download RSS feed: News Articles / In the Media / Audio

Displaying 46 - 60 of 110 news clips related to this topic.
Show:

WCVB

WCVB-TV’s Mike Wankum visits the Media Lab to learn more about a new wearable device that allows users to communicate with a computer without speaking by measuring tiny electrical impulses sent by the brain to the jaw and face. Graduate student Arnav Kapur explains that the device is aimed at exploring, “how do we marry AI and human intelligence in a way that’s symbiotic.”

Fast Company

Fast Company reporter Eillie Anzilotti highlights how MIT researchers have developed an AI-enabled headset device that can translate silent thoughts into speech. Anzilotti explains that one of the factors that is motivating graduate student Arnav Kapur to develop the device is “to return control and ease of verbal communication to people who struggle with it.”

Quartz

Quartz reporter Anne Quito spotlights how graduate student Arnav Kapur has developed a wearable device that allows users to access the internet without speech or text and could help people who have lost the ability to speak vocalize their thoughts. Kapur explains that the device is aimed at augmenting ability.

Axios

Axios reporter Ina Fried spotlights how graduate student Arnav Kapur has developed a system that can detect speech signals. “The technology could allow those who have lost the ability to speak to regain a voice while also opening up possibilities of new interfaces for general purpose computing,” Fried explains.

Fast Company

Fast Company reporter Adele Peters writes about Tarjimly, a non-profit MIT startup that connects refugees with a large network of volunteer language translators. The platform “has more than 8,000 translators who speak more than 90 languages, and can be used in nearly any situation where someone trying to help can’t communicate with someone in need,” Peters explains.

CBS Boston

CBS Boston reporter Dr. Mallika Marshall spotlights research by researchers at MIT and Brigham and Women’s Hospital to develop robotic prosthetic limbs controlled by the brain. “It’s a wonderful experience as a researcher,” explains Herr of the work’s impact. “They walk away and start crying or laughing and giggling and say, ‘my gosh I have my body back, I have leg back, I have my life back.’”

The Wall Street Journal

In an article for The Wall Street Journal, Joseph Coughlin, director of the AgeLab, and research associate Lucas Yoquinto write that companies are increasingly designing aesthetically-pleasing and user-friendly technology for the elderly. “As the focus on older consumers’ preferences goes beyond the development of better products to the creation of new product categories, the experience of later life may improve substantially,” they explain.

CNN

MIT Media Lab fellow and “bionic artist” Viktoria Modesta writes for a special CNN feature about how technology is changing what it means to be human. Modesta, who voluntarily amputated her leg due to constant pain, writes that “fusing my body with technology feels like a philosophical exploration of humanity. It is art.”

Scientific American

MIT researchers have developed a new prosthetic device that allows amputees to feel where their limbs are located, reports Simon Makin for Scientific American. “What's new here is the ability to provide feedback the brain knows how to interpret as sensations of position, speed and force,” explains postdoctoral associate Tyler Clites.

United Press International (UPI)

MIT researchers have developed a surgical technique that allows the central nervous system to send movement commands to a robotic prosthesis, writes Allen Cone for United Press International. Cone explains that the new technique allows for “more stable and efficient” control over the movement of the prosthetic device.

STAT

STAT reporters Gideon Gil and Matthew Orr describe a “pioneering” surgical technique from researchers at MIT and Brigham and Women’s Hospital that allows prosthetics to operate like human limbs. Prof. Hugh Herr, “himself a rock climber who lost both his legs to frostbite as a teen, describes his goal as nothing short of eliminating disability."

The Daily Beast

In an essay for The Daily Beast, researchers at the MIT AgeLab explore the extent to which driving is a “secondary” activity when piloting a vehicle, and caution that automation on its own cannot protect drivers from distractions. “While these technologies can nudge us in a safer direction, the decision to practice safer phone habits ultimately lies in the hands of drivers,” they write.

co.design

After several years of experimentation, graduate student Arnav Kapur developed AlterEgo, a device to interpret subvocalization that can be used to control digital applications. Describing the implications as “exciting,” Katharine Schwab at Co.Design writes, “The technology would enable a new way of thinking about how we interact with computers, one that doesn’t require a screen but that still preserves the privacy of our thoughts.”

The Guardian

AlterEgo, a device developed by Media Lab graduate student Arnav Kapur, “can transcribe words that wearers verbalise internally but do not say out loud, using electrodes attached to the skin,” writes Samuel Gibbs of The Guardian. “Kapur and team are currently working on collecting data to improve recognition and widen the number of words AlterEgo can detect.”

Popular Science

Researchers at the Media Lab have developed a device, known as “AlterEgo,” which allows an individual to discreetly query the internet and control devices by using a headset “where a handful of electrodes pick up the miniscule electrical signals generated by the subtle internal muscle motions that occur when you silently talk to yourself,” writes Rob Verger for Popular Science.