Skip to content ↓

Topic

Brain and cognitive sciences

Download RSS feed: News Articles / In the Media / Audio

Displaying 136 - 150 of 489 news clips related to this topic.
Show:

Scientific American

Scientific American reporter Dana G. Smith spotlights how Prof. Rebecca Saxe and her colleagues have found evidence that regions of the visual infant cortex show preferences for faces, bodies and scenes. “The big surprise of these results is that specialized area for seeing faces that some people speculated took years to develop: we see it in these babies who are, on average, five or six months old,” Saxe tells Smith. 

Naked Scientists

The Naked Scientist podcaster Verner Viisainen spotlights how MIT researchers studied vector-based navigation in humans. “What we discovered is actually that we don’t follow the shortest path but actually follow a different kind of optimization criteria which is based on angular deviation,” says Prof. Carlo Ratti.

Popular Science

Popular Science reporter Charlotte Hu writes that MIT researchers have simulated an environment in which socially-aware robots are able to choose whether they want to help or hinder one another, as part of an effort to help improve human-robot interactions. “If you look at the vast majority of what someone says during their day, it has to do with what other [people] want, what they think, getting what that person wants out of another [person],” explains research scientist Andrei Barbu. “And if you want to get to the point where you have a robot inside someone’s home, understanding social interactions is incredibly important.”

TechCrunch

MIT researchers have developed a new machine learning system that can help robots learn to perform certain social interactions, reports Brian Heater for TechCrunch. “Researchers conducted tests in a simulated environment, to develop what they deemed ‘realistic and predictable’ interactions between robots,” writes Heater. “In the simulation, one robot watches another perform a task, attempts to determine the goal and then either attempts to help or hamper it in that task.”

TechCrunch

TechCrunch writer Devin Coldewey reports on the ReSkin project, an AI project focused on developing a new electronic skin and fingertip meant to expand the sense of touch in robots. The ReSkin project is rooted in GelSight, a technology developed by MIT researchers that allows robots to gauge an object’s hardness.

Axios

Axios reporter Alison Snyder writes that a new study by MIT researchers demonstrates how AI algorithms could provide insight into the human brain’s processing abilities. The researchers found “Predicting the next word someone might say — like AI algorithms now do when you search the internet or text a friend — may be a key part of the human brain's ability to process language,” writes Snyder.

Scientific American

Using an integrative modeling technique, MIT researchers compared dozens of machine learning algorithms to brain scans as part of an effort to better understand how the brain processes language. The researchers found that “neural networks and computational science might, in fact, be critical tools in providing insight into the great mystery of how the brain processes information of all kinds,” writes Anna Blaustein for Scientific American.

National Public Radio (NPR)

Prof. Mark Bear speaks with NPR’s Jon Hamilton about how injecting tetrodotoxin, a paralyzing nerve toxin found in puffer fish, could allow the brain to rewire in a way that restores vision and help adults with amblyopia or "lazy eye." Bear explains that: “Unexpectedly, in many cases vision recovered in the amblyopic eye, showing that that plasticity could be restored even in the adult.”

NPR

NPR’s Jon Hamilton spotlights Prof. Li-Huei Tsai’s work developing a noninvasive technique that uses lights and sounds aimed at boosting gamma waves and potentially slowing progression of Alzheimer’s disease. "This is completely noninvasive and could really change the way Alzheimer's disease is treated," Tsai says.

Scientific American

Writing for Scientific American, Pamela Feliciano spotlights how a study by Prof. Pawan Sinha examined the predictive responses of people with autism. Sinha found that people with ASD had very different responses to a highly regular sequence of tones played on a metronome than those without ASD. While people without ASD ‘habituate’ to the sequence of regular tones; people with ASD do not acclimate to the sounds over time.”

The Wall Street Journal

Wall Street Journal reporters Angus Loten and Kevin Hand spotlight how MIT researchers are developing robots with humanlike senses that will be able to assist with a range of tasks. GelSight, a technology developed by CSAIL researchers, outfits robot arms with a small gel pad that can be pressed into objects to sense their size and texture, while another team of researchers is “working to bridge the gap between touch and sight by training an AI system to predict what a seen object feels like and what a felt object looks like.”

New Scientist

In an interview with Clare Wilson of New Scientist, Prof. Ed Boyden, one of the co-inventors of the field of optogenetics, discusses how the technique was used to help partially restore vision for a blind patient. “It’s exciting to see the first publication on human optogenetics,” says Boyden.

New York Times

Prof. Ed Boyden speaks with New York Times reporter Carl Zimmer about how scientists were able to partially restore a patient’s vision using optogenetics. “So far, I’ve thought of optogenetics as a tool for scientists primarily, since it’s being used by thousands of people to study the brain,” says Boyden, who helped pioneer the field of optogenetics. “But if optogenetics proves itself in the clinic, that would be extremely exciting.”

Wired

Wired reporter Max Levy spotlights Prof. Emery Brown and Earl Miller’s research examining how neurons in the brain operate as “consciousness emerges and recedes—and how doctors could better control it.” Levy writes that “Miller and Brown's work could make anesthesia safer, by allowing anesthesiologists who use the EEG to more precisely control drug dosages for people who are unconscious.”

Inside Higher Ed

In an article for Inside Higher Ed, Joshua Kim writes that “Grasp: The Science Transforming How We Learn,” a book by Sanjay Sarma, MIT’s vice president for open learning, and research associate Luke Yoquinto is “an important contribution to the literature on learning science and higher education change.” Kim adds that “Grasp can provide the foundations of what learning science-informed teaching might look like, with some fantastic real-world examples of constructivist theory in pedagogical action.”