A new computational model developed by MIT neuroscientists explains how the brain maintains the balance between plasticity and stability, and how it can learn very similar tasks without interference between them.
The key, the researchers say, is that neurons are constantly changing their connections with other neurons. However, not all of the changes are functionally relevant — they simply allow the brain to explore many possible ways to execute a certain skill, such as a new tennis stroke.
“Your brain is always trying to find the configurations that balance everything so you can do two tasks, or three tasks, or however many you’re learning,” says Robert Ajemian, a research scientist in MIT’s McGovern Institute for Brain Research and lead author of a paper describing the findings in the Proceeding of the National Academy of Sciences the week of Dec. 9. “There are many ways to solve a task, and you’re exploring all the different ways.”
As the brain explores different solutions, neurons can become specialized for specific tasks, according to this theory.
Noisy circuits
As the brain learns a new motor skill, neurons form circuits that can produce the desired output — a command that will activate the body’s muscles to perform a task such as swinging a tennis racket. Perfection is usually not achieved on the first try, so feedback from each effort helps the brain to find better solutions.
This works well for learning one skill, but complications arise when the brain is trying to learn many different skills at once. Because the same distributed network controls related motor tasks, new modifications to existing patterns can interfere with previously learned skills.
“This is particularly tricky when you’re learning very similar things,” such as two different tennis strokes, says Institute Professor Emilio Bizzi, the paper’s senior author and a member of the McGovern Institute.
In a serial network such as a computer chip, this would be no problem — instructions for each task would be stored in a different location on the chip. However, the brain is not organized like a computer chip. Instead, it is massively parallel and highly connected — each neuron connects to, on average, about 10,000 other neurons.
That connectivity offers an advantage, however, because it allows the brain to test out so many possible solutions to achieve combinations of tasks. The constant changes in these connections, which the researchers call hyperplasticity, is balanced by another inherent trait of neurons — they have a very low signal to noise ratio, meaning that they receive about as much useless information as useful input from their neighbors.
Most models of neural activity don’t include noise, but the MIT team says noise is a critical element of the brain’s learning ability. “Most people don’t want to deal with noise because it’s a nuisance,” Ajemian says. “We set out to try to determine if noise can be used in a beneficial way, and we found that it allows the brain to explore many solutions, but it can only be utilized if the network is hyperplastic.”
This model helps to explain how the brain can learn new things without unlearning previously acquired skills, says Ferdinando Mussa-Ivaldi, a professor of physiology at Northwestern University.
“What the paper shows is that, counterintuitively, if you have neural networks and they have a high level of random noise, that actually helps instead of hindering the stability problem,” says Mussa-Ivaldi, who was not part of the research team.
Without noise, the brain’s hyperplasticity would overwrite existing memories too easily. Conversely, low plasticity would not allow any new skills to be learned, because the tiny changes in connectivity would be drowned out by all of the inherent noise.
The model is supported by anatomical evidence showing that neurons exhibit a great deal of plasticity even when learning is not taking place, as measured by the growth and formation of connections of dendrites — the tiny extensions that neurons use to communicate with each other.
Like riding a bike
The constantly changing connections explain why skills can be forgotten unless they are practiced often, especially if they overlap with other routinely performed tasks.
“That’s why an expert tennis player has to warm up for an hour before a match,” Ajemian says. The warm-up is not for their muscles, instead, the players need to recalibrate the neural networks that control different tennis strokes that are stored in the brain’s motor cortex.
However, skills such as riding a bicycle, which is not very similar to other common skills, are retained more easily. “Once you’ve learned something, if it doesn’t overlap or intersect with other skills, you will forget it but so slowly that it’s essentially permanent,” Ajemian says.
The researchers are now investigating whether this type of model could also explain how the brain forms memories of events, as well as motor skills.
The research was funded by the National Science Foundation.