Think of it as tactile Morse code: vibrations from a wearable, GPS-linked device that tell you to turn right or left, or stop, depending on the pattern of pulses you feel. Such a device could free drivers from having to look at maps, and could also serve as a tactile guide for the visually and hearing impaired.
Lynette Jones, a senior research scientist in MIT’s Department of Mechanical Engineering, designs wearable tactile displays. Through her work, she’s observed that the skin is a sensitive — though largely untapped — medium for communication.
“If you compare the skin to the retina, you have about the same number of sensory receptors, you just have them over almost two square meters of space, unlike the eye where it’s all concentrated in an extremely small area,” Jones says. “The skin is generally as useful as a very acute area. It’s just that you need to disperse the information that you’re presenting.”
Knowing just how to disperse tactile information across the skin is tricky. For instance, people may be much more sensitive to stimuli on areas like the hand, as opposed to the forearm, and may respond best to certain patterns of vibrations. Such information on skin responsiveness could help designers determine the best configuration of motors in a display, given where on the skin a device would be worn.
Now Jones has built an array that precisely tracks a motor’s vibrations through skin in three dimensions. The array consists of eight miniature accelerometers and a single pancake motor — a type of vibrating motor used in cellphones. She used the array to measure motor vibrations in three locations: the palm of the hand, the forearm and the thigh. From her studies with eight healthy participants, Jones found that a motor’s mechanical vibrations through skin drop off quickly in all three locations, within 8 millimeters from where the vibrations originated.
Jones also gauged participants’ perception of vibrations, fitting them with a 3-by-3 array of pancake motors in these three locations on the body. While skin generally stopped vibrating 8 millimeters from the source, most people continued to perceive the vibrations as far away as 24 millimeters.
When participants were asked to identify specific locations of motors within the array, they were much more sensitive on the palm than on the forearm or thigh. But in all three locations, people were better at picking out vibrations in the four corners of the array, versus the inner motors, leading Jones to posit that perhaps people use the edges of their limbs to localize vibrations and other stimuli.
“For a lot of sensory modalities, you have to work out what it is people can process, as one of the dictates for how you design,” says Jones, whose results will appear in the journal IEEE Transactions on Haptics. “There’s no point in making things much more compact, which may be a desirable feature from an engineering point of view, but from a human-use point of view, doesn’t make a difference.”
Mapping good vibrations
In addition to measuring skin’s sensitivity to vibrations, Jones and co-author Katherine Sofia ’12 found that skin has a strong effect on motor vibrations. The researchers compared a pancake motor’s frequency of vibrations when mounted on a rigid structure or on more compliant skin. They found that in general, skin reduced a motor’s vibrations by 28 percent, with the forearm and thigh having a slightly stronger dampening effect than the palm of the hand.
The skin’s damping of motor vibrations is significant, Jones says, if engineers plan to build tactile displays that incorporate different frequencies of vibrations. For instance, the difference between two motors — one slightly faster than the other — may be indistinguishable in certain parts of the skin. Likewise, two motors spaced a certain distance apart may be differentiable in one area but not another.
“Should I have eight motors, or is four enough that 90 percent of the time, I’ll know that when this one’s on, it’s this one and not that one?” Jones says. “We’re answering those sorts of questions in the context of what information you want to present using a device.”
Roberta Klatzky, a professor of psychology at Carnegie Mellon University, says that measurements taken by Jones’ arrays can be used to set up displays in which the location of a stimulus — for example, a pattern to convey a letter — is important.
“A major challenge is to enable people to tell the difference between patterns applied to the skin as, for example, blind people do when reading Braille,” says Klatzky, who specializes in the study of spatial cognition. “Lynette’s work sets up a methodology and potential guidelines for effective pattern displays.”
Creating a buzz
Jones sees promising applications for wearable tactile displays. In addition to helping drivers navigate, she says tactile stimuli may direct firefighters through burning buildings, or emergency workers through disaster sites. In more mundane scenarios, she says tactile displays may help joggers traverse an unfamiliar city, taking directions from a buzzing wristband, instead of having to look at a smartphone.
Using data from their mechanical and perceptual experiments, Jones’ group is designing arrays that can be worn across the back and around the wrist, and is investigating various ways to present vibrations. For example, a row of vibrations activated sequentially from left to right may tell a driver to turn right; a single motor that buzzes with increasing frequency may be a warning to slow down.
“There’s a lot of things you can do with these displays that are fairly intuitive in terms of how people respond,” Jones says, “which is important because no one’s going to spend hours and hours in any application, learning what a signal means.”