Bats and marine mammals have acoustic navigation down, but humans are learning quickly too.
Bats and dolphins are famous for their ability to orient themselves by bouncing sound waves off of objects in their environment, an adaptation called echolocation. But some visually impaired people have also learned to navigate the world this way. One notable example is Daniel Kish, who lost his eyes to cancer as an infant and has since become the world’s most visible advocate of human echolocation.
Kish, who has embraced his reputation as “a real-life Batman” has trained hundreds of students to use clicks or other noises to develop a sonar-like acoustic sight.
Meanwhile, neuroscientists have amassed a fascinating body of work indicating that these sound-based snapshots are processed within the visual cortex. When one sensory organ closes, it seems, other pathways open to compensate.
“[Echolocation] really is a natural experiment that allows you to understand plasticity firsthand,” Melvyn Goodale, director of the Brain and Mind Institute at the University of Western Ontario, told me over the phone. Though he said this field is in its “early days,” it does seem that the earlier in life a person loses their sight, the more likely they are to recruit the visual regions of the brain to interpret auditory feedback instead.
On Sunday, further research into this phenomenon was presented by Bo Schenkman, an associate professor at the KTH Royal Institute of Technology in Sweden, at Acoustics ’17 Boston, a meeting of sound experts from around the world. In a talk called “Human echolocation in different situations and rooms,” Schenkman outlined experiments with both sighted and blind people who attempt echolocation in various environments, and with different sound cues.
Just like vision synthesizes properties such as brightness and color into an integrated scene, audio markers like pitch or loudness can be used to judge an object’s distance or material density. Schenkman and his colleagues have found that another, more nebulous factor—timbre, the perceived sound quality of a noise—also plays a role in orienting echolocators in space.
“By also taking account of timbre as a source of information, I believe that a more accurate description of causes for how blind people echolocate may be found,” Schenkman told me over email. “However, timbre is also more difficult to understand than pitch or loudness.”
Understanding the intricacies of this special sense could have numerous applications, such as the creation of devices and environments designed for echolocators. Sighted people could make use of these technologies and skillsets, too: Goodale suggests a scenario in which emergency workers could employ echolocation to identify escape routes or trapped victims on the job.
Moreover, human echolocators may be crucial to understanding the worldview of bats, dolphins, and other acoustically sensitive animals.
“We can report what [echolocating animals] do, but we can’t interrogate them about what their experience is like,” Goodale said. “[Human] echolocation offers up an opportunity to look at experiences of objects beyond the body that are being delivered up by a different system.”