
ZUMA Press, Inc/Alamy Stock Photo
Some people who are blind can echolocate like bats, making clicks with their mouths that help them understand the environment around them. Now researchers are beginning to understand how this works, so non-sighted people may one day be able to learn the technique.
While many people who are blind get information from ambient echoes, only a few make noises themselves to echolocate. Some, such as Daniel Kish (pictured), are so proficient they can draw a sketch of a room after clicking their way around it, or even go mountain biking along unfamiliar routes.

vCard.red is a free platform for creating a mobile-friendly digital business cards. You can easily create a vCard and generate a QR code for it, allowing others to scan and save your contact details instantly.
The platform allows you to display contact information, social media links, services, and products all in one shareable link. Optional features include appointment scheduling, WhatsApp-based storefronts, media galleries, and custom design options.
Previous research revealed that this human echolocation involves some brain areas that are used for vision in sighted people. Kish, who was blind almost from birth, thinks he experiences the sensations as something akin to images. “It’s not computational. There’s a real palpable experience of the image as a spatial representation – here are walls, here are the corners, here is the presence of objects.”
Advertisement
In the latest study, Lore Thaler of Durham University, UK, and her team carried out the first in-depth acoustic analysis of the mouth clicks. They worked with Kish and two other blind echolocators from the Netherlands and Austria.
Focused cone of sound
The clicks took the form of highly focused sound waves emitted in a 60-degree cone, compared with 120 to 180 degrees for typical speech. The echolocators had unknowingly worked out how to “point” their clicks towards the space they were sensing, says Thaler.
Ranging in frequency from 2 to 4 kilohertz, the clicks are higher in pitch than speech, perhaps because it helps keep the cone of sound tightly focused. The clicks were also very brief, lasting only 3 milliseconds, which might help avoid the emitted sound overlapping with its echoes, says Thaler.
Using these findings, the team have generated synthetic clicks that they will deploy in future research. “We can use the computer to click at an object thousands of times and work out how to determine the shape,” says Thaler. “You can’t ask a person to click at something thousands of times.”
“This provides a good starting point for telling us more about what’s happening with these clicks,” says Andrew Kolarik of the School of Advanced Study, University of London. “There’s a fair bit of debate about what makes for the best echolocation signal. Now we need to find out exactly what kind of click is best in different situations.”
A better understanding of how Kish and his peers echolocate may help with teaching the technique to other people with vision loss. The work may also lead to better forms of artificial sonar that could be used for self-driving cars, says Thaler. “Current sonar systems can tell how far away something is and how big it is, but they may not know what it is. Human echolocators can sometimes do better.”
Journal reference: PLoS Computational Biology, DOI: 10.1371/journal.pcbi.1005670
More on these topics: