Human echolocation

People trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size.

[2] In earlier times, human echolocation was sometimes described as "facial vision" or "obstacle sense", as it was believed that the proximity of nearby objects caused pressure changes on the skin.

[3][4][5] Only in the 1940s did a series of experiments performed in the Cornell Psychological Laboratory show that sound and hearing, rather than pressure changes on the skin, were the mechanisms driving this ability.

However, with training, sighted individuals with normal hearing can learn to avoid obstacles using only sound, showing that echolocation is a general human ability.

[9] John Levack Drever refers to echolocation in humans an example of panacusi loci[10], spatial hearing that exceeds the prescribed normative mode.

Both neural systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that their sense organs receive.

[12] A blind traveler using echoes can perceive very complex, detailed, and specific features of the world from distances far beyond the reach of the longest cane or arm.

Echoes can make information available about the nature and arrangement of objects and environmental features such as overhangs, walls, doorways and recesses, poles, ascending curbs and steps, planter boxes, pedestrians, fire hydrants, parked or moving vehicles, trees and other foliage, and much more.

Echoes can give detailed information about location (where objects are), dimension (how big they are and their general shape), and density (how solid they are).

[13] Some blind people are skilled at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes.

In a 2014 study by Thaler and colleagues,[16] the researchers first made recordings of the clicks and their very faint echoes using tiny microphones placed in the ears of the blind echolocators as they stood outside and tried to identify different objects such as a car, a flag pole, and a tree.

The researchers then played the recorded sounds back to the echolocators while their brain activity was being measured using functional magnetic resonance imaging.

Importantly, when the same experiment was carried out with sighted people who did not echolocate, these individuals could not perceive the objects and there was no echo-related activity anywhere in the brain.

[17] He leads blind teenagers hiking and mountain-biking through the wilderness, and teaches them how to navigate new locations safely, with a technique that he calls "FlashSonar".

He learned to make palatal clicks with his tongue when he was still a child, and now trains other blind people in the use of echolocation and in what he calls "Perceptual Mobility".

[7] The researchers were aware of the Wiederorientierung phenomenon described by Griffin[6] where bats, despite continuing to emit echolocation calls, use path integration in familiar acoustic space.

By age seven, Lucas was proficient enough to not only accurately tell the distance of objects, but also their material, and could play with other children in sports such as rock climbing and basketball.

[33] The scientist Kevin Warwick experimented with feeding ultrasonic pulses into the brain (via electrical stimulation from a neural implant) as an additional sensory input.

[citation needed] The 2017 video game Perception places the player in the role of a blind woman who must use echolocation to navigate the environment.

Echo-related activity in the brain of an early-blind, trained echolocator is shown on the left. There is no activity evident in the brain of a sighted person not so trained (shown on the right) listening to the same echoes
Ben Underwood