[3] The first sensory substitution system was developed by Bach-y-Rita et al. as a means of brain plasticity in congenitally blind individuals.
[8] In a regular visual system, the data collected by the retina is converted into an electrical stimulus in the optic nerve and relayed to the brain, which re-creates the image and perceives it.
Touch-to-visual sensory substitution transfers information from touch receptors to the visual cortex for interpretation and perception.
In blind persons, it is seen that while they are only receiving tactile information, their visual cortex is also activated as they perceive sight objects.
For example, in one experiment by Bach-y-Rita, touch perception was able to be restored in a patient who lost peripheral sensation due to leprosy.
Generally, a camera or a microphone is used to collect visual or auditory stimuli that are used to replace lost sight and hearing, respectively.
[6] This may suggest that blind people can use their occipital lobe, generally used for vision, to perceive objects through the use of other sensory modalities.
This cross modal plasticity may explain the often described tendency of blind people to show enhanced ability in the other senses.
Some of the most popular are probably Paul Bach-y-Rita's Tactile Vision Sensory Substitution (TVSS), developed with Carter Collins at Smith-Kettlewell Institute and Peter Meijer's Seeing with Sound approach (The vOICe).
For example, a leprosy patient, whose perception of peripheral touch was restored, was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated).
[29] Among all these mechanoreceptors Pacinian corpuscle offers the highest sensitivity to high frequency vibration starting from a few tens of Hz to a few kHz with the help of its specialized mechanotransduction mechanism.
[37][38] Vibrotactile systems use the properties of mechanoreceptors in the skin so they have fewer parameters that need to be monitored as compared to electrotactile stimulation.
Sensory substitutions have also been successful with the emergence of wearable haptic actuators like vibrotactile motors, solenoids, peltier diodes, etc.
Alternatively, it has been shown that even very simple cues indicating the presence or absence of obstacles (through small vibration modules located at strategic places in the body) can be useful for navigation, gait stabilization and reduced anxiety when evolving in an unknown space.
[50] Neuroscientist David Eagleman presented a new device for sound-to-touch hearing at TED in 2015;[51] his laboratory research then expanded into a company based in Palo Alto, California, called Neosensory.
They found that tactile stimulation of the fingers lead to activation of the auditory belt area, which suggests that there is a relationship between audition and tactition.
One promising[citation needed] invention is the 'Sense organs synthesizer'[56] which aims at delivering a normal hearing range of nine octaves via 216 electrodes to sequential touch nerve zones, next to the spine.
Some people with balance disorders or adverse reactions to antibiotics develop bilateral vestibular damage (BVD).
[10] For example, this leprosy patient was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated).
[10] After two days of training one of the leprosy subjects reported "the wonderful sensation of touching his wife, which he had been unable to experience for 20 years.
"[58] The development of new technologies has now made it plausible to provide patients with prosthetic arms with tactile and kinesthetic sensibilities.
[59] Other applications of sensory substitution systems can be seen in function robotic prostheses for patients with high level quadriplegia.
With auditory sensory substitution, visual or tactile sensors detect and store information about the external environment.
[61][62][63] This project, presented in 2015,[64] proposes a new versatile mobile device and a sonification method specifically designed to the pedestrian locomotion of the visually impaired.
The device is composed of a miniature camera integrated into a glasses frame which is connected to a battery-powered minicomputer worn around the neck with a strap.
While the patient moves around, the device captures visual frames at a high frequency and generates the corresponding complex sounds that allow recognition.
The frequency and the inter-aural disparity are determined by the center of gravity of the co-ordinates of the receptive field's pixels in the image (see "There is something out there: distal attribution in sensory substitution, twenty years later"; Auvray M., Hanneton S., Lenay C., O'Regan K. Journal of Integrative Neuroscience 4 (2005) 505–21).
Zach Capalbo's Kromophone uses a basic color spectrum correlating to different sounds and timbres to give users perceptual information beyond the vOICe's capabilities.
[67] By means of stimulating electrodes implanted into the human nervous system, it is possible to apply current pulses to be learned and reliably recognized by the recipient.
It has been shown successfully in experimentation, by Kevin Warwick, that signals can be employed from force/touch indicators on a robot hand as a means of communication.