Jump to content

Sensory substitution: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
Citation bot (talk | contribs)
Altered template type. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox2 | #UCB_webform_linked 221/332
 
(419 intermediate revisions by more than 100 users not shown)
Line 1: Line 1:
'''Sensory substitution''' is a change of the characteristics of one [[sensory modality]] into stimuli of another sensory modality.
{{mergefrom|Seeing with your ears}}


A sensory substitution system consists of three parts: a sensor, a coupling system, and a stimulator. The sensor records stimuli and gives them to a coupling system which interprets these signals and transmits them to a stimulator. In case the sensor obtains signals of a kind not originally available to the bearer it is a case of [[#Sensory augmentation|sensory augmentation]]. Sensory substitution concerns human [[perception]] and the [[neuroplasticity|plasticity]] of the human brain; and therefore, allows us to study these aspects of neuroscience more through [[neuroimaging]].
'''Sensory substitution''' is the principle to transform characteristics of one [[sensory modality]] into stimuli of another sensory modality. It is hoped that systems that base on sensory substitution can help handicapped people to restore the ability to perceive a certain sensory modality.


Sensory substitution systems may help people by restoring their ability to perceive certain defective sensory modality by using sensory information from a functioning sensory modality.
A sensory substitution system consists of three parts: a sensor, a coupling system, and a stimulator. The sensor records stimuli from one sensory modality and gives them to a coupling system which interpretes these signals and transmits them to a stimulator.


==History==
The research on sensory substitution raises many questions concerning human [[perception]] and the [[plasticity]] of the human brain.
The idea of sensory substitution was introduced in the 1980s by [[Paul Bach-y-Rita]] as a means of using one sensory modality, mainly [[Touch|tactition]], to gain environmental information to be used by another sensory modality, mainly [[visual perception|vision]].<ref name = "TVSS">{{cite journal|vauthors=Bach-y-Rita P, Collins CC, Saunders F, White B, Scadden L |year=1969|title=Vision substitution by tactile the image projection|journal=Nature|volume=221|issue=5184|pages=963–964|pmid=5818337|doi=10.1038/221963a0|bibcode=1969Natur.221..963B|s2cid=4179427}}</ref><ref>{{cite book | title = A History of the Mind: Evolution and the Birth of Consciousness | author = Nicholas Humphrey | publisher = Springer | year = 1999 | isbn = 978-0-387-98719-4 | url = https://books.google.com/books?id=W8G8Oji53XsC&q=%22Paul+Bach-y-Rita%22&pg=PA79 | author-link = Nicholas Humphrey }}</ref> Thereafter, the entire field was discussed by Chaim-Meyer Scheff in "Experimental model for the study of changes in the organization of human sensory information processing through the design and testing of non-invasive prosthetic devices for sensory impaired people".<ref>{{cite journal|title=Experimental Model for the Study of Changes in the Organization of Human Sensory Information Processing Through the Design and Testing of Non-invasive Prosthetic Devices for Sensory Impaired People|first=Chaim-Meyer|last=Scheff|date=1 January 1986|journal=ACM SIGCAPH Computers and the Physically Handicapped|issue=36|pages=3–10|doi=10.1145/15711.15713|s2cid=11924232}}</ref> The first sensory substitution system was developed by Bach-y-Rita et al. as a means of brain plasticity in congenitally blind individuals.<ref>{{cite journal|author=Bach-y-Rita P|year=2004|title=Tactile sensory substitution studies|journal=Annals of the New York Academy of Sciences|volume=1013|issue=1|pages=83–91|pmid=15194608|doi=10.1196/annals.1305.006|bibcode=2004NYASA1013...83B|s2cid=44624575}}</ref> After this historic invention, sensory substitution has been the basis of many studies investigating perceptive and [[cognitive neuroscience]]. Sensory substitution is often employed to investigate predictions of the [[embodied cognition]] framework. Within the theoretical framework specifically the concept of sensorimotor contingencies <ref>{{cite journal |last1=O'Regan |first1=John Kevin |last2=Noë |first2=Alva |title=A sensorimotor account of vision and visual consciousness |journal=Behavioral and Brain Sciences |date=2001 |volume=24 |issue=5 |pages=939–1031|doi=10.1017/S0140525X01000115 |pmid=12239892 }}</ref> is investigated utilizing sensory substitution. Furthermore, sensory substitution has contributed to the study of brain function, human [[cognition]] and rehabilitation.<ref name = "Three">{{cite journal|vauthors=Renier L, De Volder AG|year=2005|title=Cognitive and brain mechanisms in sensory substitution of vision: a contribution to the study of human perception|journal=Journal of Integrative Neuroscience|volume=4|issue=4|pages=489–503|pmid=16385643|doi=10.1142/S0219635205000999}}</ref>


==Applications==
==Physiology==
When a person becomes blind or deaf they generally do not lose the ability to hear or see; they simply lose their ability to transmit the sensory signals from the periphery ([[retina]] for visions and [[cochlea]] for hearing) to brain.<ref name = "bach">{{cite journal|vauthors=Bach-y-Rita P, Kercel SW|year=2003|title=Sensory substitution and the human–machine interface|journal=Trends in Cognitive Sciences|volume=7|issue=12|pages=541–546|pmid=14643370|doi=10.1016/j.tics.2003.10.013|url=http://hci.ucsd.edu/102a/readings/SensorySubstitution.pdf|citeseerx=10.1.1.159.9777|s2cid=2466391}}</ref> Since the vision processing pathways are still intact, a person who has lost the ability to retrieve data from the retina can still see subjective images by using data gathered from other sensory modalities such as touch or audition.<ref name="Regan, JK 2001">{{cite journal|author1=O'Regan, JK|author2=Noe, A.|year=2001|title=A sensorimotor account of vision and visual consciousness|journal=Behavioral and Brain Sciences|volume=24|issue=5|pages=939–973|pmid=12239892|doi=10.1017/s0140525x01000115|s2cid=22606536}}</ref>
Applications are not restricted to handicapped persons, but also include [[art|artistic]] presentations, [[game]]s, and [[augmented reality]]. Some examples are substitution of visual stimuli to audio or tactile, audio to tactile. Most popular are probably Paul Bach-y-Rita's Tactile Vision Sensory Subsitution (TVSS), developed with Dr. Carter Collins at [[Smith-Kettlewell Institute]] and Peter Meijer's Seeing with Sound approach (The vOICe).


In a regular visual system, the data collected by the retina is converted into an electrical stimulus in the [[optic nerve]] and relayed to the brain, which re-creates the image and perceives it. Because it is the brain that is responsible for the final perception, sensory substitution is possible. During sensory substitution an intact sensory modality relays information to the visual perception areas of the brain so that the person can perceive sight. With sensory substitution, information gained from one sensory modality can reach brain structures physiologically related to other sensory modalities. Touch-to-visual sensory substitution transfers information from touch receptors to the visual cortex for interpretation and perception. For example, through [[fMRI]], one can determine which parts of the brain are activated during sensory perception. In blind persons, it is seen that while they are only receiving tactile information, their visual cortex is also activated as they perceive ''sight'' objects.<ref>Bach-y-Rita P. ''Brain Mechanisms in Sensory Substitution'', Academic Press New York:1972.</ref> Touch-to-touch sensory substitution is also possible, wherein information from touch receptors of one region of the body can be used to perceive touch in another region. For example, in one experiment by Bach-y-Rita, touch perception was able to be restored in a patient who lost peripheral sensation due to leprosy.<ref name = "six">Bach-y-Rita P. ''Nonsynaptic Diffusion Neurotransmission and Late Brain Reorganization'', Demos-Vermande, New York :1995.</ref>
Technical developments, such as [[miniaturization]] and [[electrical stimulation]] help the advance of sensory substitution devices.


===Technological support===
===Tactile Vision Sensory Substitution===
In order to achieve sensory substitution and stimulate the brain without intact sensory organs to relay the information, machines can be used to do the signal transduction, rather than the sensory organs. This [[brain–computer interface|brain–machine interface]] collects external signals and transforms them into electrical signals for the brain to interpret. Generally, a camera or a microphone is used to collect visual or auditory stimuli that are used to replace lost sight and hearing, respectively. The visual or auditory data collected from the sensors is transformed into tactile stimuli that are then relayed to the brain for visual and auditory perception. Crucially, this transformation sustains the sensorimotor contingency inherent to the respective sensory modality. This and all types of sensory substitution are only possible due to [[neuroplasticity]].<ref name ="six"/>
The TVSS converts the image from a video camera into a so-called tactile image. The tactile image is produced by four hundred activators placed either on the [[back]], on the [[chest]], or on the [[brow]]. The activators are [[solenoid]]s of one millimeter diameter.


===Brain plasticity===
In experiments, [[blindness|blind]] (or [[blindfold]]ed) subjects equipped with the TVSS can learn to detect shapes and to orient themselves. In the case of simple geometric shapes, it took around 50 trials to achieve 100 percent correct recognition. To identify objects in different orientations requires several hours of learning.
''[[Neuroplasticity|Brain plasticity]]'' refers to the brain's ability to adapt to a changing environment, for instance to the absence or deterioration of a sense. It is conceivable that [[cortical remapping]] or reorganization in response to the loss of one sense may be an evolutionary mechanism that allows people to adapt and compensate by using other senses better. Brain imaging studies have shown that upon visual impairments and blindness (especially in the first 12–16 years of life) the visual cortices undergo a huge functional reorganization such that they are activated by other sensory modalities.<ref>{{cite journal |last1=Collignon |first1=O. |last2=Lassonde |first2=M. |last3=Lepore |first3=F. |last4=Bastien |first4=D. |last5=Veraart |first5=C. |title=Functional cerebral reorganization for auditory spatial processing and auditory substitution of vision in early blind subjects |journal=Cerebral Cortex |date=2007 |volume=17 |issue=2 |pages=457–465|doi=10.1093/cercor/bhj162 |pmid=16581983 }}</ref><ref>{{cite journal |last1=Sadato |first1=N. |last2=Okada |first2=T. |last3=Honda |first3=M. |last4=Yonekura |first4=Y. |title=Critical period for cross-modal plasticity in blind humans: a functional MRI study |journal=NeuroImage |date=2002 |volume=16 |issue=2 |pages=389–400|doi=10.1006/nimg.2002.1111 |pmid=12030824 |s2cid=1927987 }}</ref><ref>{{cite journal |last1=Cohen |first1=L.G. |last2=Sadato |first2=N. |last3=Celnik |first3=P. |last4=Ishii |first4=K. |last5=Hallett |first5=M. |title=Period of susceptibility for cross-modal plasticity in the blind |journal=Annals of Neurology|date=1999 |volume=45 |issue=4 |pages=451–460|doi=10.1002/1531-8249(199904)45:4<451::AID-ANA6>3.0.CO;2-B |pmid=10211469 |s2cid=22588834 }}</ref>
Such cross-modal plasticity was also found through functional imaging of congenitally blind patients which showed a cross-modal recruitment of the [[occipital cortex]] during perceptual tasks such as Braille reading, tactile perception, tactual object recognition, [[sound localization]], and sound discrimination.<ref name = "Three"/> This may suggest that blind people can use their occipital lobe, generally used for vision, to perceive objects through the use of other sensory modalities. This [[cross modal plasticity]] may explain the often described tendency of blind people to show enhanced ability in the other senses.<ref>{{Cite journal|last1=Van Boven|first1=R. W.|last2=Hamilton|first2=R. H.|last3=Kauffman|first3=T.|last4=Keenan|first4=J. P.|last5=Pascual-Leone|first5=A.|date=2000-06-27|title=Tactile spatial resolution in blind braille readers|journal=Neurology|volume=54|issue=12|pages=2230–2236|issn=0028-3878|pmid=10881245|doi=10.1212/wnl.54.12.2230|s2cid=12053536|url=https://semanticscholar.org/paper/2f2fe771aa8d995d89816eaee6d5180a8032bed8}}</ref><ref>{{Cite journal|last1=Goldreich|first1=Daniel|last2=Kanics|first2=Ingrid M.|date=2003-04-15|title=Tactile acuity is enhanced in blindness|journal=The Journal of Neuroscience|volume=23|issue=8|pages=3439–3445|issn=1529-2401|pmid=12716952|pmc=6742312|doi=10.1523/JNEUROSCI.23-08-03439.2003}}</ref><ref>{{Cite journal|last1=Goldreich|first1=Daniel|last2=Kanics|first2=Ingrid M.|date=November 2006|title=Performance of blind and sighted humans on a tactile grating detection task|journal=Perception & Psychophysics|volume=68|issue=8|pages=1363–1371|issn=0031-5117|pmid=17378422|doi=10.3758/bf03193735|doi-access=free}}</ref><ref>{{Cite journal|last1=Wong|first1=Michael|last2=Gnanakumaran|first2=Vishi|last3=Goldreich|first3=Daniel|date=2011-05-11|title=Tactile Spatial Acuity Enhancement in Blindness: Evidence for Experience-Dependent Mechanisms|journal=Journal of Neuroscience|volume=31|issue=19|pages=7028–7037|doi=10.1523/jneurosci.6461-10.2011|pmid=21562264|pmc=6703211}}</ref><ref>{{Cite journal|last1=Bhattacharjee|first1=Arindam|last2=Ye|first2=Amanda J.|last3=Lisak|first3=Joy A.|last4=Vargas|first4=Maria G.|last5=Goldreich|first5=Daniel|date=2010-10-27|title=Vibrotactile Masking Experiments Reveal Accelerated Somatosensory Processing in Congenitally Blind Braille Readers|journal=Journal of Neuroscience|volume=30|issue=43|pages=14288–14298|doi=10.1523/jneurosci.1447-10.2010|pmid=20980584|pmc=3449316}}</ref>


===Perception versus sensing===
==="Seeing with sound" sensory substitution===
While considering the physiological aspects of sensory substitution, it is essential to distinguish between sensing and perceiving. The general question posed by this differentiation is: Are blind people seeing or ''perceiving'' to see by putting together different sensory data? While sensation comes in one modality – visual, auditory, tactile etc. – perception due to sensory substitution is not one modality but a result of cross-modal interactions. It is therefore concluded that while sensory substitution for vision induces visual-like perception in ''sighted'' individuals, it induces auditory or tactile perception in ''blind'' individuals.<ref name = "eight">{{cite journal|vauthors=Poirier C, De Volder AG, Scheiber C|year=2007|title=What neuroimaging tells us about sensory substitution|journal=Neuroscience & Biobehavioral Reviews|volume=31|issue=7|pages=1064–1070|pmid=17688948|doi=10.1016/j.neubiorev.2007.05.010|s2cid=45083182}}</ref> In short, blind people ''perceive'' to see through touch and audition with sensory substitution.
Through experiments with a [[Tactile-visual sensory substitution]] (TVSS) device developed by Bach-y-Rita subjects described the perceptual experience of the TVSS as particularly visual, such that objects were perceived as if located in the external space and not on the back or skin. Further studies using the TVSS showed that such perceptual changes were only possible when the participants could actively explore their environment with the TVSS.<ref>{{cite journal |last1=Bach-Y-Rita |first1=P. |last2=Kercel |first2=S. W. |title=Sensory substitution and the human–machine interface |journal=Trends in Cognitive Sciences |date=2003 |volume=7 |issue=12 |pages=541–546|doi=10.1016/j.tics.2003.10.013 |pmid=14643370 |s2cid=2466391 }}</ref><ref>{{cite journal |last1=Bach-Y-Rita |first1=P. |title=Tactile sensory substitution studies |journal=Annals of the New York Academy of Sciences |date=2004 |volume=1013 |issue=1 |pages=83–91|doi=10.1196/annals.1305.006 |pmid=15194608 |bibcode=2004NYASA1013...83B |s2cid=44624575 }}</ref> These results have been underpinned by many other studies testing different substitution systems with blind subjects such as vision-to-tactile substitution,<ref>{{cite journal |last1=Sampaio |first1=E. |last2=Maris |first2=S. |last3=Bach-Y-Rita |first3=P. |title=). Brain plasticity: "Visual' acuity of blind persons via the tongue |journal=Brain Research |date=2001 |volume=908 |issue=2 |pages=204–207|doi=10.1016/S0006-8993(01)02667-1 |pmid=11454331 |s2cid=17323161 }}</ref> vision-to-auditory substitution <ref>{{cite journal |last1=Abboud |first1=S. |last2=Hanassy |first2=S. |last3=Levy-Tzedek |first3=S. |last4=Maidenbaum |first4=S. |last5=Amedi |first5=A. |title=EyeMusic: Introducing a visual colorful experience for the blind using auditory sensory substitution |journal=Restor Neurol Neurosci |date=2014 |volume=32 |issue=2 |pages=247–257|doi=10.3233/RNN-130338 |pmid=24398719 |doi-access=free }}</ref><ref>{{cite journal |last1=Levy-Tzedek |first1=S. |last2=Hanassy |first2=S. |last3=Abboud |first3=S. |last4=Maidenbaum |first4=S. |last5=Amedi |first5=A. |title=Fast, accurate reaching movements with a visual-to-auditory sensory substitution device |journal=Restor Neurol Neurosci |date=2012 |volume=30 |issue=4 |pages=313–323|doi=10.3233/RNN-2012-110219 |pmid=22596353 |doi-access=free }}</ref> and vision-to-vestibular substitution <ref>{{cite journal |last1=Tyler |first1=M. |last2=Danilo |first2=Y. |last3=Bach-Y-Rita |first3=P. |title=Closing an open-loop control system: vestibular substitution through the tongue |journal=Journal of Integrative Neuroscience |date=2003 |volume=2 |issue=3 |pages=159–164|doi=10.1142/S0219635203000263 |pmid=15011268 |s2cid=14565593 }}</ref> Such results are also reported in sighted subjects, when blindfolded<ref>{{cite journal |last1=Schumann |first1=F. |last2=O'Regan |first2=K. |title=Sensory augmentation: integration of an auditory compass signal into human perception of space |journal=Scientific Reports |date=2017 |volume=7 |issue=1 |page=42197 |doi=10.1038/srep42197 |pmid=28195187 |pmc=5307328 |bibcode=2017NatSR...742197S }}</ref><ref>{{cite journal |last1=König |first1=S.U. |last2=Schumann |first2=F. |last3=Keyser |first3=J. |last4=Goeke |first4=C. |last5=Krause |first5=C. |last6=Wache |first6=S. |title=). Learning New Sensorimotor Contingencies: Effects of Long-Term Use of Sensory Augmentation on the Brain and Conscious Perception |journal=PLOS ONE |date=2016 |volume=11 |issue=12 |pages=e0166647 |doi=10.1371/journal.pone.0166647|pmid=27959914 |pmc=5154504 |bibcode=2016PLoSO..1166647K |doi-access=free }}</ref> and deliver further support for the sensorimotor contingency theory.


==Different applications==
The vOICe vision technology is one of several approaches towards sensory substitution (vision substitution) for the blind that aims to provide synthetic [[Visual perception|vision]] to the by means of a non-invasive [[visual prosthetic|visual prosthesis]]. The vOICe converts live camera views from a video camera into a soundscape. Views are typically refreshed about once per second while associating height with pitch and brightness with loudness in a left-to-right stereo scan of every camera snapshot. The effective image resolution in sound is up to
Applications are not restricted to disabled persons, but also include [[art]]istic presentations, [[game]]s, and [[augmented reality]]. Some examples are substitution of visual stimuli to audio or tactile, and of audio stimuli to tactile. Some of the most popular are probably Paul Bach-y-Rita's Tactile Vision Sensory Substitution (TVSS), developed with Carter Collins at [[Smith-Kettlewell Institute]] and [[Peter Bartus Leonard Meijer|Peter Meijer]]'s Seeing with Sound approach (The vOICe). Technical developments, such as [[miniaturization]] and [[electrical stimulation]] help the advance of sensory substitution devices.
several thousand pixels as can be proven by spectrographic analysis.


In sensory substitution systems, we generally have sensors that collect the data from the external environment. This data is then relayed to a coupling system that interprets and transduces the information and then replays it to a stimulator. This stimulator ultimately stimulates a functioning sensory modality.<ref name = "eight"/> After training, people learn to use the information gained from this stimulation to experience a perception of the sensation they lack instead of the actually stimulated sensation. For example, a leprosy patient, whose perception of peripheral touch was restored, was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead.<ref name = "six"/>
The ultimate goal is to provide synthetic vision with truly visual sensations by exploiting the neural plasticity of the human brain. [[Neuroscience]] research has shown that the [[visual cortex]] of even adult blind people can become responsive to sound, and seeingwithsound might reinforce this in a visual sense with live video from a head-mounted camera encoded in sound. The extent to which cortical plasticity indeed allows for functionally relevant rewiring or remapping of the human brain is still largely unknown and is being investigated in an open collaboration with research partners around the world.


===Tactile systems===
==Criticism==
To understand ''tactile sensory substitution'' it is essential to understand some basic physiology of the tactile receptors of the skin. There are five basic types of tactile receptors: [[Lamellar corpuscle|Pacinian corpuscle]], [[Meissner's corpuscle]], [[Bulbous corpuscle|Ruffini ending]]s, [[Merkel nerve ending]]s, and [[free nerve ending]]s. These receptors are mainly characterized by which type of stimuli best activates them, and by their rate of adaptation to sustained stimuli.<ref>{{cite journal|vauthors=Vallbo AB, Johansson RS|year=1984|title=Properties of cutaneous mechanoreceptors in the human hand related to touch sensation|journal=Human Neurobiology|volume=3|issue=1|pages=3–14|pmid=6330008}}</ref> Because of the rapid adaptation of some of these receptors to sustained stimuli, those receptors require rapidly changing tactile stimulation systems in order to be optimally activated.<ref name = "EV">{{cite journal|vauthors=Kaczmarek KA, Webster JG, Bach-y-Rita P, Tompkins WJ|year=1991|title= Electrotactile and vibrotactile displays for sensory substitution systems|journal=IEEE Transactions on Biomedical Engineering|volume=38|issue=1|pages=1–16|doi=10.1109/10.68204|pmid=2026426|s2cid=23047344|url=http://digital.library.wisc.edu/1793/9626}}</ref> Among all these mechanoreceptors Pacinian corpuscle offers the highest sensitivity to high frequency vibration starting from a few tens of Hz to a few kHz with the help of its specialized [[mechanotransduction]] mechanism.<ref>{{cite journal |doi=10.1109/TOH.2014.2369422 |title=Vibrotactile Sensitivity Threshold: Nonlinear Stochastic Mechanotransduction Model of the Pacinian Corpuscle |year=2015 |last1=Biswas |first1=Abhijit |journal=IEEE Transactions on Haptics |volume=8 |issue=1 |pages=102–113 |pmid=25398183 |last2=Manivannan |first2=M. |last3=Srinivasan |first3=Mandyam A.|s2cid=15326972 |url=https://zenodo.org/record/894772 }}</ref><ref>{{cite journal |doi=10.1109/TOH.2014.2369416 |title=Multiscale Layered Biomechanical Model of the Pacinian Corpuscle |year=2015 |last1=Biswas |first1=Abhijit |journal=IEEE Transactions on Haptics |volume=8 |issue=1 |pages=31–42 |pmid=25398182 |last2=Manivannan |first2=M. |last3=Srinivasan |first3=Mandyam A.|s2cid=24658742 |url=https://zenodo.org/record/894776 }}</ref>
It has been argued that the term "substitution" is misleading, as it is merely an "addition" or "supplementation" not a substitution of a sensory modality{{fact}}.

There have been two different types of stimulators: electrotactile or vibrotactile. Electrotactile stimulators use direct electrical stimulation of the nerve ending in the skin to initiate the action potentials; the sensation triggered, burn, itch, pain, pressure etc. depends on the stimulating voltage. Vibrotactile stimulators use pressure and the properties of the mechanoreceptors of the skin to initiate action potentials. There are advantages and disadvantages for both these stimulation systems. With the electrotactile stimulating systems a lot of factors affect the sensation triggered: stimulating voltage, current, waveform, electrode size, material, contact force, skin location, thickness and hydration.<ref name = "EV"/> Electrotactile stimulation may involve the direct stimulation of the nerves ([[percutaneous]]), or through the skin ([[transcutaneous]]). Percutaneous application causes additional distress to the patient, and is a major disadvantage of this approach. Furthermore, stimulation of the skin without insertion leads to the need for high voltage stimulation because of the high impedance of the dry skin,<ref name = "EV"/> unless the tongue is used as a receptor, which requires only about 3% as much voltage.<ref name = "tong">{{cite journal|vauthors=Bach-y-Rita P, Kaczmarek KA, Tyler ME, Garcia-Lara J|title=Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note|journal=J Rehabil Res Dev|year=1998|volume=35|issue=4|pages=427–30|url=http://web.gc.cuny.edu/cogsci/private/bach-y-rita-tongue.pdf|pmid=10220221}} See also [[Brainport]]</ref> This latter technique is undergoing clinical trials for various applications, and been approved for assistance to the blind in the UK.<ref>{{cite web|url=http://science.howstuffworks.com/brainport2.htm|title=How BrainPort Works|work=HowStuffWorks|accessdate=July 21, 2016|author=Layton, Julia|date=2006-07-17}}</ref><ref>{{cite web|url=http://www.wicab.com/media/Wicab%20Press%20Release%203-19-2013.pdf|title=Wicab Announces European Market Approval for its Non-Invasive Assistive Aid for the Blind|type=Press Release|publisher=Wicab, Inc.|archive-url=https://web.archive.org/web/20130507045650/http://www.wicab.com/media/Wicab%20Press%20Release%203-19-2013.pdf|archive-date=May 7, 2013}}</ref> Alternatively, the roof of the mouth has been proposed as another area where low currents can be felt.<ref>{{cite journal|author1=Hui Tang |author2=D. J. Beebe|title=Design and microfabrication of a flexible oral electrotactile display|doi=10.1109/JMEMS.2002.807478|year=2003|journal=Journal of Microelectromechanical Systems|volume=12|issue=1|pages=29–36}}</ref>

[[Electrostatic]] arrays are explored as [[human–computer interaction]] devices for [[touch screen]]s.<ref>{{cite web|url=http://www.hizook.com/blog/2010/08/11/electrotactile-arrays-texture-and-pressure-feedback-during-robotic-teleoperation|title=Electrotactile Arrays for Texture and Pressure Feedback During Robotic Teleoperation|date=August 11, 2010|author=Deyle, Travis|work=Hizook|accessdate=July 21, 2016}}</ref> These are based on a phenomenon called [[electrovibration]], which allows microamperre-level currents to be felt as roughness on a surface.<ref>{{cite journal | doi = 10.1111/j.1748-1716.1983.tb07235.x| url = http://www.fys.uio.no/elg/bioimp/pdf/vibration.pdf | volume=118 | title=Electrovibration, cutaneous sensation of microampere current | year=1983 | journal=Acta Physiologica Scandinavica | pages=19–25 | author=Grimnes S | issue = 2 | pmid = 6624501 }}</ref><ref>{{cite journal|author1=Kurt A. Kaczmarek|author2=Krishnakant Nammi|author3=Abhishek K. Agarwal|author4=Mitchell E. Tyler|author5=Steven J. Haase|author6=David J. Beebec|title=Polarity effect in electrovibration for tactile display|journal=IEEE Transactions on Biomedical Engineering|year=2006|volume= 53|issue=10|pages=2047–2054|doi=10.1109/TBME.2006.881804|pmc=2582732|pmid=17019869}}</ref>

Vibrotactile systems use the properties of mechanoreceptors in the skin so they have fewer parameters that need to be monitored as compared to electrotactile stimulation. However, vibrotactile stimulation systems need to account for the rapid adaptation of the tactile sense.

Another important aspect of tactile sensory substitution systems is the location of the tactile stimulation. Tactile receptors are abundant on the fingertips, face, and tongue while sparse on the back, legs and arms. It is essential to take into account the spatial resolution of the receptor as it has a major effect on the resolution of the sensory substitution.<ref name = "EV"/> A high resolution pin-arrayed display is able to present spatial information via tactile symbols, such as city maps<ref name="Zeng, 2015">{{cite journal|author=Zeng|display-authors=etal|year=2015|title=Interactive Audio-haptic Map Explorer on a Tactile Display|journal=Interacting with Computers|volume=27|issue=4|pages=413–429|doi=10.1093/iwc/iwu006}}</ref> and obstacle maps.<ref name="Zeng, 2012">{{cite book|author=Zeng|title=Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility - ASSETS '12|display-authors=etal|year=2012|chapter=Exploration and avoidance of surrounding obstacles for the visually impaired|pages=111–118|doi=10.1145/2384916.2384936|isbn=9781450313216|s2cid=18623028}}</ref>

Below you can find some descriptions of current tactile substitution systems.

====Tactile–visual====
One of the earliest and most well known form of sensory substitution devices was Paul Bach-y-Rita's TVSS that converted the image from a video camera into a tactile image and coupled it to the tactile receptors on the [[back]] of his blind subject.<ref name = "TVSS"/> Recently, several new systems have been developed that interface the tactile image to tactile receptors on different areas of the body such as the on the chest, brow, fingertip, abdomen, and forehead.<ref name = "bach"/> The tactile image is produced by hundreds of activators placed on the person. The activators are [[solenoid]]s of one millimeter diameter. In experiments, [[blindness|blind]] (or [[blindfold]]ed) subjects equipped with the TVSS can learn to detect shapes and to orient themselves. In the case of simple geometric shapes, it took around 50 trials to achieve 100 percent correct recognition. To identify objects in different orientations requires several hours of learning.

A system using the tongue as the human–machine interface is most practical. The [[human–computer interaction|tongue–machine interface]] is both protected by the closed mouth and the saliva in the mouth provides a good electrolytic environment that ensures good electrode contact.<ref name = "tong"/> Results from a study by Bach-y-Rita et al. show that electrotactile stimulation of the tongue required 3% of the voltage required to stimulate the finger.<ref name = "tong"/> Also, since it is more practical to wear an orthodontic retainer holding the stimulation system than an apparatus strapped to other parts of the body, the tongue–machine interface is more popular among TVSS systems.

This tongue TVSS system works by delivering electrotactile stimuli to the dorsum of the tongue via a flexible [[electrode array]] placed in the mouth. This electrode array is connected to a Tongue Display Unit [TDU] via a ribbon cable passing out of the mouth. A video camera records a picture, transfers it to the TDU for conversion into a tactile image. The tactile image is then projected onto the tongue via the ribbon cable where the tongue's receptors pick up the signal. After training, subjects are able to associate certain types of stimuli to certain types of visual images.<ref name = "bach"/><ref>Bach-y-Rita P, and Kaczmarek KA. (2002). ''Tongue placed tactile output device''. US Patent 6,430,450.</ref> In this way, tactile sensation can be used for visual perception.

Sensory substitutions have also been successful with the emergence of wearable haptic actuators like vibrotactile motors, solenoids, peltier diodes, etc. At the [[Center for Cognitive Ubiquitous Computing]] at [[Arizona State University]], researchers have developed technologies that enable people who are blind to perceive social situational information using wearable vibrotactile belts<ref name = "Haptic Belt">{{cite conference|author1=T. McDaniel|author2=S. Krishna|author3=V. Balasubramanian |author4=D. Colbry|author5=S. Panchanathan|title=Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind|conference=IEEE International Workshop on Haptic, Audio and Visual Environments and Games, 2008|publisher=HAVE 2008|pages=13–18|year=2008|doi=10.1109/HAVE.2008.4685291}}</ref> (Haptic Belt) and gloves<ref>{{cite conference |doi=10.1049/cp.2012.1461 |title= Tactile Sensing Fabrics for Detecting Impairments in Leprosy Patients |year=2012 |last1= Paul |first1=Sathish Kumar |publisher=IEEE |book-title= Appropriate Healthcare Technologies for Developing Countries – AHT2012. The 7th International Conference – World Health and Wellbeing |location=London, UK |last2= Rekha |first2=V. |last3= Sivarasu |first3= Sudesh}}</ref><ref name="Haptic Glove">{{cite conference|author1=S. Krishna|author2=S. Bala|author3=T. McDaniel|author4=S. McGuire|author5=S. Panchanathan|title=VibroGlove: an assistive technology aid for conveying facial expressions|conference=Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems|location=Atlanta, Georgia, USA|publisher=ACM|year=2010|pages=3637–3642|doi=10.1145/1753846.1754031|url=http://dmrussell.net/CHI2010/docs/p3637.pdf}}</ref> (VibroGlove). Both technologies use miniature cameras that are mounted on a pair of glasses worn by the user who is blind. The Haptic Belt provides vibrations that convey the direction and distance at which a person is standing in front of a user, while the VibroGlove uses spatio-temporal mapping of vibration patterns to convey facial expressions of the interaction partner. Alternatively, it has been shown that even very simple cues indicating the presence or absence of obstacles (through small vibration modules located at strategic places in the body) can be useful for navigation, gait stabilization and reduced anxiety when evolving in an unknown space. This approach, called the "Haptic Radar"<ref>{{cite web|url=http://www.k2.t.u-tokyo.ac.jp/perception/HapticRadar/index-e.html|title=The Haptic Radar / Extended Skin Project|publisher=Ishikawa Watanabe Laboratory}}</ref> has been studied since 2005 by researchers at the [[University of Tokyo]] in collaboration with the [[University of Rio de Janeiro]].<ref name="Haptic Radar">{{cite journal|author1=A. Cassinelli|author2=E. Sampaio|author3=S.B. Joffily|author4=H.R.S. Lima|author5=B.P.G.R. Gusmão|title=Do blind people move more confidently with the Tactile Radar?|journal= Technology and Disability|volume=26|issue=2–3|pages=161–170|doi=10.3233/TAD-140414|url=http://www.k2.t.u-tokyo.ac.jp/members/alvaro/Publications/TechnologyDisability_Alvaro.pdf|access-date=2016-07-21|archive-url=https://web.archive.org/web/20170314214320/http://www.k2.t.u-tokyo.ac.jp/members/alvaro/Publications/TechnologyDisability_Alvaro.pdf|archive-date=2017-03-14|url-status=dead|year=2014}}</ref> Similar products include the Eyeronman vest and belt,<ref name="seeing eye vest">{{cite web |url=https://www.livescience.com/48760-vibrating-vest-blind-navigation.html |title=Seeing-Eye Vest? Vibrating Clothing Helps Blind Navigate |newspaper=Https |date=14 November 2014 |accessdate= June 21, 2019}}</ref><ref name="vibrating vest">{{cite web |url=https://www.cbsnews.com/news/vibrating-vest-could-help-the-blind-avoid-obstacles/ |title=Vibrating vest could help the blind avoid obstacles - CBS News |newspaper=Https |date=13 June 2014 |accessdate= June 21, 2019}}</ref><ref name="vibrating vest business insider">{{cite web |url=https://www.businessinsider.com/vibrating-clothes-could-help-blind-people-navigate-2014-6 |title=Vibrating Clothes Could Help Blind People Navigate - Business Insider |newspaper=Https |accessdate= June 21, 2019}}</ref> and the forehead retina system.<ref name="forehead retina system">{{cite news |url=https://www.medgadget.com/2006/08/forehead_retina_1.html |title=Forehead Retina System |newspaper=Https |date= 2006-08-08|accessdate= June 21, 2019}}</ref>

====Tactile–auditory====
Neuroscientist [[David Eagleman]] presented a new device for sound-to-touch hearing at TED in 2015;<ref>Eagleman, David (2015). [http://www.ted.com/talks/david_eagleman_can_we_create_new_senses_for_humans?language=en Can we create new senses for humans?] TED talks.</ref> his laboratory research then expanded into a company based in Palo Alto, California, called Neosensory.<ref>[https://Neosensory.com Neosensory, Inc]</ref> Neosensory devices capture sound and turn them into high-dimensional patterns of touch on the skin.<ref>[https://www.smithsonianmag.com/innovation/could-this-futuristic-vest-give-us-sixth-sense-180968852/ Could This Futuristic Vest Give Us a Sixth Sense?], Smithsonian Magazine, Apr 2018.</ref><ref>[https://www.telegraph.co.uk/technology/2019/01/16/meet-man-trying-give-humans-sixth-senseneuroscientist-dr-david/ Meet the man who wants to give humans a sixth sense], The Telegraph, Jan 2019.</ref>

Experiments by Schurmann et al. show that tactile senses can activate the human auditory cortex. Currently vibrotactile stimuli can be used to facilitate hearing in normal and hearing-impaired people.<ref name = "TASS">{{cite journal|vauthors=Schurmann M, Caetano G, Hlushchuk Y, Jousmaki V, Hari R |year=2006|title=Touch activates human auditory cortex|journal=NeuroImage|volume=30|issue=4|pages=1325–1331|pmid=16488157|doi=10.1016/j.neuroimage.2005.11.020|s2cid=22045169}}</ref> To test for the auditory areas activated by touch, Schurmann et al. tested subjects while stimulating their fingers and palms with vibration bursts and their fingertips with tactile pressure. They found that tactile stimulation of the fingers lead to activation of the auditory belt area, which suggests that there is a relationship between audition and tactition.<ref name = "TASS"/> Therefore, future research can be done to investigate the likelihood of a tactile–auditory sensory substitution system. One promising{{citation needed|reason=This claim needs a reliable source.|date=March 2016}} invention is the 'Sense organs synthesizer'<ref>{{cite web|url=http://www.freepatentsonline.com/20020173823.pdf|title=Sense organs synthesizer United States Patent Application 20020173823}}</ref> which aims at delivering a normal hearing range of nine octaves via 216 electrodes to sequential touch nerve zones, next to the spine.

====Tactile–vestibular====
Some people with [[balance disorder]]s or adverse reactions to antibiotics develop bilateral vestibular damage (BVD). They experience difficulty maintaining posture, unstable gait, and [[oscillopsia]].<ref name = "vest" >{{cite journal|vauthors=Tyler M, Danilov Y, Bach-y-Rita P|year=2003|title=Closing an open-loop control system: vestibular substitution through the tongue|journal=Journal of Integrative Neuroscience|volume=2|issue=2|pages=159–164|pmid=15011268|doi=10.1142/S0219635203000263|s2cid=14565593}}</ref> Tyler et al. studied the restitution of postural control through a tactile for vestibular sensory substitution. Because BVD patients cannot integrate visual and tactile cues, they have a lot of difficulty standing. Using a head-mounted [[accelerometer]] and a [[brain–computer interface]] that employs electrotactile stimulation on the tongue, information about head-body orientation was relayed to the patient so that a new source of data is available to orient themselves and maintain good posture.<ref name = "vest" />

====Tactile–tactile to restore peripheral sensation====
Touch to touch sensory substitution is where information from touch receptors of one region can be used to perceive touch in another. For example, in one experiment by Bach-y-Rita, the touch perception was restored in a patient who lost peripheral sensation from leprosy.<ref name = "six"/> For example, this leprosy patient was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead.<ref name = "six"/> After two days of training one of the leprosy subjects reported "the wonderful sensation of touching his wife, which he had been unable to experience for 20 years."<ref name = "s">{{cite journal|author=Bach-y-Rita P|year=1999|title=Theoretical aspects of sensory substitution and of neurotransmission-related reorganization in spinal cord injury|journal=Spinal Cord|volume=37|issue=7|pages=465–474|pmid=10438112|doi=10.1038/sj.sc.3100873 |doi-access=free}}</ref>

====Tactile feedback system for prosthetic limbs====
The development of new technologies has now made it plausible to provide patients with prosthetic arms with tactile and kinesthetic sensibilities.<ref name="pros">{{cite journal|author=Riso RR|year=1999|title=Strategies for providing upper extremity amputees with tactile and hand position feedback – moving closer to the bionic arm|journal=Technology and Health Care|volume=7|issue=6|pages=401–409|pmid=10665673|url=http://www.smpp.northwestern.edu/savedLiterature/Riso(1999)TechAndHealthCare7p401-409.pdf|access-date=2016-07-21|archive-url=https://web.archive.org/web/20100627025008/http://www.smpp.northwestern.edu/savedLiterature/Riso(1999)TechAndHealthCare7p401-409.pdf|archive-date=2010-06-27|url-status=dead|doi=10.3233/THC-1999-7604}}</ref> While this is not purely a sensory substitution system, it uses the same principles to restore perception of senses. Some tactile feedback methods of restoring a perception of touch to amputees would be direct or micro stimulation of the tactile nerve afferents.<ref name = "pros"/>

Other applications of sensory substitution systems can be seen in function robotic prostheses for patients with high level quadriplegia. These robotic arms have several mechanisms of slip detection, vibration and texture detection that they relay to the patient through feedback.<ref name = "s"/> After more research and development, the information from these arms can be used by patients to perceive that they are holding and manipulating objects while their robotic arm actually accomplishes the task.

===Auditory systems===
Auditory sensory substitution systems like the tactile sensory substitution systems aim to use one sensory modality to compensate for the lack of another in order to gain a perception of one that is lacking. With auditory sensory substitution, visual or tactile sensors detect and store information about the external environment. This information is then transformed by interfaces into sound. Most systems are auditory-vision substitutions aimed at using the sense of hearing to convey visual information to the blind.

====The vOICe Auditory Display====
"The vOICe" converts live camera views from a video camera into soundscapes, patterns of scores of different tones at different volumes and pitches emitted simultaneously.<ref name = "voice">{{cite journal|author=Meijer PBL |year=1992|title=An Experimental System for Auditory Image Representations|journal=IEEE Transactions on Biomedical Engineering|volume=39|issue=2|pages=112–121|pmid=1612614|doi=10.1109/10.121642|s2cid=34811735}}</ref> The technology of the vOICe was invented in the 1990s by [[Peter Bartus Leonard Meijer|Peter Meijer]] and uses general video to audio mapping by associating height to pitch and brightness with loudness in a left-to-right scan of any video frame.<ref name = "bach"/>

====EyeMusic====
The EyeMusic user wears a miniature camera connected to a small computer (or smartphone) and stereo headphones. The images are converted into "soundscapes". The high locations on the image are projected as high-pitched musical notes on a pentatonic scale, and low vertical locations as low-pitched musical notes.

The EyeMusic conveys color information by using different musical instruments for each of the following five colors: white, blue, red, green, yellow. The EyeMusic employs an intermediate resolution of 30×50 pixels.<ref>{{cite journal|last=Levy-Tzedek|first=Shelly|author2=Hanassy S. |author3=Abboud S. |author4=Maidenbaum S. |author5=Amedi A. |title=Fast, accurate reaching movements with a visual-to-auditory sensory substitution device|journal=Restorative Neurology and Neuroscience|date=January 1, 2012|volume=30|issue=4|pages=313–323|doi=10.3233/RNN-2012-110219|pmid=22596353|url=http://brain.huji.ac.il/publications/Levy-Tzedek_et_al_RNN_2012.pdf|doi-access=free}}</ref><ref>{{cite journal|last=Abboud|first=Sami|author2=Hanassy S |author3=Levy-Tzedek S |author4=Maidenbaum S |author5=Amedi A. |title=EyeMusic: Introducing a "visual" colorful experience for the blind using auditory sensory substitution|journal=Restorative Neurology and Neuroscience|date=2014|volume=32|issue=2|pages=247–257|doi=10.3233/RNN-130338|pmid=24398719|url=http://brain.huji.ac.il/publications/Abboud_RNN_2014.pdf|doi-access=free}}</ref><ref>{{cite journal|last=Maidenbaum|first=Shachar|author2=Abboud S. |author3=Amedi A. |title=Sensory substitution: Closing the gap between basic research and widespread practical visual rehabilitation|journal=Neuroscience & Biobehavioral Reviews|date=April 2014|volume=41|pages=3–15|doi=10.1016/j.neubiorev.2013.11.007|pmid=24275274|url=http://brain.huji.ac.il/publications/Maidenbaum_NBR_2013.pdf|doi-access=free}}</ref>

====LibreAudioView====
This project, presented in 2015,<ref>{{cite journal|last=Ambard|first=Maxime|author2=Benezeth Y. |author3=Pfister P. |title=Mobile video-to-audio transducer and motion detection for sensory substitution|journal=Frontiers in ICT |date=2015|doi=10.3389/fict.2015.00020|volume=2|doi-access=free}}</ref> proposes a new versatile mobile device and a sonification method specifically designed to the pedestrian locomotion of the visually impaired. It sonifies in real-time spatial information from a video stream acquired at a standard frame rate. The device is composed of a miniature camera integrated into a glasses frame which is connected to a battery-powered minicomputer worn around the neck with a strap. The audio signal is transmitted to the user via running headphones. This system has two operating modes. With the first mode, when the user is static, only the edges of the moving objects are sonified. With the second mode, when the user is moving, the edges of both static and moving objects are sonified. Thus, the video stream is simplified by extracting only the edges of objects that can become dangerous obstacles. The system enables the localization of moving objects, the estimation of trajectories, and the detection of approaching objects.

====PSVA====
Another successful visual-to-auditory sensory substitution device is the Prosthesis Substituting Vision for Audition (PSVA).<ref name = "pvsa">{{cite journal |vauthors=Capelle C, Trullemans C, Arno P, Veraart C | year = 1998 | title = A real-time experimental prototype for enhancement of Vision Rehabilitation–vision rehabilitation using auditory substitution. | journal = IEEE Transactions on Biomedical Engineering | volume = 45 | issue = 10| pages = 1279–1293 | doi=10.1109/10.720206| pmid = 9775542 | s2cid = 7865760 }}</ref> This system utilizes a head-mounted TV camera that allows real-time, online translation of visual patterns into sound. While the patient moves around, the device captures visual frames at a high frequency and generates the corresponding complex sounds that allow recognition.<ref name = "bach"/> Visual stimuli are transduced into auditory stimuli with the use of a system that uses pixel to frequency relationship and couples a rough model of the human retina with an inverse model of the cochlea.<ref name = "pvsa"/>

====The Vibe====
The sound produced by this software is a mixture of sinusoidal sounds produced by virtual "sources", corresponding each to a "receptive field" in the image. Each receptive field is a set of localized pixels. The sound's amplitude is determined by the mean luminosity of the pixels of the corresponding receptive field. The frequency and the inter-aural disparity are determined by the center of gravity of the co-ordinates of the receptive field's pixels in the image (see "There is something out there: distal attribution in sensory substitution, twenty years later"; Auvray M., Hanneton S., Lenay C., O'Regan K. [[Journal of Integrative Neuroscience]] 4 (2005) 505–21). The Vibe is an Open Source project hosted by SourceForge.

====Other systems====
Other approaches to the substitution of hearing for vision use binaural directional cues, much as natural [[human echolocation]] does. An example of the latter approach is the "SeeHear" chip from Caltech.<ref>Nielson L, Mahowald M, Mead C (1989). "SeeHear," in ''Analog VLSI and Neural Systems,'' by C. Mead, Reading: Addison-Wesley, chapter 13, 207–227.</ref>

Other visual-auditory substitution devices deviate from the vOICe's greyscale mapping of images. Zach Capalbo's Kromophone uses a basic color spectrum correlating to different sounds and timbres to give users perceptual information beyond the vOICe's capabilities.<ref>[http://ppl.gordon.edu/sensory_substitution.php] {{dead link|date=July 2016}}</ref>

===Nervous system implants===
By means of stimulating electrodes implanted into the human nervous system, it is possible to apply current pulses to be learned and reliably recognized by the recipient. It has been shown successfully in experimentation, by [[Kevin Warwick]], that signals can be employed from force/touch indicators on a robot hand as a means of communication.<ref>{{cite journal|vauthors=Warwick K, Gasson M, Hutt B, Goodhew I, Kyberd P, Schulzrinne H, Wu X |year=2004|title=Thought communication and control: A first step using radiotelegraphy|journal= IEE Proceedings - Communications|volume=151|issue=3|pages=185–189|doi=10.1049/ip-com:20040409|doi-broken-date=7 December 2024 |s2cid=15804806}}</ref>

==Terminology==
It has been argued that the term "substitution" is misleading, as it is merely an "addition" or "supplementation" not a substitution of a sensory modality.<ref>{{cite book|vauthors=Lenay C, Gapenne O, Hanneton S, Marque C, Geouelle C|year=2003|contribution=Sensory Substitution: limits and perspectives|title=Touching for Knowing, Cognitive psychology of haptic manual perception|pages=275–292|url=https://www.sfu.ca/~kathleea/docs/(A%20Guide%20to%20Sensory%20Substitution)%20-%20Sensory%20Substitution%20-%20limits%20and%20perspectives%20copy.pdf}}</ref>

==Sensory augmentation==
{{See also|Human enhancement|Remote sensing}}
Building upon the research conducted on sensory substitution, investigations into the possibility of ''augmenting'' the body's sensory apparatus are now beginning. The intention is to extend the body's ability to sense aspects of the environment that are not normally perceivable by the body in its natural state. Moreover, such new informations about the environment could be used not to directly replace a sensory organ but to offer a sensory information usually perceived via another, potentially harmed, sensory modality. Thus, also sensory augmentation is widely used for rehabilitation purposes as well as for investigating perceptive and cognitive neuroscience

Active work in this direction is being conducted by, among others, the e-sense project<ref>{{cite web|url=http://esenseproject.info/ |title=Home - e-sense project |accessdate=2014-08-06 |url-status=dead |archive-url=https://web.archive.org/web/20140810201043/http://esenseproject.info/ |archive-date=2014-08-10 }}</ref> of the [[Open University]] and [[Edinburgh University]], the feelSpace project of the [[University of Osnabrück]], and the [https://www.nature.com/articles/srep42197 hearSpace project] at [[University of Paris]].

The findings of research into sensory augmentation (as well as sensory substitution in general) that investigate the emergence of perceptual experience (qualia) from the activity of neurons have implications for the understanding of consciousness.<ref name="Regan, JK 2001"/>


==See also==
==See also==
*[[Biological Neural Network]]s
*[[Biological neural network]]
*[[Brain implant]]
*[[Brain implant]]
*[[Human echolocation]], blind people navigating by listening to the echo of sounds
*[[Human echolocation]], blind people navigating by listening to the echo of sounds


==References==
==References==
{{Reflist}}
*[http://www.utc.fr/gsp/publi/Lenay03-Sensory%20substitution.pdf Sensory Subsitution- Limits and Perspectives]


==External links==
==External links==
*[http://www.wicab.com Tongue display for sensory substitution]
*[http://www.wicab.com Tongue display for sensory substitution]
*[http://www.seeingwithsound.com The vOICe auditory display for sensory substitution].
*[http://www.seeingwithsound.com The vOICe auditory display for sensory substitution].
*[https://web.archive.org/web/20070216024709/http://biomed.brown.edu/Courses/BI108/2006-108websites/group03retinalimplants/ Artificial Retinas]
*[http://membres.lycos.fr/ren1er/experiences.html Webpage of Laurent Renier]
*[https://web.archive.org/web/20061122061515/http://www.utc.fr/gsp/publi/Lenay03-SensorySubstitution.pdf Sensory Substitution:limits and perspectives C. Lenay et al.]
*[http://sourceforge.net/projects/thevibe/ The Vibe]
*[http://feelspace.cogsci.uni-osnabrueck.de feelSpace - The Magnetic Perception Group of the University of Osnabrück]
*[http://www.kromophone.com The Kromophone]
*[http://academicplatform.blogspot.com.tr/2009/05/deneme.html Sensory Substitution For Blind (Nihat Erim İnceoğlu)]
*[https://www.nature.com/articles/srep42197 Sensory augmentation: integration of an auditory compass signal into human perception of space]

{{BCI}}


[[Category:Neuroscience]]
[[Category:Cognitive neuroscience]]
[[Category:Engineering]]
[[Category:Biomedical engineering]]
[[Category:Neural engineering]]
[[Category:Neuroprosthetics]]

Latest revision as of 01:01, 13 December 2024

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

A sensory substitution system consists of three parts: a sensor, a coupling system, and a stimulator. The sensor records stimuli and gives them to a coupling system which interprets these signals and transmits them to a stimulator. In case the sensor obtains signals of a kind not originally available to the bearer it is a case of sensory augmentation. Sensory substitution concerns human perception and the plasticity of the human brain; and therefore, allows us to study these aspects of neuroscience more through neuroimaging.

Sensory substitution systems may help people by restoring their ability to perceive certain defective sensory modality by using sensory information from a functioning sensory modality.

History

[edit]

The idea of sensory substitution was introduced in the 1980s by Paul Bach-y-Rita as a means of using one sensory modality, mainly tactition, to gain environmental information to be used by another sensory modality, mainly vision.[1][2] Thereafter, the entire field was discussed by Chaim-Meyer Scheff in "Experimental model for the study of changes in the organization of human sensory information processing through the design and testing of non-invasive prosthetic devices for sensory impaired people".[3] The first sensory substitution system was developed by Bach-y-Rita et al. as a means of brain plasticity in congenitally blind individuals.[4] After this historic invention, sensory substitution has been the basis of many studies investigating perceptive and cognitive neuroscience. Sensory substitution is often employed to investigate predictions of the embodied cognition framework. Within the theoretical framework specifically the concept of sensorimotor contingencies [5] is investigated utilizing sensory substitution. Furthermore, sensory substitution has contributed to the study of brain function, human cognition and rehabilitation.[6]

Physiology

[edit]

When a person becomes blind or deaf they generally do not lose the ability to hear or see; they simply lose their ability to transmit the sensory signals from the periphery (retina for visions and cochlea for hearing) to brain.[7] Since the vision processing pathways are still intact, a person who has lost the ability to retrieve data from the retina can still see subjective images by using data gathered from other sensory modalities such as touch or audition.[8]

In a regular visual system, the data collected by the retina is converted into an electrical stimulus in the optic nerve and relayed to the brain, which re-creates the image and perceives it. Because it is the brain that is responsible for the final perception, sensory substitution is possible. During sensory substitution an intact sensory modality relays information to the visual perception areas of the brain so that the person can perceive sight. With sensory substitution, information gained from one sensory modality can reach brain structures physiologically related to other sensory modalities. Touch-to-visual sensory substitution transfers information from touch receptors to the visual cortex for interpretation and perception. For example, through fMRI, one can determine which parts of the brain are activated during sensory perception. In blind persons, it is seen that while they are only receiving tactile information, their visual cortex is also activated as they perceive sight objects.[9] Touch-to-touch sensory substitution is also possible, wherein information from touch receptors of one region of the body can be used to perceive touch in another region. For example, in one experiment by Bach-y-Rita, touch perception was able to be restored in a patient who lost peripheral sensation due to leprosy.[10]

Technological support

[edit]

In order to achieve sensory substitution and stimulate the brain without intact sensory organs to relay the information, machines can be used to do the signal transduction, rather than the sensory organs. This brain–machine interface collects external signals and transforms them into electrical signals for the brain to interpret. Generally, a camera or a microphone is used to collect visual or auditory stimuli that are used to replace lost sight and hearing, respectively. The visual or auditory data collected from the sensors is transformed into tactile stimuli that are then relayed to the brain for visual and auditory perception. Crucially, this transformation sustains the sensorimotor contingency inherent to the respective sensory modality. This and all types of sensory substitution are only possible due to neuroplasticity.[10]

Brain plasticity

[edit]

Brain plasticity refers to the brain's ability to adapt to a changing environment, for instance to the absence or deterioration of a sense. It is conceivable that cortical remapping or reorganization in response to the loss of one sense may be an evolutionary mechanism that allows people to adapt and compensate by using other senses better. Brain imaging studies have shown that upon visual impairments and blindness (especially in the first 12–16 years of life) the visual cortices undergo a huge functional reorganization such that they are activated by other sensory modalities.[11][12][13] Such cross-modal plasticity was also found through functional imaging of congenitally blind patients which showed a cross-modal recruitment of the occipital cortex during perceptual tasks such as Braille reading, tactile perception, tactual object recognition, sound localization, and sound discrimination.[6] This may suggest that blind people can use their occipital lobe, generally used for vision, to perceive objects through the use of other sensory modalities. This cross modal plasticity may explain the often described tendency of blind people to show enhanced ability in the other senses.[14][15][16][17][18]

Perception versus sensing

[edit]

While considering the physiological aspects of sensory substitution, it is essential to distinguish between sensing and perceiving. The general question posed by this differentiation is: Are blind people seeing or perceiving to see by putting together different sensory data? While sensation comes in one modality – visual, auditory, tactile etc. – perception due to sensory substitution is not one modality but a result of cross-modal interactions. It is therefore concluded that while sensory substitution for vision induces visual-like perception in sighted individuals, it induces auditory or tactile perception in blind individuals.[19] In short, blind people perceive to see through touch and audition with sensory substitution. Through experiments with a Tactile-visual sensory substitution (TVSS) device developed by Bach-y-Rita subjects described the perceptual experience of the TVSS as particularly visual, such that objects were perceived as if located in the external space and not on the back or skin. Further studies using the TVSS showed that such perceptual changes were only possible when the participants could actively explore their environment with the TVSS.[20][21] These results have been underpinned by many other studies testing different substitution systems with blind subjects such as vision-to-tactile substitution,[22] vision-to-auditory substitution [23][24] and vision-to-vestibular substitution [25] Such results are also reported in sighted subjects, when blindfolded[26][27] and deliver further support for the sensorimotor contingency theory.

Different applications

[edit]

Applications are not restricted to disabled persons, but also include artistic presentations, games, and augmented reality. Some examples are substitution of visual stimuli to audio or tactile, and of audio stimuli to tactile. Some of the most popular are probably Paul Bach-y-Rita's Tactile Vision Sensory Substitution (TVSS), developed with Carter Collins at Smith-Kettlewell Institute and Peter Meijer's Seeing with Sound approach (The vOICe). Technical developments, such as miniaturization and electrical stimulation help the advance of sensory substitution devices.

In sensory substitution systems, we generally have sensors that collect the data from the external environment. This data is then relayed to a coupling system that interprets and transduces the information and then replays it to a stimulator. This stimulator ultimately stimulates a functioning sensory modality.[19] After training, people learn to use the information gained from this stimulation to experience a perception of the sensation they lack instead of the actually stimulated sensation. For example, a leprosy patient, whose perception of peripheral touch was restored, was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead.[10]

Tactile systems

[edit]

To understand tactile sensory substitution it is essential to understand some basic physiology of the tactile receptors of the skin. There are five basic types of tactile receptors: Pacinian corpuscle, Meissner's corpuscle, Ruffini endings, Merkel nerve endings, and free nerve endings. These receptors are mainly characterized by which type of stimuli best activates them, and by their rate of adaptation to sustained stimuli.[28] Because of the rapid adaptation of some of these receptors to sustained stimuli, those receptors require rapidly changing tactile stimulation systems in order to be optimally activated.[29] Among all these mechanoreceptors Pacinian corpuscle offers the highest sensitivity to high frequency vibration starting from a few tens of Hz to a few kHz with the help of its specialized mechanotransduction mechanism.[30][31]

There have been two different types of stimulators: electrotactile or vibrotactile. Electrotactile stimulators use direct electrical stimulation of the nerve ending in the skin to initiate the action potentials; the sensation triggered, burn, itch, pain, pressure etc. depends on the stimulating voltage. Vibrotactile stimulators use pressure and the properties of the mechanoreceptors of the skin to initiate action potentials. There are advantages and disadvantages for both these stimulation systems. With the electrotactile stimulating systems a lot of factors affect the sensation triggered: stimulating voltage, current, waveform, electrode size, material, contact force, skin location, thickness and hydration.[29] Electrotactile stimulation may involve the direct stimulation of the nerves (percutaneous), or through the skin (transcutaneous). Percutaneous application causes additional distress to the patient, and is a major disadvantage of this approach. Furthermore, stimulation of the skin without insertion leads to the need for high voltage stimulation because of the high impedance of the dry skin,[29] unless the tongue is used as a receptor, which requires only about 3% as much voltage.[32] This latter technique is undergoing clinical trials for various applications, and been approved for assistance to the blind in the UK.[33][34] Alternatively, the roof of the mouth has been proposed as another area where low currents can be felt.[35]

Electrostatic arrays are explored as human–computer interaction devices for touch screens.[36] These are based on a phenomenon called electrovibration, which allows microamperre-level currents to be felt as roughness on a surface.[37][38]

Vibrotactile systems use the properties of mechanoreceptors in the skin so they have fewer parameters that need to be monitored as compared to electrotactile stimulation. However, vibrotactile stimulation systems need to account for the rapid adaptation of the tactile sense.

Another important aspect of tactile sensory substitution systems is the location of the tactile stimulation. Tactile receptors are abundant on the fingertips, face, and tongue while sparse on the back, legs and arms. It is essential to take into account the spatial resolution of the receptor as it has a major effect on the resolution of the sensory substitution.[29] A high resolution pin-arrayed display is able to present spatial information via tactile symbols, such as city maps[39] and obstacle maps.[40]

Below you can find some descriptions of current tactile substitution systems.

Tactile–visual

[edit]

One of the earliest and most well known form of sensory substitution devices was Paul Bach-y-Rita's TVSS that converted the image from a video camera into a tactile image and coupled it to the tactile receptors on the back of his blind subject.[1] Recently, several new systems have been developed that interface the tactile image to tactile receptors on different areas of the body such as the on the chest, brow, fingertip, abdomen, and forehead.[7] The tactile image is produced by hundreds of activators placed on the person. The activators are solenoids of one millimeter diameter. In experiments, blind (or blindfolded) subjects equipped with the TVSS can learn to detect shapes and to orient themselves. In the case of simple geometric shapes, it took around 50 trials to achieve 100 percent correct recognition. To identify objects in different orientations requires several hours of learning.

A system using the tongue as the human–machine interface is most practical. The tongue–machine interface is both protected by the closed mouth and the saliva in the mouth provides a good electrolytic environment that ensures good electrode contact.[32] Results from a study by Bach-y-Rita et al. show that electrotactile stimulation of the tongue required 3% of the voltage required to stimulate the finger.[32] Also, since it is more practical to wear an orthodontic retainer holding the stimulation system than an apparatus strapped to other parts of the body, the tongue–machine interface is more popular among TVSS systems.

This tongue TVSS system works by delivering electrotactile stimuli to the dorsum of the tongue via a flexible electrode array placed in the mouth. This electrode array is connected to a Tongue Display Unit [TDU] via a ribbon cable passing out of the mouth. A video camera records a picture, transfers it to the TDU for conversion into a tactile image. The tactile image is then projected onto the tongue via the ribbon cable where the tongue's receptors pick up the signal. After training, subjects are able to associate certain types of stimuli to certain types of visual images.[7][41] In this way, tactile sensation can be used for visual perception.

Sensory substitutions have also been successful with the emergence of wearable haptic actuators like vibrotactile motors, solenoids, peltier diodes, etc. At the Center for Cognitive Ubiquitous Computing at Arizona State University, researchers have developed technologies that enable people who are blind to perceive social situational information using wearable vibrotactile belts[42] (Haptic Belt) and gloves[43][44] (VibroGlove). Both technologies use miniature cameras that are mounted on a pair of glasses worn by the user who is blind. The Haptic Belt provides vibrations that convey the direction and distance at which a person is standing in front of a user, while the VibroGlove uses spatio-temporal mapping of vibration patterns to convey facial expressions of the interaction partner. Alternatively, it has been shown that even very simple cues indicating the presence or absence of obstacles (through small vibration modules located at strategic places in the body) can be useful for navigation, gait stabilization and reduced anxiety when evolving in an unknown space. This approach, called the "Haptic Radar"[45] has been studied since 2005 by researchers at the University of Tokyo in collaboration with the University of Rio de Janeiro.[46] Similar products include the Eyeronman vest and belt,[47][48][49] and the forehead retina system.[50]

Tactile–auditory

[edit]

Neuroscientist David Eagleman presented a new device for sound-to-touch hearing at TED in 2015;[51] his laboratory research then expanded into a company based in Palo Alto, California, called Neosensory.[52] Neosensory devices capture sound and turn them into high-dimensional patterns of touch on the skin.[53][54]

Experiments by Schurmann et al. show that tactile senses can activate the human auditory cortex. Currently vibrotactile stimuli can be used to facilitate hearing in normal and hearing-impaired people.[55] To test for the auditory areas activated by touch, Schurmann et al. tested subjects while stimulating their fingers and palms with vibration bursts and their fingertips with tactile pressure. They found that tactile stimulation of the fingers lead to activation of the auditory belt area, which suggests that there is a relationship between audition and tactition.[55] Therefore, future research can be done to investigate the likelihood of a tactile–auditory sensory substitution system. One promising[citation needed] invention is the 'Sense organs synthesizer'[56] which aims at delivering a normal hearing range of nine octaves via 216 electrodes to sequential touch nerve zones, next to the spine.

Tactile–vestibular

[edit]

Some people with balance disorders or adverse reactions to antibiotics develop bilateral vestibular damage (BVD). They experience difficulty maintaining posture, unstable gait, and oscillopsia.[57] Tyler et al. studied the restitution of postural control through a tactile for vestibular sensory substitution. Because BVD patients cannot integrate visual and tactile cues, they have a lot of difficulty standing. Using a head-mounted accelerometer and a brain–computer interface that employs electrotactile stimulation on the tongue, information about head-body orientation was relayed to the patient so that a new source of data is available to orient themselves and maintain good posture.[57]

Tactile–tactile to restore peripheral sensation

[edit]

Touch to touch sensory substitution is where information from touch receptors of one region can be used to perceive touch in another. For example, in one experiment by Bach-y-Rita, the touch perception was restored in a patient who lost peripheral sensation from leprosy.[10] For example, this leprosy patient was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead.[10] After two days of training one of the leprosy subjects reported "the wonderful sensation of touching his wife, which he had been unable to experience for 20 years."[58]

Tactile feedback system for prosthetic limbs

[edit]

The development of new technologies has now made it plausible to provide patients with prosthetic arms with tactile and kinesthetic sensibilities.[59] While this is not purely a sensory substitution system, it uses the same principles to restore perception of senses. Some tactile feedback methods of restoring a perception of touch to amputees would be direct or micro stimulation of the tactile nerve afferents.[59]

Other applications of sensory substitution systems can be seen in function robotic prostheses for patients with high level quadriplegia. These robotic arms have several mechanisms of slip detection, vibration and texture detection that they relay to the patient through feedback.[58] After more research and development, the information from these arms can be used by patients to perceive that they are holding and manipulating objects while their robotic arm actually accomplishes the task.

Auditory systems

[edit]

Auditory sensory substitution systems like the tactile sensory substitution systems aim to use one sensory modality to compensate for the lack of another in order to gain a perception of one that is lacking. With auditory sensory substitution, visual or tactile sensors detect and store information about the external environment. This information is then transformed by interfaces into sound. Most systems are auditory-vision substitutions aimed at using the sense of hearing to convey visual information to the blind.

The vOICe Auditory Display

[edit]

"The vOICe" converts live camera views from a video camera into soundscapes, patterns of scores of different tones at different volumes and pitches emitted simultaneously.[60] The technology of the vOICe was invented in the 1990s by Peter Meijer and uses general video to audio mapping by associating height to pitch and brightness with loudness in a left-to-right scan of any video frame.[7]

EyeMusic

[edit]

The EyeMusic user wears a miniature camera connected to a small computer (or smartphone) and stereo headphones. The images are converted into "soundscapes". The high locations on the image are projected as high-pitched musical notes on a pentatonic scale, and low vertical locations as low-pitched musical notes.

The EyeMusic conveys color information by using different musical instruments for each of the following five colors: white, blue, red, green, yellow. The EyeMusic employs an intermediate resolution of 30×50 pixels.[61][62][63]

LibreAudioView

[edit]

This project, presented in 2015,[64] proposes a new versatile mobile device and a sonification method specifically designed to the pedestrian locomotion of the visually impaired. It sonifies in real-time spatial information from a video stream acquired at a standard frame rate. The device is composed of a miniature camera integrated into a glasses frame which is connected to a battery-powered minicomputer worn around the neck with a strap. The audio signal is transmitted to the user via running headphones. This system has two operating modes. With the first mode, when the user is static, only the edges of the moving objects are sonified. With the second mode, when the user is moving, the edges of both static and moving objects are sonified. Thus, the video stream is simplified by extracting only the edges of objects that can become dangerous obstacles. The system enables the localization of moving objects, the estimation of trajectories, and the detection of approaching objects.

PSVA

[edit]

Another successful visual-to-auditory sensory substitution device is the Prosthesis Substituting Vision for Audition (PSVA).[65] This system utilizes a head-mounted TV camera that allows real-time, online translation of visual patterns into sound. While the patient moves around, the device captures visual frames at a high frequency and generates the corresponding complex sounds that allow recognition.[7] Visual stimuli are transduced into auditory stimuli with the use of a system that uses pixel to frequency relationship and couples a rough model of the human retina with an inverse model of the cochlea.[65]

The Vibe

[edit]

The sound produced by this software is a mixture of sinusoidal sounds produced by virtual "sources", corresponding each to a "receptive field" in the image. Each receptive field is a set of localized pixels. The sound's amplitude is determined by the mean luminosity of the pixels of the corresponding receptive field. The frequency and the inter-aural disparity are determined by the center of gravity of the co-ordinates of the receptive field's pixels in the image (see "There is something out there: distal attribution in sensory substitution, twenty years later"; Auvray M., Hanneton S., Lenay C., O'Regan K. Journal of Integrative Neuroscience 4 (2005) 505–21). The Vibe is an Open Source project hosted by SourceForge.

Other systems

[edit]

Other approaches to the substitution of hearing for vision use binaural directional cues, much as natural human echolocation does. An example of the latter approach is the "SeeHear" chip from Caltech.[66]

Other visual-auditory substitution devices deviate from the vOICe's greyscale mapping of images. Zach Capalbo's Kromophone uses a basic color spectrum correlating to different sounds and timbres to give users perceptual information beyond the vOICe's capabilities.[67]

Nervous system implants

[edit]

By means of stimulating electrodes implanted into the human nervous system, it is possible to apply current pulses to be learned and reliably recognized by the recipient. It has been shown successfully in experimentation, by Kevin Warwick, that signals can be employed from force/touch indicators on a robot hand as a means of communication.[68]

Terminology

[edit]

It has been argued that the term "substitution" is misleading, as it is merely an "addition" or "supplementation" not a substitution of a sensory modality.[69]

Sensory augmentation

[edit]

Building upon the research conducted on sensory substitution, investigations into the possibility of augmenting the body's sensory apparatus are now beginning. The intention is to extend the body's ability to sense aspects of the environment that are not normally perceivable by the body in its natural state. Moreover, such new informations about the environment could be used not to directly replace a sensory organ but to offer a sensory information usually perceived via another, potentially harmed, sensory modality. Thus, also sensory augmentation is widely used for rehabilitation purposes as well as for investigating perceptive and cognitive neuroscience

Active work in this direction is being conducted by, among others, the e-sense project[70] of the Open University and Edinburgh University, the feelSpace project of the University of Osnabrück, and the hearSpace project at University of Paris.

The findings of research into sensory augmentation (as well as sensory substitution in general) that investigate the emergence of perceptual experience (qualia) from the activity of neurons have implications for the understanding of consciousness.[8]

See also

[edit]

References

[edit]
  1. ^ a b Bach-y-Rita P, Collins CC, Saunders F, White B, Scadden L (1969). "Vision substitution by tactile the image projection". Nature. 221 (5184): 963–964. Bibcode:1969Natur.221..963B. doi:10.1038/221963a0. PMID 5818337. S2CID 4179427.
  2. ^ Nicholas Humphrey (1999). A History of the Mind: Evolution and the Birth of Consciousness. Springer. ISBN 978-0-387-98719-4.
  3. ^ Scheff, Chaim-Meyer (1 January 1986). "Experimental Model for the Study of Changes in the Organization of Human Sensory Information Processing Through the Design and Testing of Non-invasive Prosthetic Devices for Sensory Impaired People". ACM SIGCAPH Computers and the Physically Handicapped (36): 3–10. doi:10.1145/15711.15713. S2CID 11924232.
  4. ^ Bach-y-Rita P (2004). "Tactile sensory substitution studies". Annals of the New York Academy of Sciences. 1013 (1): 83–91. Bibcode:2004NYASA1013...83B. doi:10.1196/annals.1305.006. PMID 15194608. S2CID 44624575.
  5. ^ O'Regan, John Kevin; Noë, Alva (2001). "A sensorimotor account of vision and visual consciousness". Behavioral and Brain Sciences. 24 (5): 939–1031. doi:10.1017/S0140525X01000115. PMID 12239892.
  6. ^ a b Renier L, De Volder AG (2005). "Cognitive and brain mechanisms in sensory substitution of vision: a contribution to the study of human perception". Journal of Integrative Neuroscience. 4 (4): 489–503. doi:10.1142/S0219635205000999. PMID 16385643.
  7. ^ a b c d e Bach-y-Rita P, Kercel SW (2003). "Sensory substitution and the human–machine interface" (PDF). Trends in Cognitive Sciences. 7 (12): 541–546. CiteSeerX 10.1.1.159.9777. doi:10.1016/j.tics.2003.10.013. PMID 14643370. S2CID 2466391.
  8. ^ a b O'Regan, JK; Noe, A. (2001). "A sensorimotor account of vision and visual consciousness". Behavioral and Brain Sciences. 24 (5): 939–973. doi:10.1017/s0140525x01000115. PMID 12239892. S2CID 22606536.
  9. ^ Bach-y-Rita P. Brain Mechanisms in Sensory Substitution, Academic Press New York:1972.
  10. ^ a b c d e Bach-y-Rita P. Nonsynaptic Diffusion Neurotransmission and Late Brain Reorganization, Demos-Vermande, New York :1995.
  11. ^ Collignon, O.; Lassonde, M.; Lepore, F.; Bastien, D.; Veraart, C. (2007). "Functional cerebral reorganization for auditory spatial processing and auditory substitution of vision in early blind subjects". Cerebral Cortex. 17 (2): 457–465. doi:10.1093/cercor/bhj162. PMID 16581983.
  12. ^ Sadato, N.; Okada, T.; Honda, M.; Yonekura, Y. (2002). "Critical period for cross-modal plasticity in blind humans: a functional MRI study". NeuroImage. 16 (2): 389–400. doi:10.1006/nimg.2002.1111. PMID 12030824. S2CID 1927987.
  13. ^ Cohen, L.G.; Sadato, N.; Celnik, P.; Ishii, K.; Hallett, M. (1999). "Period of susceptibility for cross-modal plasticity in the blind". Annals of Neurology. 45 (4): 451–460. doi:10.1002/1531-8249(199904)45:4<451::AID-ANA6>3.0.CO;2-B. PMID 10211469. S2CID 22588834.
  14. ^ Van Boven, R. W.; Hamilton, R. H.; Kauffman, T.; Keenan, J. P.; Pascual-Leone, A. (2000-06-27). "Tactile spatial resolution in blind braille readers". Neurology. 54 (12): 2230–2236. doi:10.1212/wnl.54.12.2230. ISSN 0028-3878. PMID 10881245. S2CID 12053536.
  15. ^ Goldreich, Daniel; Kanics, Ingrid M. (2003-04-15). "Tactile acuity is enhanced in blindness". The Journal of Neuroscience. 23 (8): 3439–3445. doi:10.1523/JNEUROSCI.23-08-03439.2003. ISSN 1529-2401. PMC 6742312. PMID 12716952.
  16. ^ Goldreich, Daniel; Kanics, Ingrid M. (November 2006). "Performance of blind and sighted humans on a tactile grating detection task". Perception & Psychophysics. 68 (8): 1363–1371. doi:10.3758/bf03193735. ISSN 0031-5117. PMID 17378422.
  17. ^ Wong, Michael; Gnanakumaran, Vishi; Goldreich, Daniel (2011-05-11). "Tactile Spatial Acuity Enhancement in Blindness: Evidence for Experience-Dependent Mechanisms". Journal of Neuroscience. 31 (19): 7028–7037. doi:10.1523/jneurosci.6461-10.2011. PMC 6703211. PMID 21562264.
  18. ^ Bhattacharjee, Arindam; Ye, Amanda J.; Lisak, Joy A.; Vargas, Maria G.; Goldreich, Daniel (2010-10-27). "Vibrotactile Masking Experiments Reveal Accelerated Somatosensory Processing in Congenitally Blind Braille Readers". Journal of Neuroscience. 30 (43): 14288–14298. doi:10.1523/jneurosci.1447-10.2010. PMC 3449316. PMID 20980584.
  19. ^ a b Poirier C, De Volder AG, Scheiber C (2007). "What neuroimaging tells us about sensory substitution". Neuroscience & Biobehavioral Reviews. 31 (7): 1064–1070. doi:10.1016/j.neubiorev.2007.05.010. PMID 17688948. S2CID 45083182.
  20. ^ Bach-Y-Rita, P.; Kercel, S. W. (2003). "Sensory substitution and the human–machine interface". Trends in Cognitive Sciences. 7 (12): 541–546. doi:10.1016/j.tics.2003.10.013. PMID 14643370. S2CID 2466391.
  21. ^ Bach-Y-Rita, P. (2004). "Tactile sensory substitution studies". Annals of the New York Academy of Sciences. 1013 (1): 83–91. Bibcode:2004NYASA1013...83B. doi:10.1196/annals.1305.006. PMID 15194608. S2CID 44624575.
  22. ^ Sampaio, E.; Maris, S.; Bach-Y-Rita, P. (2001). "). Brain plasticity: "Visual' acuity of blind persons via the tongue". Brain Research. 908 (2): 204–207. doi:10.1016/S0006-8993(01)02667-1. PMID 11454331. S2CID 17323161.
  23. ^ Abboud, S.; Hanassy, S.; Levy-Tzedek, S.; Maidenbaum, S.; Amedi, A. (2014). "EyeMusic: Introducing a visual colorful experience for the blind using auditory sensory substitution". Restor Neurol Neurosci. 32 (2): 247–257. doi:10.3233/RNN-130338. PMID 24398719.
  24. ^ Levy-Tzedek, S.; Hanassy, S.; Abboud, S.; Maidenbaum, S.; Amedi, A. (2012). "Fast, accurate reaching movements with a visual-to-auditory sensory substitution device". Restor Neurol Neurosci. 30 (4): 313–323. doi:10.3233/RNN-2012-110219. PMID 22596353.
  25. ^ Tyler, M.; Danilo, Y.; Bach-Y-Rita, P. (2003). "Closing an open-loop control system: vestibular substitution through the tongue". Journal of Integrative Neuroscience. 2 (3): 159–164. doi:10.1142/S0219635203000263. PMID 15011268. S2CID 14565593.
  26. ^ Schumann, F.; O'Regan, K. (2017). "Sensory augmentation: integration of an auditory compass signal into human perception of space". Scientific Reports. 7 (1): 42197. Bibcode:2017NatSR...742197S. doi:10.1038/srep42197. PMC 5307328. PMID 28195187.
  27. ^ König, S.U.; Schumann, F.; Keyser, J.; Goeke, C.; Krause, C.; Wache, S. (2016). "). Learning New Sensorimotor Contingencies: Effects of Long-Term Use of Sensory Augmentation on the Brain and Conscious Perception". PLOS ONE. 11 (12): e0166647. Bibcode:2016PLoSO..1166647K. doi:10.1371/journal.pone.0166647. PMC 5154504. PMID 27959914.
  28. ^ Vallbo AB, Johansson RS (1984). "Properties of cutaneous mechanoreceptors in the human hand related to touch sensation". Human Neurobiology. 3 (1): 3–14. PMID 6330008.
  29. ^ a b c d Kaczmarek KA, Webster JG, Bach-y-Rita P, Tompkins WJ (1991). "Electrotactile and vibrotactile displays for sensory substitution systems". IEEE Transactions on Biomedical Engineering. 38 (1): 1–16. doi:10.1109/10.68204. PMID 2026426. S2CID 23047344.
  30. ^ Biswas, Abhijit; Manivannan, M.; Srinivasan, Mandyam A. (2015). "Vibrotactile Sensitivity Threshold: Nonlinear Stochastic Mechanotransduction Model of the Pacinian Corpuscle". IEEE Transactions on Haptics. 8 (1): 102–113. doi:10.1109/TOH.2014.2369422. PMID 25398183. S2CID 15326972.
  31. ^ Biswas, Abhijit; Manivannan, M.; Srinivasan, Mandyam A. (2015). "Multiscale Layered Biomechanical Model of the Pacinian Corpuscle". IEEE Transactions on Haptics. 8 (1): 31–42. doi:10.1109/TOH.2014.2369416. PMID 25398182. S2CID 24658742.
  32. ^ a b c Bach-y-Rita P, Kaczmarek KA, Tyler ME, Garcia-Lara J (1998). "Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note" (PDF). J Rehabil Res Dev. 35 (4): 427–30. PMID 10220221. See also Brainport
  33. ^ Layton, Julia (2006-07-17). "How BrainPort Works". HowStuffWorks. Retrieved July 21, 2016.
  34. ^ "Wicab Announces European Market Approval for its Non-Invasive Assistive Aid for the Blind" (PDF) (Press Release). Wicab, Inc. Archived from the original (PDF) on May 7, 2013.
  35. ^ Hui Tang; D. J. Beebe (2003). "Design and microfabrication of a flexible oral electrotactile display". Journal of Microelectromechanical Systems. 12 (1): 29–36. doi:10.1109/JMEMS.2002.807478.
  36. ^ Deyle, Travis (August 11, 2010). "Electrotactile Arrays for Texture and Pressure Feedback During Robotic Teleoperation". Hizook. Retrieved July 21, 2016.
  37. ^ Grimnes S (1983). "Electrovibration, cutaneous sensation of microampere current" (PDF). Acta Physiologica Scandinavica. 118 (2): 19–25. doi:10.1111/j.1748-1716.1983.tb07235.x. PMID 6624501.
  38. ^ Kurt A. Kaczmarek; Krishnakant Nammi; Abhishek K. Agarwal; Mitchell E. Tyler; Steven J. Haase; David J. Beebec (2006). "Polarity effect in electrovibration for tactile display". IEEE Transactions on Biomedical Engineering. 53 (10): 2047–2054. doi:10.1109/TBME.2006.881804. PMC 2582732. PMID 17019869.
  39. ^ Zeng; et al. (2015). "Interactive Audio-haptic Map Explorer on a Tactile Display". Interacting with Computers. 27 (4): 413–429. doi:10.1093/iwc/iwu006.
  40. ^ Zeng; et al. (2012). "Exploration and avoidance of surrounding obstacles for the visually impaired". Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility - ASSETS '12. pp. 111–118. doi:10.1145/2384916.2384936. ISBN 9781450313216. S2CID 18623028.
  41. ^ Bach-y-Rita P, and Kaczmarek KA. (2002). Tongue placed tactile output device. US Patent 6,430,450.
  42. ^ T. McDaniel; S. Krishna; V. Balasubramanian; D. Colbry; S. Panchanathan (2008). Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. IEEE International Workshop on Haptic, Audio and Visual Environments and Games, 2008. HAVE 2008. pp. 13–18. doi:10.1109/HAVE.2008.4685291.
  43. ^ Paul, Sathish Kumar; Rekha, V.; Sivarasu, Sudesh (2012). "Tactile Sensing Fabrics for Detecting Impairments in Leprosy Patients". Appropriate Healthcare Technologies for Developing Countries – AHT2012. The 7th International Conference – World Health and Wellbeing. London, UK: IEEE. doi:10.1049/cp.2012.1461.
  44. ^ S. Krishna; S. Bala; T. McDaniel; S. McGuire; S. Panchanathan (2010). VibroGlove: an assistive technology aid for conveying facial expressions (PDF). Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems. Atlanta, Georgia, USA: ACM. pp. 3637–3642. doi:10.1145/1753846.1754031.
  45. ^ "The Haptic Radar / Extended Skin Project". Ishikawa Watanabe Laboratory.
  46. ^ A. Cassinelli; E. Sampaio; S.B. Joffily; H.R.S. Lima; B.P.G.R. Gusmão (2014). "Do blind people move more confidently with the Tactile Radar?" (PDF). Technology and Disability. 26 (2–3): 161–170. doi:10.3233/TAD-140414. Archived from the original (PDF) on 2017-03-14. Retrieved 2016-07-21.
  47. ^ "Seeing-Eye Vest? Vibrating Clothing Helps Blind Navigate". Https. 14 November 2014. Retrieved June 21, 2019.
  48. ^ "Vibrating vest could help the blind avoid obstacles - CBS News". Https. 13 June 2014. Retrieved June 21, 2019.
  49. ^ "Vibrating Clothes Could Help Blind People Navigate - Business Insider". Https. Retrieved June 21, 2019.
  50. ^ "Forehead Retina System". Https. 2006-08-08. Retrieved June 21, 2019.
  51. ^ Eagleman, David (2015). Can we create new senses for humans? TED talks.
  52. ^ Neosensory, Inc
  53. ^ Could This Futuristic Vest Give Us a Sixth Sense?, Smithsonian Magazine, Apr 2018.
  54. ^ Meet the man who wants to give humans a sixth sense, The Telegraph, Jan 2019.
  55. ^ a b Schurmann M, Caetano G, Hlushchuk Y, Jousmaki V, Hari R (2006). "Touch activates human auditory cortex". NeuroImage. 30 (4): 1325–1331. doi:10.1016/j.neuroimage.2005.11.020. PMID 16488157. S2CID 22045169.
  56. ^ "Sense organs synthesizer United States Patent Application 20020173823" (PDF).
  57. ^ a b Tyler M, Danilov Y, Bach-y-Rita P (2003). "Closing an open-loop control system: vestibular substitution through the tongue". Journal of Integrative Neuroscience. 2 (2): 159–164. doi:10.1142/S0219635203000263. PMID 15011268. S2CID 14565593.
  58. ^ a b Bach-y-Rita P (1999). "Theoretical aspects of sensory substitution and of neurotransmission-related reorganization in spinal cord injury". Spinal Cord. 37 (7): 465–474. doi:10.1038/sj.sc.3100873. PMID 10438112.
  59. ^ a b Riso RR (1999). "Strategies for providing upper extremity amputees with tactile and hand position feedback – moving closer to the bionic arm" (PDF). Technology and Health Care. 7 (6): 401–409. doi:10.3233/THC-1999-7604. PMID 10665673. Archived from the original (PDF) on 2010-06-27. Retrieved 2016-07-21.
  60. ^ Meijer PBL (1992). "An Experimental System for Auditory Image Representations". IEEE Transactions on Biomedical Engineering. 39 (2): 112–121. doi:10.1109/10.121642. PMID 1612614. S2CID 34811735.
  61. ^ Levy-Tzedek, Shelly; Hanassy S.; Abboud S.; Maidenbaum S.; Amedi A. (January 1, 2012). "Fast, accurate reaching movements with a visual-to-auditory sensory substitution device" (PDF). Restorative Neurology and Neuroscience. 30 (4): 313–323. doi:10.3233/RNN-2012-110219. PMID 22596353.
  62. ^ Abboud, Sami; Hanassy S; Levy-Tzedek S; Maidenbaum S; Amedi A. (2014). "EyeMusic: Introducing a "visual" colorful experience for the blind using auditory sensory substitution" (PDF). Restorative Neurology and Neuroscience. 32 (2): 247–257. doi:10.3233/RNN-130338. PMID 24398719.
  63. ^ Maidenbaum, Shachar; Abboud S.; Amedi A. (April 2014). "Sensory substitution: Closing the gap between basic research and widespread practical visual rehabilitation" (PDF). Neuroscience & Biobehavioral Reviews. 41: 3–15. doi:10.1016/j.neubiorev.2013.11.007. PMID 24275274.
  64. ^ Ambard, Maxime; Benezeth Y.; Pfister P. (2015). "Mobile video-to-audio transducer and motion detection for sensory substitution". Frontiers in ICT. 2. doi:10.3389/fict.2015.00020.
  65. ^ a b Capelle C, Trullemans C, Arno P, Veraart C (1998). "A real-time experimental prototype for enhancement of Vision Rehabilitation–vision rehabilitation using auditory substitution". IEEE Transactions on Biomedical Engineering. 45 (10): 1279–1293. doi:10.1109/10.720206. PMID 9775542. S2CID 7865760.
  66. ^ Nielson L, Mahowald M, Mead C (1989). "SeeHear," in Analog VLSI and Neural Systems, by C. Mead, Reading: Addison-Wesley, chapter 13, 207–227.
  67. ^ [1] [dead link]
  68. ^ Warwick K, Gasson M, Hutt B, Goodhew I, Kyberd P, Schulzrinne H, Wu X (2004). "Thought communication and control: A first step using radiotelegraphy". IEE Proceedings - Communications. 151 (3): 185–189. doi:10.1049/ip-com:20040409 (inactive 7 December 2024). S2CID 15804806.{{cite journal}}: CS1 maint: DOI inactive as of December 2024 (link)
  69. ^ Lenay C, Gapenne O, Hanneton S, Marque C, Geouelle C (2003). "Sensory Substitution: limits and perspectives". Touching for Knowing, Cognitive psychology of haptic manual perception (PDF). pp. 275–292.
  70. ^ "Home - e-sense project". Archived from the original on 2014-08-10. Retrieved 2014-08-06.
[edit]