Wikia

Psychology Wiki

Sensory substitution

Talk0
34,142pages on
this wiki

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking  - Cognitive processes Cognition - Outline Index


Sensory substitution means to transform the characteristics of one sensory modality into stimuli of another sensory modality. It is hoped that sensory substitution systems can help handicapped people by restoring their ability to perceive a certain defective sensory modality by using sensory information from a functioning sensory modality. A sensory substitution system consists of three parts: a sensor, a coupling system, and a stimulator. The sensor records stimuli and gives them to a coupling system which interprets these signals and transmits them to a stimulator. Sensory substitution concerns human perception and the plasticity of the human brain; and therefore, allows us to study these aspects of neuroscience more through neuroimaging.

HistoryEdit

Sensory Substitution was introduced in the 60’s by Paul Bach-y-Rita as a means of using one sensory modality, mainly tactition, to gain environmental information to be used by another sensory modality, mainly vision.[1][2] The first sensory substitution system was developed by Bach-y-Rita et al. as a means of brain plasticity in congenitally blind individuals.[3] After this historical invention, sensory substitution has been the basis of many studies investigating perceptive and cognitive neuroscience. Since then, sensory substitution has contributed to the study of brain function, human cognition and rehabilitation.[4]

Physiology of sensory substitutionEdit

When a person becomes blind or deaf they generally do not lose the ability to hear or see, they simply lose their ability to transmit the sensory signals from the periphery (retina for visions and cochlea for hearing) to brain.[5] Since the vision processing pathways are still intact, a person who’s lost the ability retrieve data from the retina can still see subjective images by using data gathered from other sensory modalities such as touch or audition.[6]

In a regular visual system, the data collected by the retina is converted into an electrical stimulus in the optic nerve and relayed to the brain, which re-creates the image and perceives it. Because it is the brain that is responsible for the final perception, sensory substitution is possible. During sensory substitution an intact sensory modality relays information to the visual perception areas of the brain so that the person can perceive to see. With sensory substitution, information gained from one sensory modality can reach brain structures physiologically related to other sensory modalities. Touch-to-voice sensory substitution transfers information from touch receptors to the visual cortex for interpretation and perception. For example, through fMRI, we can determine which parts of the brain are activated during sensory perception. In blind persons, we can see that while they are only receiving tactile information, their visual cortex is also activated as they perceive to see objects.[7] We can also have touch to touch sensory substitution where information from touch receptors of one region can be used to perceive touch in another region. For example, in one experiment by Bach-y-Rita, he was able to restore the touch perception in a patient who lost peripheral sensation from leprosy.[8]

In order to have sensory substitution and stimulate the brain without intact sensory organs to relay the information, it is necessary to develop machines that do the signal transduction. This brain-machine interface is where external signals are collected and transduced into electrical signals for the brain to interpret. Generally a camera or a microphone is used to collect visual or auditory stimuli that are used to replace lost sensory information. The visual or auditory data collected from the sensors is transduced into tactile stimuli that are then relayed to the brain for visual and auditory perception. This type of sensory substitution is only possible due to the plasticity of the brain.[8]

Brain plasticityEdit

Brain plasticity is the brain’s ability to adapt to the complete absence or the deterioration of a sense. Sensory substitution is therefore most likely explained through the study of brain plasticity. Cortical re-mapping or reorganization takes place when the brain experiences some sort of deterioration. This is an evolutionary mechanism that allows people with the deprivation of a sense to adapt and compensate by using other senses. Functional imaging of congenitally blind patients showed a cross-modal recruitment of the occipital cortex during the realization perceptual tasks such as Braille reading, tactile perception, tactual object recognition, sound localization, and sound discrimination.[4] This shows that blind people can use their occipital lobe, generally used for vision, to perceive objects though the use of other sensory modalities, which would explain their oft-displayed propensity towards increased strength of the other senses.

Perception versus sensingEdit

While talking about the physiological aspects of sensory substitution, it is essential to distinguish between sensing and perceiving. The general question posed by this differentiation is: Are blind people seeing or perceiving to see by putting together different sensory data? While sensation comes in one modality – visual, auditory, tactile etc. – perception due to sensory substitution is not one modality but a result of cross-modal interactions. Therefore, we can say that while sensory substitution for vision induces visual-like perception in sighted individual, it induces auditory or tactile perception in blind individuals.[9] In short, blind people perceive to see though touch and audition with sensory substitution.

Different applications of sensory substitutionEdit

Applications are not restricted to handicapped persons, but also include artistic presentations, games, and augmented reality. Some examples are substitution of visual stimuli to audio or tactile, and of audio stimuli to tactile. Some of the most popular are probably Paul Bach-y-Rita's Tactile Vision Sensory Substitution (TVSS), developed with Carter Collins at Smith-Kettlewell Institute and Peter Meijer's Seeing with Sound approach (The vOICe). Technical developments, such as miniaturization and electrical stimulation help the advance of sensory substitution devices.

In sensory substitution systems, we generally have sensors that collect the data from the external environment. This data is then relayed to a coupling system that interprets and transduces the information and then replays it to a stimulator. This stimulator ultimately stimulates a functioning sensory modality.[9] After training, people learn to use the information gained from this stimulation to experience a perception of the sensation they lack instead of the actually stimulated sensation. For example, a leprosy patient, whose perception of peripheral touch was restored, was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead.[8]

Tactile sensory substitution systemsEdit

To understand tactile sensory substitution it is essential to understand some basic physiology of the tactile receptors of the skin. There are six basic types of tactile receptors: Pacinian corpuscle, Meissner’s corpuscle, Ruffini endings, Merkel nerve endings, free nerve endings, and tactile disks. These receptors are mainly characterized by their ability to adapt to stimuli and their thresholds.[10] Because of the relative high thresholds of most these receptors and their rapid adaptation to stimulus, the human body requires rapidly changing tactile stimulation systems.[11]

There have been two different types of stimulators: electrotactile or vibrotactile. Electrotactile stimulators use direct electrical stimulation of the nerve ending in the skin to initiate the action potentials; the sensation triggered, burn, itch, pain, pressure etc. depends on the stimulating voltage. Vibrotactile stimulators use pressure and the properties of the mechanoreceptors of the skin to initiate action potentials. There are advantages and disadvantages for both these stimulation systems. With the electrotactile stimulating systems a lot of factors effect the sensation triggered: stimulating voltage, current, waveform, electrode size, material, contact force, skin location, thickness and hydration.[11] Electrotactile stimulation requires the direct stimulation of the nerves so insertion of the electrode needles into the skin is necessary.[12] This will also cause additional distress to the patient and is a major disadvantage of electrotactile array. Furthermore, stimulation of the skin without insertion will lead to the need for high voltage stimulation because of the high impedance of the skin.[11] Vibrotactile systems use the properties of mechanoreceptors in the skin so they have fewer parameters that need to be monitored as compared to electrotactile stimulation. However, vibrotactile stimulation systems need to account for the rapid adaptation of the tactile sense.

Another important aspect of tactile sensory substitution systems is the location of the tactile stimulation. Tactile receptors are abundant on the fingertips, face, and tongue while sparse on the back, legs and arms. It is essential to take into account the spatial resolution of the receptor as it has a major effect on the resolution of the sensory substitution.[11]

Below you can find some descriptions of current tactile substitution systems.

Tactile–vision substitutionEdit

One of the earliest and most well known form of sensory substitution devices was Paul Bach-y-Rita’s TVSS that converted the image from a video camera into a tactile image and coupled it to the tactile receptors on the back of his blind subject.[1] Recently, several new systems have been developed that interface the tactile image to tactile receptors on different areas of the body such as the on the chest, brow, fingertip, abdomen, and forehead.[5]The tactile image is produced by four hundred activators placed either on the person. The activators are solenoids of one millimeter diameter. In experiments, blind (or blindfolded) subjects equipped with the TVSS can learn to detect shapes and to orient themselves. In the case of simple geometric shapes, it took around 50 trials to achieve 100 percent correct recognition. To identify objects in different orientations requires several hours of learning.

A system using the tongue as the human-machine interface is most practical. The tongue-machine interface is both protected by the closed mouth and the saliva in the mouth provides a good electrolytic environment that ensures good electrode contact.[13] Results from a study by Bach-y-Rita et al. show that electrotactile stimulation of the tongue required 3% of the voltage required to stimulate the finger.[13] Also, since it is more practical to wear an orthodontic retainer holding the stimulation system than an apparatus strapped to other parts of the body, the tongue-machine interface is more popular among TVSS systems.

This tongue TVSS system works by delivering electrotactile stimuli to the dorsum of the tongue via a flexible electrode array placed in the mouth. This electrode array is connected to a Tongue Display Unit [TDU] via a ribbon cable passing out of the mouth. A video camera records a picture, transfers it to the TDU for conversion into a tactile image. The tactile image is then projected onto the tongue via the ribbon cable where the tongue’s receptors pick up the signal. After training, subjects are able to associate certain types of stimuli to certain types of visual images.[5][14] In this way, tactile sensation can be used for visual perception.

Tactile–auditory substitutionEdit

While there are no tactile-auditory substitution system currently available, recent experiments by Schurmann et al. show that tactile senses can activate the human auditory cortex. Currently vibrotactile stimuli can be used to facilitate hearing in normal and hearing-impaired people.[15] To test for the auditory areas activated by touch, Schurmann et al. tested subjects while stimulating their fingers and palms with vibration bursts and their finger tips with tactile pressure. They found that tactile stimulation of the fingers lead to activation of the auditory belt area, which suggests that there is a relationship between audition and tactition.[15] Therefore, future research can be done to investigate the likelihood of a tactile-auditory sensory substitution system.

Tactile–vestibular substitutionEdit

Some people with balance disorders or adverse reactions to antibiotics suffer from bilateral vestibular damage (BVD). They experience difficulty maintaining posture, unstable gait, and oscillopsia.[16] Tyler et al. studied the restitution of postural control through a tactile for vestibular sensory substitution. Because BVD patients cannot integrate visual and tactile cues, they have a lot of difficulty standing. Using a head-mounted accelerometer and a brain-machine interface that employs electrotactile stimulation on the tongue, information about head-body orientation was relayed to the patient so that a new source of data is available to orient themselves and maintain good posture.[16]

Tactile–tactile substitution to restore peripheral sensationEdit

Touch to touch sensory substitution is where information from touch receptors of one region can be used to perceive touch in another. For example, in one experiment by Bach-y-Rita, the touch perception was restored in a patient who lost peripheral sensation from leprosy.[8] For example, this leprosy patient was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead.[8] After two days of training one of the leprosy subjects reported “the wonderful sensation of touching his wife, which he had been unable to experience for 20 years.”[17]

Tactile feedback system for prosthetic limbsEdit

The development of new technologies has now made is plausible to provide patients with prosthetic arms with tactile and kinesthetic sensibilities.[18] While this is not purely a sensory substitution system, it uses the same principles to restore perception of senses. Some tactile feedback methods of restoring a perception of touch to amputees would be direct or micro stimulation of the tactile nerve afferents. [18]

Other applications of sensory substitution systems can be seen in function robotic prostheses for patients with high level quadriplegia. These robotic arms have several mechanisms of slip detection, vibration and texture detection that they relay to the patient through feedback.[17] After more research and development, the information from these arms can be used by patients to perceive that they are holding and manipulating objects while their robotic arm actually accomplishes the task.

Auditory sensory substitution systems Edit

Auditory sensory substitution systems like the tactile sensory substitution systems aim to use one sensory modality to compensate for the lack of another sensory modality in order to gain a perception of one that is lacking. With auditory sensory substitution, we use visual or tactile sensors to detect and store information about the external environment. This information is then transduced by brain-machine interfaces into auditory signals that are then relayed via the auditory receptors to the brain.

Auditory–vision substitutionEdit

The ultimate goal is to provide synthetic vision with truly visual sensations by exploiting the neural plasticity of the human brain. Neuroscience research has shown that the visual cortex of even adult blind people can become responsive to sound, and “seeing with sound” might reinforce this in a visual sense with live video from a head-mounted camera encoded in sound. The extent to which cortical plasticity indeed allows for functionally relevant rewiring or remapping of the human brain is still largely unknown and is being investigated in an open collaboration with research partners around the world.

The vOICeEdit

The vOICe vision technology is one of several approaches towards sensory substitution (vision substitution) for the blind that aims to provide synthetic vision to the user by means of a non-invasive visual prosthesis. The vOICe converts live camera views from a video camera into soundscapes.[19] This system uses general video to audio mapping by associating height to pitch and brightness with loudness in a left-to-right scan of any video frame.[5] Views are typically refreshed about once per second with a typical image resolution of up to 60 x 60 pixels as can be proven by spectrographic analysis.[19] Neuroscience and psychology research indicate recruitment of relevant brain areas in seeing with sound, as well as functional improvement through training.[20][21][22]

PSVAEdit

Another successful visual-to-auditory sensory substitution device is the Prosthesis Substituting Vision for Audition (PSVA).[23] This system utilizes a head-mounted TV camera that allows real-time, online translation of visual patterns into sound. While the patient moves around, the device captures visual frames at a high frequency and generates the corresponding complex sounds that allow recognition.[5] Visual stimuli are transduced into auditory stimuli with the use of a system that uses pixel to frequency relationship and couples a rough model of the human retina with an inverse model of the cochlea.[23]

The Vibe Edit

The sound produced by this software is a mixture of sinusoidal sounds produced by virtual "sources", corresponding each to a "receptive field" in the image. Each receptive field is a set of localized pixels. The sound's amplitude is determined by the mean luminosity of the pixels of the corresponding receptive field. The frequency and the inter-aural disparity are determined by the center of gravity of the co-ordinates of the receptive field's pixels in the image (see "There is something out there: distal attribution in sensory substitution, twenty years later"; Auvray M., Hanneton S., Lenay C., O'Regan K. Journal of Integrative Neuroscience 4 (2005) 505-21). The Vibe is an Open Source project hosted by Sourceforge.

Other systemsEdit

Other approaches to the substitution of hearing for vision use binaural directional cues, much as natural human echolocation does. An example of the latter approach is the "SeeHear" chip from Caltech.[24]

Nervous system implantsEdit

By means of stimulating electrodes implanted into the human nervous system, it is possible to apply current pulses to be learned and reliably recognized by the recipient. It has been shown successfully in experimentation, by Kevin Warwick, that signals can be employed from force/touch indicators on a robot hand as a means of communication.[25]

CriticismEdit

It has been argued that the term "substitution" is misleading, as it is merely an "addition" or "supplementation" not a substitution of a sensory modality.[26]

See alsoEdit

ReferencesEdit

  1. 1.0 1.1 Bach-y Rita P, Collins CC, Saunders F, White B, Scadden L.(1969). “Vision substitution by tactile image projection.”. Nature, 221:963–964.
  2. Nicholas Humphrey (1999). A History of the Mind: Evolution and the Birth of Consciousness, Springer.
  3. Bach-y-Rita P. (2004). “Tactile sensory substitution studies.”. Annals of New York Academic Sciences, 1013:83–91.
  4. 4.0 4.1 Renier L, De Volder AG. (2005). “Cognitive and brain mechanisms in sensory substitution of vision: a contribution to the study of human perception.”. Journal of Integrative Neuroscience, 4 (4):489–503.
  5. 5.0 5.1 5.2 5.3 5.4 Bach-y-Rita P, Kercel SW. (2003). “Sensory substitution and the human-machine interface.”. Trends in Cognitive Neuroscience, 7 (12):541-546.
  6. O’Regan, JK, Noe, A. (2001). “A sensorimotor account of vision and visual consciousness.”. Behavioral and Brain Sciences, 24 (5):939-973.
  7. Bach-y-Rita P. Brain Mechanisms in Sensory Substitution, Academic Press New York:1972.
  8. 8.0 8.1 8.2 8.3 8.4 Bach-y-Rita P. Nonsynaptic Diffusion Neurotransmission and Late Brain Reorganization, Demos-Vermande, New York :1995.
  9. 9.0 9.1 Poirier C, De Volder AG, Scheiber C. (2007). “What neuroimaging tells us about sensory substitution.” Neuroscience and Behavioral Reviews, 31: 1064-1070.
  10. Vallbo AB, Johansson RS. (1984). “Properties of cutaneous mechanoreceptors in the human hand related to touch sensation.”. Human Neurobiology, 3: 3-14.
  11. 11.0 11.1 11.2 11.3 Kaczmarek KA, Webster JG, Bach-y-Rita P, Tompkins WJ. (1991). “Electrotactile and vibrotactile displays for sensory substitution systems” IEEE Transactions Biomedical Engineering, 38 (1): 1-16.
  12. Blamey PJ, Clark GM. (1985). “A wearable multiple-electrode electrotactile speech processor for the profoundly deaf.” Journal of Acoustic Society of America, 77:1619-1621.
  13. 13.0 13.1 Bach-y-Rita P, Kaczmarek KA, Tyler ME, Garcia-Lara J (1998). “Form perception with a 49-point electrotactile stimulus array on the tongue.”. Journal of Rehabilitation Research Development, 35:427-430.
  14. Bach-y-Rita P, and Kaczmarek KA . (2002). Tongue placed tactile output device. US Patent 6,430,459.
  15. 15.0 15.1 Schurmann M, Caetano G, Hlushchuk Y, Jousmaki V, Hari R (2006). “Touch activates human auditory cortex”, Neuroimage, 30:1325–1331.
  16. 16.0 16.1 Tyler M, Danilov Y, Bach-y-Rita P (2003). “Closing an open-loop control system: vestibular substitution through the tongue.”. Journal of Integrative Neuroscience, 2:159-164.
  17. 17.0 17.1 Bach-y-Rita P (1999). “Theoretical aspects of sensory substitution and of neurotransmission-related reorganization in spinal cord injury.”. Spinal Cord, 37:465-474.
  18. 18.0 18.1 Rise RR (1999). “Stratergies for providing upper extremity amputees with tactile and hand position feedback – moving closer to the bionic arm.”. Technology and Health Care, 7:401-409.
  19. 19.0 19.1 Meijer PBL (1992). "An Experimental System for Auditory Image Representations.". IEEE Transactions Biomedical Engineering, 39: 112-121.
  20. A. Amedi, W. Stern, J. A. Camprodon, F. Bermpohl, L. Merabet, S. Rotman, C. Hemond, P. Meijer and A. Pascual-Leone, "Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex", Nature Neuroscience, 10(6): 687-689, June 2007.
  21. M. Auvray, S. Hanneton and J. K. O'Regan, "Learning to perceive with a visuo-auditory substitution system: Localisation and object recognition with ‘The vOICe’", Perception, 36(3): 416-430, 2007.
  22. M. J. Proulx, P. Stoerig, E. Ludowig and I. Knoll, "Seeing 'Where' through the Ears: Effects of Learning-by-Doing and Long-Term Sensory Deprivation on Localization Based on Image-to-Sound Substitution", PLoS ONE, 3(3): e1840, March 2008.
  23. 23.0 23.1 Capelle C, Trullemans C, Arno P, Veraart C (1998). "A real-time experimental prototype for enhancement of vision rehabilitation using auditory substitution.". IEEE Transactions Biomedical Engineering, 45: 1279-1293.
  24. Nielson L, Mahowald M, Mead C (1989). "SeeHear," in Analog VLSI and Neural Systems, by C. Mead, Reading: Addison-Wesley, chapter 13, 207–227.
  25. Warwick K, Gasson M, Hutt B, Goodhew I, Kyberd P, SchulzrinneH, Wu, X (2004). “Thought communication and control: A first step using radiotelegraphy”, IEE Proceedings on Communications, 151 (3), 185-189.
  26. Lenay C, Gapenne O, Hanneton S, Marque C, Geouelle C (2003). "Sensory Substitution: limits and perspectives", Touching for Knowing, Cognitive psychology of haptic manual perception, 275-292.

External linksEdit


This page uses Creative Commons Licensed content from Wikipedia (view authors).

Around Wikia's network

Random Wiki