Wikia

Psychology Wiki

Hearing

Talk0
34,140pages on
this wiki
Revision as of 10:04, August 3, 2013 by Dr Joe Kiff (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking  - Cognitive processes Cognition - Outline Index


Hearing is one of the traditional five senses, and refers to the ability to detect sound. In humans and other vertebrates, hearing is performed primarily by the auditory system: sound is detected by the ear and transduced into nerve impulses that are perceived by the brain.

Like touch, audition requires sensitivity to the movement of molecules in the world outside the organism. Both hearing and touch are types of mechanosensation.[1]

Hearing in animals Edit

Not all sounds are normally audible to all animals. Each species has a range of normal hearing for both loudness (amplitude) and pitch (frequency). Many animals use sound in order to communicate with each other and hearing in these species is particularly important for survival and reproduction. In species using sound as a primary means of communication, hearing is typically most acute for the range of pitches produced in calls and speech.

Frequencies capable of being heard by humans are called audio or sonic. Frequencies higher than audio are referred to as ultrasonic, while frequencies below audio are referred to as infrasonic. Some bats use ultrasound for echo location while in flight. Dogs are able to hear ultrasound, which is the principle of 'silent' dog whistles. Snakes sense infrasound through their bellies, and whales, giraffes and elephants use it for communication.

The physiology of hearing in vertebrates is not fully understood at this time. The molecular mechanism of sound transduction within the cochlea and the processing of sound by the brain, (the auditory cortex) are two areas that remain largely unknown.

Hearing in humans Edit

Humans can generally hear sounds with frequencies between 20 Hz and 20 kHz. Human hearing is able to discriminate small differences in loudness (intensity) and pitch (frequency) over that large range of audible sound. This healthy human range of frequency detection varies significantly with age, occupational hearing damage, and gender; some individuals are able to hear pitches up to 22 kHz and perhaps beyond, while others are limited to about 16 kHz. The ability of most adults to hear sounds above about 8 kHz begins to deteriorate in early middle age.[2]

Mechanism Edit

Main article: Auditory system

Human hearing takes place by a complex mechanism involving the transformation of sound waves into nerve impulses.

Outer ear Edit

Main article: Outer ear

The visible portion of the outer ear in humans is called the auricle or the pinna. It is a convoluted cup that arises from the opening of the ear canal on either side of the head. The auricle helps direct sound to the ear canal. Both the auricle and the ear canal amplify and guide sound waves to the tympanic membrane or eardrum.

In humans, amplification of sound ranges from 5 to 20 dB for frequencies within the speech range (about 1.5–7 kHz). Since the shape and length of the human external ear preferentially amplifies sound in the speech frequencies, the external ear also improves signal to noise ratio for speech sounds.[3]

Middle ear Edit

Main article: Middle ear

The eardrum is stretched across the front of a bony air-filled cavity called the middle ear. Just as the tympanic membrane is like a drum head, the middle ear cavity is like a drum body.

Much of the middle ear's function in hearing has to do with processing sound waves in air surrounding the body into the vibrations of fluid within the cochlea of the inner ear. Sound waves move the tympanic membrane, which moves the ossicles, which move the fluid of the cochlea.

Inner ear Edit

Main article: Inner ear

The cochlea is a snail shaped fluid-filled chamber, divided along almost its entire length by a membranous partition. The cochlea propagates mechanical signals from the middle ear as waves in fluid and membranes, and then transduces them to nerve impulses which are transmitted to the brain. It is responsible for the sensations of balance and motion.

Central auditory system Edit

This sound information, now re-encoded, travels down the auditory nerve, through parts of the brainstem (for example, the cochlear nucleus and inferior colliculus), further processed at each waypoint. The information eventually reaches the thalamus, and from there it is relayed to the cortex. In the human brain, the primary auditory cortex is located in the temporal lobe.

Representation of loudness, pitch, and timbre Edit

Nerves transmit information through discrete electrical impulses known as "action potentials." As the loudness of a sound increases, the rate of action potentials in the auditory nerve fibre increases. Conversely, at lower sound intensities (low loudness), the rate of action potentials is reduced.

Different repetition rates and spectra of sounds, that is, pitch and timbre, are represented on the auditory nerve by a combination of rate-versus-place and temporal-fine-structure coding. That is, different frequencies cause a maximum response at different places along the organ of Corti, while different repetition rates of low enough pitches (below about 1500 Hz) are represented directly by repetition of neural firing patterns (known also as volley coding).

Loudness and duration of sound (within small time intervals) may also influence pitch to a small extent. For example, for sounds higher than 4000 Hz, as loudness increases, the perceived pitch also increases.

Localization of sound Edit

Main article: sound localization

Humans are normally able to hear a variety of sound frequencies, from about 20 Hz to 20 kHz. Our ability to estimate just where the sound is coming from, sound localization, is dependent on both hearing ability of each of the two ears, and the exact quality of the sound. Since each ear lies on an opposite side of the head, a sound will reach the closest ear first, and its amplitude will be larger in that ear.

The shape of the pinna (outer ear) and of the head itself result in frequency-dependent variation in the amount of attenuation that a sound receives as it travels from the sound source to the ear: further this variation depends not only on the azimuthal angle of the source, but also on its elevation. This variation is described as the head-related transfer function, or HRTF. As a result, humans can locate sound both in azimuth and altitude. Most of the brain's ability to localize sound depends on interaural (between ears) intensity differences and interaural temporal or phase differences. In addition, humans can also estimate the distance that a sound comes from, based primarily on how reflections in the environment modify the sound, for example as in room reverberation.

Human echolocation is a technique used by some blind humans to navigate within their environment by listening for echos of click or tap sounds that they emit.

Hearing and language Edit

Human beings develop spoken language within the first few years of life, and hearing impairment can not only prevent the ability to talk but also the ability to understand the spoken word. By the time it is apparent that a severely hearing impaired (deaf) child has a hearing deficit, problems with communication may have already caused issues within the family and hindered social skills, unless the child is part of a Deaf community where sign language is used instead of spoken language (see Deaf Culture). In many developed countries, hearing is evaluated during the newborn period in an effort to prevent the inadvertent isolation of a deaf child in a hearing family. Although sign language is a full means of communication, literacy depends on understanding speech. In the great majority of written language, the sound of the word is coded in symbols. Although an individual who hears and learns to speak and read will retain the ability to read even if hearing becomes too impaired to hear voices, a person who never heard well enough to learn to speak is rarely able to read proficiently.[4] Most evidence points to early identification of hearing impairment as key if a child with very insensitive hearing is to learn spoken language. Listening also plays an important role in learning a second language.

Hearing tests Edit

Hearing can be measured by behavioral tests using an audiometer. Electrophysiological tests of hearing can provide accurate measurements of hearing thresholds even in unconscious subjects. Such tests include auditory brainstem evoked potentials (ABR), otoacoustic emissions and electrocochleography (EchoG). Technical advances in these tests have allowed hearing screening for infants to become widespread.

Hearing underwater Edit

Hearing threshold and the ability to localize sound sources are reduced underwater, in which the speed of sound is faster than in air. Underwater hearing is by bone conduction, and localization of sound appears to depend on differences in amplitude detected by bone conduction.[5]

ReferencesEdit

  1. Kung C., "A possible unifying principle for mechanosensation," Nature, 436(7051):647–54, 2005 Aug 4.
  2. http://www.nytimes.com/2006/06/12/technology/12ring.html?ex=1307764800&en=2b80d158770dccdf&ei=5088&partner=rssnyt&emc=rss
  3. (John F. Brugge and Matthew A. Howard, Hearing, Chapter in Encyclopedia of the Human Brain, ISBN 0-12-227210-2, Elsevier, Pages 429-448,2002)
  4. (Morton CC. Nance WE. Newborn hearing screening--a silent revolution. [Review] [47 refs] [Journal Article. Review] New England Journal of Medicine. 354(20):2151-64, 2006 May 18.)
  5. (Shupak A. Sharoni Z. Yanir Y. Keynan Y. Alfie Y. Halpern P. Underwater hearing and sound localization with and without an air interface. [Journal Article] Otology & Neurotology. 26(1):127-30, 2005 Jan.)
  • Gelfand, S. A. (2004) Hearing: An Introduction to Psychological and Physiological Acoustics. 4th Edition New York: Marcel Dekker.
  • Moore, B. C. (2004) An Introduction to the Psychology of Hearing. 5th Edition London: Elsevier Academic Press.
  • Yost, W. A. (2000) Fundamentals of Hearing: An Introduction. 4th Edition San Diego: Academic Press.

See also Edit

Nervous system - Sensory system - edit
Special sensesVisual system | Auditory system | Olfactory system | Gustatory system
Somatosensory systemNociception | Thermoreception | Vestibular system |
Mechanoreception (Pressure, Vibration & Proprioception) | Equilibrioception 



Auditory perception

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Around Wikia's network

Random Wiki