Individual differences |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |
Auditory localization or Sound localization is a listener's ability to identify the location or origin of a detected sound.
There are two general methods for sound localization, binaural cues and monaural cues.
Binaural cues Edit
Binaural localization relies on the comparison of auditory input from two separate detectors. Therefore, most auditory systems feature two ears, one on each side of the head. The primary biological binaural cue is the split-second delay between the time when sound from a single source reaches the near ear and when it reaches the far ear. This is often technically referred to as the "interaural time difference" (ITD). ITDmax = 0.63 ms. Another binaural cue, less significant in ground dwelling animals, is the reduction in loudness when the sound reaches the far ear, or the "interaural amplitude difference" (IAD) or (ILD) as "interaural level difference". This is also referred to as the frequency dependent "interaural level difference" (ILD) (or "interaural intensity difference" (IID)). Our eardrums are only sensitive to the sound pressure level differences.
Note that these cues will only aid in localizing the sound source's azimuth (the angle between the source and the sagittal plane), not its elevation (the angle between the source and the horizontal plane through both ears), unless the two detectors are positioned at different heights in addition to being separated in the horizontal plane. In animals, however, rough elevation information is gained simply by tilting the head, provided that the sound lasts long enough to complete the movement. This explains the innate behavior of cocking the head to one side when trying to localize a sound precisely. To get instantaneous localization in more than two dimensions from time-difference or amplitude-difference cues requires more than two detectors. However, many animals have quite complex variations in the degree of attenuation of a sound receives in travelling from the source to the eardrum: there are variations in the frequency-dependent attenuation with both azimuthal angle and elevation. These can be summarised in the head-related transfer function, or HRTF. As a result, where the sound is wideband (that is, has its energy spread over the audible spectrum), it is possible for an animal to estimate both angle and elevation simultaneously without tilting its head. Of course, additional information can be found by moving the head, so that the HRTF for both ears changes in a way known (implicitly!) by the animal.
In vertebrates, inter-aural time differences are known to be calculated in the superior olivary nucleus of the brainstem. According to Jeffress, this calculation relies on delay lines: neurons in the superior olive which accept innervation from each ear with different connecting axon lengths. Some cells are more directly connected to one ear than the other, thus they are specific for a particular inter-aural time difference. This theory is equivalent to the mathematical procedure of cross-correlation. However, because Jeffress' theory is unable to account for the precedence effect, in which only the first of multiple identical sounds is used to determine the sounds' location (thus avoiding confusion caused by echoes), it cannot be entirely correct, as pointed out by Gaskell.
The tiny parasitic fly Ormia ochracea has become a model organism in sound localization experiments because of its unique ear. The animal is too small for the time difference of sound arriving at the two ears to be calculated in the usual way, yet it can determine the direction of sound sources with exquisite precision. The tympanic membranes of opposite ears are directly connected mechanically, allowing resolution of nanosecond time differences  and requiring a new neural coding strategy. Ho showed that the coupled-eardrum system in frogs can produce increased interaural vibration disparities when only small arrival time and intensity differences were available to the animal’s head. Efforts to build directional microphones based on the coupled-eardrum structure are underway.
Monaural (filtering) cues Edit
Monaural localization mostly depends on the filtering effects of external structures. In advanced auditory systems, these external filters include the head, shoulders, torso, and outer ear or "pinna", and can be summarized as the head-related transfer function. Sounds are frequency filtered specifically depending on the angle from which they strike the various external filters. The most significant filtering cue for biological sound localization is the pinna notch, a notch filtering effect resulting from destructive interference of waves reflected from the outer ear. The frequency that is selectively notch filtered depends on the angle from which the sound strikes the outer ear. Instantaneous localization of sound source elevation in advanced systems primarily depends on the pinna notch and other head-related filtering. These monaural effects also provide azimuth information, but it is inferior to that gained from binaural cues.
In order to enhance filtering information, many animals have large, specially shaped outer ears. Many also have the ability to turn the outer ear at will, which allows for better sound localization and also better sound detection. Bats and barn owls are paragons of monaural localization in the animal kingdom, and have thus become model organisms.
Processing of head-related transfer functions for biological sound localization occurs in the auditory cortex.
Distance cues Edit
Neither inter-aural time differences nor monaural filtering information provides good distance localization. Distance can theoretically be approximated through inter-aural amplitude differences or by comparing the relative head-related filtering in each ear: a combination of binaural and filtering information. The most direct cue to distance is sound amplitude, which decays with increasing distance. However, this is not a reliable cue, because in general it is not known how strong the sound source is. In case of familiar sounds, such as speech, there is an implicit knowledge of how strong the sound source should be, which enables a rough distance judgment to be made.
In general, humans are best at judging sound source azimuth, then elevation, and worst at judging distance. Source distance is qualitatively obvious to a human observer when a sound is extremely close (the mosquito in the ear effect), or when sound is echoed by large structures in the environment (such as walls and ceiling). Such echoes provide reasonable cues to the distance of a sound source, in particular because the strength of echoes does not depend on the distance of the source, while the strength of the sound that arrives directly from the sound source becomes weaker with distance. As a result, the ratio of direct-to-echo strength alters the quality of the sound in such a way to which humans are sensitive. In this way consistent, although not very accurate, distance judgments are possible. This method generally fails outdoors, due to a lack of echoes. Still, there are a number of outdoor environments that also generate strong, discrete echoes, such as mountains. On the other hand, distance evaluation outdoors is largely based on the received timbre of sound: short soundwaves (high-pitched sounds) die out sooner, due to their relatively smaller kinetic energy, and thus distant sounds appear duller than normal (lacking in treble).
Auditory localization by speciesEdit
- Animal echolocation
- Auditory acuity
- Auditory perception
- Coincidence detection in neurobiology
- Head-related transfer function
- Head shadow
- Human echolocation
- Rayleigh's duplex theory
- ↑ Jeff€ress, L.A., 1948. A place theory of sound localization. Journal of Comparative and Physiological Psychology 41, 35-39.
- ↑ Gaskell, H., 1983. The precedence effect. Hearing Research 11, 277-303.
- ↑ Miles RN, Robert D, Hoy RR. Mechanically coupled ears for directional hearing in the parasitoid fly Ormia ochracea. J Acoust Soc Am. 1995 Dec;98(6):3059-70. PMID 8550933 DOI:10.1121/1.413830
- ↑ Robert D, Miles RN, Hoy RR. Directional hearing by mechanical coupling in the parasitoid fly Ormia ochracea. J Comp Physiol [A]. 1996;179(1):29-44. PMID 8965258 DOI:10.1007/BF00193432
- ↑ Mason AC, Oshinsky ML, Hoy RR. Hyperacute directional hearing in a microscale auditory system. Nature. 2001 Apr 5;410(6829):686-90. PMID 11287954 DOI:10.1038/35070564
- ↑ Ho CC, Narins PM. Directionality of the pressure-difference receiver ears in the northern leopard frog, Rana pipiens pipiens. J Comp Physiol [A]. 2006 Apr;192(4):417-29.
- Collection of references about sound localization
- Scientific articles about the sound localization abilities of different species of mammals
|Concepts in Neuroethology||
Feedforward · Coincidence detector · Umwelt · Instinct · Feature detector · Central pattern generator (CPG) ·NMDA receptor · Lateral inhibition · Fixed action pattern · Krogh's Principle·Hebbian theory· Sound localization
|History of Neuroethology|
|Methods in Neuroethology|
|Model Systems in Neuroethology|
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|