Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking  - Cognitive processes Cognition - Outline Index


The cognitive neuroscience of music is the scientific study of brain-based mechanisms involved in the cognitive processes underlying music. Methods include functional magnetic resonance imaging (fMRI), transcranial magnetic stimulation (TMS), magnetoencephalography (MEG), electroencephalography (EEG), and positron emission tomography (PET).

Neurological Bases[]

Melody Processing in the Secondary Auditory Cortex[]

People are able to automatically detect a difference in a melody such as an out of tune pitch which does not fit with their previous music experience. This automatic processing occurs in the secondary auditory cortex. Brattico, Tervaniemi, Naatanen, and Peretz (2006) performed a study to determine if our detection of tones that do not fit our expectations can occur automatically[1]. They recorded event-related potentials (ERPs) in nonmusicians as they were presented unfamiliar melodies with either an out of tune pitch or an out of key pitch while participants were either distracted from the sounds or attending to the melody. Both conditions revealed an early frontal negativity independent of where attention was directed. This negativity originated in the auditory cortex, more precisely in the supratemporal lobe (which corresponds with the secondary auditory cortex) with greater activity from the right hemisphere. The negativity response was larger for pitch that was out of tune than that which was out of key. Ratings of musical incongruity were higher for out of tune pitch melodies than for out of key pitch. In the focused attention condition, out of key and out of tune pitches produced late parietal positivity. The findings of Brattico et al. (2006) suggest that there is automatic and rapid processing of melody properties in the secondary auditory cortex [1]. The findings that pitch incongruities were detected automatically, even in processing unfamiliar melodies suggests that there is an automatic comparison of incoming information with long term knowledge of musical scale properties, such as culturally influenced rules of music properties.

Role of Right Auditory Cortex in Fine Pitch Resolution[]

Brodmann 41 42

The primary auditory cortex is one of the main areas associated with superior pitch resolution.

The right secondary auditory cortex has finer pitch resolution than the left. Hyde, Peretz and Zatorre (2008) used functional magnetic resonance imaging (fMRI) in their study to test the involvement of right and left auditory cortical regions in frequency processing of melodic sequences[2]. As well as finding superior pitch resolution in the right secondary auditory cortex, specific areas found to be involved were the planum temporale (PT) in the secondary auditory cortex, and the primary auditory cortex in the medial section of Heschl’s gyrus (HG).

Many neuroimaging studies have found evidence of the importance of right secondary auditory regions in aspects of musical pitch processing, such as melody[3]. Many of these studies such as one by Patterson, Uppenkamp, Johnsrude and Griffiths (2002) also find evidence of a hierarchy of pitch processing. Patterson et al. (2002) used spectrally matched sounds which produced: no pitch, fixed pitch or melody in an fMRI study and found that all conditions activated HG and PT. Sounds with pitch activated more of these regions than sounds without. When a melody was produced activation spread to the superior temporal gyrus (STG) and planum polare (PP). These results support the existence of a pitch processing hierarchy.

Music and Language[]

Certain aspects of language and melody have been shown to be processed in near identical functional brain areas. Brown, Martinez and Parsons (2006) examined the neurological structural similarities between music and language[4]. Utilizing positron emission tomography (PET), the findings showed that both linguistic and melodic phrases produced activation in almost identical functional brain areas. These areas included the primary motor cortex, supplementary motor area, Broca’s area, anterior insula, primary and secondary auditory cortices, temporal pole, basal ganglia, ventral thalamus and posterior cerebellum. Differences were found in lateralization tendencies as language tasks favoured the left hemisphere, but the majority of activations were bilateral which produced significant overlap across modalities[4].

Syntactical information mechanisms in both music and language have been shown to be processed similarly in the brain. Jentschke, Koelsch, Sallat and Friederici (2008) conducted a study investigating the processing of music in children with specific language impairments(SLI)[5]. Children with typical language development (TLD) showed ERP patterns different than that of children with SLI which reflected their challenges to process music-syntactic regularities. Strong correlations between the ERAN amplitude and linguistic and musical abilities provide additional evidence for the relationship of syntactical processing in music and language[5].

However, production of melody and production of speech may be subserved by different neural networks. Stewart, Walsh, Frith and Rothwell (2001) studied the differences between speech production and song production using transcranial magnetic stimulation (TMS)[6]. Stewart et al. found that TMS applied to the left frontal lobe disturbs speech but not melody supporting the idea that they are subserved by different areas of the brain. The authors suggest that a reason for the difference is that speech generation can be localized well but the underlying mechanisms of melodic production cannot. Alternatively, it was also suggested that speech production may be less robust than melodic production and thus more susceptible to interference[6].

Musician vs. Non-musician processing[]

File:Player piano keyboard.jpg

Professional piano players show less cortical activation for complex finger movement tasks due to structural differences in the brain.

Differences[]

Brain structure within musicians and non-musicians is distinctly different. Gaser and Schlaug (2003) compared brain structures of professional musicians with non-musicians and discovered gray matter volume differences in motor, auditory and visual-spatial brain regions[7]. Specifically, positive correlations were discovered between musician status (professional, amateur and non-musician) and gray matter volume in the primary motor and somatosensory areas, premotor areas, anterior superior parietal areas and in the inferior temporal gyrus bilaterally. This strong association between musician status and gray matter differences supports the notion that musicians’ brains show use-dependent structural changes. Due to the distinct differences in several brain regions, it is unlikely that these differences are innate but rather due to the long-term acquisition and repetitive rehearsal of musical skills.

Brains of musicians also show functional differences from those of non-musicians. Krings, Topper, Foltys, Erberich, Sparing, Willmes and Thron (2000) utilized fMRI to study brain area involvement of professional piano players and a control group while performing complex finger movements[8]. Krings et al. found that the professional piano players showed lower levels of cortical activation in motor areas of the brain. It was concluded that a lesser amount of neurons needed to be activated for the piano players due to long-term motor practice which results in the different cortical activation patterns. Koeneke, Lutz, Wustenberg and Jancke (2004) reported similar findings in keyboard players[9]. Skilled keyboard players and a control group performed complex tasks involving unimanual and bimanual finger movements. During task conditions, strong hemodynamic responses in the cerebellum were shown by both non-musicians and keyboard players, but non-musicians showed the stronger response. This finding indicates that different cortical activation patterns emerge from long-term motor practice. This evidence supports previous data showing that musicians require less neurons to perform the same movements.

Similarities[]

Studies have shown that the human brain has an implicit musical ability[10][11]. Koelsch, Gunter, Friederici and Schoger (2000) investigated the influence of preceeding musical context, task relevance of unexpected chords and the degree of probability of violation on music processing in both musicians and non-musicians[10]. Findings showed that the human brain unintentionally extrapolates expectations about impending auditory input. Even in non-musicians, the extrapolated expectations are consistent with music theory. The ability to process information musically supports the idea of an implicit musical ability in the human brain. In a follow-up study, Koelsch, Schroger, and Gunter (2002) investigated whether ERAN and N5 could be evoked preattentively in non-musicians[11]. Findings showed that both ERAN and N5 can be elicited even in a situation where the musical stimulus is ignored by the listener indicating that there is a highly differentiated preattentive musicality in the human brain.

Gender Differences[]

Minor neurological differences with regard to hemispheric processing exist between brains of males and females. Koelsch, Maess, Grossmann and Friederici (2003) investigated music processing through EEG and ERPs and discovered gender differences[12]. Findings showed that females process music information bilaterally and males process music with a right-hemispheric predominance. However, the early negativity of males was also present over the left hemisphere. This indicates that males do not exclusively utilize the right hemisphere for musical information processing. In a follow-up study, Koelsch, Grossman, Gunter, Hahne, Schroger and Friederici (2003) found that boys show lateralization of the early anterior negativity in the left hemisphere but found a bilateral effect in girls[13]. This indicates a developmental effect as early negativity is lateralized in the right hemisphere in men and in the left hemisphere in boys.

Musical Imagery[]

Musical imagery refers to the experience of replaying music by imagining it inside the head.[14]. Musicians show a superior ability for musical imagery due to intense musical training[15]. Herholz, Lappe, Knief and Pantev (2008) investigated the differences in neural processing of a musical imagery task in musicians and non-musicians. Utilizing magnetoencephalography (MEG), Herholz et al. examined differences in the processing of a musical imagery task with familiar melodies in musicians and non-musicians. Specifically, the study examined whether the mismatch negativity (MMN) can be based solely on imagery of sounds. The task involved participants listening to the beginning of a melody, continuation of the melody in his/her head and finally hearing a correct/incorrect tone as further continuation of the melody. The imagery of these melodies was strong enough to obtain an early preattentive brain response to unanticipated violations of the imagined melodies in the musicians. These results indicate similar neural correlates are relied upon for trained musicians imagery and perception. Additionally, the findings suggest that modification of the imagery mismatch negativity (iMMN) through intense musical training results in achievement of a superior ability for imagery and preattentive processing of music.

Perceptual musical processes and musical imagery may share a neural substrate in the brain. A PET study conducted by Zatorre, Halpern, Perry, Meyer and Evans (1996) investigated cerebral blood flow (CBF) changes related to auditory imagery and perceptual tasks[16]. These tasks examined the involvement of particular anatomical regions as well as functional commonalities between perceptual processes and imagery. Similar patterns of CBF changes provided evidence supporting the notion that imagery processes share a substantial neural substrate with related perceptual processes. Bilateral neural activity in the secondary auditory cortex was associated with both perceiving and imagining songs. This implies that within the secondary auditory cortex, processes underlie the phenomenological impression of imagined sounds. The supplementary motor area (SMA) was active in both imagery and perceptual tasks suggesting covert vocalization as an element of musical imagery. CBF increases in the inferior frontal polar cortex and right thalamus suggest that these regions may be related to retrieval and/or generation of auditory information from memory.

Absolute Pitch[]

File:A C D notes.svg

Musicians possessing absolute pitch can identify the pitch of musical tones without external reference.

Absolute pitch (AP) is defined as the ability to identify the pitch of a musical tone or to produce a musical tone at a given pitch without the use of an external reference pitch[17]. Neuroscientific research has not discovered a distinct activation pattern common for possessors of AP. Zatorre, Perry, Beckett, Westbury and Evans (1998) examined the neural foundations of AP using functional and structural brain imaging techniques[18]. Positron emission tomography (PET) was utilized to measure cerebral blood flow (CBF) in musicians possessing AP and musicians lacking AP. When presented with musical tones, similar patterns of increased CBF in auditory cortical areas emerged in both groups. AP possessors and non-AP subjects demonstrated similar patterns of left dorsolateral frontal activity when they performed relative pitch judgments. However, in non-AP subjects activation in the right inferior frontal cortex was present whereas AP possessors showed no such activity. This finding suggests that musicians with AP do not need access to working memory devices for such tasks. These findings imply that there is no specific regional activation pattern unique to AP. Rather, the availability of specific processing mechanisms and task demands determine the recruited neural areas.

Emotion[]

Emotions induced by music activate similar frontal brain regions compared to emotions elicited by other stimuli. Schmidt and Trainor (2001) discovered that valence (ie. positive vs. negative) of musical segments was distinguished by patterns of frontal EEG activity[19]. Joyful and happy musical segments were associated with increases in left frontal EEG activity whereas fearful and sad musical segments were associated with increases in right frontal EEG activity. Additionally, the intensity of emotions was differentiated by the pattern of overall frontal EEG activity. Overall frontal region activity increased as affective musical stimuli became more intense[19].

Music is able to create an incredibly pleasurable experience that can be described as “chills”[20]. Blood and Zatorre (2001) used PET to measure changes in cerebral blood flow while participants listened to music that they knew to give them the “chills” or any sort of intensely pleasant emotional response. They found that as these chills increase, many changes in cerebral blood flow are seen in brain regions such as the amygdala, orbitofrontal cortex, ventral striatum, midbrain, and the ventral medial prefrontal cortex. Many of these areas appear to be linked to reward and motivation, emotion and arousal and are also activated in other pleasurable situations[20].

Memory[]

Neuropsychology of Musical Memory[]

Musical memory involves both explicit and implicit memory systems[21]. Explicit musical memory is further differentiated between episodic (where, when and what of the musical experience) and semantic (memory for music knowledge including facts and emotional concepts). Implicit memory centers on the ‘how’ of music and involves automatic processes such as procedural memory and motor skill learning – in other words skills critical for playing an instrument. Samson and Baird (2009) found that the ability of musicians with Alzheimer’s Disease to play an instrument (implicit procedural memory) may be preserved.

Neural correlates of Musical Memory[]

A PET study looking into the neural correlates of musical semantic and episodic memory found distinct activation patterns[22]. Semantic musical memory involves the sense of familiarity of songs. The semantic memory for music condition resulted in bilateral activation in the medial and orbital frontal cortex, as well as activation in the left angular gyrus and the left anterior region of the middle temporal gyri. These patterns support the functional asymmetry favouring the left hemisphere for semantic memory. Left anterior temporal and inferior frontal regions that were activated in the musical semantic memory task produced activation peaks specifically during the presentation of musical material, suggestion that these regions are somewhat functionally specialized for musical semantic representations.

Episodic memory of musical information involves the ability to recall the former context associated with a musical excerpt[22]. In the condition invoking episodic memory for music, activations were found bilaterally in the middle and superior frontal gyri and precuneus, with activation predominant in the right hemisphere. Other studies have found the precuneus to become activated in successful episodic recall[23]. As it was activated in the familiar memory condition of episodic memory, this activation may be explained by the successful recall of the melody.

When it comes to memory for pitch, there appears to be a dynamic and distributed brain network subserves pitch memory processes. Gaab, Gaser, Zaehle, Jancke and Schlaug (2003) examined the functional anatomy of pitch memory using functional magnetic resonance imaging (fMRI)[24]. An analysis of performance scores in a pitch memory task resulted in a significant correlation between good task performance and the supramarginal gyrus (SMG) as well as the dorsolateral cerebellum. Findings indicate that the dorsolateral cerebellum may act as a pitch discrimination processor and the SMG may act as a short-term pitch information storage site. The left hemisphere was found to be more prominent in the pitch memory task than the right hemispheric regions.

Impairment[]

Focal Hand Dystonia[]

Focal hand dystonia is a task-related movement disorder associated with occupational activities that require repetitive hand movements[25]. Focal hand dystonia is associated with abnormal processing in the premotor and primary sensorimotor cortices. An fMRI study examined five guitarists with focal hand dystonia[26]. The study reproduced task-specific hand dystonia by having guitarists use a real guitar neck inside the scanner as well as performing a guitar exercise to trigger abnormal hand movement. The dystonic guitarists showed significantly more activation of the contralateral primary sensorimotor cortex as well as a bilateral underactivation of premotor areas. This activation pattern represents abnormal recruitment of the cortical areas involved in motor control. Even in professional musicians, widespread bilateral cortical region involvement is necessary to produce complex hand movements such as scales and arpeggios. The abnormal shift from premotor to primary sensorimotor activation directly correlates with guitar-induced hand dystonia.

Music Agnosia[]

Main article: Music agnosia

Music agnosia, an auditory agnosia, is a syndrome of selective impairment in music recognition[27] . Three cases of music agnosia are examined by Dalla Bella and Peretz (1999); C.N., G.L., and I.R.. All three of these patients suffered bilateral damage to the auditory cortex which resulted in musical difficulties while speech understanding remained intact. Their impairment is specific to the recognition of once familiar melodies. They are spared in recognizing environmental sounds and in recognizing lyrics. Peretz (1996) has studied C.N.’s music agnosia further and reports an initial impairment of pitch processing and spared temporal processing[28] . C.N. later recovered in pitch processing abilities but remained impaired in tune recognition and familiarity judgments.

Musical agnosias may be categorized based on the process which is impaired in the individual.[29]. Apperceptive music agnosia involves an impairment at the level of perceptual analysis involving an inability to encode musical information correctly. Associative music agnosia reflects an impaired representational system which disrupts music recognition. Many of the cases of music agnosia have resulted from surgery involving the middle cerebral artery. Patient studies have surmounted a large amount of evidence demonstrating that the left side of the brain is more suitable for holding long-term memory representations of music and that the right side is important for controlling access to these representations. Associative music agnosias tend to be produced by damage to the left hemisphere, while apperceptive music agnosia reflects damage the to right hemisphere.

Congenital Amusia[]

Congenital Amusia, otherwise known as tone deafness, is a term for lifelong musical problems which are not attributable to mental retardation, lack of exposure to music or deafness, or brain damage after birth [30]. Amusic brains have been found in fMRI studies to have less white matter and thicker cortex than controls in the right inferior frontal cortex. These differences suggest abnormal neuronal development in the auditory cortex and inferior frontal gyrus, two areas which are important in musical-pitch processing.

Amygdala damage[]

File:Amgydala.jpg

Damage to the amygdala may impair recognition of scary music.

Damage to the amygdala has selective emotional impairments on musical recognition. Gosselin, Peretz, Johnsen and Adolphs (2007) studied S.M., a patient with bilateral damage of the amygdala with the rest of the temporal lobe undamaged and found that S.M. was impaired in recognition of scary and sad music[31]. S.M.’s perception of happy music was normal, as was her ability to use cues such as tempo to distinguish between happy and sad music. It appears that damage specific to the amygdala can selectively impair recognition of scary music.

Selective Deficit in Music Reading[]

Specific musical impairments may result from brain damage leaving other musical abilities intact. Cappelletti, Waley-Cohen, Butterworth and Kopelman (2000) studied a single case study of patient P.K.C., a professional musician who sustainted damage to the left posterior temporal lobe as well as a small right occipitotemporal lesion[32]. After sustaining damage to these regions, P.K.C. was selectively impaired in the areas of reading, writing and understanding musical notation but maintained other musical skills. The ability to read aloud letters, words, numbers and symbols (including musical ones) was retained. However, P.K.C. was unable to read aloud musical notes on the staff regardless of whether the task involved naming with the conventional letter or by singing or playing. Yet despite this specific deficit, P.K.C. retained the ability to remember and play familiar and new melodies.

Auditory Arrhythmia[]

Arrhythmia in the auditory modality is defined as a disturbance of rhythmic sense; and includes deficits such as the inability to rhythmically perform music, the inability to keep time to music and the inability to discriminate between or reproduce rhythmic patterns[33]. A study investigating the elements of rhythmic function examined Patient H.J., who acquired arrhythmia after sustaining a right temporoparietal infarct[33]. Damage to this region impaired H.J.’s central timing system which is essentially the basis of his global rhythmic impairment. H.J. was unable to generate steady pulses in a tapping task. These findings suggest that keeping a musical beat relies on functioning in the right temporal auditory cortex.


See also[]

References[]

  1. 1.0 1.1 Brattico, E., Tervaniemi, M., Naatanen, R., & Peretz, I. (2006). Musical scale properties are automatically processed in the human auditory cortex. Brain Research, 1117, 162-174.
  2. Hyde, K. L, Peretz, I., & Zatorre, R. J. (2008). Evidence for the role of the right auditory cortex in fine pitch resolution. Neuropsychologia, 46, 632-639.
  3. Patterson, R. D., Uppenkamp, S., Johnsrude, I. S., & Griffiths, T. D. (2002). The processing of temporal pitch and melody information in auditory cortex. Neuron, 36, 767-776.
  4. 4.0 4.1 Brown, S., Martinez, M.J., & Parsons, L.M. (2006). Music and language side by side in the brain: a PET study of the generation of melodies and sentences. European Journal of Neuroscience, 23, 2791-2803.
  5. 5.0 5.1 Jentschke, S., Koelsch, S., Sallat, S., & Friederici, A.D. (2008). Children with specific language impairment also show impairment of music-syntactic processing. Journal of Cognitive Neuroscience, 20(11), 1940-1951.
  6. 6.0 6.1 Stewart, L., Walsh, V., Frith, U., & Rothwell, J. 2001. Transcranial magnetic stimulation produces speech arrest but not song arrest. Annals New York Academy of Sciences, 930, 433–35.
  7. Gaser, C., & Schlaug, G.(2003). Brain structures differ between musicians and non-musicians. The Journal of Neuroscience, 23(27), 9240-9245.
  8. Krings, T., Topper, R., Foltys, H., Erberich, S., Sparing, R., Willmes, K., & Thron, A. (1999). Cortical activation patterns during complex motor tasks in piano players and control subjects. A functional magnetic resonance imaging study. Neuroscience Letters, 278, 189-193.
  9. Koeneke, S., Lutz, K., Wustenberg, T., & Jancke, L. (2004). Long-term training affects cerebellar processing in skilled keyboard players. Neuroreport, 15(8), 1279-1282.
  10. 10.0 10.1 Koelsch, S., Gunter, T., Friederici, A.D., & Schoger, E. (2000). Brain indices of music processing: “nonmusicians” are musical. Journal of Cognitive Neuroscience, 12(3), 520-541.
  11. 11.0 11.1 Koelsch, S., Schroger, E., & Gunter, T. (2002). Music matters: preattentive musicality of the human brain. Psychophysiology, 39, 38-48.
  12. Koelsch, S., Maess, B., Grossmann, T., & Friederici, A.D. (2003). Electric brain responses reveal gender differences in music processing. Neuroreport, 14(5), 709-713.
  13. Koelsch, S., Grossmann, T., Gunter, T.C., Hahne, A., Schroger, E., & Friederici, A.D. (2003). Children processing music: electric brain responses reveal musical competence and gender differences. Journal of Cognitive Neuroscience, 15(5), 683-693.
  14. Halpern, A.R (2001). Cerebral substrates of musical imagery. Annals New York Academy of Sciences, 930, 179-192.
  15. Herholz, S.C., Lappe, C., Knief, A., & Pantev, C. (2008). Neural basis of music imagery and the effect of musical expertise. European Journal of Neuroscience, 28, 2352-2360.
  16. Zatorre, R.J., Halpern, A.R., Perry, D.W., Meyer, E., & Evans, A.C. (1996). Hearing in the mind’s ear: A PET investigation of musical imagery and perception. Journal of Cognitive Neuroscience, 8(1), 29-46.
  17. Takeuchi, A.H., & Hulse, S.H. (1993). Absolute pitch. Psychological Bulletin, 113, 345-361.
  18. Zatorre, R.J., Perry, D.W., Beckett, C.A., Westbury, C.F., & Evans, A.C. (1998). Functional anatomy of musical processing in listeners with absolute pitch and relative pitch. Neurobiology, 95, 3172-3177.
  19. 19.0 19.1 Schmidt, L.A. & Trainor, L.J. (2001). Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cognition and Emotion, 15(4), 487-500.
  20. 20.0 20.1 Blood, A. J., & Zatorre, R. J. (2001). Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion. Proceedings of the National Academy of Sciences of the United States of America, 98(20), 11818-11823.
  21. Baird, A. & Samson, S. (2009). Memory for music in Alzheimer’s Disease: Unforgettable?. Neuropsychological Review, 19, 85-101.
  22. 22.0 22.1 Platel, H., Baron, J-C., Desgranges, B., Bernard, F., & Eustache, F. (2003). Semantic and episodic memory of music are subserved by distinct neural networks. NeuroImage, 20, 244-256.
  23. Kapur, S., Craik, F. I. M., Jones, C., Brown, G. M., Houle, S., & Tulving, E., (1995). Functional role of the prefrontal cortex in retrieval of memories: A PET study. NeuroReport, 6, 1880-1884.
  24. Gaab, N, Gaser, C., Zaehle, T., Jancke, L., & Schlaug, G. (2003). Functional anatomy of pitch memory – and fMRI study with sparse temporal sampling. Neuroimage, 19, 1417-1426.
  25. Chen, R. & Hallett, M. (1998)Focal dystonia and repetitive motion disorders. Occupational health and industrial medicine, 39(3), 122.
  26. Pujol, J., Roset-Llobet, J., Rosines-Cubells, D., Deus, J., Narberhaus, B., Valls-Sole, J., Capdevila, A., & Pascual-Leone, A. (2000). Brain cortical activation during guitar-induced hand dystonia studied by functional MRI. Neuroimage, 12, 257-267.
  27. Dalla Bella, S. & Peretz, I. (1999) Music agnosias: Selective impairments of music recognition after brain damage. Journal of New Music Research, 28(3), 209-216.
  28. Peretz, I. (1996). Can we lose memory for music? A case of music agnosia in a nonmusician. Journal of Cognitive Neuroscience, 8(6), 481-496.
  29. Ayotte, J., Peretz, I., Rousseau, I., Bard, C., & Bojanowski, M. (2000). Patterns of music agnosia associated with middle cerebral artery infarcts. Brain, 123, 1926-1938.
  30. Peretz, I. (2008). Musical disorders. Current Directions in Psychological Science, 17(5), 329-333.
  31. Gosselin, N., Peretz, I., Johnsen, E., & Adolphs, R. (2007). Amygdala damage impairs emotion recognition from music. Neuropsychologia, 45, 236-244.
  32. Cappelletti, M., Waley-Cohen, H., Butterworth, B., & Kopelman, M. (2000). A selection loss of the ability to read and write music. Neurocase, 6(4), 321-332.
  33. 33.0 33.1 Wilson, S.J., Pressing, J.L., & Wales, R.J. (2002). Modelling rhythmic function in a musician post-stroke. Neuropsychologia, 40, 1494-1505.

Psychology of music
Music cognition
Absolute pitch | Biomusicology | Cognitive musicology | Developmental aspects of music | Embodied music cognition | Music acoustics | Music neuroscience]] | Music-related memory | Perception of music | Pitch discrimination | Pitch perception | Psychoacoustics | Relative pitch | Sound localization |
Aspects of Music theory
Melody | Harmonic | Harmonic series | Harmony | Key | Phrasing | Rhythm | Meter | tempo  Rhythm  Tonality |
Musical behaviors
Clapping | Dancing | Everyday music listening | Eye movement in music reading | Improvisation | Musical preferences | Psychology of music composition | Psychology of music performance | sight reading | Singing | [[]] | [[]] | [[]] |
Music education
Dalcroze method | Ear training | Kodaly method]] | Orff-Approach | Suzuki method | Mnemonic major system | Mnemonic peg system | [[]] |
Social psychology of music
Culture in music cognition | Culturally linked qualities | Ethnomusicology | Role in personal identity | Sociomusicology | Systemic musicology | [[]] |[[]] |
Assessment of musical ability
Music specific disorders | Amusia | Asonia |Dysmusia | Tone deafness | Seashore Tests of Musical Ability | [[]] | [[]] |[[]] |
Prominant workers
Jamshed Bharucha | Diana Deutsch |Carol L. Krumhans |Otto Laske |H. Christopher Longuet-Higgins | Helga de la Motte-Haber | John Sloboda | [[]] |
Journals
Music Perception | Psychology of Music |Jahrbuch Musikpsychologie |Journal of Research in Music Education |Musicae Scientiae | Psychomusicology | Empirical Musicology Review | Codex Flores |
Miscellaneous
Journals |Musicology |Music therapy |Musical instruments | Musicians |Rock music | [[]] | [[]] |
This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement