Research in Music Cognition
Navigation in Tonal Chronotope
Melodic contour transformation
Mental Rotation in Visual and Musical Space
Marina Korsakova-Kreyn & W. Jay Dowling
Melodic Rotation flier SMPC - 2009
Literature Review for Melodic Rotation - 2005
Musicians habitually think about music in quasi-spatial terms.
In general, we seem to use spatial abilities in music perception by cognizing melodiic objects within tonal reference system (scale). I speculated that the human brain transcends the modality of tonal-temporal patterns in perception of musical structures, and that perhaps same neural substrate is involved in visuo-spatial processing and music cognition. The hypothesis predicted that melodic transformation and mental rotation share same neural substrate in the parietal lobe and that on a certain level of computation our mind transcends the modality of information.
To explore this hypothesis, I created a study that compared performance on two congruency tasks in two different modalities: The first task was mental rotation and the second was a task on congruency of melodic objects in tonal space (melodic transformation). The melodic objects (original melodies from polyphonic compositions by J.S. Bach) were carefully selected to represent the melodic analogues of 3-D objects from Shepard-Metzler study in mental rotation. The same melodies were used in a control task where the participants judged timbral change. Performance on the visuo-spatial and music perception task was correlated. There was also a gender effect on all three tasks, with males performing better than females. Moreover, the timbre judgment task correlated with mental rotation for women but not for men. The pattern of correlations for men and women suggests involvement of specific cognitive mechanisms for spatial and melodic transformations for men, and of more general cognitive mechanisms for women.
It may be that the brain has mechanisms for processing hierarchically organized information independent of the modality in which this information is delivered. In addition, it is possible that I stumbled upon a test on fluid intelligence.
My prediction that melodic transformation task might generate neural activity in the parietal cortices (BA 7 specifically) has been confirmed by Foster & Zatorre (2010) and Zatorre, Halpern & Bouffard (2010).
Critique of Zatorre, Halpern, and Bouffard (2010) study: On retrograde
Affective Response to Tonal Modulation
Marina Korsakova-Kreyn, W. Jay Dowling, Joseph P. Dunlop, James C. Bartlett: Modulation
Modulation flier SMPC - 2009
It is generally accepted that music expresses and recreates emotional states. Music can be explained as a pattern of tonal tension and relaxation arranged along the arrow of time. We can say about music perception that it is a process of pattern recognition within tonal system of reference where tones of the scale differ in their pitch (frequency-wise) and in degree of tonal attraction to a tonal center - tonic. In the European musical tradition, diatonic scale is the conventional tonal system of reference, whereas the major and minor modes represent two versions of the diatonic scale. Reorientation from one tonal center to another within same musical composition is called tonal modulation.
Our study investigated affective response to tonal modulation to all 12 steps of the tonal schema (seven diatonic and five chromatic steps) and in all modal conditions (Major to Major, Major to minor, minor to Major, and minor to minor) and compared responses to modulation to three selected steps – dominant, subdominant and step 8 in major mode only. Results demonstrate different affective responses to different degrees of modulation and agree with other studies on affective response to the major and minor modes. It was found that affective responses agree with the theoretical model of pitch proximity and that they are influences by the direction around the circle of fifths. In addition, the results indicate that responses were influenced by melodic contour patterns and that in real music excerpts affective influence of degree of modulation can compete successfully with other expressive elements of music such as tempo, tessitura, and style.
The findings suggest a connection between affective responses to music and perceived tonal tension. This connection is a basis for my model of “archaic” emotional processing. The archaic model employs Panksepp’s concept of a virtual body-image (the “virtual self” within our paleomammalian brain), which integrates minute somato- and viscero-motor responses to the environment. Following this concept, the archaic model proposes that our minute sensations of differences in perceived tension acquire affective properties because the sequencing of these sensations in music mimics the way the “virtual self” reflects and integrates the experience of the living organism. From this perspective, music can be explained as a temporally controlled sequence of tonal events that induce emotion by imitating the dynamics of the integration of the patterns of somato- and viscero-motor information in the midbrain. The integration of these sensations according to the tonal-temporal program of a particular musical composition results in generation of a particular emotion. These gut-felt sensations are not emotions; however, their artfully controlled pattern can trigger emotional responses that can occur at different levels in the physiological system. The interplay of different levels of perceived tension is related to the main morphological principles of music. (Together, time and tonal attraction define acousmatic space of music: the Tonal Chronotope.) The value of these principles is revealed when they are utilized to affect our psychological state concurrent with the presentation of ideas that appeal to our aesthetic judgment.
Emotional Processing in Music
Summary of the Archaic Model for ICMPC-11
The proposed model of emotional processing attempts to explain music’s directness in communicating emotions by connecting perceived tension , , , ,  to affective response . This model draws on recognition of the strong pre-cognitive aspect in music perception , , , and suggests that affective responses in music are generated by the integration of “gut-felt” sensations produced by the temporally organized interplay of tonal tension and release. At the core of this model is Panksepp’s concept of a virtual body-image , or the “virtual self” within our paleomammalian brain, which integrates minute somato- and viscero-motor responses to the environment. Following this concept, the model of emotional processing proposes that the listener’s minute sensations of differences in perceived tension acquire affective properties because the sequencing of these sensations in music mimics the way the “virtual self” reflects and integrates the experience of the living organism. From this perspective, music can be explained as a sequence of tonal events that induce emotion by imitating the dynamics of the integration of the patterns of somato- and viscero-motor information in the midbrain. The integration of these sensations according to the tonal-temporal program of a particular musical composition results in the generation of a particular emotion. These gut-felt sensations are not emotions; however, their artfully controlled pattern can trigger emotional responses that can occur at different levels in the psycho-physiological system. The emotion is built into a musical composition by employing the most primitive mechanism of reaction of a living organism to environment. The same primitive mechanism is involved during re-construction of an emotion by the listener, during listening to a tonal-temporal program. The interplay of different levels of perceived tension is related to the main morphological principles of music: tonal attraction  and structured time. The value of these principles is revealed when they are utilized to affect our psychological state concurrent with the presentation of forms and ideas that appeal to our aesthetic judgment.
1. Nielsen, F. V. (1983). Oplevelse of Musikalsk Spænding. Copenhagen: Akademisk Forlag
2. Krumhansl, CL (1996). A Perceptual Analysis of Mozart's Piano Sonata K. 282: Segmentation, Tension, and Musical Ideas. Music Perception, 13, 401-432
3. Bigand, E., Parncutt, R., & Lerdahl, F. (1996). Perception of musical tension in short chord sequences: The influence of harmonic function, sensory dissonance, horizontal motion, and musical training. Perception & Psychophysics, 58, 125-141
4. Krumhansl, C. L., & Toiviainen, P. (2003). Tonal cognition. In the Peretz, I. & Zatorre, R., (Eds.) The Cognitive Neuroscience of Music (pp. 95-108). Oxford University Press
5. Korsakova-Kreyn, M., (2009). Affective Response to Tonal Modulation, Ph.D. Dissertation, The University of Texas at Dallas
6. Panksepp, J. (2004). Affective consciousness and the origins of human mind: A critical role of brain research on animal emotions. Impuls, 57, 47-607. Shewmon, D.A., Holmes, D.A., Byrne, P.A. (1999). Consciousness in congenitally decorticate children: developmental vegetative state as self-fulfilling prophecy. Developmental Medicine and Child Neurology, 41, 364-374
8. Panksepp, J. & Bernatsky, G. (2002) Emotional sounds and the brain: the neuro-affective foundations of musical appreciation, Behavioural Processes, 60, 133-155.
9. Huron, D. (2006) Sweet Anticipation: Music and the Psychology of Expectation, Cambridge, MA: MIT Press
10. Panksepp, J. (1998). The periconscious substrates of consciousness: Affective states and the evolutionary origins of the SELF. Journal of Consciousness Studies, 5, 566-582
11. Krumhansl, C. L. & Shepard, R. N. (1979). Quantification of the hierarchy of tonal functions within a diatonic context. Journal of Experimental Psychology: Human Perception and Performance, 5, 579-594
12. Scruton, R. (1997). The Aesthetics of Music, Oxford University Press