Emotions evoked by music

 

Research on musically evoked emotions has seen a huge increase over the last decade. One challenge in this area of study is that emotions evoked by music are elusive, defying conventional emotion models and categories, such as sadness, anger, and joy, or arousal and valence dimensions. To better understand and describe musically evoked emotions, we developed a domain-specific model (see Figure below).

From this model we developed Geneva Emotional Music Scale (GEMS), a tool to assess musically evoked emotions. The GEMS comprises 45 emotion expressions, nine emotion scales and three superfactors. There are also shorter scales available. Detailed information about all scales can be found here.

This work has ramified into various lines of inquiry. For example, neuroscience research has shown that several GEMS emotions exhibit specific neurophysiological activation patterns (Trost et al., 2012). Another line of research looks at the applicability of the GEMS for optimizing metadata of audio files (so-called ‘tags’). Specifically, the research examines how digital music libraries can be arranged according to emotions and how algorithms for music recommendations may be enhanced.

 


Figure 1. Illustration of the GEMS model (Zentner et al., 2008)

gems



Literature:

Zentner, M., Grandjean, D., & Scherer, K. (2008). Emotions evoked by the sound of music: Characterization, classification, and measurement. Emotion, 8, 494-521.

Zentner, M. (2011). Homer’s Prophecy: An essay on music's primary emotions. Music Analysis, 29, 102-125.

Trost, W., Ethofer, T., Zentner, M., & Vuilleumier, P. (2012). Mapping aesthetic musical emotions in the brain. Cerebral Cortex, 22, 2769-2783.

 Eerola, T. (PI) & Zentner, M. (Co-PI). Tagging online music contents for emotion. A systematic approach based on contemporary emotion research” – [Link to the ESRC research project]