IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations
Document Type
Conference Proceeding
Publication Date
8-23-2017
Abstract
Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.
Publication Title
ACM International Conference Proceeding Series
Volume
Part F131930
Digital Object Identifier (DOI)
10.1145/3123514.3123545
ISBN
9781450353731
Citation Information
Herremans, Yang, S., Chuan, C.-H., Barthet, M., & Chew, E. (2017). IMMA-Emo: A Multimodal Interface for Visualising Score- and Audio-synchronised Emotion Annotations. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences, 1–8. https://doi.org/10.1145/3123514.3123545