Sonification of Emotion

Participants: R. Michael Winters
Ian Hattwick
Marcelo M. Wanderley (Principle Investigator)

Funding: NSERC Engage Grant, “Audio Environment for the Emotional Imaging Composer”

Project Type: Research and Development

Time Period: April. 2012 - Sept. 2012. Status: ongoing.

Project Description

Purpose of the Grant

Emotional Imaging Inc. (EII) and researchers at McGill University will work to develop an extended interface for the EIC, a software system under development at EII. The extension of the interface will consist of an audio environment for control of sound synthesis and interactive sonification of human emotional states. The expanded interface addresses problems with EII’s current visual-only interface and broadens use to a music performance and data analysis context.

Company-Specific Program

The EIC is a multimedia tool that translates bio-signals into emotionally responsive environments in real-time. The foundation of the EIC is a custom-built wireless finger sensor that captures seven time-varying physiological measures of the autonomic nervous system (ANS). Through a procedure involving signal processing, feature extraction, transformation, and machine recognition, these signals are mapped onto human affective states with over 90 percent accuracy [1]. The signals extracted from the ANS are particularly useful in that they are difficult to consciously control or fake. The EIC has tremendous potential for the field of affective computing as it provides the computer with a reliable method for extracting emotional states from the user [2]. It has previously been released as a hardware interface for data visualization that generates a fluid collage of visual images that directly correspond to the users internal emotional experience.

Benefits of the Audio Environment

Bio-signals provide intimate information about the emotional and physical states of musicians, information which can be used in powerful ways for a variety of musical purposes, including virtuosic perfomance [3], real-time composition [4], and pedagogy [5]. Research into gestural control of music has lead to the use of sensor technology in novel musical instruments [6], and bio-signal sensors promise to contribute to the development of expressive musical performance systems [7]. However, there remains a need for commercially available integrated hardware and software systems that will enable musicians and researchers to easily and accurately incorporate biosignals into their work [8]. The goal of sonification is to use sound to convey the structures and dynamics embedded within data in the most clear and accurate way [9, 10]. Characteristics of bio-signals, including their multi-dimensionality, complexity, and time-varying dynamics, point to efficacy of sonification for bio-signal analysis. Sonification allows signals to be monitored “eyes-free,” permitting the user to actively engage in another task and providing relief from a crowded visual interface [11], and it provides a means to dynamically explore a high-dimensional space, which has proven useful for uncovering non-obvious data relations [12].



More Information