Now available online: Carolina Brum Medeiros and Marcelo M. Wanderley (2014). A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression. Sensors Journal 14, no. 8: 13556-13591.
Digital Musical Instruments (DMIs) are musical instruments typically composed of a control surface where user interaction is measured by sensors whose values are mapped to sound synthesis algorithms. These instruments have gained interest among skilled musicians and performers in the last decades leading to artistic practices including musical performance, interactive installations and dance. The creation of DMIs typically involves several areas, among them: arts, design and engineering. The balance between these areas is an essential task in DMI design so that the resulting instruments are aesthetically appealing, robust, and allow responsive, accurate and repeatable sensing. In this paper, we review the use of sensors in the DMI community as manifested in the proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2009–2013). Focusing on the sensor technologies and signal conditioning techniques used by the NIME community. Although it has been claimed that specifications for artistic tools are harder than those for military applications, this study raises a paradox showing that in most of the cases, DMIs are based on a few basic sensors types and unsophisticated engineering solutions, not taking advantage of more advanced sensing, instrumentation and signal processing techniques that could dramatically improve their response. We aim to raise awareness of limitations of any engineering solution and to assert the benefits of advanced electronics instrumentation design in DMIs. For this, we propose the use of specialized sensors such as strain gages, advanced conditioning circuits and signal processing tools such as sensor fusion. We believe that careful electronic instrumentation design may lead to more responsive instruments.
Now available online: Malloch, J., Sinclair, S., M. M. Wanderley (2014). Distributed tools for interactive design of heterogeneous signal networks. Multimedia Tools and Applications. DOI: 10.1007/s11042-014-1878-5
We introduce libmapper, an open source, cross-platform software library for flexibly connecting disparate interactive media control systems at run-time. This library implements a minimal, openly-documented protocol meant to replace and improve on existing schemes for connecting digital musical instruments and other interactive systems, bringing clarified, strong semantics to system messaging and description. We use automated discovery and message translation instead of imposed system-representation standards to approach “plug-and-play” usability without sacrificing design flexibility. System modularity is encouraged, and data are transported between peers without centralized servers.
Mike Winters will present his research on Sonification at the CIRMMT Workshop on Symbolic Music Processing, Semantic Audio, and Music Information Retrieval. The talk, entitled “Sonification in MIR: Corpora Analysis and Music Emotion Recognition Model Matching,” concludes the workshop, and will include motivations and techniques for sonification in symbolic music analysis, and his research in sonification of emotion, most specifically in emotion model matching.
Masters student R. Michael Winters' research on Sonification of Emotion is scheduled to appear in the April 2014 issue of Organised Sound. The paper is entitled “When the Data is Emotion: Strategies and Results from the Intersection of Sonification and Music,” and features the first ever computational design and evaluation of sonifications of emotion. He used the MIREmotion function from the MIRToolbox and specially designed two MATLAB frameworks for interactive exploration and iterative design. The paper will also appear in Chapter 4 of his manuscript based thesis, ”Exploring Music through Sound: Sonification of Emotion, Gesture, and Corpora.”