loading1
loading2
alt

Unsounding Objects

category

author

John Sullivan



Description:

The goal of Unsounding Objects is to create digital musical instruments which use the timbral qualities of pre-existing objects to drive sound synthesis and compositional structure during a musical performance. Non-musical objects are frequently integrated into contemporary percussion performance practice. Through audio feature extraction, these objects can be used as intuitive interfaces for the control of sound synthesis. Percussionists are accustomed to intimate control of timbre using a wide variety of performance techniques and our research will leverage this expert technique in order to allow for the intuitive control of digital musical instruments in a percussion ensemble composition.

Piezo contact microphones are commonly used to amplify acoustically inert objects for musical performance. For our purposes, we will build a piezo-based contact mic that will allow for flexible and stable placement on a variety of objects with a maximum rejection of ambient sound. The sound of the objects which are miked will not be amplified, but will have perceptually relevant audio features extracted from it.

Many different perceptual audio features are able to be extracted(spectral centroid, roughness, BARK coefficients, harmonic flux, etc.), and perceptual parameters have previously been extracted from live audio in order to control sound synthesis ( cf. Todd Machover, Sparkler). In order to allow real-time control of sound synthesis, it is necessary to determine the fewest possible features which characterize the timbre of an object. The available features will therefore need to be evaluated in order to determine which are the most perceptually relevant depending upon the source object.

The perceptual features will then be mapped to two intermediate mapping layers. The first layer will extract characteristics of the performers’ gestures from the audio analysis and will be developed in tandem with the choice of instrumental gestures used for performance. The second layer will be for collaborative control, in which performers will share joint control of synthesis. The parameters of this layer will be determined by the development of a compositional strategy. The varying timbral characteristics of the objects mean that identical mapping strategies and sound synthesis algorithms will produce different sonic results when played on different objects. One compositional approach will be to develop musical motives which will be transformed by the objects which they are played upon. Percussion ensemble compositions frequently employ open instrumentation (Xenakis’ Psappha); we will employ this strategy in that which objects are played is undetermined but mapping strategies and sound synthesis algorithms are predetermined.

One of our primary goals is the interdisciplinary development of the instrument (interface, feature extraction, mapping, sound synthesis) with the performance practice and the composition. To this end, we will hold weekly workshop meetings which will inform our individual research during the course of the project. At the end of our research, we will present conclusions regarding optimal audio feature extraction algorithms, mapping strategies for the control of sound synthesis with perceptual audio features, and a composition for percussion ensemble utilizing non-musical objects for control of sound synthesis and compositional structure.


IDMIL Participants:


External Participants:

Zach Hale

Preston Beebe

Fabrice Marandola

Philippe Leroux


Research Areas:


Funding:


Publications:



Images: