The Breakflute, currently under construction, is a digital musical instrument that displays vibrotactile feedback. The physical interface is similar to a open-tonehole flute, with vibration actuators inside the mouthpiece and toneholes. The interface will control a specialized digital synthesizer that will provide both sound and vibration feedback. These two high resolution feedback channels, simultaneously active and tightly correlated, will provide the performer with a richer source of information for self-monitoring than sound feedback alone (O’Modhrain, 2000). A flute design has been chosen because open-tonehole flute players are in the unique position of having their lips and fingertips – the areas of the body most sensitive to vibration – in direct contact with the vibrating air column that radiates sound. The Breakflute will complement the growing repository of capable gestural controllers. The most important goal of this project is to document the fabrication of a musical instrument that is systematically designed and a useful artistic tool.
The physical interface will include the following systems:
A body that is characteristically similar to an existing flute; the model used will depend on size and weight requirements of the interface.
A sensor system that detects: (a) embouchure gesture rather than discrete relative breath pressure; (b) tonehole coverage, providing some level of continuous finger proximity (rather than binary switches) using capacitive sensing techniques; and © a bell acceleration sensor for capture of posture and ancillary gestures.
A vibrotactile “reed” capable of stimulating vibrotactile sensations in the embouchure. It will be the second implementation of this concept, demonstrated previously with my Touch Flute (Birnbaum, 2004), focusing this time on maximizing the information displayed to the performer.
A vibrotactile display for the fingertips, installed inside the toneholes of the instrument.
In the terminology of Cadoz et al. (Cadoz, 2000), excitation will utilize a reed-embouchure metaphor of energy input, selection gestures will consist of fingerings to select, play, and rearrange percussion samples using wavetable synthesis, and modification will be controlled by the instrument-body dynamic (Wanderley, 2001).
Components of rhythm such as phrase, accent, and measure subdivision will be represented as tactile stimulation events (TSEs) (Rovan, 2000). The vibration synthesizer process the musical signal for the vibrotactile channel, detecting perceptual musical variables such as brightness, noisiness, and spectral centroid, and encode them as vibrations.
This work is an extension of the Touch Flute, designed by the same author, which was a proof-of-concept prototype for a vibrotactile wind controller.
The Touch Flute was constructed to facilitate experiments to determine the usefulness of vibration information to wind performers. It provides a gestural control interface, and both audio and vibration output. The ability to arbitrarily control the frequency and amplitude of the instrument’s vibrations allows for empirical testing of musicians’ reaction to variability in the parameters of vibrotactile feedback as compared to auditory feedback. The effect of unexpected or impossible vibration parameters, such as inverting the frequency response, could also be observed.
In the informal experiments that were conducted with the Touch Flute, users reported an improved feeling of “warmth” or “familiarity” with the instrument when vibrations were present, indicating players of traditional woodwind instruments might prefer vibration feedback in wind controllers for increased realism.