Authors:Gregory Burlet, Marcelo M. Wanderley, Ichiro Fujinaga
Publication or Conference Title:Proceedings of the 2013 International Conference on New Interfaces for Musical Expression (NIME 2013)
Sensor-based gesture recognition is investigated as a possible solution to the problem of managing an overwhelming number of audio effects in live guitar performances. A realtime gesture recognition system, which automatically toggles digital audio effects according to gestural information captured by an accelerometer attached to the body of a guitar, is presented. To supplement the several predefined gestures provided by the recognition system, personalized gestures may be trained by the user. Upon successful recognition of a gesture, the corresponding audio effects are applied to the guitar signal and visual feedback is provided to the user. An evaluation of the system yielded 86% accuracy for user-independent recognition and 99% accuracy for user-dependent recognition, on average.