MoCap Data Exchange and the Establishment of a preliminary Database of Music Performances (MoCap and Video)

Organizers: Isabelle Cossette (CIRMMT, McGill University), Marcelo M. Wanderley

Project Type: Hands-on Workshop during the SMPC 2007 Conference

Time: July 30 2007, 9h30am to 4pm.

Location: New Music Building, Room A-832, McGill Univeristy, 555 Sherbrooke Street West, Montreal

Registration: Maximum 30 participants. First come, first served basis.

To register for this workshop (free you charge) or for information, please contact the organizers directly at isabelle.cossette1[at]mcgill.ca and/or marcelo.wanderley[at]mcgill.ca


Workshop webpage

The workshop webpage, with the schedule and links to slides of the talks.


Workshop Description

Motion Capture (MoCap) systems have traditionally been used in motor control research (e.g. gait and rehabilitation) and in computer animation (e.g. video games and movies). Thanks to substantial funding from different agencies and also to lower technology costs, motion capture systems are becoming increasingly available in music performance research laboratories.

Considering McGill University alone, around a dozen research laboratories use motion capture systems (at least 3 in a musical context). Ideally, data acquired by one system should be easily readable by software from another manufacturer (or by specific Matlab routines). Although theoretically file formats like c3d allow for such exchange, in practice this does not usually work because of software (and also hardware) implementation differences. Many times systems cannot open ancillary data (analog channels, video data) or actually loose information on marker labels and/or body models.

MoCap Hands-on Session during the first Workshop. Performer: Kristie Ibrahim

The consequence is that researchers typically capture and analyze their own data with their own systems, therefore research comparisons and the reproduction of results cannot be easily done. Although we know that each scientific experiment is unique, exchange of data among laboratories could at the very least provide a benchmark of analysis tools. At best, it will allow for the comparison of similar experiments and the development of research collaborations across laboratories.

For this workshop, we will ask potential participants to exchange sets of music performance data obtained with different systems before the event. We will verify which exchanges are directly feasible (if any), and identify the bottlenecks in those that are not obvious. The systems we currently use for measurements of full body, hands and chest wall displacements include: Vicon System 460, Vicon MX, BTS Smart, NDI Optotrak, NDI Certus, Phoenix VisualEyez.

A second point – given that data will be available for the workshop from various researchers – is to discuss what would be a *basic* methodology for motion capture that could be useful for various researchers (unfortunately, not to all, though). This includes the number and position of markers (e.g. using Vicon's Plug In Gait, so that the body's center of mass can be calculated directly from the markers without the need for force plates?), the position and requirements of video camera(s) (e.g. background shot without the performer for background subtraction during analysis), the requirements of performer clothes (e.g. lycra suits to avoid marker movement), etc.

Though this workshop might not produce scientific data immediately, it is a step in making possible multidisciplinary and multi-level body measurements collaborations between the various laboratories working with MoCap which should result in an increase of productivity.

This workshop is a follow up event of CIRMMT's Workshop on Motion Capture for Music Performance, held at McGill University on October 30-31, 2006. See the News page in this site for more information.


More Information

Data exchange

Download data obtained with different Mocap systems.