100% found this document useful (1 vote)
871 views2 pages

Xsens Performance: Playing Music by The Rules

About my performance using a motion capture suit. Presented at the Interactive Media Arts Conference (IMAC) in Denmark, 2011.

Uploaded by

Yago de Quay
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
871 views2 pages

Xsens Performance: Playing Music by The Rules

About my performance using a motion capture suit. Presented at the Interactive Media Arts Conference (IMAC) in Denmark, 2011.

Uploaded by

Yago de Quay
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Xsens Performance: Playing Music by the Rules

Yago de Quay Ståle Skogstad


Faculty of Engeneering fourMs group - Music, Mind,
University of Porto Motion, Machines
Rua Dr. Roberto Frias, s/n Department of Informatics
4200-465 Porto PORTUGAL University of Oslo, NORWAY
+351 966 089 603 +47 228 524 10
[email protected] savskogs@ifi.uio.no

ABSTRACT Lack of standardization is also apparent in the software


Recent studies suggest that current interactive music results could architecture used for most computer based life performances.
be improved by structuring its practice. This paper explores the Max/MSP is a visual programming language that has become a
impact of standardized Motion Capture and software architecture popular performance tool for artists [6]. However, there are no
on knowledge transfer and efficiency, by illustrating with a regulations on how patches (i.e. programs) should be internally
computer based musical performance where sounds are controlled structured; making is difficult to share high-level patches [7, 8].
by sensors on the dancer’s body. It concludes that these factors This paper proposes a musical performance that explores
help refocus the attention to the artistic mission and improve the implementing two standardized systems: The Xsens MVN MoCap
communication between users. technology, and the Jamoma framework. Together they provide
real time information of a dancer’s movements and trigger a broad
Topic and Subject Descriptors range of musical events.
H.5.5 [Interfaces for Dance and Physical Expression]: Sound
and Music Computing – methodologies and techniques, signal 2. METHODOLOGY
analysis, synthesis and processing.
2.1 Motion Capture
Keywords Xsens MVN is a commercial 3D motion tracking product that is
Motion Capture, Sonic Interaction, Xsens, Jamoma, Music used mainly by the army, industry, medicine and films. It consists
Performance, Musical Controller of 17 inertial sensors and an application for recording and
exporting data from the sensors. Each sensor encapsulated an
1. INTRODUCTION accelerometer, gyroscope and magnetometer. When worn by the
Motion Capture (MoCap) is commonly defined as the process of dancer, they provide a detailed 3D virtual representation of the
recording human motion in a digital format. The most common body as well as orientation [9]. The advantages of this system
MoCap technologies applied in entertainment are optical, relying over other MoCap are: 1) relatively lightweight and portable (suit:
in computer vision techniques, mechanical, flex sensors on limb 1.9 kg, full system: 11 kg); 2) quick setup time (5-15 min); 3)
joints, and magnetic, magnetic receivers positioned on the communication with 3rd party programs (UDP protocol); and 4)
subject’s body. However, applications are mostly limited to the Wireless (Bluetooth).
film industry, army and medicine [1-3]. The network stream from the Xsens suit sends information about
Marshall and Wanderlay [4] surveyed the MoCap interfaces for 23 body segments at a rate of 120Hz, a total of 138 floating point
computer music submitted between 2001 and 2004 to the New numbers per frame. Furthermore, data can be computed over time
Interfaces for Musical Expression (NIME) conference which to get other properties such as jerk, acceleration, velocity and
seeks to design alternative interfaces for musical performance. quantity of motion. To this end a dedicated C++ Real Time
They found that the most popular sensors were accelerometers Motion Capture Toolbox software was developed to transform
and force sensing resistors. these numbers into usable values.
The above papers expose a gap between MoCap methods Body poses, thresholds, and continuous movement were the three
practiced by the entertainment industry, and those by academia types of data used to trigger and manipulate sound. Body poses
and artists. The former employ industry standards while the latter were described by the following Xsens parameters: Left hand
develop idiosyncratic interfaces [5]. height, left arm angle, right hand height and right arm angle.
Whenever these relative values matched a body pose description
in the Real Time Motion Capture Toolbox, it got sent to
Permission to make digital or hard copies of all or part of this work for Max/MSP. By approximation, the software was always matching
personal or classroom use is granted without fee provided that copies are the current pose with stored ones. Thus we constantly got stored
not made or distributed for profit or commercial advantage and that body pose identifiers that best described the current body pose of
copies bear this notice and the full citation on the first page. To copy the subject. These poses, 8 in total, were used to trigger musical
otherwise, or republish, to post on servers or to redistribute to lists, events and transition between sections in the song.
requires prior specific permission and/or a fee.
re-new’11, May 17-22, 2011, Copenhagen, Denmark.
Copyright 2011 IMAC
2.2 Interaction Platform full body inertial MoCap technology and standardized software
Jamoma is an open-source modular framework for patching in architecture.
Max/MSP. It enforces consistency in the patch without placing There is evidence in literature regarding the limited application of
strong restrictions on developers. This project developed three standard MoCap technologies, and the inconsistency in Max/MSP
new Jamoma modules—cues, mappings and transitions, borrowed patches in academia and the performative arts. The consequences
from the finite-state machine. As illustrated in Figure 1, a of not implementing standards are twofold: 1) low exchange of
performance sequences beforehand different scripts; short high-level knowledge and techniques between users; 2) excessive
thematic compositions that in turn harbour various cues. Each cue focus on sensor development. These two issues might partially
determines: 1) Mappings, links between Xsens data and sound explain why interactive music work has remained out of reach
parameters; and 2) transition, what is required to change to the from the general public [5, 10, 11].
next cue.
5. CONCLUSION
Performce cue 1 maping
Our exploratory musical performance suggests that using
Script 1
Performance

transition
cue 2 ... standardized MoCap methods and software architecture help build
strong development communities and enable more time to be
cue 3 ...
spent on artistic decisions. However, further work is need to
cue 1 extend and replicate these findings, and to understand how
Script 2 cue 2 standardization can help usher interactive music into main stream
entertainment.
cue 3
cue 3 6. REFERENCES
cue 1 [1] Furniss, M. Motion Capture. http://web.mit.edu/comm-
Script 3 cue 2
forum/papers/furniss.html. Accessed on Jan 10, 2011.

cue 3
[2] Skogstad, S. A. v. D., Jensenius, A. R. and Nymoen, K. Using
IR Optical Marker Based Motion Capture for Exploring Musical
Interaction. The University of Oslo, Oslo, 2010.
Figure 1. Performance setup
[3] Kitagawa, M. and Windsor, B. MoCap for Artists: Workflow
2.3 Sound Manipulation and Techniques for Motion Capture. Focal Press, Burlington, MA,
Sounds for the piece encompassed music compositions and sound 2008.
effects. The audio engines used were Ableton Live 8 and Reason [4] Marshall, M. and Wanderley, M. Evaluation of Sensors as
4 due to their easy MIDI mapping and robust live performance Input Devices for Computer Music Interfaces. Springer Berlin
capabilities. These audio software had the task of following Heidelberg, City, 2006.
instructions (i.e. MIDI messages) from Max. Instructions could be
[5] Salter, C., Baalman, M. and Moody-Grigsby, D. Between
notes, continuous control values (e.g. to alter the filter
Mapping, Sonification and Composition: Responsive Audio
frequencies) or triggers (e.g. audio clips). These messages were
Environments in Live Performance. Springer Berlin / Heidelberg.
sent through LoopBe30, a software that made it possible to
2008.
exchange MIDI between programs. MIDI notes were sent to
Reason 4 synthesizers and samplers, and Ableton Live 8 to start [6] Wikipedia Max (software).
and stop sound clips. Control values were used to turn on/off http://en.wikipedia.org/wiki/Max_(software) Accessed on Jan 13,
tracks in Ableton Live 8 and manipulate faders. In order to 2011.
provide a solid and reliable performance, the sonic piece was [7] Place, T. and Lossius, T. Jamoma: A modular standard for
through-composed, that is no cues or sections were repeated. structuring patches in Max. In Proceedings of the Proceedings of
the 2006 International Computer Music Conference, Geneva.
3. RESULTS 2006.
The performance has been staged five times in Oslo, Norway and
Porto, Portugal between September 2010 and April 2011 in stages [8] Zicarelli, D. How I Learned to Love a Program That Does
ranging from formal concerts to night clubs. The overall reaction Nothing. Comput. Music J., 26, 4, 44-51. 2002.
has been positive. All performances presented no problems except [9] Xsens Xsens MVN - Inertial Motion Capture.
for the third which had some communication drops because of the http://www.xsens.com/ Accessed on Dec 26, 2010.
distance between the Xsens and the Bluetooth receiver. However,
[10] Bertini, G., Magrini, M. and Tarabella, L. An Interactive
the audience did not perceive any issues.
Musical Exhibit Based on Infrared Sensors. Springer Berlin /
4. DISCUSSION Heidelberg, 2006.
The aim of this project was to examine how the use of a [11] Lee, E., Nakra, T. M. and Borchers, J. You're the conductor:
commercial MoCap systems and a standardized modular a realistic interactive conducting system for children. In
framework affected interactive musical performances. Controller Proceedings of the Proceedings of the 2004 conference on New
interfaces and software design differ hugely across computer interfaces for musical expression (Hamamatsu, Shizuoka, Japan,
based arts, but to our knowledge, this is the first study in the field 2004). National University of Singapore.
of interactive musical performances to implement a commercial

You might also like