4 papers at NIME 2012

I was involved in no less than 4 papers at this year’s NIME conference in Ann Arbor, Michigan.

K. Nymoen, A. Voldsund, S. A. v. D. Skogstad, A. R. Jensenius, and J. Tørresen.
Comparing motion data from an iPod touch to a high-end optical infrared marker-based motion capture system

The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared marker-based motion capture system (Qualisys) in terms of latency, jitter, accuracy and precision. We identify some rotational drift in the iPod, and some time lag between the two systems. Still, the iPod motion data is quite reliable, especially for describing relative motion over a short period of time.

S. A. Skogstad, K. Nymoen, Y. de Quay, and A. R. Jensenius.
Developing the Dance Jockey system for musical interaction with the Xsens MVN suit.

In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant postures and actions from the continuous data, and how these postures and actions can be used to control sonic and musical features. The system has been used in several public performances, and we believe it has great potential for further exploration. However, to overcome the current practical and technical challenges when working with the system, it is important to further refine tools and software in order to facilitate making of new performance pieces.

J. Torresen, Ø. N. Hauback, D. Overholt, and A. R. Jensenius.
Development and evaluation of a ZigFlea-based wireless transceiver board for CUI32.

We present a new wireless transceiver board for the CUI32 sensor interface, aimed at creating a solution that is flexible, reliable, and with little power consumption. Communication with the board is based on the ZigFlea protocol and it has been evaluated on a CUI32 using the StickOS oper- ating system. Experiments show that the total sensor data collection time is linearly increasing with the number of sensor samples used. A data rate of 0.8 kbit/s is achieved for wirelessly transmitting three axes of a 3D accelerometer. Although this data rate is low compared to other systems, our solution benefits from ease-of-use and stability, and is useful for applications that are not time-critical.

A. R. Jensenius and A. Voldsund.
The music ball project: Concept, design, development, performance.

We report on the Music Ball Project, a longterm, exploratory project focused on creating novel instruments/controllers with a spherical shape as the common denominator. Besides a simple and attractive geometrical shape, balls aØord many diØerent types of use, including play. This has made our music balls popular among widely diØerent groups of people, from toddlers to seniors, including those that would not otherwise engage with a musical instrument. The paper summarises our experience of designing, constructing and using a number of music balls of various sizes and with diØerent types of sound-producing elements.

Mocap workshop in Trondheim

I will participate in a motion capture workshop at the Norwegian University of Science and Technology (NTNU) tomorrow. My contribution will consist of the following:

  • Lecture: Introduction to motion capture (in music analysis and performance)
  • Demo 1: Working with video analysis using the Musical Gestures Toolbox
  • Demo 2: The Xsens MVN BioMech mobile mocap suit
  • Workshop: Analysis and performance with Wii controllers, Phidgets accelerometers and Kinect

Below are various resources.


PDF of the presentation.

Demo 1: Musical Gestures Toolbox

The Musical Gestures Toolbox is a collection of modules and abstractions developed in and for the graphical programming environment Max. The toolbox is currently being developed within the Jamoma open platform for interactive art-based research and performance.

Download: Jamoma + UserLib and Max

The toolbox is probably most useful for people that are already familiar with Max programming. People looking for more easy-to-use solutions can check out some of my standalone applications at the fourMs software page. These applications should work on most versions of OSX, as well as on WinXP. I know that there are various issues with Windows Vista and Windows 7, and will try to get these problems ironed out as soon as possible.

Demo 2: XSens

Xsens MVN BioMech is a mobile mocap suit based on inertial sensors (accelerometer, gyroscopes, magnetometers). It excels over camera-based systems in that it is portable and allows for mobile motion capture. I will show how the system can be used both for analysis and performance:

  • Analysis: mocap recordings will be made with the internal Xsens software, and exported to C3D files that will be imported in Matlab using the Mocap Toolbox.

  • Performance: I will also show how the system can be used in realtime, passing data to Max within which the packets will be parsed and used to control sound in realtime.


During the hands-on workshop, participants will be able to try out the above mentioned tools and systems, as well as:

  • Phidgets USB motion sensors.

  • Nintendo Wii controllers, which allow for wireless inertial motion sensing. Here there are several tools available.

    • OSCulator a general purpose tool for reading data from various human interface devices.
    • WiiDataCapture can record data coming from OSCulator, and format them so that they can easily be imported by the MoCap Toolbox
    • junXion is an application for passing on either OSC or MIDI from Wii controllers and other human interface devices.
  • Microsoft Kinect sensor, which allows for inexpensive “3D” motion capture using depth-cameras.

    • Wiki page describing how to work with Kinect in Max