Kinectofon: Performing with shapes in planes

2013-05-28-DSCN7184Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique.

Title
Kinectofon: Performing with shapes in planes

Links

Abstract
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. These two image streams are used to create different types of motiongrams, which, again, are used as the source material for a sonification process based on inverse FFT. The instrument is intuitive to play, allowing the performer to create sound by “touching’’ a virtual sound wall.

Reference
Jensenius, A. R. (2013). Kinectofon: Performing with shapes in planes. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

BibTeX

@inproceedings{Jensenius:2013e,
   Address = {Daejeon, Korea},
   Author = {Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
   Pages = {196--197},
   Title = {Kinectofon: Performing with Shapes in Planes},
   Year = {2013}
}

kinectofon_poster

New Master Thesis 2: Music Kinection: Musical Sound and Motion in Interactive Systems

evenYet another of my master students have graduated recently, and here is a link to his thesis:

Even has carried out a so-called “practical” master thesis, with a more practical focus. He has carried out a mocap analysis of how people move while playing computer games with a Kinect device, and has also prototyped several mocap instruments.

Abstract:

Sound is often used as a feedback modality in technological devices. Yet relatively little is known about the relation between sound and motion in interactive systems. This thesis exam- ines what happens in the intersection between human-computer interaction, motion and sonic feedback. From the connection of music and motion, coupled by technology, we can draw the expression “Music Kinection”. A theoretical foundation accounts for the relationships that exist between sound and motion, and cognitive foundations for these relationships. This study of literature on music and motion, and music cognition theory, shows that there are many aspects that support various relationships between sound and motion. To see if it is possible to detect similarities between users of an interactive system, a user-study was performed with 16 subjects playing commercially available video games for the Kinect platform. Motion capture data was recorded and analyzed. The user-study showed that there is an overall similarity in the amount of motion performed by the user, but that there is some deviation in amount of motion performed by body parts important to the gameplay. Many users will choose the same body part for one task, but will apply different tactics when using this limb. Knowledge from the theory and observation study was used in the practical explorations of sound-action relationships. Two installations, Kinect Piano and Popsenteret Kinect installation, was made, together with two software prototypes, Soundshape and Music Kinection. The practical study showed that working with full-body motion capture and sound in human-computer interaction is dependent on good motion feature extraction algorithms and good mapping to sound engines.

KinectRecorder

I am currently working on a paper describing some further exploration of the sonifyer technique and module that I have previously published on. The new thing is that I am now using the inputs from a Kinect device as the source material for the sonification, which opens up for using also the depth in the image as an element in the process.

To be able to create figures for the paper, I needed to record the input from a Kinect to a regular video file. For that reason I have created a small Max patch called KinectRecorder, which allows for easy recording of one combined video file from the two inputs (regular video image and depth image) from the Kinect. As the screenshot below shows, there is not much more to the patch than starting the video input from the Kinect, and then start the recording. Files will be stored as with MJPEG compression and named with the current date and time.

KinectRecorder

The patch is not particularly fancy, but I imagine that it could be useful for other people interested in recording video from the Kinect, either for analytical applications or for testing performance setups when not having access to a Kinect device. So here it is:

Below is a short video recorded with the patch, showing some basic movement patterns. This video is not particularly interesting in itself, but I can reveal that it actually leads to some interesting sonic results when run through my sonifyer technique. More on that later…

Mocap workshop in Trondheim

I will participate in a motion capture workshop at the Norwegian University of Science and Technology (NTNU) tomorrow. My contribution will consist of the following:

  • Lecture: Introduction to motion capture (in music analysis and performance)
  • Demo 1: Working with video analysis using the Musical Gestures Toolbox
  • Demo 2: The Xsens MVN BioMech mobile mocap suit
  • Workshop: Analysis and performance with Wii controllers, Phidgets accelerometers and Kinect

Below are various resources.

Introduction

PDF of the presentation.

Demo 1: Musical Gestures Toolbox

The Musical Gestures Toolbox is a collection of modules and abstractions developed in and for the graphical programming environment Max. The toolbox is currently being developed within the Jamoma open platform for interactive art-based research and performance.

Download: Jamoma + UserLib and Max

The toolbox is probably most useful for people that are already familiar with Max programming. People looking for more easy-to-use solutions can check out some of my standalone applications at the fourMs software page. These applications should work on most versions of OSX, as well as on WinXP. I know that there are various issues with Windows Vista and Windows 7, and will try to get these problems ironed out as soon as possible.

Demo 2: XSens

Xsens MVN BioMech is a mobile mocap suit based on inertial sensors (accelerometer, gyroscopes, magnetometers). It excels over camera-based systems in that it is portable and allows for mobile motion capture. I will show how the system can be used both for analysis and performance:

  • Analysis: mocap recordings will be made with the internal Xsens software, and exported to C3D files that will be imported in Matlab using the Mocap Toolbox.

  • Performance: I will also show how the system can be used in realtime, passing data to Max within which the packets will be parsed and used to control sound in realtime.

Workshop:

During the hands-on workshop, participants will be able to try out the above mentioned tools and systems, as well as:

  • Phidgets USB motion sensors.

  • Nintendo Wii controllers, which allow for wireless inertial motion sensing. Here there are several tools available.

    • OSCulator a general purpose tool for reading data from various human interface devices.
    • WiiDataCapture can record data coming from OSCulator, and format them so that they can easily be imported by the MoCap Toolbox
    • junXion is an application for passing on either OSC or MIDI from Wii controllers and other human interface devices.
  • Microsoft Kinect sensor, which allows for inexpensive “3D” motion capture using depth-cameras.

    • Wiki page describing how to work with Kinect in Max