alt text I am currently in beautiful Aldeburgh, a small town on the east coast of England, teaching at the Britten-Pears Young Artist Programme together with Rolf Wallin and Tansy Davies. This post is mainly to summarise the things I have been going through, and provide links for various things.

Theoretical stuff

My introductory lectures went through some of the theory of an embodied understanding of the experience of music. One aspect of this theory that I find very relevant for the development of interactive works is what I call action-sound relationships. By this I mean that we have an intuitive understanding of how certain actions may produce certain sounds. This is the cognitive basis for the fact that we can “hear” an action we only see, and “see” the action of a sound we can only hear. These ideas are presented and discussed more thoroughly in my PhD dissertation.

Examples of realtime motion capture

I went through a number of examples of how to use motion capture in musical contexts. Here are but a few of the examples:

Transformation is an improvisation piece for electric violin and live electronics. It is based on the idea of letting the performer control a large collection of sound fragments while moving around on stage. The technical setup for the piece is based on a video-based motion tracking system developed in Jamoma, coupled to the playback of sounds using concatenative synthesis in CataRT. Transformatillon is described more thoroughly in a paper in the upcoming issue of Computer Music Journal (winter 2012).

The SoundSaber motion capture instrument is tracking the position of a rod in space using an infrared marker-based motion capture system. The setup is described in more detail in this NIME 2011 paper.

Dance Jockey is a setup/piece in which the Xsens MVN inertial motion capture suit is used to control both sample playback and synthesis using a combination of posture and action recognition of the full body of the performer. It is described in this NIME 2012 paper.

Technical resources

The Max patches used in the course are available here (Aldeburgh-patches.zip). Here are pointers to other useful things:

  • Maxobjects.com is a database with pointers to a number of third party externals for Max. It is a great resource to find whatever you need.
  • Jamoma is a large collection of modules and externals, including video analysis and mapping tools
  • CNMAT depot is a large collection of externals, tutorials, and example patches.

Wii

The Wii controllers are a great way of getting started to use accelerometer data in Max. They are wireless (Bluetooth), and once you figure out the pairing with your computer they work quite well. There are several ways of using the with Max, amongst others:

  • ajh.wiimote is an external for getting data from the Wii controllers into Max.
  • OSCulator is a multipurpose control and mapping tool working with different types of controllers, including the Wii. Even though it may be a little more work passing OSC messages into Max from a separate application, OSCulator is what I find the most stable way of working with Wii controllers in Max.
  • Junxion is another multipurpose control and mapping tool, developed at STEIM. Works with lots of controllers + it also does video tracking.

Phidgets

The Phidgets kits are an inexpensive, user-friendly and soldering-free way of getting started working with musical electronics. The kits come with a sensor interface and a number of different types of sensors to test out.

  • The driver is necessary to get the data into your computer.
  • Then use the Phidgets Max externals to get data into Max.
  • Phidgets2MIDI is a small Max application developed for working more easily with data from the Phidgets.

Kinect

The MS Kinect controller is a great solution for getting started with full-body motion capture.

  • The jit.freenect object lets you grab the video image or depth image from the Kinect into Max. You can then use e.g. the Jamoma video modules or any other Jitter tools in Max.
  • Synapse is a standalone application for tracking a skeleton model. You need to set it up so that you can get OSC messages back into Max.
  • Descriptions of more advanced uses of the Kinect can be found at the fourMs wiki