Tag Archives: motiongram

Presenting mocapgrams

Earlier today I held the presentation “Reduced Displays of Multidimensional Motion Capture Data Sets of Musical Performance” at the ESCOM conference in Jyväskylä, Finland. The presentation included an overview of different approaches to visualization of music-related movement, and also our most recent method: mocapgrams.

While motiongrams are reduced displays created from video files, mocapgrams are intended to work in a similar way, but created from motion capture data. They are conceptually similar, but otherwise quite different in the way they are generated. In mocapgrams we map XYZ coordinates of motion capture markers into RGB colours. Thus the end result gives an impression of how the markers moved in 3D-space over time, as seen below:

Example of a mocapgram generated from a 3D accelerometer recording. The XYZ values are mapped into a RGB colourspace. The bottom image is generated by frame differencing the top one, and therefore shows how the regular mocapgram is changing.
Still from the video recorded from the piano study.

Below is an example of two different types of mocapgrams (as well as a motiongram and spectrogram) generated from a motion capture recording of myself playing the piano (recorded at the IDMIL, McGill University).

Different plots from a short piano recording. Mocapgram2 is a frame-difference mocapgram, while mocapgram1 is a regular mocapgram. The motiongram is generated from the video recording, and spectrogram of the sound.

There is no paper published based on the presentation, but the PDF of the presentation summarizes the main idea.

Citation: Jensenius, Alexander Refsum, Ståle Skogstad, Kristian Nymoen, Jim Torresen, and Mats Erling Høvin. “Reduced Displays of Multidimensional Motion Capture Data Sets of Musical Performance.” In Proceedings of the Conference of the European Society for the Cognitive Sciences of Music. Jyväskylä, Finland, 2009.

AudioVideoAnalysis

To allow everyone to watch their own synchronised spectrograms and motiongrams, I have made a small application called AudioVideoAnalysis.

It currently has the following features:

  • Draws a spectrogram from any connected microphone
  • Draws a motiongram/videogram from any connected camera
  • Press the escape button to toggle fullscreen mode

Built with Max/MSP by Cycling ’74 on OS X.5. I will probably make a Windows version at some point, but haven’t gotten that far yet.

A snapshot of the main interface:

The main window of the AudioVideoAnalysis application

The fullscreen can be toggled with the escape button:

Fullscreen mode in the AudioVideoAnalysis application

The are, obviously, lots of things that can and will be improved in future versions. Please let me know of any problems you experience with the application, and if there is anything in particular you think should be included.

Sonification of Traveling Landscapes

I just heard a talk called “Real-Time Synaesthetic Sonification of Traveling Landscapes” (PDF) by Tim Pohle and Peter Knees from the Department of Computational Perception (great name!) in Linz. They have made an application creating music from a moving video camera. The implementation is based on grabbing a one pixel wide column from the video, plotting these columns and sonifying the image. Interestingly enough, the images they get out (see below) of this are very close to the motiongrams and videograms I have been working on.

Picture 1.png