Visualising a Bach prelude played on Boomwhackers

I came across a fantastic performance of a Bach prelude played on Boomwhackers by Les Objets Volants.

It is really incredible how they manage to coordinate the sticks and make it into a beautiful performance. Given my interest in the visual aspects of music performance, I reached for the Musical Gestures Toolbox to create some video visualisations.

I started with creating an average image of the video:

Average image of the video.

This image is not particularly interesting. The performers moved around quite a bit, so the average image mainly shows the stage. An alternative spatial summary is the creation of a keyframe history image of the video file. This is created by extracting the keyframes of the video (approximately 50 frames) and combining these into one image:

Keyframe history image.

The keyframe history image summarizes how the performers moved around on stage and explained the spatial distribution of activity over time. But to get more into the temporal distribution of motion, we need to look at a spatiotemporal visualization. This is where motiongrams are useful:

Motiongram of vertical motion (time from left to right)
Motiongram of vertical motion (time from left to right)
Motiongram of horizontal motion (time from top to bottom)
Motiongram of horizontal motion (time from top to bottom)

If you click on the images above, you can zoom in to look at the visual beauty of the performance.

Analyzing a double stroke drum roll

Yesterday, PhD fellow Mojtaba Karbassi presented his research on impedance control in robotic drumming at RITMO. I will surely get back to discussing more of his research later. Today, I wanted to share the analysis of one of the videos he showed. Mojtaba is working on developing a robot that can play a double stroke drum roll. To explain what this is, he showed this video he had found online, made by John Wooton:

The double stroke roll is a standard technique for drummers, but not everyone manages to perform it as evenly as in this example. I was eager to have a look at the actions in a little more detail. We are currently beta-testing the next release of the Musical Gestures Toolbox for Python, so I thought this video would be a nice test case.

Motion video

I started the analysis by extracting the part of the video where he is showing the complete drum roll. Next, I generated a motion video of this segment:

This is already fascinating to look at. Since the background is removed, only the motion is visible. Obviously, the framerate of the video is not able to capture the speed that he plays with. I was therefore curious about the level of detail I could achieve in the further analysis.

Audio visualization

Before delving into the visualization of the video file, I made a spectrogram of the sound:

If you are used to looking at spectrograms, you can quite clearly see the change in frequency as the drummer is speeding up and then slowing down again. However, a tempogram of the audio is even clearer:

Here you can really see the change in both the frequency and the onset strength. The audio is sampled at a much higher frequency (44.1 kHz) than the video (25 fps). Is it possible to see some of the same effects in the motion?

Motiongrams

I then moved on to create a motiongram of the video:

There are two problems with this motiongram. First, the recording is composed of alternating shots from two different camera angles. These changes between shots can clearly be seen in the motiongram (marked with Camera 1 and 2). Second, this horizontal motiongram only reveals the vertical motion in the video image. Since we are here averaging over each row in the image, the motiongram shows both the left and right-hand motion. For such a recording, it is, therefore, more relevant to look at the vertical motiongram, which shows the horizontal motion:

In this motiongram, we can more clearly see the patterns of each hand. Still, we have the problem of the alternating shots. If we “zoom” in on the part called Camera 2b, it is possible to see the evenness of the motion in the most rapid part:

I also find it fascinating to “zoom” in on the part called Camera 2c, which shows the gradual slow-down of motion:

Finally, let us consider the slowest part of the drum roll (Camera 1d):

Here it is possible to see the beauty of the double strokes very clearly.

Visualizing some videos from the AIST Dance Video Database

Researchers from AIST have released an open database of dance videos, and I got very excited to try out some visualization methods on some of the files. This was also a good chance to test out some new functionality in the Musical Gestures Toolbox for Matlab that we are developing at RITMO. The AIST collection contains a number of videos. I selected one hip-hop dance video based on a very steady rhythmic pattern, and a contemporary dance video that is more fluid in both motion and music.

Hip-hop dance

The first I have looked at a couple of different files. Let us start with this one:

We can start by looking at the motion video from this. While a motion video gives less information about context, I often find them interesting to study since they reveal the essentials of what is going on.

And from the motion video we can look at the motiongrams and average image:

The horizontal motiongram reveals the repetitiveness of the dance motion, but also some of the variation throughout the different parts. I also really like the “bump” in the vertical motiongram. This is caused by the couple of side-steps he is doing midways in the session. The “line” that can be seen throughout the horizontal motiongram is cased by the cable in the back of the video.

Contemporary dance

And then I looked at another video, with a very different character:

From this we get the following motion video (wait a few seconds, since there is no dance in the beginning…):

The average image and motiongrams from this video reveal the spatial distribution of the dancer’s motion on stage. Here it is also possible to see an artifact of the compression algorithm of the video file in the beginning of the motiongrams.

I really look forwards to continue the explorations of this wonderful new and open database. Thanks to the AIST researchers for sharing!

Motiongram of high-speed violin bowing

I came across a high-speed recording of bowing on a violin string today, and thought it would be interesting to try to analyze it with the new version of the Musical Gestures Toolbox for Python. This is inspired by results from the creation of motiongrams of a high-speed guitar recording that I did some years ago.

Here is the original video:

From this I generated the following motion video:

And from this we get the following motiongram showing the vertical motion of the string (time running from left to right):

This motiongram shows the horizontal motion of the string (time running downwards):

Great example of a sound-producing action!

New publication: Non-Realtime Sonification of Motiongrams

SMC-poster-thumbToday I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works.

Title
Non-Realtime Sonification of Motiongrams

Links

Abstract
The paper presents a non-realtime implementation of the sonomotiongram method, a method for the sonification of motiongrams. Motiongrams are spatiotemporal displays of motion from video recordings, based on frame-differencing and reduction of the original video recording. The sonomotiongram implementation presented in this paper is based on turning these visual displays of motion into sound using FFT filtering of noise sources. The paper presents the application ImageSonifyer, accompanied by video examples showing the possibilities of the sonomotiongram method for both analytic and creative applications

Reference
Jensenius, A. R. (2013). Non-realtime sonification of motiongrams. In Proceedings of Sound and Music Computing, pages 500–505, Stockholm.

BibTeX

 @inproceedings{Jensenius:2013f,
    Address = {Stockholm},
    Author = {Jensenius, Alexander Refsum},
    Booktitle = {Proceedings of Sound and Music Computing},
    Pages = {500--505},
    Title = {Non-Realtime Sonification of Motiongrams},
    Year = {2013}}