Documentation of the NIME project at Norwegian Academy of Music

From 2007 to 2011 I had a part-time research position at the Norwegian Academy of Music in a project called New Instruments for Musical Exploration, and with the acronym NIME. This project was also the reason why I ended up organising the NIME conference in Oslo in 2011.

The NIME project focused on creating an environment for musical innovation at the Norwegian Academy of Music, through exploring the design of new physical and electronic instruments. We were three people involved in the project, percussionist/electro-improviser Kjell Tore Innervik, composer Ivar Frounberg, and myself, and we had a great time together in creating and performing with a number of different new instruments.

A slogan for the project was to create instrument “for the many and for the few”. The “for the many” part we approached through the creation of Oslo Laptop Orchestra and Oslo Mobile Orchestra, and the creation of a series of music balls. The “for the few” part was more specifically targeted at creating specific instruments for professional musicians. Some of these were glass instruments, and here we also did some historic and analytic studies that were presented at NIME 2010.

As an artistic research project we were also careful about documenting all the processes we were involved in, and we also ended up creating a final series of video documentaries reflecting on the process and the artistic outcomes. Kjell Tore has written more about all of this on his own web page. Here I would like to mention three short documentaries we created, reflecting on the roles of technologist, performer, and composer in the project. Creating these documentaries was in itself an interesting exercise. As an academic researcher, I am used to writing formal research papers about my findings. However, as artistic researchers in the NIME project, we all felt that a more discussion-based reflection was more suitable. The documentaries are, unfortunately, only in Norwegian, but we hope to be able to include subtitles in English at some point.

Visualisations of a timelapse video

Yesterday, I posted a blog entry on my TimeLapser application, and how it was used to document the working process of the making of the sculpture Hommage til kaffeselskapene by my mother. The final timelapse video looks like this:

Now I have run this timelapse video through my VideoAnalysis application, to see what types of analysis material can come out of such a video.

The average image displays a “summary” of the entire video recording, somehow similar to an “open shutter” in traditional photography. This image allows for seeing what has been moving and what has not been moving throughout the entire sequence.

Average image
Average image 

The motion average image is somehow similar to the average image, but it summarises the motion images through the entire sequence, that is, only the parts of the image that changed.

Motion average image
Motion average image

What I call a motion history image, is the motion average image overlaid only a single frame from the original video. I typically create such motion history images using both the first and last frames of the video, as can be seen below.

Motion history image, based on first video frame
Motion history image, based on first video frame
Motion history image, based on last video frame
Motion history image, based on last video frame

Finally, I have also created both horisontal and vertical motiongrams of the timelapse video. The horisontal motiongram displays the vertical motion, which in this case is how the sculptor moved back and forth when sitting at the table. The edge of the table can be seen as the “stripe” running throughout the image.

Horisontal motiongram, displaying vertical motion
Horisontal motiongram, displaying vertical motion

The vertical motiongram, on the other hand, displays horisontal motion, that is, how the artist moved sideways throughout the process. Here it is very interesting to note the rhythmic swaying pattern, as the sculptor moved back and forth in what seems to be a periodic pattern.

Vertical motiongram, displaying horisontal motion
Vertical motiongram, displaying horisontal motion

I also have some more motion data, which it will be interesting to study in more detail in Matlab.

Timelapser

TimeLapser-screenshotI have recently started moving my development efforts over to GitHub, to keep everything in one place. Now I have also uploaded a small application I developed for a project by my mother, Norwegian sculptor Grete Refsum. She wanted to create a timelapse video of her making a new sculpture, “Hommage til kaffeselskapene”, for her installation piece Tante Vivi, fange nr. 24 127 Ravensbrück.

There are lots of timelapse software available, but none of them that fitted my needs. So I developed a small Max patch called TimeLapser. TimeLapser takes an image from a webcam at a regular interval (1 minute). Each image is saved with the time code as the name of the file, making it easy to use the images for documentation purposes or assembling the images into timelapse videos. The application was originally developed for an art project, but can probably be useful for other timelapse applications as well.

The application will only store separate image files, which can easily be assembled into timelapse movies using for example Quicktime.

Below is a video showing the final timelapse of my mother’s sculpture:

Kinectofon: Performing with shapes in planes

2013-05-28-DSCN7184Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique.

Title
Kinectofon: Performing with shapes in planes

Links

Abstract
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. These two image streams are used to create different types of motiongrams, which, again, are used as the source material for a sonification process based on inverse FFT. The instrument is intuitive to play, allowing the performer to create sound by “touching’’ a virtual sound wall.

Reference
Jensenius, A. R. (2013). Kinectofon: Performing with shapes in planes. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

BibTeX

@inproceedings{Jensenius:2013e,
   Address = {Daejeon, Korea},
   Author = {Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
   Pages = {196--197},
   Title = {Kinectofon: Performing with Shapes in Planes},
   Year = {2013}
}

kinectofon_poster

ImageSonifyer

ImageSonifyer

Earlier this year, before I started as head of department, I was working on a non-realtime implementation of my sonomotiongram technique (a sonomotiongram is a sonic display of motion from a video recording, created by sonifying a motiongram). Now I finally found some time to wrap it up and make it available as an OSX application called ImageSonifyer.  The Max patch is also available, for those that want to look at what is going on.

I am working on a paper that will describe everything in more detail, but the main point can hopefully be understood by looking at some of the videos I have posted in the sonomotiongram playlist on YouTube. In its most basic form, the ImageSonifyer will work more or less like Metasynth, sonifying an image. Here is a basic example showing how an image is sonified by being “played” from left to right.

But my main idea is to use motiongrams as the source material for the sonification. Here is a sonification of the high-speed guitar recordings I have written about earlier, first played at a rate of 10 seconds:

and then played at a rate of 1 second, which is about the original recording speed.