NIME publication and performance: Vrengt

My PhD student Cagri Erdem developed a performance together with dancer Katja Henriksen Schia. The piece was first performed together with Qichao Lan and myself during the RITMO opening and also during MusicLab vol. 3. See here for a teaser of the performance: This week Cagri, Katja and myself performed a version of the piece Vrengt at NIME in Porto Alegre. We also presented a paper describing the development of the instrument/piece:...

June 6, 2019 · 2 min · 262 words · ARJ

Kinectofon: Performing with shapes in planes

Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique. Title Kinectofon: Performing with shapes in planes Links Paper (PDF) Poster (PDF) Software Videos (coming soon) Abstract The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device....

May 28, 2013 · 1 min · 193 words · ARJ

ImageSonifyer

Earlier this year, before I started as head of department, I was working on a non-realtime implementation of my sonomotiongram technique (a sonomotiongram is a sonic display of motion from a video recording, created by sonifying a motiongram). Now I finally found some time to wrap it up and make it available as an OSX application called ImageSonifyer. The Max patch is also available, for those that want to look at what is going on....

April 6, 2013 · 1 min · 200 words · ARJ

Record videos of sonification

I got a question the other day about how it is possible to record a sonifyed video file based on my sonification module for Jamoma for Max. I wrote about my first experiments with the sonifyer module here, and also published a paper at this year’s ACHI conference about the technique. It is quite straightforward to record a video file with the original video + audio using the jit.vcr object in Max....

June 25, 2012 · 1 min · 159 words · ARJ

Sonification of motiongrams

A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper. See below for the full paper and video examples. The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams....

February 3, 2012 · 2 min · 398 words · ARJ

Sonification of motiongrams

I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram. See the demonstration video below: The module is available from the Jamoma source repository, and will probably make it into an official release at some point....

November 9, 2010 · 1 min · 88 words · ARJ

Sonification of Traveling Landscapes

I just heard a talk called “Real-Time Synaesthetic Sonification of Traveling Landscapes” (PDF) by Tim Pohle and Peter Knees from the Department of Computational Perception (great name!) in Linz. They have made an application creating music from a moving video camera. The implementation is based on grabbing a one pixel wide column from the video, plotting these columns and sonifying the image. Interestingly enough, the images they get out (see below) of this are very close to the motiongrams and videograms I have been working on....

May 15, 2008 · 1 min · 86 words · ARJ