New publication: Non-Realtime Sonification of Motiongrams

Today I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works....

August 1, 2013 · 2 min · 225 words · ARJ

Timelapser

I have recently started moving my development efforts over to GitHub, to keep everything in one place. Now I have also uploaded a small application I developed for a project by my mother, Norwegian sculptor Grete Refsum. She wanted to create a timelapse video of her making a new sculpture, “Hommage til kaffeselskapene”, for her installation piece Tante Vivi, fange nr. 24 127 Ravensbrück. There are lots of timelapse software available, but none of them that fitted my needs....

June 25, 2013 · 1 min · 184 words · ARJ

Sonification of motiongrams

A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper. See below for the full paper and video examples. The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams....

February 3, 2012 · 2 min · 398 words · ARJ

Sonification of motiongrams

I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram. See the demonstration video below: The module is available from the Jamoma source repository, and will probably make it into an official release at some point....

November 9, 2010 · 1 min · 88 words · ARJ

New motiongram features

Inspired by the work [[[Static no. 12 by Daniel Crooks that I watched at the Sydney Biennale]{.entry-content}]{.status-content}]{.status-body} a couple of weeks ago, I have added the option of scanning a single column in the jmod.motiongram% module in Jamoma. Here is a video that shows how this works in practice: About motiongrams A motiongram is a way of displaying motion (e.g. human motion) in the time-domain, somehow similar to how we are used to working with time-representations of audio (e....

July 2, 2010 · 1 min · 171 words · ARJ

Updated software

I was at the Musical Body conference at University of London last week and presented my work on visualisation of music-related movements. For my PhD I developed the Musical Gestures Toolbox as a collection of components and modules for Max/MSP/Jitter, and most of this has been merged into Jamoma. However, lots of potential users are not familiar with Max, so over the last couple of years I have decided to develop standalone applications for some of the main tasks....

April 27, 2009 · 1 min · 194 words · ARJ