From Basic Music Research to Medical Tool

The Research Council of Norway is evaluating the research being done in the humanities these days, and all institutions were given the task to submit cases of how societal impact. Obviously, basic research is per definition not aiming at societal impact in the short run, and my research definitely falls into category.Still it is interesting to see that some of my basic research is, indeed, on the verge of making a societal impact in the sense that policy makers like to think about....

November 22, 2016 · 5 min · 866 words · ARJ

Performing with the Norwegian Noise Orchestra

Yesterday, I performed with the Norwegian Noise Orchestra at Betong in Oslo, at a concert organised by Dans for Voksne. The orchestra is an ad-hoc group of noisy improvisers, and I immediately felt at home. The performance lasted for 12 hours, from noon to midnight, and I performed for two hours in the afternoon. For the performance I used my Soniperforma patch based on the sonifyer technique and the Jamoma module I developed a couple of years ago (jmod....

December 13, 2012 · 1 min · 207 words · ARJ

Paper #1 at SMC 2012: Evaluation of motiongrams

Today I presented the paper Evaluating how different video features influence the visual quality of resultant motiongrams at the Sound and Music Computing conference in Copenhagen. Abstract Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, background, lighting, clothing, video size and compression. It is argued that the proposed motiongram implementation is capable of visualising the main motion features even with quite drastic changes in all of the above mentioned variables....

July 12, 2012 · 1 min · 166 words · ARJ

Record videos of sonification

I got a question the other day about how it is possible to record a sonifyed video file based on my sonification module for Jamoma for Max. I wrote about my first experiments with the sonifyer module here, and also published a paper at this year’s ACHI conference about the technique. It is quite straightforward to record a video file with the original video + audio using the jit.vcr object in Max....

June 25, 2012 · 1 min · 159 words · ARJ

Sonification of motiongrams

A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper. See below for the full paper and video examples. The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams....

February 3, 2012 · 2 min · 398 words · ARJ

Concert: Victoria Johnson

Together with Victoria Johnson I have developed the piece Transformation, a piece where we are using video analysis to control sound selection and spatialisation. We have been developing the setup and piece during the last couple of years, and performed variations of the piece at MIC, the Opera house and at the music academy last year. The piece will be performed again today, Monday 28 March 2011 at 19:00 at the Norwegian Academy of Music....

March 28, 2011 · 1 min · 90 words · ARJ

Sonification of motiongrams

I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram. See the demonstration video below: The module is available from the Jamoma source repository, and will probably make it into an official release at some point....

November 9, 2010 · 1 min · 88 words · ARJ

GDIF recording and playback

Kristian Nymoen have updated the Jamoma modules for recording and playing back GDIF data in Max 5. The modules are based on the FTM library (beta 12, 13-15 does not work), and can be downloaded here. We have also made available three use cases in the (soon to be expanded) fourMs database: simple mouse recording, sound saber and a short piano example. See the video below for a quick demonstration of how it works:

July 3, 2010 · 1 min · 74 words · ARJ

New motiongram features

Inspired by the work [[[Static no. 12 by Daniel Crooks that I watched at the Sydney Biennale]{.entry-content}]{.status-content}]{.status-body} a couple of weeks ago, I have added the option of scanning a single column in the jmod.motiongram% module in Jamoma. Here is a video that shows how this works in practice: About motiongrams A motiongram is a way of displaying motion (e.g. human motion) in the time-domain, somehow similar to how we are used to working with time-representations of audio (e....

July 2, 2010 · 1 min · 171 words · ARJ

Liquid Vapor

I performed in the open form piece Liquid Vapor by Else Olsen S. yesterday. The performance was special in many ways. First, electronic music pioneer Pauline Oliveros was also performing in the piece. She performed electric accordeon and live electronics, a great combination. Second, the performance took place in the magnificent foyer of the new Oslo opera house. Even if it is a large space, we managed to fill it up with all the different stations, equipment and instruments....

November 27, 2009 · 2 min · 227 words · ARJ