Tag Archives: publication

New publication: Non-Realtime Sonification of Motiongrams

SMC-poster-thumbToday I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works.

Non-Realtime Sonification of Motiongrams


The paper presents a non-realtime implementation of the sonomotiongram method, a method for the sonification of motiongrams. Motiongrams are spatiotemporal displays of motion from video recordings, based on frame-differencing and reduction of the original video recording. The sonomotiongram implementation presented in this paper is based on turning these visual displays of motion into sound using FFT filtering of noise sources. The paper presents the application ImageSonifyer, accompanied by video examples showing the possibilities of the sonomotiongram method for both analytic and creative applications

Jensenius, A. R. (2013). Non-realtime sonification of motiongrams. In Proceedings of Sound and Music Computing, pages 500–505, Stockholm.


    Address = {Stockholm},
    Author = {Jensenius, Alexander Refsum},
    Booktitle = {Proceedings of Sound and Music Computing},
    Pages = {500--505},
    Title = {Non-Realtime Sonification of Motiongrams},
    Year = {2013}}

New publication: Performing the Electric Violin in a Sonic Space

I am happy to announce that a paper I wrote together with Victoria Johnson has just been published in Computer Music Journal. The paper is based on the experiences that Victoria and I gained while working on the piece Transformation for electric violin and live electronics (see video of the piece below).

A. R. Jensenius and V. Johnson. Performing the electric violin in a sonic space. Computer Music Journal, 36(4):28–39, 2012.

This article presents the development of the improvisation piece Transformation for electric violin and live electronics. The aim of the project was to develop an “invisible” technological setup that would allow the performer to move freely on stage while still being in full control of the electronics. The developed system consists of a video-based motion-tracking system, with a camera hanging in the ceiling above the stage. The performer’s motion and position on stage is used to control the playback of sonic fragments from a database of violin sounds, using concatenative synthesis as the sound engine. The setup allows the performer to improvise freely together with the electronic sounds being played back as she moves around the “sonic space.” The system has been stable in rehearsal and performance, and the simplicity of the approach has been inspiring to both the performer and the audience.

The PDF will be available in the University of Oslo public repository after the 6 month embargo. Until then, it is available through either MIT Press or Project MUSE.

BibTeX entry
Author = {Jensenius, Alexander Refsum and Johnson, Victoria},
Journal = {Computer Music Journal},
Number = {4},
Pages = {28–39},
Title = {Performing the Electric Violin in a Sonic Space},
Volume = {36},
Year = {2012}}

Video of the piece Transformation.

Paper #1 at SMC 2012: Evaluation of motiongrams

Today I presented the paper Evaluating how different video features influence the visual quality of resultant motiongrams at the Sound and Music Computing conference in Copenhagen.


Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, background, lighting, clothing, video size and compression. It is argued that the proposed motiongram implementation is capable of visualising the main motion features even with quite drastic changes in all of the above mentioned variables.


  • Full paper [PDF]
  • Poster [PDF]


Jensenius, A. R. (2012). Evaluating how different video features influence the visual quality of resultant motiongrams. In Proceedings of the 9th Sound and Music Computing Conference, pages 467–472, Copenhagen.


   Address = {Copenhagen},
   Author = {Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the 9th Sound and Music Computing Conference},
   Pages = {467--472},
   Title = {Evaluating How Different Video Features Influence the Visual Quality of Resultant Motiongrams},
   Year = {2012}}

Building low-cost music controllers

New publication on our Cheapstick music controller:



A. R. Jensenius, R. Koehly, and M. M. Wanderley. Building low-cost music controllers. In R. Kronland-Martinet, T. Voinier, and S. Ystad, editors, CMMR 2005, LNCS 3902, pages 123–129. Berlin Heidelberg: Springer-Verlag, 2006. (PDF from Springer)


This paper presents our work on building low-cost music controllers intended for educational and creative use. The main idea was to build an electronic music controller, including sensors and a sensor interface, on a “10 euro” budget. We have experimented with turning commercially available USB game controllers into generic sensor interfaces, and making sensors from cheap conductive materials such as latex, ink, porous materials, and video tape. Our prototype controller, the CheapStick, is comparable to interfaces built with commercially available sensors and interfaces, but at a fraction of the price.



	Author = {Jensenius, Alexander Refsum and Koehly, Rodolphe and Wanderley, Marcelo M.},
	Booktitle = {CMMR 2005, LNCS 3902},
	Editor = {Kronland-Martinet, R. and Voinier, T. and Ystad, S.},
	Pages = {123--129},
	Publisher = {Berlin Heidelberg: Springer-Verlag},
	Title = {Building Low-Cost Music Controllers},
	Year = {2006}}