New publication: “To Gesture or Not” (NIME 2014)

This week I am participating at the NIME conference, organised at Goldsmiths, University of London. I am doing some administrative work as chair of the NIME steering committee, and I am also happy to present a paper tomorrow:

Title
To Gesture or Not? An Analysis of Terminology in NIME Proceedings 2001–2013

Links
Paper (PDF)
Presentation (HTML)
Spreadsheet with summary of data (ODS)
OSX shell script used for analysis

Abstract
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series. But how often is it actually used, what is it used to describe, and how does its usage here differ from its usage in other fields of study? This paper presents a linguistic analysis of the motion-related terminology used in all of the papers published in the NIME conference proceedings to date (2001– 2013). The results show that ‘gesture’ is in fact used in 62 % of all NIME papers, which is a significantly higher percentage than in other music conferences (ICMC and SMC), and much more frequently than it is used in the HCI and biomechanics communities. The results from a collocation analysis support the claim that ‘gesture’ is used broadly in the NIME community, and indicate that it ranges from the description of concrete human motion and system control to quite metaphorical applications.

Reference
Jensenius, A. R. (2014). To gesture or not? An analysis of terminology in NIME proceedings 2001–2013. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 217–220, London.

BibTeX

@inproceedings{Jensenius:2014c,
    Address = {London},
    Author = {Jensenius, Alexander Refsum},
    Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
    Pages = {217--220},
    Title = {To Gesture or Not? {A}n Analysis of Terminology in {NIME} Proceedings 2001--2013},
    Year = {2014}}

Documentation of the NIME project at Norwegian Academy of Music

From 2007 to 2011 I had a part-time research position at the Norwegian Academy of Music in a project called New Instruments for Musical Exploration, and with the acronym NIME. This project was also the reason why I ended up organising the NIME conference in Oslo in 2011.

The NIME project focused on creating an environment for musical innovation at the Norwegian Academy of Music, through exploring the design of new physical and electronic instruments. We were three people involved in the project, percussionist/electro-improviser Kjell Tore Innervik, composer Ivar Frounberg, and myself, and we had a great time together in creating and performing with a number of different new instruments.

A slogan for the project was to create instrument “for the many and for the few”. The “for the many” part we approached through the creation of Oslo Laptop Orchestra and Oslo Mobile Orchestra, and the creation of a series of music balls. The “for the few” part was more specifically targeted at creating specific instruments for professional musicians. Some of these were glass instruments, and here we also did some historic and analytic studies that were presented at NIME 2010.

As an artistic research project we were also careful about documenting all the processes we were involved in, and we also ended up creating a final series of video documentaries reflecting on the process and the artistic outcomes. Kjell Tore has written more about all of this on his own web page. Here I would like to mention three short documentaries we created, reflecting on the roles of technologist, performer, and composer in the project. Creating these documentaries was in itself an interesting exercise. As an academic researcher, I am used to writing formal research papers about my findings. However, as artistic researchers in the NIME project, we all felt that a more discussion-based reflection was more suitable. The documentaries are, unfortunately, only in Norwegian, but we hope to be able to include subtitles in English at some point.

NIME 2013

Back from a great NIME 2013 conference in Daejeon + Seoul! For Norwegian readers out there, I have written a blog post about the conference on my head of department blog. I would have loved to write some more about the conference in English, but I think these images from my Flickr account will have to do for now:

2013-05-26-DSCN70162013-05-26-DSCN70232013-05-26-DSCN70242013-05-26-DSCN70272013-05-26-DSCN70292013-05-26-DSCN7031
2013-05-26-DSCN70322013-05-26-DSCN70372013-05-26-DSCN70432013-05-26-DSCN70452013-05-26-DSCN70502013-05-27-DSCN7055
2013-05-27-DSCN70632013-05-27-DSCN70662013-05-27-DSCN70702013-05-27-DSCN70742013-05-27-DSCN70832013-05-27-DSCN7084
2013-05-27-DSCN70882013-05-27-DSCN70972013-05-27-DSCN70982013-05-27-DSCN71012013-05-27-DSCN71042013-05-27-DSCN7106

At the last of the conference it was also announced that next year’s conference will be held in London and hosted by the Embodied AudioVisual Interaction Group at Goldsmiths. Future chair Atau Tanaka presented this teaser video:

Kinectofon: Performing with shapes in planes

2013-05-28-DSCN7184Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique.

Title
Kinectofon: Performing with shapes in planes

Links

Abstract
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. These two image streams are used to create different types of motiongrams, which, again, are used as the source material for a sonification process based on inverse FFT. The instrument is intuitive to play, allowing the performer to create sound by “touching’’ a virtual sound wall.

Reference
Jensenius, A. R. (2013). Kinectofon: Performing with shapes in planes. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

BibTeX

@inproceedings{Jensenius:2013e,
   Address = {Daejeon, Korea},
   Author = {Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
   Pages = {196--197},
   Title = {Kinectofon: Performing with Shapes in Planes},
   Year = {2013}
}

kinectofon_poster

Filtering motion capture data for real-time applications

2013-05-27-DSCN7126We have three papers from our fourMs group at this year’s NIME conference in Daejeon. The first one was presented today by Ståle Skogstad, and is based on his work on trying minimize the delay when filtering motion capture data.

Title
Filtering motion capture data for real-time applications

Links:

Abstract:
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.

Reference:
Skogstad, S. A., Nymoen, K., Høvin, M., Holm, S., and Jensenius, A. R. (2013). Filtering motion capture data for real-time applications. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

BibTeX:

@inproceedings{Skogstad:2013,
   Address = {Daejeon, Korea},
   Author = {Skogstad, St{\aa}le A. and Nymoen, Kristian and Hovin, Mats and Holm, Sverre and Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
   Pages = {196--197},
   Title = {Filtering Motion Capture Data for Real-Time Applications},
   Year = {2013}
}

filterdelayillu