Tag Archives: nime

Filtering motion capture data for real-time applications

2013-05-27-DSCN7126We have three papers from our fourMs group at this year’s NIME conference in Daejeon. The first one was presented today by Ståle Skogstad, and is based on his work on trying minimize the delay when filtering motion capture data.

Title
Filtering motion capture data for real-time applications

Links:

Abstract:
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.

Reference:
Skogstad, S. A., Nymoen, K., Høvin, M., Holm, S., and Jensenius, A. R. (2013). Filtering motion capture data for real-time applications. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

BibTeX:

@inproceedings{Skogstad:2013,
   Address = {Daejeon, Korea},
   Author = {Skogstad, St{\aa}le A. and Nymoen, Kristian and Hovin, Mats and Holm, Sverre and Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
   Pages = {196--197},
   Title = {Filtering Motion Capture Data for Real-Time Applications},
   Year = {2013}
}

filterdelayillu

NIME panel at CHI

This week the huge ACM SIGCHI Conference on Human Factors in Computing Systems (also known as CHI) is organised in Paris. This is the largest conference in the field of human-computer interaction, and is also the conference at which the NIME conference series started.  I will participate in a panel session called “Music, Technology, and Human-Computer Interaction” on Wednesday. This is a great opportunity to show musical HCI to the broader HCI community, and I am very much looking forwards to participating.

As a teaser for what we are going to talk about and discuss, next year’s NIME organisers at Goldsmith’s in London have prepared a short NIME teaser.

NIME 2013 deadline approaching

nime2013-logo

Here is a little plug for the submission deadline for this year’s NIME conference. I usually don’t write so much about deadlines here, but as the current chair of the international steering committee for the conference series, I feel that I should do my share in helping to spread the word. The NIME conference is a great place to meet academics, designers, technologists, and artists, all working on creating weird instruments and music. Here is some more information about this year’s conference:

NIME 2013 will be hosted by the Graduate School of Culture Technology at KAIST (Korea Advanced Institute of Science and Technology)Daejeon, Korea, and will also feature a series of special events in Seoul.

There are four submission categories, all with a deadline of 1 February:

Moog on Google

Probably by coincidence, but still a nice concurrence: on the last day of this year’s International Conference on New Interfaces for Musical Expression (NIME) in Ann Arbor, Michigan, Google celebrates Robert Moog’s 78 year birthday.

The interesting thing is that Google not only has a picture of a Moog synthesizer, but they also have an interactive model up and running, where it is possible to play on the keyboard and tweak the knobs. The synth draws some CPU, so I had problems grabbing a screencast while playing on it. But it is worth trying out if you read this before it disappears.