2014-10-14 12.19.36

ICMC 2006 proceedings details

A colleague of mine recently asked if I could help her find the bibligraphic details of the ICMC 2006 proceedings. Apparently this information is not easily available online, and she had spent a great deal of research time trying to find the information.

I was lucky enough to participate in this wonderful event at Tulane University, and still have the paper version of the proceedings in my office. So here is the relevant information, in case anyone else also wonders about these details:

  • Editors (Paper chairs): Georg Essl and Ichiro  Fujinaga
  • November 6-11 2006
  • Publisher: International Computer Music Association, San Francisco, CA & The Music Department, Tulane University, New Orleans, LA
  • ISBN: 0-9713192-4-3

 

 

New publication: “To Gesture or Not” (NIME 2014)

This week I am participating at the NIME conference, organised at Goldsmiths, University of London. I am doing some administrative work as chair of the NIME steering committee, and I am also happy to present a paper tomorrow:

Title
To Gesture or Not? An Analysis of Terminology in NIME Proceedings 2001–2013

Links
Paper (PDF)
Presentation (HTML)
Spreadsheet with summary of data (ODS)
OSX shell script used for analysis

Abstract
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series. But how often is it actually used, what is it used to describe, and how does its usage here differ from its usage in other fields of study? This paper presents a linguistic analysis of the motion-related terminology used in all of the papers published in the NIME conference proceedings to date (2001– 2013). The results show that ‘gesture’ is in fact used in 62 % of all NIME papers, which is a significantly higher percentage than in other music conferences (ICMC and SMC), and much more frequently than it is used in the HCI and biomechanics communities. The results from a collocation analysis support the claim that ‘gesture’ is used broadly in the NIME community, and indicate that it ranges from the description of concrete human motion and system control to quite metaphorical applications.

Reference
Jensenius, A. R. (2014). To gesture or not? An analysis of terminology in NIME proceedings 2001–2013. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 217–220, London.

BibTeX

@inproceedings{Jensenius:2014c,
    Address = {London},
    Author = {Jensenius, Alexander Refsum},
    Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
    Pages = {217--220},
    Title = {To Gesture or Not? {A}n Analysis of Terminology in {NIME} Proceedings 2001--2013},
    Year = {2014}}

New publication: “How still is still? exploring human standstill for artistic applications”

sverm-dumpI am happy to announce a new publication titled How still is still? exploring human standstill for artistic applications (PDF of preprint), published in the International Journal of Arts and Technology. The paper is based on the Sverm project, and was written and accepted two years ago. Sometimes academic publishing takes absurdly long, which this is an example of, but I am happy that the publication is finally out in the wild.

Abstract

We present the results of a series of observation studies of ourselves standing still on the floor for 10 minutes at a time. The aim has been to understand more about our own standstill, and to develop a heightened sensitivity for micromovements and how they can be used in music and dance performance. The quantity of motion, calculated from motion capture data of a head marker, reveals remarkably similar results for each person, and also between persons. The best results were obtained with the feet at the width of the shoulders, locked knees, and eyes open. No correlation was found between different types of mental strategies employed and the quantity of motion of the head marker, but we still believe that different mental strategies have an important subjective and communicative impact. The findings will be used in the development of a stage performance focused on micromovements.

Reference

Jensenius, A. R., Bjerkestrand, K. A. V., and Johnson, V. (2014). How still is still? exploring human standstill for artistic applications. International Journal of Arts and Technology, 7(2/3):207–222.

BibTeX

@article{Jensenius:2014a,
    Author = {Jensenius, Alexander Refsum and Bjerkestrand, Kari Anne Vadstensvik and Johnson, Victoria},
    Journal = {International Journal of Arts and Technology},
    Number = {2/3},
    Pages = {207--222},
    Title = {How Still is still? Exploring Human Standstill for Artistic Applications},
    Volume = {7},
    Year = {2014}}
Screenshot of MultiControl v0.6.2

MultiControl on GitHub

Screenshot of MultiControl v0.6.2

Screenshot of MultiControl v0.6.2

Today I have added MultiControl to my GitHub account. Inititally, I did not intend to release the source code for MultiControl, because it is so old and dirty. The whole patch is based on bpatchers and trying to hide things away in the pre-Max5-days, when presentation view did not exist.

I originally developed the Max patch back in 2004, mainly so that I could distribute a standalone application for my students to use. I have only incrementally updated it to work with newer versions of Max and OSX, but have never really given it a full brush-over.

The reason why I decided to release the code now is because I get so many questions about the program. Even though there are several other good alternatives out there, a lot of people download the application each month, and I get lots of positive feedback from happy users. I also get information about bugs, and occasionally also some feature requests. While I do not really have time to update the patch myself, hopefully someone else might pick it up and improve it.

Happy multicontrolling!

If you did not understand anything about the above, here is a little screencast showcasing some of the functionality of MultiControl:

New fourMs PhD thesis: Ståle Skogstad

Ståle Skogstad

Ståle Skogstad with his poster at the Verdikt conference a few years ago.

I am happy to announce that Ståle Skogstad defended his PhD today. Ståle was a PhD student in the project Sensing Music-related Actions (SMA) in our fourMs group, and I served as one of his supervisors.

The thesis is titled Methods and Technologies for Using Body Motion for Real-Time Musical Interaction and is available from the UiO archive. Abstract:

There are several strong indications for a profound connection between musical sound and body motion. Musical embodiment, meaning that our bodies play an important role in how we experience and understand music, has become a well accepted concept in music cognition. Today there are increasing numbers of new motion capture (MoCap) technologies that enable us to incorporate the paradigm of musical embodiment into computer music. This thesis focuses on some of the challenges involved in designing such systems. That is, how can we design digital musical instruments that utilize MoCap systems to map motion to sound?

The first challenge encountered when wanting to use body motion for musical interaction is to find appropriate MoCap systems. Given the wide availability of different systems, it has been important to investigate the strengths and weaknesses of such technologies. This thesis includes evaluations of two of the technologies available: an optical marker-based system known as OptiTrack V100:R2; and an inertial sensor-based system known as the Xsens MVN suit.

Secondly, to make good use of the raw MoCap data from the above technologies, it is often necessary to process them in different ways. This thesis presents a review and suggestions towards best practices for processing MoCap data in real time. As a result, several novel methods and filters that are applicable for processing MoCap data for real-time musical interaction are presented in this thesis. The most reasonable processing approach was found to be utilizing digital filters that are designed and evaluated in the frequency domain. To determine the frequency content of MoCap data, a frequency analysis method has been developed. An experiment that was carried out to determine the typical frequency content of free hand motion is also presented. Most remarkably, it has been necessary to design filters with low time delay, which is an important feature for real-time musical interaction. To be able to design such filters, it was necessary to develop an alternative filter design method. The resulting noise filters and differentiators are more low-delay optimal than than those produced by the established filter design methods.

Finally, the interdisciplinary challenge of making good couplings between motion and sound has been targeted through the Dance Jockey project. During this project, a system was developed that has enabled the use of a full-body inertial motion capture suit, the Xsens MVN suit, in music/dance performances. To my knowledge, this is one of the first attempts to use a full body MoCap suit for musical interaction, and the presented system has demonstrated several hands-on solutions for how such data can be used to control sonic and musical features. The system has been used in several public performances, and the conceptual motivation, development details and experience of using the system are presented.