New paper: MuMYO – Evaluating and Exploring the MYO Armband for Musical Interaction

usertest3Yesterday, I presented my microinteraction paper here at the NIME conference (New Interfaces for Musical Expression), organised at Louisiana State University, Baton Rouge, LA. Today I am presenting a poster based on a paper written together with two of my colleagues at UiO.

Title
MuMYO – Evaluating and Exploring the MYO Armband for Musical Interaction

Authors
Kristian Nymoen, Mari Romarheim Haugen, Alexander Refsum Jensenius

Abstract
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new “standard” controller in the NIME community.

Files

BibTeX

@inproceedings{nymoen_mumyo_2015,
    address = {Baton Rouge, LA},
    title = {{MuMYO} - {Evaluating} and {Exploring} the {MYO} {Armband} for {Musical} {Interaction}},
    abstract = {The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband's sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new ``standard'' controller in the NIME community.},
    booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
    author = {Nymoen, Kristian and Haugen, Mari Romarheim and Jensenius, Alexander Refsum},
    year = {2015}
}

New publication: “From experimental music technology to clinical tool”

Omslag_Ryhme-smallI have written a chapter called From experimental music technology to clinical tool in the newly published anthology Music, Health, Technology and Design, edited by Karette A. Stensæth from the Norwegian Academy of Music. Here is the summary of the book:

This anthology presents a compilation of articles that explore the many intersections of music, health, technology and design. The first and largest part of the book includes articles deriving from the multidisciplinary research project called RHYME (www.rhyme.no). They engage with the study of the design, development, and use of digital and musical ‘co-creative tangibles’ for the potential health benefit of families with a child having physical or mental needs.

And here is the abstract of my chapter:

Human body motion is integral to all parts of musical experience, from performance to
perception. But how is it possible to study body motion in a systematic manner? This
article presents a set of video-based visualisation techniques developed for the analysis
of music-related body motion, including motion images, motion-history images and
motiongrams. It includes examples of how these techniques have been used in studies of
music and dance performances, and how they, quite unexpectedly, have become useful
in laboratory experiments on ADHD and clinical studies of CP. Finally, it includes
reflections regarding what music researchers can contribute to the study of human
motion and behaviour in general.

New publication: “To Gesture or Not” (NIME 2014)

This week I am participating at the NIME conference, organised at Goldsmiths, University of London. I am doing some administrative work as chair of the NIME steering committee, and I am also happy to present a paper tomorrow:

Title
To Gesture or Not? An Analysis of Terminology in NIME Proceedings 2001–2013

Links
Paper (PDF)
Presentation (HTML)
Spreadsheet with summary of data (ODS)
OSX shell script used for analysis

Abstract
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series. But how often is it actually used, what is it used to describe, and how does its usage here differ from its usage in other fields of study? This paper presents a linguistic analysis of the motion-related terminology used in all of the papers published in the NIME conference proceedings to date (2001– 2013). The results show that ‘gesture’ is in fact used in 62 % of all NIME papers, which is a significantly higher percentage than in other music conferences (ICMC and SMC), and much more frequently than it is used in the HCI and biomechanics communities. The results from a collocation analysis support the claim that ‘gesture’ is used broadly in the NIME community, and indicate that it ranges from the description of concrete human motion and system control to quite metaphorical applications.

Reference
Jensenius, A. R. (2014). To gesture or not? An analysis of terminology in NIME proceedings 2001–2013. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 217–220, London.

BibTeX

@inproceedings{Jensenius:2014c,
    Address = {London},
    Author = {Jensenius, Alexander Refsum},
    Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
    Pages = {217--220},
    Title = {To Gesture or Not? {A}n Analysis of Terminology in {NIME} Proceedings 2001--2013},
    Year = {2014}}

Analyzing correspondence between sound objects and body motion

acm-tapNew publication:

Title 
Analyzing correspondence between sound objects and body motion

Authors
Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception.

Abstract
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ? correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment.

Reference
Nymoen, K., Godøy, R. I., Jensenius, A. R., and Torresen, J. (2013). Analyzing correspondence between sound objects and body motion. ACM Transactions on Applied Perception, 10(2).

BibTeX

@article{Nymoen:2013,
 Author = {Nymoen, Kristian and God{\o}y, Rolf Inge and Jensenius, Alexander Refsum and Torresen, Jim},
 Journal = {ACM Transactions on Applied Perception},
 Number = {2},
 Title = {Analyzing correspondence between sound objects and body motion},
 Volume = {10},
 Year = {2013}}

Kinectofon: Performing with shapes in planes

2013-05-28-DSCN7184Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique.

Title
Kinectofon: Performing with shapes in planes

Links

Abstract
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. These two image streams are used to create different types of motiongrams, which, again, are used as the source material for a sonification process based on inverse FFT. The instrument is intuitive to play, allowing the performer to create sound by “touching’’ a virtual sound wall.

Reference
Jensenius, A. R. (2013). Kinectofon: Performing with shapes in planes. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

BibTeX

@inproceedings{Jensenius:2013e,
   Address = {Daejeon, Korea},
   Author = {Jensenius, Alexander Refsum},
   Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
   Pages = {196--197},
   Title = {Kinectofon: Performing with Shapes in Planes},
   Year = {2013}
}

kinectofon_poster