Tag Archives: motion

New paper: MuMYO – Evaluating and Exploring the MYO Armband for Musical Interaction

usertest3Yesterday, I presented my microinteraction paper here at the NIME conference (New Interfaces for Musical Expression), organised at Louisiana State University, Baton Rouge, LA. Today I am presenting a poster based on a paper written together with two of my colleagues at UiO.

Title
MuMYO – Evaluating and Exploring the MYO Armband for Musical Interaction

Authors
Kristian Nymoen, Mari Romarheim Haugen, Alexander Refsum Jensenius

Abstract
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new “standard” controller in the NIME community.

Files

BibTeX

@inproceedings{nymoen_mumyo_2015,
    address = {Baton Rouge, LA},
    title = {{MuMYO} - {Evaluating} and {Exploring} the {MYO} {Armband} for {Musical} {Interaction}},
    abstract = {The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband's sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new ``standard'' controller in the NIME community.},
    booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
    author = {Nymoen, Kristian and Haugen, Mari Romarheim and Jensenius, Alexander Refsum},
    year = {2015}
}

New publication: “To Gesture or Not” (NIME 2014)

This week I am participating at the NIME conference, organised at Goldsmiths, University of London. I am doing some administrative work as chair of the NIME steering committee, and I am also happy to present a paper tomorrow:

Title
To Gesture or Not? An Analysis of Terminology in NIME Proceedings 2001–2013

Links
Paper (PDF)
Presentation (HTML)
Spreadsheet with summary of data (ODS)
OSX shell script used for analysis

Abstract
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series. But how often is it actually used, what is it used to describe, and how does its usage here differ from its usage in other fields of study? This paper presents a linguistic analysis of the motion-related terminology used in all of the papers published in the NIME conference proceedings to date (2001– 2013). The results show that ‘gesture’ is in fact used in 62 % of all NIME papers, which is a significantly higher percentage than in other music conferences (ICMC and SMC), and much more frequently than it is used in the HCI and biomechanics communities. The results from a collocation analysis support the claim that ‘gesture’ is used broadly in the NIME community, and indicate that it ranges from the description of concrete human motion and system control to quite metaphorical applications.

Reference
Jensenius, A. R. (2014). To gesture or not? An analysis of terminology in NIME proceedings 2001–2013. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 217–220, London.

BibTeX

@inproceedings{Jensenius:2014c,
    Address = {London},
    Author = {Jensenius, Alexander Refsum},
    Booktitle = {Proceedings of the International Conference on New Interfaces For Musical Expression},
    Pages = {217--220},
    Title = {To Gesture or Not? {A}n Analysis of Terminology in {NIME} Proceedings 2001--2013},
    Year = {2014}}

Analyzing correspondence between sound objects and body motion

acm-tapNew publication:

Title 
Analyzing correspondence between sound objects and body motion

Authors
Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception.

Abstract
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ? correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment.

Reference
Nymoen, K., Godøy, R. I., Jensenius, A. R., and Torresen, J. (2013). Analyzing correspondence between sound objects and body motion. ACM Transactions on Applied Perception, 10(2).

BibTeX

@article{Nymoen:2013,
 Author = {Nymoen, Kristian and God{\o}y, Rolf Inge and Jensenius, Alexander Refsum and Torresen, Jim},
 Journal = {ACM Transactions on Applied Perception},
 Number = {2},
 Title = {Analyzing correspondence between sound objects and body motion},
 Volume = {10},
 Year = {2013}}

New PhD Thesis: Kristian Nymoen

I am happy to announce that fourMs researcher Kristian Nymoen has successfully defended his PhD dissertation, and that the dissertation is now available in the DUO archive. I have had the pleasure of co-supervising Kristian’s project, and also to work closely with him on several of the papers included in the dissertation (and a few others).

Reference

Abstract

There are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and muscial sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using stateof- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis. A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of systemspecific characteristics like data types or sampling rates. The thesis presents evaluations of four motion tracking systems used in research on musicrelated body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion. The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber. Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses.

List of papers

New publication: Some video abstraction techniques for displaying body movement in analysis and performance

leonardo-2013Today the MIT Press journal Leonardo has published my paper entitled “Some video abstraction techniques for displaying body movement in analysis and performance”. The paper is a summary of my work on different types of visualisation techniques of music-related body motion. Most of these techniques were developed during my PhD, but have been refined over the course of my post-doc fellowship.

The paper is available from the Leonardo web page (or MUSE), and will also be posted in the digital archive at UiO after the 6 month embargo period.

Citation
A. R. Jensenius. Some video abstraction techniques for displaying body movement in analysis and performance. Leonardo, 46(1):53–60, 2013.

Abstract
This paper presents an overview of techniques for creating visual displays of human body movement based on video recordings. First a review of early movement and video visualization techniques is given. Then follows an overview of techniques that the author has developed and used in the study of music-related body movements: motion history images, motion average images, motion history keyframe images and motiongrams. Finally, examples are given of how such visualization techniques have been used in empirical music research, in medical research and for creative applications.

Citation

@article{Jensenius:2013,
   Author = {Jensenius, Alexander Refsum},
   Journal = {Leonardo},
   Number = {1},
   Pages = {53--60},
   Title = {Some video abstraction techniques for displaying body movement in analysis and performance},
   Volume = {46},
   Year = {2013}}