Tag Archives: motion capture

New MOOC: Music Moves

Together with several colleagues, and with great practical and economic support from the University of Oslo, I am happy to announce that we will soon kick off our first free online course (a so-called MOOC) called Music Moves.

Music Moves: Why Does Music Make You Move?

Learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.

Go to course – starts 1 Feb

About the course

Music is movement. A bold statement, but one that we will explore together in this free online course. Together we will study music through different types of body movement. This includes everything from the sound-producing keyboard actions of a pianist to the energetic dance moves in a club.

You will learn about the theoretical foundations for what we call embodied music cognition and why body movement is crucial for how we experience the emotional moods in music. We will also explore different research methods used at universities and conservatories. These include advanced motion capture systems and sound analysis methods.

You will be guided by a group of music researchers from the University of Oslo, with musical examples from four professional musicians. The course is rich in high-quality text, images, video, audio and interactive elements.

Join us to learn more about terms such as entrainment and musical metaphors, and why it is difficult to sit still when you experience a good groove.

  • FREE online course
  • 3 hours pw
  • Certificates available

Educators

Alexander Refsum Jensenius Alexander Refsum Jensenius

Diana Kayser (Mentor) Diana Kayser (Mentor)

Hans T. Zeiner-Henriksen Hans T. Zeiner-Henriksen

Kristian Nymoen Kristian Nymoen

Requirements

This course is open to everyone. No technical knowledge of music or dance is required.

Get a personalised, digital and printed certificate

You can buy a Statement of Participation for this course — a personalised certificate in both digital and printed formats — to show that you’ve taken part.

Join the conversation on social media

Use the hashtag #FLmusicmoves to join and contribute to social media conversations about this course.

Go to course – starts 1 Feb

New publication: “How still is still? exploring human standstill for artistic applications”

sverm-dumpI am happy to announce a new publication titled How still is still? exploring human standstill for artistic applications (PDF of preprint), published in the International Journal of Arts and Technology. The paper is based on the Sverm project, and was written and accepted two years ago. Sometimes academic publishing takes absurdly long, which this is an example of, but I am happy that the publication is finally out in the wild.

Abstract

We present the results of a series of observation studies of ourselves standing still on the floor for 10 minutes at a time. The aim has been to understand more about our own standstill, and to develop a heightened sensitivity for micromovements and how they can be used in music and dance performance. The quantity of motion, calculated from motion capture data of a head marker, reveals remarkably similar results for each person, and also between persons. The best results were obtained with the feet at the width of the shoulders, locked knees, and eyes open. No correlation was found between different types of mental strategies employed and the quantity of motion of the head marker, but we still believe that different mental strategies have an important subjective and communicative impact. The findings will be used in the development of a stage performance focused on micromovements.

Reference

Jensenius, A. R., Bjerkestrand, K. A. V., and Johnson, V. (2014). How still is still? exploring human standstill for artistic applications. International Journal of Arts and Technology, 7(2/3):207–222.

BibTeX

@article{Jensenius:2014a,
    Author = {Jensenius, Alexander Refsum and Bjerkestrand, Kari Anne Vadstensvik and Johnson, Victoria},
    Journal = {International Journal of Arts and Technology},
    Number = {2/3},
    Pages = {207--222},
    Title = {How Still is still? Exploring Human Standstill for Artistic Applications},
    Volume = {7},
    Year = {2014}}

Analyzing correspondence between sound objects and body motion

acm-tapNew publication:

Title 
Analyzing correspondence between sound objects and body motion

Authors
Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception.

Abstract
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ? correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment.

Reference
Nymoen, K., Godøy, R. I., Jensenius, A. R., and Torresen, J. (2013). Analyzing correspondence between sound objects and body motion. ACM Transactions on Applied Perception, 10(2).

BibTeX

@article{Nymoen:2013,
 Author = {Nymoen, Kristian and God{\o}y, Rolf Inge and Jensenius, Alexander Refsum and Torresen, Jim},
 Journal = {ACM Transactions on Applied Perception},
 Number = {2},
 Title = {Analyzing correspondence between sound objects and body motion},
 Volume = {10},
 Year = {2013}}

New PhD Thesis: Kristian Nymoen

I am happy to announce that fourMs researcher Kristian Nymoen has successfully defended his PhD dissertation, and that the dissertation is now available in the DUO archive. I have had the pleasure of co-supervising Kristian’s project, and also to work closely with him on several of the papers included in the dissertation (and a few others).

Reference

Abstract

There are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and muscial sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using stateof- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis. A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of systemspecific characteristics like data types or sampling rates. The thesis presents evaluations of four motion tracking systems used in research on musicrelated body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion. The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber. Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses.

List of papers

New Master Thesis 2: Music Kinection: Musical Sound and Motion in Interactive Systems

evenYet another of my master students have graduated recently, and here is a link to his thesis:

Even has carried out a so-called “practical” master thesis, with a more practical focus. He has carried out a mocap analysis of how people move while playing computer games with a Kinect device, and has also prototyped several mocap instruments.

Abstract:

Sound is often used as a feedback modality in technological devices. Yet relatively little is known about the relation between sound and motion in interactive systems. This thesis exam- ines what happens in the intersection between human-computer interaction, motion and sonic feedback. From the connection of music and motion, coupled by technology, we can draw the expression “Music Kinection”. A theoretical foundation accounts for the relationships that exist between sound and motion, and cognitive foundations for these relationships. This study of literature on music and motion, and music cognition theory, shows that there are many aspects that support various relationships between sound and motion. To see if it is possible to detect similarities between users of an interactive system, a user-study was performed with 16 subjects playing commercially available video games for the Kinect platform. Motion capture data was recorded and analyzed. The user-study showed that there is an overall similarity in the amount of motion performed by the user, but that there is some deviation in amount of motion performed by body parts important to the gameplay. Many users will choose the same body part for one task, but will apply different tactics when using this limb. Knowledge from the theory and observation study was used in the practical explorations of sound-action relationships. Two installations, Kinect Piano and Popsenteret Kinect installation, was made, together with two software prototypes, Soundshape and Music Kinection. The practical study showed that working with full-body motion capture and sound in human-computer interaction is dependent on good motion feature extraction algorithms and good mapping to sound engines.