Visualising a Bach prelude played on Boomwhackers

I came across a fantastic performance of a Bach prelude played on Boomwhackers by Les Objets Volants.

It is really incredible how they manage to coordinate the sticks and make it into a beautiful performance. Given my interest in the visual aspects of music performance, I reached for the Musical Gestures Toolbox to create some video visualisations.

I started with creating an average image of the video:

Average image of the video.

This image is not particularly interesting. The performers moved around quite a bit, so the average image mainly shows the stage. An alternative spatial summary is the creation of a keyframe history image of the video file. This is created by extracting the keyframes of the video (approximately 50 frames) and combining these into one image:

Keyframe history image.

The keyframe history image summarizes how the performers moved around on stage and explained the spatial distribution of activity over time. But to get more into the temporal distribution of motion, we need to look at a spatiotemporal visualization. This is where motiongrams are useful:

Motiongram of vertical motion (time from left to right)
Motiongram of vertical motion (time from left to right)
Motiongram of horizontal motion (time from top to bottom)
Motiongram of horizontal motion (time from top to bottom)

If you click on the images above, you can zoom in to look at the visual beauty of the performance.

New run of Music Moves

I am happy to announce a new run (the 6th) of our free online course Music Moves: Why Does Music Make You Move?. Here is a 1-minute welcome that I recorded for Twitter:

The course starts on Monday (25 January 2021) and will run for six weeks. In the course, you will learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.

We developed the course 5 years ago, but the content is still valid. I also try to keep it up to date by recording new weekly wrap-ups with interviews with researchers around here at UiO.

I highly recommend joining the course on FutureLearn, that is the only way to get all the content, including videos, articles, quizzes, and, most importantly, the dialogue with other learners. But if you are only interested in watching videos, all of them are available on this UiO page and this YouTube playlist.

New publication: Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music

After several years of hard work, we are very happy to announce a new publication coming out of the MICRO project that I am leading: Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music (Frontiers Psychology).

From the setup of the experiment in which we tested the effects of listening to headphones and speakers.

This is the first journal article of my PhD student Agata Zelechowska, and it reports on a standstill study conducted a couple of years ago. It is slightly different than the paradigm we have used for the Championships of Standstill. While the latter is based on single markers on the head of multiple people, Agata’s experiment was conducted with full-body motion capture of individuals.

The most exciting thing about this new study, is that we have investigated whether there are any differences in people’s micromotion when they listen through either headphones or speakers. Is there a difference? Yes, it is! People move (a little) more when listening through headphones.

Want to know more? The article is Open Access, so you can read the whole thing here. The short summary is here:

Previous studies have shown that music may lead to spontaneous body movement, even when people try to stand still. But are spontaneous movement responses to music similar if the stimuli are presented using headphones or speakers? This article presents results from an exploratory study in which 35 participants listened to rhythmic stimuli while standing in a neutral position. The six different stimuli were 45 s each and ranged from a simple pulse to excerpts from electronic dance music (EDM). Each participant listened to all the stimuli using both headphones and speakers. An optical motion capture system was used to calculate their quantity of motion, and a set of questionnaires collected data about music preferences, listening habits, and the experimental sessions. The results show that the participants on average moved more when listening through headphones. The headphones condition was also reported as being more tiresome by the participants. Correlations between participants’ demographics, listening habits, and self-reported body motion were observed in both listening conditions. We conclude that the playback method impacts the level of body motion observed when people are listening to music. This should be taken into account when designing embodied music cognition studies.

Method chapter freely available

I am a big supporter of Open Access publishing, but for various reasons some of my publications are not openly available by default. This is the case for the chapter Methods for Studying Music-Related Body Motion that I have contributed to the Springer Handbook of Systematic Musicology.

I am very happy to announce that the embargo on the book ran out today, which means that a pre-print version of my chapter is finally freely available in UiO’s digital repository. This chapter is a summary of my experiences with music-related motion analysis, and I often recommend it to students. Therefore it is great that it is finally available to download from everywhere.

Abstract

This chapter presents an overview of some methodological approaches and technologies that can be used in the study of music-related body motion. The aim is not to cover all possible approaches, but rather to highlight some of the ones that are more relevant from a musicological point of view. This includes methods for video-based and sensor-based motion analyses, both qualitative and quantitative. It also includes discussions of the strengths and weaknesses of the different methods, and reflections on how the methods can be used in connection to other data in question, such as physiological or neurological data, symbolic notation, sound recordings and contextual data.

NIME publication and performance: Vrengt

My PhD student Cagri Erdem developed a performance together with dancer Katja Henriksen Schia. The piece was first performed together with Qichao Lan and myself during the RITMO opening and also during MusicLab vol. 3. See here for a teaser of the performance:

This week Cagri, Katja and myself performed a version of the piece Vrengt at NIME in Porto Alegre.

We also presented a paper describing the development of the instrument/piece:

Erdem, Cagri, Katja Henriksen Schia, and Alexander Refsum Jensenius. “Vrengt: A Shared Body-Machine Instrument for Music-Dance Performance.” In Proceedings of the International C Onference on New Interfaces for Musical Expression. Porto Alegre, 2019.

Abstract:

This paper describes the process of developing a shared instrument for music–dance performance, with a particular focus on exploring the boundaries between standstill vs motion, and silence vs sound. The piece Vrengt grew from the idea of enabling a true partnership between a musician and a dancer, developing an instrument that would allow for active co-performance. Using a participatory design approach, we worked with sonification as a tool for systematically exploring the dancer’s bodily expressions. The exploration used a “spatiotemporal matrix,” with a particular focus on sonic microinteraction. In the final performance, two Myo armbands were used for capturing muscle activity of the arm and leg of the dancer, together with a wireless headset microphone capturing the sound of breathing. In the paper we reflect on multi-user instrument paradigms, discuss our approach to creating a shared instrument using sonification as a tool for the sound design, and reflect on the performers’ subjective evaluation of the instrument.