Strings On-Line installation

We presented the installation Strings On-Line at NIME 2020. It was supposed to be a physical installation at the conference to be held in Birmingham, UK.

Due to the corona crisis, the conference went online, and we decided to redesign the proposed physical installation into an online installation instead. The installation ran continuously from 21-25 July last year, and hundreds of people “came by” to interact with it.

I finally got around to edit a short (1-minute) video promo of the installation:

I have also made a short (10-minute) “behind the scenes” mini-documentary about the installation. Here researchers from RITMO, University of Oslo, talk about the setup featuring 6 self-playing guitars, 3 remote-controlled robots, and a 24/7 high-quality, low-latency, audiovisual stream.

We are planning a new installation for the RPPW conference this year. So if you are interested in exploring such an online installation live, please stay tuned.

What is a musical instrument?

A piano is an instrument. So is a violin. But what about the voice? Or a fork? Or a mobile phone? So what is (really) a musical instrument? That was the title of a short lecture I held at UiO’s Open Day today.

The 15-minute lecture is a very quick version of some of the concepts I have been working on for a new book project. Here I present a model for understanding what a musical instrument is and how new technology changes how we make and experience music.

The original lecture was in Norwegian, but I got inspired and recorded an English version right afterwards:

If you rather prefer the original, Norwegian version, here it is:

And, if you do want to learn more about these things, you can apply for one of our study programmes before 15 April: bachelor or master of musicology, or master of music, communication and technology.

Visualising a Bach prelude played on Boomwhackers

I came across a fantastic performance of a Bach prelude played on Boomwhackers by Les Objets Volants.

It is really incredible how they manage to coordinate the sticks and make it into a beautiful performance. Given my interest in the visual aspects of music performance, I reached for the Musical Gestures Toolbox to create some video visualisations.

I started with creating an average image of the video:

Average image of the video.

This image is not particularly interesting. The performers moved around quite a bit, so the average image mainly shows the stage. An alternative spatial summary is the creation of a keyframe history image of the video file. This is created by extracting the keyframes of the video (approximately 50 frames) and combining these into one image:

Keyframe history image.

The keyframe history image summarizes how the performers moved around on stage and explained the spatial distribution of activity over time. But to get more into the temporal distribution of motion, we need to look at a spatiotemporal visualization. This is where motiongrams are useful:

Motiongram of vertical motion (time from left to right)
Motiongram of vertical motion (time from left to right)
Motiongram of horizontal motion (time from top to bottom)
Motiongram of horizontal motion (time from top to bottom)

If you click on the images above, you can zoom in to look at the visual beauty of the performance.

New run of Music Moves

I am happy to announce a new run (the 6th) of our free online course Music Moves: Why Does Music Make You Move?. Here is a 1-minute welcome that I recorded for Twitter:

The course starts on Monday (25 January 2021) and will run for six weeks. In the course, you will learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.

We developed the course 5 years ago, but the content is still valid. I also try to keep it up to date by recording new weekly wrap-ups with interviews with researchers around here at UiO.

I highly recommend joining the course on FutureLearn, that is the only way to get all the content, including videos, articles, quizzes, and, most importantly, the dialogue with other learners. But if you are only interested in watching videos, all of them are available on this UiO page and this YouTube playlist.

New publication: Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music

After several years of hard work, we are very happy to announce a new publication coming out of the MICRO project that I am leading: Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music (Frontiers Psychology).

From the setup of the experiment in which we tested the effects of listening to headphones and speakers.

This is the first journal article of my PhD student Agata Zelechowska, and it reports on a standstill study conducted a couple of years ago. It is slightly different than the paradigm we have used for the Championships of Standstill. While the latter is based on single markers on the head of multiple people, Agata’s experiment was conducted with full-body motion capture of individuals.

The most exciting thing about this new study, is that we have investigated whether there are any differences in people’s micromotion when they listen through either headphones or speakers. Is there a difference? Yes, it is! People move (a little) more when listening through headphones.

Want to know more? The article is Open Access, so you can read the whole thing here. The short summary is here:

Previous studies have shown that music may lead to spontaneous body movement, even when people try to stand still. But are spontaneous movement responses to music similar if the stimuli are presented using headphones or speakers? This article presents results from an exploratory study in which 35 participants listened to rhythmic stimuli while standing in a neutral position. The six different stimuli were 45 s each and ranged from a simple pulse to excerpts from electronic dance music (EDM). Each participant listened to all the stimuli using both headphones and speakers. An optical motion capture system was used to calculate their quantity of motion, and a set of questionnaires collected data about music preferences, listening habits, and the experimental sessions. The results show that the participants on average moved more when listening through headphones. The headphones condition was also reported as being more tiresome by the participants. Correlations between participants’ demographics, listening habits, and self-reported body motion were observed in both listening conditions. We conclude that the playback method impacts the level of body motion observed when people are listening to music. This should be taken into account when designing embodied music cognition studies.