Lecture-performance setup

I have not been very good at blogging recently, primarily because I have been so busy in starting up both RITMO and MCT. As things are calming down a bit now, I am also trying to do some digital cleaning up, archiving files, organizing photos, etc.

As part of the cleanup, I came across this picture of my setup for a lecture-performance held at the humanities library earlier this fall. It consists of a number of sound makers, various types of acoustic ones, and also some electronic. Note that I am not using a computer, and there was no projector, so the entire thing is based on talking and playing. Feels very “unplugged”, and gives me (and hopefully the audience) a feeling of performing more than lecturing.

I have been using a similar setup in several lectures over the past year, testing out some ideas that are part of a book project that I am working on. The short story is that I am trying to create a coherent theoretical model for both acoustic and electronic instruments. More on that later!

Musical Gestures Toolbox for Matlab

Yesterday I presented the Musical Gestures Toolbox for Matlab in the late-breaking demo session at the ISMIR conference in Paris.

The Musical Gestures Toolbox for Matlab (MGT) aims at assisting music researchers with importing, preprocessing, analyzing, and visualizing video, audio, and motion capture data in a coherent manner within Matlab.

Most of the concepts in the toolbox are based on the Musical Gestures Toolbox that I first developed for Max more than a decade ago. A lot of the Matlab coding for the new version was done in the master’s thesis by Bo Zhou.

The new MGT is available on Github, and there is a more or less complete introduction to the main features in the software carpentry workshop Quantitative Video analysis for Qualitative Research.

Deciding on author names in publications

Publications are important for researchers. Therefore deciding on who should be named as author for an academic publication is a topic that often leads to discussions. Also the ordering of the author names in a publication is a topic for heated debate, and particularly when you work in interdisciplinary teams with different traditions, as can be seen in the version from PhD Comics below.

Here is a task I have developed as a point of departure for discussing this issue in research groups. This is a task we have used successfully at RITMO, and hopefully others can make use of it too.

Publication case

Consider the following scenario:

  • Professor Pia secures funding for a large project with a brilliant overarching research idea.
  • Professor Per leads a sub-project in the project focusing on an empirical investigation of the brilliant research idea. He hires PhD student Siri and Postdoc Palle to work on the experiment.
  • PhD student Siri and Postdoc Palle designs and carries out the experiment.
  • Administrator Anton helps with recruiting all the participants.
  • PhD student Sofie provides all the sound material used in the study, and a preliminary analysis of the sound.
  • Research assistant Anders helps with all the recordings for the experiment, including post-processing all the data.
  • Lab engineer Erik programs the system used for data collection.
  • Statistician Svein helps with the analysis of the data.
  • A large part of the analysis is done using a toolbox made by Postdoc Penelope.
  • Professor Pernille suggests an alternative analysis method in a seminar with a presentation of preliminary results of the data. This alternative analysis method turns out to be very promising and is therefore included in the paper.
  • PhD student Siri writes the main part of the paper.
  • Postdoc Palle makes all the figures and writes some of the text.
  • Professor Per reads the paper and comments on a few things.

Question:

Who gets on the publication list, and in which order?

New article: “Correspondences Between Music and Involuntary Human Micromotion During Standstill”

I am happy to announce a new journal article coming out of the MICRO project:

Victor E. Gonzalez-Sanchez, Agata Zelechowska and Alexander Refsum Jensenius
Correspondences Between Music and Involuntary Human Micromotion During Standstill
Front. Psychol., 07 August 2018 | https://doi.org/10.3389/fpsyg.2018.01382

Abstract: The relationships between human body motion and music have been the focus of several studies characterizing the correspondence between voluntary motion and various sound features. The study of involuntary movement to music, however, is still scarce. Insight into crucial aspects of music cognition, as well as characterization of the vestibular and sensorimotor systems could be largely improved through a description of the underlying links between music and involuntary movement. This study presents an analysis aimed at quantifying involuntary body motion of a small magnitude (micromotion) during standstill, as well as assessing the correspondences between such micromotion and different sound features of the musical stimuli: pulse clarity, amplitude, and spectral centroid. A total of 71 participants were asked to stand as still as possible for 6 min while being presented with alternating silence and music stimuli: Electronic Dance Music (EDM), Classical Indian music, and Norwegian fiddle music (Telespringar). The motion of each participant’s head was captured with a marker-based, infrared optical system. Differences in instantaneous position data were computed for each participant and the resulting time series were analyzed through cross-correlation to evaluate the delay between motion and musical features. The mean quantity of motion (QoM) was found to be highest across participants during the EDM condition. This musical genre is based on a clear pulse and rhythmic pattern, and it was also shown that pulse clarity was the metric that had the most significant effect in induced vertical motion across conditions. Correspondences were also found between motion and both brightness and loudness, providing some evidence of anticipation and reaction to the music. Overall, the proposed analysis techniques provide quantitative data and metrics on the correspondences between micromotion and music, with the EDM stimulus producing the clearest music-induced motion patterns. The analysis and results from this study are compatible with embodied music cognition and sensorimotor synchronization theories, and provide further evidence of the movement inducing effects of groove-related music features and human response to sound stimuli. Further work with larger data sets, and a wider range of stimuli, is necessary to produce conclusive findings on the subject.

Nordic Sound and Music Computing Network up and running

I am super excited about our new Nordic Sound and Music Computing Network, which has just started up with funding from the Nordic Research Council.

This network brings together a group of internationally leading sound and music computing researchers from institutions in five Nordic countries: Aalborg University, Aalto University, KTH Royal Institute of Technology, University of Iceland, and University of Oslo. The network covers the field of sound and music from the “soft” to the “hard,” including the arts and humanities, and the social and natural sciences, as well as engineering, and involves a high level of technological competency.

At the University of Oslo we have one open PhD fellowship connected to the network, with application deadline 4 April 2018. We invite PhD proposals that focus on sound/music interaction with periodic/rhythmic human body motion (walking, running, training, etc.). The appointed candidate is expected to carry out observation studies of human body motion in real-life settings, using different types of mobile motion capture systems (full-body suit and individual trackers). Results from the analysis of these observation studies should form the basis for the development of prototype systems for using such periodic/rhythmic motion in musical interaction.

The appointed candidate will benefit from the combined expertise within the NordicSMC network, and is expected to carry out one or more short-term scientific missions to the other partners. At UiO, the candidate will be affiliated with RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion. This interdisciplinary centre focuses on rhythm as a structuring mechanism for the temporal dimensions of human life. RITMO researchers span the fields of musicology, psychology and informatics, and have access to state-of-the-art facilities in sound/video recording, motion capture, eye tracking, physiological measurements, various types of brain imaging (EEG, fMRI), and rapid prototyping and robotics laboratories.