Completing the MICRO project

I wrote up the final report on the project MICRO – Human Bodily Micromotion in Music Perception and Interaction before Christmas. Now I finally got around to wrapping up the project pages. With the touch of a button, the project’s web page now says “completed”. But even though the project is formally over, its results will live on.

Aims and objectives

The MICRO project sought to investigate the close relationships between musical sound and human bodily micromotion. Micromotion is here used to describe the smallest motion that we can produce and experience, typically at a rate lower than 10 mm/s.

Example plots of the micromotion observed in the motion capture data of a person standing still for 10 minutes.

The last decades have seen an increased focus on the role of the human body in both the performance and the perception of music. Up to now, however, the micro-level of these experiences has received little attention.

The main objective of MICRO was broken down into three secondary objectives:

  1. Define a set of sub-categories of music-related micromotion.
  2. Understand more about how musical sound influences the micromotion of perceivers and which musical features (such as melody, harmony, rhythm, timbre, loudness, spatialization) come into play.
  3. Develop conceptual models for controlling sound through micromotion, and develop prototypes of interactive music systems based on these models.


The project completed most of its planned activities and several more:

  1. The scientific results include many insights about human music-related micromotion. Results have been presented in one doctoral dissertation, two master theses, several journal papers, and at numerous conferences. As hypothesized, music influences human micromotion. This has been verified with different types of music in all the collected datasets. We have also found that music with a regular and strong beat, particularly electronic dance music, leads to more motion. Our data also supports the idea that music with a pulse of around 120 beats per minute is more motion-inducing than music with slower or faster tempi. In addition, we found that people generally moved more when listening with headphones. Towards the end of the project, we began studying whether there are individual differences. One study found that people who score high on empathic concern move more to music than others. This aligns with findings from recent studies of larger-scale music-related body motion.
  2. Data collected from the project has been released openly in Oslo Standstill Database. The database contains data from all Championships of Standstill, the Headphones-Speakers study, and from the Sverm project that preceded MICRO.
  3. Software developed during the project has been made openly available. This includes various analysis scrips implemented in Jupyter Notebooks. Several of the developed software modules have been wrapped up in the Musical Gestures Toolbox for Python.
  4. The scientific results have inspired a series of artistic explorations, including several installations and performances with the Self-playing Guitars, Oslo Muscle Band, and the Micromotion Apps.
  5. The project and its results have been featured in many media appearances, including a number of newspaper stories and several times on national TV and radio.

Open Research

MICRO has been an Open Research flagship project. This includes making the entire project as open as possible but as closed as necessary. The project shares publications, data, source code, application, and other parts of the research process openly.

Summing up

I am very happy about the outcomes of the MICRO project. This is largely thanks to the fantastic team, particularly postdoctoral fellow Victor Gonzalez Sanchez and doctoral fellow Agata Zelechowska.

Results from the Sverm project inspired the MICRO project, and many lines of thought will continue in my new AMBIENT project. I am looking forward to researching unconscious and involuntary micromotion in the years to come.

NIME publication and performance: Vrengt

My PhD student Cagri Erdem developed a performance together with dancer Katja Henriksen Schia. The piece was first performed together with Qichao Lan and myself during the RITMO opening and also during MusicLab vol. 3. See here for a teaser of the performance:

This week Cagri, Katja and myself performed a version of the piece Vrengt at NIME in Porto Alegre.

We also presented a paper describing the development of the instrument/piece:

Erdem, Cagri, Katja Henriksen Schia, and Alexander Refsum Jensenius. “Vrengt: A Shared Body-Machine Instrument for Music-Dance Performance.” In Proceedings of the International C Onference on New Interfaces for Musical Expression. Porto Alegre, 2019.


This paper describes the process of developing a shared instrument for music–dance performance, with a particular focus on exploring the boundaries between standstill vs motion, and silence vs sound. The piece Vrengt grew from the idea of enabling a true partnership between a musician and a dancer, developing an instrument that would allow for active co-performance. Using a participatory design approach, we worked with sonification as a tool for systematically exploring the dancer’s bodily expressions. The exploration used a “spatiotemporal matrix,” with a particular focus on sonic microinteraction. In the final performance, two Myo armbands were used for capturing muscle activity of the arm and leg of the dancer, together with a wireless headset microphone capturing the sound of breathing. In the paper we reflect on multi-user instrument paradigms, discuss our approach to creating a shared instrument using sonification as a tool for the sound design, and reflect on the performers’ subjective evaluation of the instrument.

New publication: Sonic Microinteraction in “the Air”

I am happy to announce a new book chapter based on the artistic-scientific research in the Sverm and MICRO projects.

Citation: Jensenius, A. R. (2017). Sonic Microinteraction in “the Air.” In M. Lesaffre, P.-J. Maes, & M. Leman (Eds.), The Routledge Companion to Embodied Music Interaction (pp. 431–439). New York: Routledge.
Abstract: This chapter looks at some of the principles involved in developing conceptual methods and technological systems concerning sonic microinteraction, a type of interaction with sounds that is generated by bodily motion at a very small scale. I focus on the conceptualization of interactive systems that can exploit the smallest possible micromotion that people are able to both perceive and produce. It is also important that the interaction that is taking place allow for a recursive element via a feedback loop from the sound produced back to the performer producing it.

Starting up my new project: MICRO

I am super excited about starting up my new project – MICRO – Human Bodily Micromotion in Music Perception and Interaction – these days. Here is a short trailer explaining the main points of the project:

Now I have also been able to recruit two great researchers to join me, postdoctoral researcher Victor Evaristo Gonzalez Sanchez and PhD fellow Agata Zelechowska. Together we will work on human micromotion, how music influences such micromotion, and how we can get towards microinteraction in digital musical instruments. Great fun!

This week we have already made some progress, both in terms of analysis and synthesis. A sneak peak below, more to come…