New Book Chapter: Gestures in ensemble performance

I am happy to announce that Cagri Erdem and I have written a chapter titled “Gestures in ensemble performance” in the new book Together in Music: Coordination, Expression, Participation edited by Renee Timmers Freya Bailes, and Helena Daffern.

Video Teaser

For the book launch, Cagri and I recorded a short video teaser:

Abstract

The more formal abstract is:

The topic of gesture has received growing attention among music researchers over recent decades. Some of this research has been summarized in anthologies on “musical gestures”, such as those by Gritten and King (2006), Godøy and Leman (2010), and Gritten and King (2011). There have also been a couple of articles reviewing how the term gesture has been used in various music-related disciplines (and beyond), including those by Cadoz and Wanderley (2000) and Jensenius et al. (2010). Much empirical work has been performed since these reviews were written, aided by better motion capture technologies, new machine learning techniques, and a heightened awareness of the topic. Still there are a number of open questions as to the role of gestures in music performance in general, and in ensemble performance in particular. This chapter aims to clarify some of the basic terminology of music-related body motion, and draw up some perspectives of how one can think about gestures in ensemble performance. This is, obviously, only one way of looking at the very multifaceted concept of gesture, but it may lead to further interest in this exciting and complex research domain.

Ten years after Musical Gestures

We began writing this ensemble gesture chapter in 2020, about ten years after the publication of the chapter Musical gestures: Concepts and methods in research. That musical gesture chapter has, to my surprise, become my most-cited publication to date. When I began working on the topic of musical gestures with Rolf Inge Godøy back in the early 2000s, it was still a relatively new topic. Most music researchers I spoke to didn’t understand why we were interested in the body.

Fast forward to today, and it is hard to find music researchers that are not interested in the body in one way or another. So I am thrilled about the possibility of expanding some of the “old” thoughts about musical gestures into the ensemble context in the newly published book chapter.

Releasing the Musical Gestures Toolbox for Python

After several years in the making, we finally “released” the Musical Gestures Toolbox for Python at the NordicSMC Conference this week. The toolbox is a collection of modules targeted at researchers working with video recordings.

Below is a short video in which Bálint Laczkó and I briefly describe the toolbox:

About MGT for Python

The Musical Gestures Toolbox for Python includes video visualization techniques such as creating motion videos, motion history images, and motiongrams. These visualizations allow for studying video recordings from different temporal and spatial perspectives. The toolbox also includes basic computer vision methods, and it is designed to integrate well with audio analysis toolboxes.

It is possible to run the toolbox from the terminal:

ipython example
Example of running MGT for Python in a terminal.

Many people would probably prefer to run it in a Jupyter notebook:

Screenshots from the example Jupyter Notebook.

The MGT was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.

History

This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max. The latest version was primarily developed by Bálint Laczkó, Frida Furmyr, and Marcus Widmer.

Read more

To learn more about Musical Gestures Toolbox for Python, take a look at our paper presented at NordicSMC: