The location of the EyesWeb Week is the facilities of the DIST group in the beautiful Casa Paganina, including a large auditorium next to the laboratories. This allows for an ecological setting for experiments, since performers can actually perform on a real stage with real audience. I wish we could have something like this in Oslo!
Here a picture from an experimental setup where we are looking at the synchronisation between the musicians in a string trio.
The Peretz group has made available a set of musical excerpts with emotion ratings. Perhaps not the most exciting musical collection, but I think it is very important that the community starts building some data sets that can be used as reference for various type of analyses.
We really need to create a set of music recordings including motion capture and video, but this first requires that we develop some common format that can be used for synchronisation and sharing. GDIF, do I need to say anything more…
I just heard Esteban Maestre from UPF present his project on creating a database of instrumental actions of bowed instruments, for use in the synthesis of score-based material. They have come up with a very interesting solution to the recording and synchronisation of audio with movement data: Building a VST plugin which implements recording of motion capture data from a Polhemus Liberty, together with bow sensing through an Arduino. This makes it possible to load the VST-plugin inside regular audio sequencing software and do the recording from there.
Esteban played an example of a synthesised version of Pachelbel’s Canon, and it was amazing how much more lively it sounded when the performance actions were also synthesised and used to control the sound. However, as Antonio Camurri noted in the discussion, the performance sounded a bit rushed and without breathing. This is probably because the model is, so far, only based on the recording and synthesising of the instrumental actions (excitation and modification), and does not take into account various types of ancillary movements (e.g. support and phrasing) which typically would create the larger scale shapes.
Working on a book chapter, I am trying to clarify some terminology. Right now I am thinking about the differences between “musical” and “music-related” movements/actions/gestures. What is the difference? I find that it makes sense to think about whether the action is direct or indirect. In other words:
- Musical actions: actions involved in music making, e.g. performing an instrument (i.e. sound-producing actions).
- Music-related actions: actions that are the result of, or influenced by, music, e.g. dancing, walking in pace, etc.
While I believe the two are strongly connected, I think it makes sense to also make a clear distinction between them.
I heard about the initiative last year at Music & Gesture 2 in Manchester, and now I see that the new online journal Music Performance Research is actually up and running.
Music Performance Research is an international peer-reviewed journal that disseminates theoretical and empirical research on the performance of music. Its purpose is to disseminate research on the nature of music performance from both theoretical and empirical perspectives. The journal publishes contributions from all disciplines that are relevant to music performance, including archaeology, cultural studies, composition, computer science, education, ethnomusicology, history, medicine, music theory and analysis, musicology, philosophy, physics, psychology, neuroscience and sociology. […]
Sounds like a place where I should consider submitting a manuscript.