New article: Group behaviour and interpersonal synchronization to electronic dance music

I am happy to announce the publication of a follow-up study to our former paper on group dancing to EDM, and a technical paper on motion capture of groups of people. In this new study we successfully managed to track groups of 9-10 people dancing in a semi-ecological setup in our motion capture lab. We also found a lot of interesting things when it came to how people synchronize to both the music and each other.

Citation:
Solberg, R. T., & Jensenius, A. R. (2017). Group behaviour and interpersonal synchronization to electronic dance music. Musicae Scientiae.

Abstract:
The present study investigates how people move and relate to each other – and to the dance music – in a club-like setting created within a motion capture laboratory. Three groups of participants (29 in total) each danced to a 10-minute-long DJ mix consisting of four tracks of electronic dance music (EDM). Two of the EDM tracks had little structural development, while the two others included a typical “break routine” in the middle of the track, consisting of three distinct passages: (a) “breakdown”, (b) “build-up” and (c) “drop”. The motion capture data show similar bodily responses for all three groups in the break routines: a sudden decrease and increase in the general quantity of motion. More specifically, the participants demonstrated an improved level of interpersonal synchronization after the drop, particularly in their vertical movements. Furthermore, the participants’ activity increased and became more pronounced after the drop. This may suggest that the temporal removal and reintroduction of a clear rhythmic framework, as well as the use of intensifying sound features, have a profound effect on a group’s beat synchronization. Our results further suggest that the musical passages of EDM efficiently lead to the entrainment of a whole group, and that a break routine effectively “re-energizes” the dancing.

 

New publication: Pleasurable and Intersubjectively Embodied Experiences of Electronic Dance Music.

I am happy to announce a new publication, this time with my colleague Ragnhild Torvanger Solberg. Best of all, this is also a gold open access publication, freely available for everyone:
Citation:
Solberg, R. T., & Jensenius, A. R. (2017). Pleasurable and Intersubjectively Embodied Experiences of Electronic Dance Music. Empirical Musicology Review, 11(3–4), 301–318.
Abstract:
How do dancers engage with electronic dance music (EDM) when dancing? This paper reports on an empirical study of dancers’ pleasurable engagement with three structural properties of EDM: (1) breakdown, (2) build-up, and (3) drop. Sixteen participants danced to a DJ mix in a club-like environment, and the group’s bodily activity was recorded with an infrared, marker-based motion capture system. After they danced, the subjects filled out questionnaires about the pleasure they experienced and their relative desire to move while dancing. Subsequent analyses revealed associations between the group’s quantity of motion and self-reported experiences of pleasure. Associations were also found between certain sonic features and dynamic changes in the dancers’ movements. Pronounced changes occurred in the group’s quantity of motion during the breakdown, build-up, and drop sections, suggesting a high level of synchronization between the group and the structural properties of the music. The questionnaire confirmed this intersubjective agreement: participants perceived the musical passages consistently and marked the build-up and drop as particularly pleasurable and motivational in terms of dancing. Self-reports demonstrated that the presence and activity of other participants were also important in the shaping of one’s own experience, thus supporting the idea of clubbing as an intersubjectively embodied experience.

New Book: “A NIME Reader”

I am happy to announce that Springer has now released a book that I have been co-editing with Michael J. Lyons: “A NIME Reader: Fifteen Years of New Interfaces for Musical Expression“. From the book cover:

What is a musical instrument? What are the musical instruments of the future? This anthology presents thirty papers selected from the fifteen year long history of the International Conference on New Interfaces for Musical Expression (NIME). NIME is a leading music technology conference, and an important venue for researchers and artists to present and discuss their explorations of musical instruments and technologies.

Each of the papers is followed by commentaries written by the original authors and by leading experts. The volume covers important developments in the field, including the earliest reports of instruments like the reacTable, Overtone Violin, Pebblebox, and Plank. There are also numerous papers presenting new development platforms and technologies, as well as critical reflections, theoretical analyses and artistic experiences.

The anthology is intended for newcomers who want to get an overview of recent advances in music technology. The historical traces, meta-discussions and reflections will also be of interest for longtime NIME participants. The book thus serves both as a survey of influential past work and as a starting point for new and exciting future developments.

The ebook (PDF/epub) is a free download for all institutions/libraries affiliated with Springer Link.

New SMC paper: Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music

2015-08-12_18-04-34My colleague Ragnhild Torvanger Solberg and I presented a paper at the Sound and Music Computing conference in Hamburg last week called: “Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music“.

This is a methodological paper, trying to summarize our experiences with using our Qualisys motion capture system for group dance studies. We have two other papers in the pipeline that describes the actual data from the experiments in question. The happy story in the SMC paper is that it is, indeed, possible to get good tracking with multiple people, although it requires quite some fine tuning of the system.

Download: Fulltext (PDF)

Abstract: What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group studies of dancers. This is exemplified through experiments with a Qualisys infrared motion capture system being used alongside a set of small inertial trackers from Axivity and regular video recordings. The conclusion is that it is possible to fine-tune an infrared tracking system to work satisfactory for group studies of complex body motion in a “club-like” environment. For ecological studies in a real club setting, however, inertial tracking is the most scalable and flexible solution.

Citation: Solberg, R. T., & Jensenius, Alexander Refsum, A. R. (2016). Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music. In Proceedings of the Sound and Music Computing Conference (pp. 469–474). Hamburg.

BibTeX
@inproceedings{solberg_optical_2016,
address = {Hamburg},
title = {Optical or {Inertial}? {Evaluation} of {Two} {Motion} {Capture} {Systems} for {Studies} of {Dancing} to {Electronic} {Dance} {Music}},
isbn = {978-3-00-053700-4},
abstract = {What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group studies of dancers. This is exemplified through experiments with a Qualisys infrared motion capture system being used alongside a set of small inertial trackers from Axivity and regular video recordings. The conclusion is that it is possible to fine-tune an infrared tracking system to work satisfactory for group studies of complex body motion in a “club-like” environment. For ecological studies in a real club setting, however, inertial tracking is the most scalable and flexible solution.},
booktitle = {Proceedings of the {Sound} and {Music} {Computing} {Conference}},
author = {Solberg, Ragnhild Torvanger and Jensenius, Alexander Refsum, Alexander Refsum},
year = {2016},
pages = {469--474},

New NIME paper: “The ‘Virtualmonium’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra”

The third NIME contribution from the fourMs lab this year was the paper:

The ‘Virtualmonium’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra

Despite increasingly accessible and user-friendly multi-channel compositional tools, many composers still choose stereo formats for their work, where the compositional process is allied to diffusion performance over a ‘classical’ loudspeaker orchestra. Although such orchestras remain common within UK institutions as well as in France, they are in decline in the rest of the world. In contrast, permanent, high-density loudspeaker arrays are on the rise, as is the practical application of 3-D audio technologies. Looking to the future, we need to reconcile the performance of historical and new stereo works, side-byside native 3-D compositions. In anticipation of this growing need, we have designed and tested a prototype ‘Virtualmonium’. The Virtualmonium is an instrument for classical diffusion performance over an acousmonium emulated in higher-order Ambisonics. It allows composers to custom-design loudspeaker orchestra emulations for the performance of their works, rehearse and refine performances off-site, and perform classical repertoire alongside native 3-D formats in the same concert. This paper describes the technical design of the Virtualmonium, assesses the success of the prototype in some preliminary listening tests and concerts, and speculates how the instrument can further composition and performance practice.

Reference
Barrett, N., & Jensenius, A. R. (2016). The “Virtualmonium”: an instrument for classical sound diffusion over a virtual loudspeaker orchestra. In Proceedings of the International Conference on New Interfaces For Musical Expression (pp. 55–60). Brisbane.

BibTeX

@inproceedings{barrett_virtualmonium:_2016,
    address = {Brisbane},
    title = {The ‘{Virtualmonium}’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra},
    abstract = {Despite increasingly accessible and user-friendly multi-channel compositional tools, many composers still choose stereo formats for their work, where the compositional process is allied to diffusion performance over a ‘classical’ loudspeaker orchestra. Although such orchestras remain common within UK institutions as well as in France, they are in decline in the rest of the world. In contrast, permanent, high-density loudspeaker arrays are on the rise, as is the practical application of 3-D audio technologies. Looking to the future, we need to reconcile the performance of historical and new stereo works, side-byside native 3-D compositions. In anticipation of this growing need, we have designed and tested a prototype ‘Virtualmonium’. The Virtualmonium is an instrument for classical diffusion performance over an acousmonium emulated in higher-order Ambisonics. It allows composers to custom-design loudspeaker orchestra emulations for the performance of their works, rehearse and refine performances off-site, and perform classical repertoire alongside native 3-D formats in the same concert. This paper describes the technical design of the Virtualmonium, assesses the success of the prototype in some preliminary listening tests and concerts, and speculates how the instrument can further composition and performance practice.},
    booktitle = {Proceedings of the {International} {Conference} on {New} {Interfaces} {For} {Musical} {Expression}},
    author = {Barrett, Natasha and Jensenius, Alexander Refsum},
    year = {2016},
    pages = {55--60},
    file = {Barrett_Jensenius_2016_The_‘Virtualmonium’.pdf:/home/alexarje/Dropbox/Reference/Zotero/Barrett_Jensenius/Barrett_Jensenius_2016_The_‘Virtualmonium’.pdf:application/pdf}
}