Tag Archives: Music

Music Moves on YouTube

We have been running our free online course Music Moves a couple of times on the FutureLearn platform. The course consists of a number of videos, as well as articles, quizzes, etc., all of which help create a great learning experience for the people that take part.

One great thing about the FutureLearn model (similar to Coursera, etc.) is that they focus on creating a complete course. There are many benefits to such a model, not least to create a virtual student group that interact in a somewhat similar way to campus students. The downside to this, of course, is that the material is not accessible to others when the course is not running.

We spent a lot of time and effort on making all the material for Music Moves, and we see that some of it could also be useful in other contexts. This semester, for example, I am teaching a course called Interactive Music, in which some of the videos on motion capture would be very relevant for  the students.

For that reason we have now decided to upload all the Music Moves videos to YouTube, so that everyone can access them. We still encourage interested people to enroll in the complete course, though. The next run on FutureLearn is scheduled to start in September.

Starting up my new project: MICRO

I am super excited about starting up my new project – MICRO – Human Bodily Micromotion in Music Perception and Interaction – these days. Here is a short trailer explaining the main points of the project:

Now I have also been able to recruit two great researchers to join me, postdoctoral researcher Victor Evaristo Gonzalez Sanchez and PhD fellow Agata Zelechowska. Together we will work on human micromotion, how music influences such micromotion, and how we can get towards microinteraction in digital musical instruments. Great fun!

This week we have already made some progress, both in terms of analysis and synthesis. A sneak peak below, more to come…

New SMC paper: Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music

2015-08-12_18-04-34My colleague Ragnhild Torvanger Solberg and I presented a paper at the Sound and Music Computing conference in Hamburg last week called: “Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music“.

This is a methodological paper, trying to summarize our experiences with using our Qualisys motion capture system for group dance studies. We have two other papers in the pipeline that describes the actual data from the experiments in question. The happy story in the SMC paper is that it is, indeed, possible to get good tracking with multiple people, although it requires quite some fine tuning of the system.

Download: Fulltext (PDF)

Abstract: What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group studies of dancers. This is exemplified through experiments with a Qualisys infrared motion capture system being used alongside a set of small inertial trackers from Axivity and regular video recordings. The conclusion is that it is possible to fine-tune an infrared tracking system to work satisfactory for group studies of complex body motion in a “club-like” environment. For ecological studies in a real club setting, however, inertial tracking is the most scalable and flexible solution.

Citation: Solberg, R. T., & Jensenius, Alexander Refsum, A. R. (2016). Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music. In Proceedings of the Sound and Music Computing Conference (pp. 469–474). Hamburg.

BibTeX
@inproceedings{solberg_optical_2016,
address = {Hamburg},
title = {Optical or {Inertial}? {Evaluation} of {Two} {Motion} {Capture} {Systems} for {Studies} of {Dancing} to {Electronic} {Dance} {Music}},
isbn = {978-3-00-053700-4},
abstract = {What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group studies of dancers. This is exemplified through experiments with a Qualisys infrared motion capture system being used alongside a set of small inertial trackers from Axivity and regular video recordings. The conclusion is that it is possible to fine-tune an infrared tracking system to work satisfactory for group studies of complex body motion in a “club-like” environment. For ecological studies in a real club setting, however, inertial tracking is the most scalable and flexible solution.},
booktitle = {Proceedings of the {Sound} and {Music} {Computing} {Conference}},
author = {Solberg, Ragnhild Torvanger and Jensenius, Alexander Refsum, Alexander Refsum},
year = {2016},
pages = {469--474},

New paper: “NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs”

At NIME we have a large archive of the conference proceedings, but we do not (yet) have a proper repository for instrument designs. For that reason I took part in a workshop on Monday with the aim to lay the groundwork for a new repository:

NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs [PDF]

This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, open datasets, established evaluation methods and community standards for research practice. NIME could benefit from similar practices, both to share ideas amongst geographically distant researchers and to maintain instrument designs after their first performances. However, the needs of NIME are different from other communities on account of NIME’s reliance on custom hardware designs and the interdependence of technology and arts practice. This half-day workshop will promote a community discussion of the potential benefits and challenges of a DMI repository and plan concrete steps toward its implementation.

Reference
McPherson, A. P., Berdahl, E., Lyons, M. J., Jensenius, A. R., Bukvic, I. I., & Knudsen, A. (2016). NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs. In Proceedings of the International Conference on New Interfaces For Musical Expression. Brisbane.

BibTeX

@inproceedings{mcpherson_nimehub:_2016,
    address = {Brisbane},
    title = {{NIMEhub}: {Toward} a {Repository} for {Sharing} and {Archiving} {Instrument} {Designs}},
    abstract = {This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, open datasets, established evaluation methods and community standards for research practice. NIME could benefit from similar practices, both to share ideas amongst geographically distant researchers and to maintain instrument designs after their first performances. However, the needs of NIME are different from other communities on account of NIME's reliance on custom hardware designs and the interdependence of technology and arts practice. This half-day workshop will promote a community discussion of the potential benefits and challenges of a DMI repository and plan concrete steps toward its implementation.},
    booktitle = {Proceedings of the {International} {Conference} on {New} {Interfaces} {For} {Musical} {Expression}},
    author = {McPherson, Andrew P. and Berdahl, Edgar and Lyons, Michael J. and Jensenius, Alexander Refsum and Bukvic, Ivica Ico and Knudsen, Arve},
    year = {2016},
    file = {McPherson_et_al_2016_NIMEhub.pdf:/home/alexarje/Dropbox/Reference/Zotero/McPherson et al/McPherson_et_al_2016_NIMEhub.pdf:application/pdf}
}

New paper: Exploring Sound-Motion Similarity in Musical Experience

New paper in Journal of New Music Research:nnmr_a_1184689_f0001_b

Exploring Sound-Motion Similarity in Musical Experience (fulltext)
Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum

Abstract: People tend to perceive many and also salient similarities between musical sound and body motion in musical experience, as can be seen in countless situations of music performance or listening to music, and as has been documented by a number of studies in the past couple of decades. The so-called motor theory of perception has claimed that these similarity relationships are deeply rooted in human cognitive faculties, and that people perceive and make sense of what they hear by mentally simulating the body motion thought to be involved in the making of sound. In this paper, we survey some basic theories of sound-motion similarity in music, and in particular the motor theory perspective. We also present findings regarding sound-motion similarity in musical performance, in dance, in so-called sound-tracing (the spontaneous body motions people produce in tandem with musical sound), and in sonification, all in view of providing a broad basis for understanding sound-motion similarity in music.

Citation
Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2016). Exploring Sound-Motion Similarity in Musical Experience. Journal of New Music Research.  ISSN 0929-8215. . doi: http://dx.doi.org/10.1080/09298215.2016.1184689

BibTeX

@article{godoy:2016,
author = {Rolf Inge Godøy and Minho Song and Kristian Nymoen and Mari Romarheim Haugen and Alexander Refsum Jensenius},
title = {Exploring Sound-Motion Similarity in Musical Experience},
journal = {Journal of New Music Research},
volume = {0},
number = {0},
pages = {1-13},
year = {0},
doi = {10.1080/09298215.2016.1184689},
URL = {http://dx.doi.org/10.1080/09298215.2016.1184689},
eprint = {http://dx.doi.org/10.1080/09298215.2016.1184689},
abstract = { People tend to perceive many and also salient similarities between musical sound and body motion in musical experience, as can be seen in countless situations of music performance or listening to music, and as has been documented by a number of studies in the past couple of decades. The so-called motor theory of perception has claimed that these similarity relationships are deeply rooted in human cognitive faculties, and that people perceive and make sense of what they hear by mentally simulating the body motion thought to be involved in the making of sound. In this paper, we survey some basic theories of sound-motion similarity in music, and in particular the motor theory perspective. We also present findings regarding sound-motion similarity in musical performance, in dance, in so-called sound-tracing (the spontaneous body motions people produce in tandem with musical sound), and in sonification, all in view of providing a broad basis for understanding sound-motion similarity in music. }
}