Completing the MICRO project

I wrote up the final report on the project MICRO - Human Bodily Micromotion in Music Perception and Interaction before Christmas. Now I finally got around to wrapping up the project pages. With the touch of a button, the project’s web page now says “completed”. But even though the project is formally over, its results will live on. Aims and objectives The MICRO project sought to investigate the close relationships between musical sound and human bodily micromotion. Micromotion is here used to describe the smallest motion that we can produce and experience, typically at a rate lower than 10 mm/s. ...

February 16, 2022 · 3 min · 595 words · ARJ

MusicTestLab as a Testbed of Open Research

Many people talk about “opening” the research process these days. Due to initiatives like Plan S, much has happened when it comes to Open Access to research publications. There are also things happening when it comes to sharing data openly (or at least FAIR). Unfortunately, there is currently more talking about Open Research than doing. At RITMO, we are actively exploring different strategies for opening our research. The most extreme case is that of MusicLab. In this blog post, I will reflect on yesterday’s MusicTestLab - Slow TV. ...

October 30, 2020 · 6 min · 1172 words · ARJ

New PhD Fellowship in the fourMs group

{.vrtx-introduction} Come work with us in the fourMs group at University of Oslo: Doctoral Research Fellowship is available in the fourMs group. All proposals within the area of music cognition will be considered, but we are particularly looking for projects on the topical areas of the fourMs group, such as music-related body motion, cross-modal relationships of sound and motion, rhythm studies, and music and emotions. The appointed candidate will get full access to the world-class fourMs lab, with state-of-the-art motion capture systems and sound spatialisation facilities. It is expected that the candidate will work on an independent project and be supervised by one or more members of the fourMs group, as well as other researchers at the Department of Musicology, depending on the particular focus of the project. ...

December 15, 2016 · 1 min · 133 words · ARJ

New fourMs video

Not only do we have a new Department video, but we have also made a short video documentary about our fourMs group. It is in Norwegian (subtitles coming soon), but even though you do not understand the language, the video has lots of nice shots from the labs and the background music is made by Professor Rolf Inge Godøy.

February 25, 2014 · 1 min · 59 words · ARJ

Analyzing correspondence between sound objects and body motion

New publication: **Title ** Analyzing correspondence between sound objects and body motion Authors Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception. Abstract Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ? correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment. ...

June 3, 2013 · 2 min · 235 words · ARJ

ImageSonifyer

Earlier this year, before I started as head of department, I was working on a non-realtime implementation of my sonomotiongram technique (a sonomotiongram is a sonic display of motion from a video recording, created by sonifying a motiongram). Now I finally found some time to wrap it up and make it available as an OS X application called ImageSonifyer. The Max patch is also available, for those that want to look at what is going on. ...

April 6, 2013 · 1 min · 201 words · ARJ

Musikkteknologidagene 2012

](/images/2012/10/image2.jpg) Last week I held a keynote lecture at the Norwegian music technology conference Musikkteknologidagene, by (and at) the Norwegian Academy of Music and NOTAM. The talk was titled: “Embodying the human body in music technology”, and was an attempt at explaining why I believe we need to put more emphasis on human-friendly technologies, and why the field of music cognition is very much connected to that of music technology. I got a comment that it would have been better to exchange “embodying” with “embedding” in my title, and I totally agree. So now I already have a title for my next talk! ...

October 30, 2012 · 3 min · 576 words · ARJ

fourMs videos

Over the years we have uploaded various videos to YouTube of our fourMs lab activities. Some of these videos have been uploaded using a shared YouTube user, others by myself and others. I just realised that a good solution for gathering all the different videos is just to create a playlist, and then add all relevant videos there. Then it should also be possible to embed this playlist in web pages, like below: ...

August 16, 2012 · 1 min · 73 words · ARJ

Paternity leave

After spending a lot of time organizing NIME 2011 and building up our new lab facilities at fourMs the last academic year, I will be on vacation and paternity leave from now and through the fall semester. I will teach and supervise a little during the fall semester, but will otherwise be taking care of my daughter. I will be reading e-mails, but find that it is quite difficult to find time to reply. So if you contact me, please don’t expect a rapid reply.

June 27, 2011 · 1 min · 85 words · ARJ

GDIF recording and playback

Kristian Nymoen have updated the Jamoma modules for recording and playing back GDIF data in Max 5. The modules are based on the FTM library (beta 12, 13-15 does not work), and can be downloaded here. We have also made available three use cases in the (soon to be expanded) fourMs database: simple mouse recording, sound saber and a short piano example. See the video below for a quick demonstration of how it works:

July 3, 2010 · 1 min · 74 words · ARJ