New publication: Moving to the Beat

{.imagecontainer} {.cover .fleft .alignright} I am happy to announce that I have a new publication out, written together with two of my colleagues Anne Danielsen and Mari Romarheim Haugen: Moving to the Beat: Studying Entrainment to Micro-Rhythmic Changes in Pulse by Motion Capture {.pubtopleft} Authors: Anne Danielsen^1^; Mari Romarheim Haugen^1^ and Alexander Refsum Jensenius^1^ Source: Timing & Time Perception Publication Year : 2015 DOI: 10.1163/22134468-00002043{.externallink} {#tabbedpages} {.abstract .tabbedsection} {.clear .contain .articlemetadata} **Abstract **Pulse is a fundamental reference for the production and perception of rhythm. In this paper, we study entrainment to changes in the micro-rhythmic design of the basic pulse of the groove in ‘Left & Right’ by D’Angelo. In part 1 of the groove the beats have one specific position; in part 2, on the other hand, the different rhythmic layers specify two simultaneous but alternative beat positions that are approximately 50-80 ms apart. We first anticipate listeners’ perceptual response using the theories of entrainment and dynamic attending as points of departure. We then report on a motion capture experiment aimed at engaging listeners’ motion patterns in response to the two parts of the tune. The results show that when multiple onsets are introduced in part 2, the half note becomes a significant additional level of entrainment and the temporal locations of the perceived beats are drawn towards the added onsets. ...

March 16, 2015 · 2 min · 372 words · ARJ

New publication: How still is still? exploring human standstill for artistic applications

I am happy to announce a new publication titled How still is still? exploring human standstill for artistic applications (PDF of preprint), published in the International Journal of Arts and Technology. The paper is based on the Sverm project, and was written and accepted two years ago. Sometimes academic publishing takes absurdly long, which this is an example of, but I am happy that the publication is finally out in the wild. ...

May 1, 2014 · 2 min · 295 words · ARJ

Analyzing correspondence between sound objects and body motion

New publication: **Title ** Analyzing correspondence between sound objects and body motion Authors Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception. Abstract Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ? correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment. ...

June 3, 2013 · 2 min · 235 words · ARJ

New PhD Thesis: Kristian Nymoen

I am happy to announce that fourMs researcher Kristian Nymoen has successfully defended his PhD dissertation, and that the dissertation is now available in the DUO archive. I have had the pleasure of co-supervising Kristian’s project, and also to work closely with him on several of the papers included in the dissertation (and a few others). Reference K. Nymoen. Methods and Technologies for Analysing Links Between Musical Sound and Body Motion. PhD thesis, University of Oslo, 2013. Abstract There are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and muscial sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using stateof- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis. A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of systemspecific characteristics like data types or sampling rates. The thesis presents evaluations of four motion tracking systems used in research on musicrelated body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion. The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber. Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses. ...

February 20, 2013 · 5 min · 917 words · ARJ

New Master Thesis 2: Music Kinection: Musical Sound and Motion in Interactive Systems

Yet another of my master students have graduated recently, and here is a link to his thesis: Even Bekkedal: Music Kinection : Musical Sound and Motion in Interactive Systems Even has carried out a so-called “practical” master thesis, with a more practical focus. He has carried out a mocap analysis of how people move while playing computer games with a Kinect device, and has also prototyped several mocap instruments. ...

February 14, 2013 · 2 min · 332 words · ARJ

Musikkteknologidagene 2012

](/images/2012/10/image2.jpg) Last week I held a keynote lecture at the Norwegian music technology conference Musikkteknologidagene, by (and at) the Norwegian Academy of Music and NOTAM. The talk was titled: “Embodying the human body in music technology”, and was an attempt at explaining why I believe we need to put more emphasis on human-friendly technologies, and why the field of music cognition is very much connected to that of music technology. I got a comment that it would have been better to exchange “embodying” with “embedding” in my title, and I totally agree. So now I already have a title for my next talk! ...

October 30, 2012 · 3 min · 576 words · ARJ

Paper #2 at SMC 2012: Noise level in IR mocap systems

Yesterday I presented a paper on motiongrams at the Sound and Music Computing conference in Copenhagen. Today I will present the paper A study of the noise-level in two infrared marker-based motion capture systems. This is a quite nerdy, in-depth study of the noise-level of two of our motion capture systems. Abstract With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested how various features (calibration volume, marker size, sampling frequency, etc.) influence the noise level of markers lying still, and fixed to subjects standing still. The conclusion is that the motion observed in humans standing still is usually considerably higher than the noise level of the systems. Dependent on the system and its calibration, however, the signal-to-noise-ratio may in some cases be problematic. ...

July 13, 2012 · 2 min · 241 words · ARJ

Workshops at Art.on.Wires

Yesterday I held a workshop on music-related motion capture at this year’s Art.on.Wires. The workshop was quite similar to the one I held in Trondheim a couple of weeks ago. Trond Lossius held workshops on Jamoma and surround sound, and there were many other interesting workshops as well. Below are some pictures from the festival:

May 3, 2012 · 1 min · 55 words · ARJ

Motionlessness

Yesterday Miles Phillips{.url} suggested that the word “motionlessness” may be what I am after when it comes to describing the act of standing still. He further pointed me to a web site with a list of the world records for motionlessness. The rules to compete in motionlessness is as follows: The record is for continuously standing motionless. You must stand: sitting is not allowed. No facial movements are allowed other then the involuntary blinking of the eye. Deep breathing is permitted provided it does not involve observable movement notably greater than that in normal breathing. No rest breaks are allowed at any point during the event. The venue for such an event should be such that the general public can view. But from my point of view, being interested in micromovements, I would be very curious to see how still these record holders actually were. ...

November 10, 2011 · 2 min · 379 words · ARJ

Standing still

In between organizing a little conference, teaching (MUS2006, MUS2860, MUS4830), and finalizing some publications, I have started a new research/artistic project with Kari Anne Bjerkestrand. I’ll write a lot more on this later, but for now I just wanted to share a plot from a motion capture recording of a single marker placed on my neck (C7). The recording is of me standing still in 10 minutes. Quite a lot of motion for someone standing still… To be continued. ...

March 21, 2011 · 1 min · 79 words · ARJ