New paper: Who Moves to Music? Empathic Concern Predicts Spontaneous Movement Responses to Rhythm and Music

A few days after Agata Zelechowska defended her PhD dissertation, we got the news that her last paper was finally published in Music & Science. It is titled Who Moves to Music? Empathic Concern Predicts Spontaneous Movement Responses to Rhythm and Music and was co-authored by Victor Gonzalez Sanchez, Bruno Laeng, Jonna Vuoskoski, and myself. The paper is based on Agata’s headphones-speakers experiment. We have previously published a paper showing that people move more when listening on headphones. This, however, the focus was on the data gathered on individual differences. Many variables were tested, but it was only empathic concern that turned out to be a motion predictor. ...

December 23, 2020 · 2 min · 350 words · ARJ

New publication: Moving to the Beat

{.imagecontainer} {.cover .fleft .alignright} I am happy to announce that I have a new publication out, written together with two of my colleagues Anne Danielsen and Mari Romarheim Haugen: Moving to the Beat: Studying Entrainment to Micro-Rhythmic Changes in Pulse by Motion Capture {.pubtopleft} Authors: Anne Danielsen^1^; Mari Romarheim Haugen^1^ and Alexander Refsum Jensenius^1^ Source: Timing & Time Perception Publication Year : 2015 DOI: 10.1163/22134468-00002043{.externallink} {#tabbedpages} {.abstract .tabbedsection} {.clear .contain .articlemetadata} **Abstract **Pulse is a fundamental reference for the production and perception of rhythm. In this paper, we study entrainment to changes in the micro-rhythmic design of the basic pulse of the groove in ‘Left & Right’ by D’Angelo. In part 1 of the groove the beats have one specific position; in part 2, on the other hand, the different rhythmic layers specify two simultaneous but alternative beat positions that are approximately 50-80 ms apart. We first anticipate listeners’ perceptual response using the theories of entrainment and dynamic attending as points of departure. We then report on a motion capture experiment aimed at engaging listeners’ motion patterns in response to the two parts of the tune. The results show that when multiple onsets are introduced in part 2, the half note becomes a significant additional level of entrainment and the temporal locations of the perceived beats are drawn towards the added onsets. ...

March 16, 2015 · 2 min · 372 words · ARJ

New publication: Non-Realtime Sonification of Motiongrams

Today I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works. ...

August 1, 2013 · 2 min · 225 words · ARJ

New publication: Some video abstraction techniques for displaying body movement in analysis and performance

Today the MIT Press journal Leonardo has published my paper entitled “Some video abstraction techniques for displaying body movement in analysis and performance”. The paper is a summary of my work on different types of visualisation techniques of music-related body motion. Most of these techniques were developed during my PhD, but have been refined over the course of my post-doc fellowship. The paper is available from the Leonardo web page (or MUSE), and will also be posted in the digital archive at UiO after the 6 month embargo period. ...

January 14, 2013 · 2 min · 231 words · ARJ

New publication: Performing the Electric Violin in a Sonic Space

I am happy to announce that a paper I wrote together with Victoria Johnson has just been published in Computer Music Journal. The paper is based on the experiences that Victoria and I gained while working on the piece Transformation for electric violin and live electronics (see video of the piece below). Citation A. R. Jensenius and V. Johnson. Performing the electric violin in a sonic space. Computer Music Journal, 36(4):28–39, 2012. **Abstract **This article presents the development of the improvisation piece Transformation for electric violin and live electronics. The aim of the project was to develop an “invisible” technological setup that would allow the performer to move freely on stage while still being in full control of the electronics. The developed system consists of a video-based motion-tracking system, with a camera hanging in the ceiling above the stage. The performer’s motion and position on stage is used to control the playback of sonic fragments from a database of violin sounds, using concatenative synthesis as the sound engine. The setup allows the performer to improvise freely together with the electronic sounds being played back as she moves around the “sonic space.” The system has been stable in rehearsal and performance, and the simplicity of the approach has been inspiring to both the performer and the audience. ...

January 8, 2013 · 2 min · 290 words · ARJ

Paper #1 at SMC 2012: Evaluation of motiongrams

Today I presented the paper Evaluating how different video features influence the visual quality of resultant motiongrams at the Sound and Music Computing conference in Copenhagen. Abstract Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, background, lighting, clothing, video size and compression. It is argued that the proposed motiongram implementation is capable of visualising the main motion features even with quite drastic changes in all of the above mentioned variables. ...

July 12, 2012 · 1 min · 166 words · ARJ

New Book: Musical Gestures: Sound, Movement, and Meaning

I am happy to announce the publication of the new book Musical Gestures: Sound, Movement, and Meaning edited by Rolf Inge Godøy and Marc Leman (2009). The book is published by Routledge and was released last Friday, although it may take a few days before it hits the book shelves (including Amazon). The book is based on research carried out in the EU COST Action 287 ConGAS (Gesture controlled audio systems) that ran from 2004 to 2007. I have contributed a chapter called Musical Gestures: concepts and methods in research, co-authored with Marcelo M. Wanderley, Rolf Inge Godøy, and Marc Leman. ...

December 21, 2009 · 2 min · 274 words · ARJ

Motiongrams

Challenge Traditional keyframe displays of videos are not particularly useful when studying single-shot studio recordings of music-related movements, since they mainly show static postural information and no motion. Using motion images of various kinds helps in visualizing what is going on in the image. Below can be seen (from left): motion image, with noise reduction, with edge detection, with “trails” and added to the original image. ...

November 1, 2006 · 2 min · 373 words · ARJ

Building low-cost music controllers

New publication on our Cheapstick music controller: {width=“600” height=“315”} Reference: A. R. Jensenius, R. Koehly, and M. M. Wanderley. Building low-cost music controllers. In R. Kronland-Martinet, T. Voinier, and S. Ystad, editors, CMMR 2005, LNCS 3902, pages 123–129. Berlin Heidelberg: Springer-Verlag, 2006. (PDF from Springer) **Abstract: **This paper presents our work on building low-cost music controllers intended for educational and creative use. The main idea was to build an electronic music controller, including sensors and a sensor interface, on a “10 euro” budget. We have experimented with turning commercially available USB game controllers into generic sensor interfaces, and making sensors from cheap conductive materials such as LaTeX, ink, porous materials, and video tape. Our prototype controller, the CheapStick, is comparable to interfaces built with commercially available sensors and interfaces, but at a fraction of the price. ...

June 1, 2006 · 1 min · 183 words · ARJ