Below you will find pages that utilize the taxonomy term “motion”
January 7, 2022
Try not to headbang challenge
I recently came across a video of the so-called Try not to headbang challenge, where the idea is to, well, not to headbang while listening to music. This immediately caught my attention. After all, I have been researching music-related micromotion over the last years and have run the Norwegian Championship of Standstill since 2012.
Here is an example of Nath & Johnny trying the challenge:
https://www.youtube.com/watch?v=-I4CBsDT37I As seen in the video, they are doing ok, although they are far from sitting still.
January 22, 2021
New run of Music Moves
I am happy to announce a new run (the 6th) of our free online course Music Moves: Why Does Music Make You Move?. Here is a 1-minute welcome video:
The course starts on Monday (25 January 2021) and will run for six weeks. In the course, you will learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.
We developed the course 5 years ago, but the content is still valid.
March 22, 2020
Method chapter freely available
I am a big supporter of Open Access publishing, but for various reasons some of my publications are not openly available by default. This is the case for the chapter Methods for Studying Music-Related Body Motion that I have contributed to the Springer Handbook of Systematic Musicology.
I am very happy to announce that the embargo on the book ran out today, which means that a pre-print version of my chapter is finally freely available in UiO’s digital repository.
January 18, 2018
New Publication: Analyzing Free-Hand Sound-Tracings of Melodic Phrases
We have done several sound-tracing studies before at University of Oslo, and here is a new one focusing on free-hand sound-tracings of melodies. I am happy to say that this is a gold open access publication, and that all the data are also available. So it is both free and “free”!
Kelkar, Tesjaswinee; Jensenius, Alexander Refsum
Analyzing Free-Hand Sound-Tracings of Melodic Phrases
Applied Sciences 2018, 8, 135. (Special Issue Sound and Music Computing) [
December 13, 2017
Come work with me! Lots of new positions at University of Oslo
I recently mentioned that I have been busy setting up the new MCT master’s programme. But I have been even more busy with preparing the startup of our new Centre of Excellence RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion. This is a large undertaking, and a collaboration between researchers from musicology, psychology and informatics. A visual “abstract” of the centre can be seen in the figure to the right.
October 9, 2017
And we're off: RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion
I am happy to announce that RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion officially started last week. This is a new centre of excellence funding by the Research Council of Norway.
Even though we have formally taken off, this mainly means that the management group has started to work. Establishing a centre with 50-60 researchers is not done in a few days, so we will more or less spend the coming year to get up to speed.
September 11, 2017
Sverm-Resonans - Installation at Ultima Contemporary Music Festival
I am happy to announce the opening of our new interactive art installation at the Ultima Contemporary Music Festival 2017: Sverm-resonans.
Time and place: Sep. 12, 2017 12:30 PM - Sep. 14, 2017 3:30 PM, Sentralen
Conceptual information The installation is as much haptic as audible.
An installation that gives you access to heightened sensations of stillness, sound and vibration.
Stand still. Listen. Locate the sound. Move. Stand still. Listen. Hear the tension.
July 20, 2017
SMC paper based on data from the first Norwegian Championship of Standstill
We have been carrying out three editions of the Norwegian Championship of Standstill over the years, but it is first with the new resources in the MICRO project that we have finally been able to properly analyze all the data. The first publication coming out of the (growing) data set was published at SMC this year:
Reference: Jensenius, Alexander Refsum; Zelechowska, Agata & Gonzalez Sanchez, Victor Evaristo (2017). The Musical Influence on People’s Micromotion when Standing Still in Groups, In Tapio Lokki; Jukka Pa?
March 16, 2017
New Centre of Excellence: RITMO
I am happy to announce that the Research Council of Norway has awarded funding to establish RITMO Centre of Excellence for Interdisciplinary Studies in Rhythm, Time and Motion. The centre is a collaboration between Departments of Musicology, Psychology and Informatics at University of Oslo.
Project summary Rhythm is omnipresent in human life, as we walk, talk, dance and play; as we tell stories about our past; and as we predict the future.
February 5, 2017
Music Moves on YouTube
We have been running our free online course Music Moves a couple of times on the FutureLearn platform. The course consists of a number of videos, as well as articles, quizzes, etc., all of which help create a great learning experience for the people that take part.
One great thing about the FutureLearn model (similar to Coursera, etc.) is that they focus on creating a complete course. There are many benefits to such a model, not least to create a virtual student group that interact in a somewhat similar way to campus students.
January 24, 2016
New MOOC: Music Moves
Together with several colleagues, and with great practical and economic support from the University of Oslo, I am happy to announce that we will soon kick off our first free online course (a so-called MOOC) called Music Moves.
Music Moves: Why Does Music Make You Move? Learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.
[Go to course – starts 1 Feb](https://www.
August 3, 2015
New paper: Test–retest reliability of computer-based video analysis of general movements in healthy term-born infants
{.alignright .size-medium .wp-image-2449 width=“228” height=“300”}I have for several years been collaborating with researchers at NTNU in Trondheim on developing video analysis tools for studying the movement patterns of infants. This has resulted in several papers, international testing (and a TV documentary). Now there is a new paper out, with some very successful data testing the reliability of the video analysis method:
Reference:
Valle, Susanne Collier, Ragnhild Støen, Rannei Sæther, Alexander Refsum Jensenius, and Lars Adde.
June 2, 2015
New paper: MuMYO - Evaluating and Exploring the MYO Armband for Musical Interaction
Yesterday, I presented my microinteraction paper here at the NIME conference (New Interfaces for Musical Expression), organised at Louisiana State University, Baton Rouge, LA. Today I am presenting a poster based on a paper written together with two of my colleagues at UiO.
Title
MuMYO - Evaluating and Exploring the MYO Armband for Musical Interaction
Authors
Kristian Nymoen, Mari Romarheim Haugen, Alexander Refsum Jensenius
**Abstract
**The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform.
June 30, 2014
New publication: To Gesture or Not (NIME 2014)
This week I am participating at the NIME conference, organised at Goldsmiths, University of London. I am doing some administrative work as chair of the NIME steering committee, and I am also happy to present a paper tomorrow:
Title
To Gesture or Not? An Analysis of Terminology in NIME Proceedings 2001–2013
Links
Paper (PDF)
Presentation (HTML)
Spreadsheet with summary of data (ODS)
OSX shell script used for analysis
Abstract
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series.
June 3, 2013
Analyzing correspondence between sound objects and body motion
New publication:
**Title **
Analyzing correspondence between sound objects and body motion
Authors
Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception.
Abstract
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present.
February 20, 2013
New PhD Thesis: Kristian Nymoen
I am happy to announce that fourMs researcher Kristian Nymoen has successfully defended his PhD dissertation, and that the dissertation is now available in the DUO archive. I have had the pleasure of co-supervising Kristian’s project, and also to work closely with him on several of the papers included in the dissertation (and a few others).
Reference K. Nymoen. Methods and Technologies for Analysing Links Between Musical Sound and Body Motion.
January 14, 2013
New publication: Some video abstraction techniques for displaying body movement in analysis and performance
Today the MIT Press journal Leonardo has published my paper entitled “Some video abstraction techniques for displaying body movement in analysis and performance”. The paper is a summary of my work on different types of visualisation techniques of music-related body motion. Most of these techniques were developed during my PhD, but have been refined over the course of my post-doc fellowship.
The paper is available from the Leonardo web page (or MUSE), and will also be posted in the digital archive at UiO after the 6 month embargo period.
November 1, 2012
Definitions: Motion, Action, Gesture
I have been discussing definitions of the terms motion/movement, action and gesture several times before on this blog (for example here and here). Here is a summary of my current take on these three concepts:
Motion: displacement of an object in space over time. This object could be a hand, a foot, a mobile phone, a rod, whatever. Motion is an objective entity, and can be recorded with a motion capture system.
August 13, 2012
Hi-speed guitar recording
I was in Hamburg last week, teaching at the International Summer Shool in Systematic Musicology (ISSSM). While there, I was able to test a newly acquired high-speed video camera (Phantom V711) at the Department of Musicology.
[caption id=“attachment_1988” align=“alignnone” width=“300”] The beautiful building of the Department of Musicology in Hamburg[/caption]
[caption id=“attachment_1987” align=“alignnone” width=“300”] They have some really cool drawings in the ceiling at the entrance of the Department of Musicology in Hamburg.
November 10, 2011
Motionlessness
Yesterday Miles Phillips{.url} suggested that the word “motionlessness” may be what I am after when it comes to describing the act of standing still. He further pointed me to a web site with a list of the world records for motionlessness. The rules to compete in motionlessness is as follows:
The record is for continuously standing motionless. You must stand: sitting is not allowed. No facial movements are allowed other then the involuntary blinking of the eye.
October 5, 2011
Audio recordings as motion capture
I spend a lot of time walking around the city with my daughter these days, and have been wondering how much I move and how the movement is distributed over time. To answer these questions, and to try out a method for easy and cheap motion capture, I decided to record today’s walk to the playground.
I could probably have recorded the accelerometer data in my phone, but I wanted to try an even more low-tech solution: an audio recorder.
October 2, 2011
Difference between the terms movement and motion
Terminology is always challenging. I have previously written about definitions of actions and gesture several times (e.g. here, here, and here) and chapter 2 in the book Musical gestures: sound, movement, and meaning (Routledge, 2010).
Movement vs motion There are, however, two words/terms that I still find very challenging to define properly and to differentiate: movement and motion. In Norwegian, we only have one word (bevegelse) for describing movement/motion, which makes everything much simpler.
March 21, 2011
Standing still
In between organizing a little conference, teaching (MUS2006, MUS2860, MUS4830), and finalizing some publications, I have started a new research/artistic project with Kari Anne Bjerkestrand. I’ll write a lot more on this later, but for now I just wanted to share a plot from a motion capture recording of a single marker placed on my neck (C7). The recording is of me standing still in 10 minutes. Quite a lot of motion for someone standing still… To be continued.
July 1, 2010
Quantity of motion of an arbitrary number of inputs
In video analysis I have been working with what is often referred to as “quantity of motion” (which should not be confused with momentum, the product of mass and velocity p=mv), i.e. the sum of all active pixels in a motion image. In this sense, QoM is 0 if there is no motion, and has a positive value if there is motion in any direction.
Working with various types of sensor and motion capture systems, I see the same need to know how much motion there is in the system, independent of the number of variables and dimensions in the system studied.
July 17, 2008
Black box in the lab
Last week we started setting up a “black box” in the new lab space. It is great to finally have a more permanent motion lab set up that we can use for various types of observation studies and recording sessions.
May 23, 2008
Janer's dissertation
I had a quick read of Jordi Janer’s dissertation today: Singing-Driven Interfaces for Sound Synthesizers. The dissertation presents a good overview of various types of voice analysis techniques, and suggestions for various ways of using the voice as a controller for synthesis. I am particularly interested in his suggestion of a GDIF namespace for structuring parameters for voice control:
/gdif/instrumental/excitation/loudness x
/gdif/instrumental/modulation/pitch x
/gdif/instrumental/modulation/formants x1 x2
/gdif/instrumental/modulation/breathiness x
/gdif/instrumental/selection/phoneticclass x
May 12, 2008
Kickoff-seminar
Some pictures from the kickoff-seminar for the Sensing Music-related Actions project last week:
Project leader Rolf-Inge Godøy started with a short presentation of the new project.
Then Marcelo M. Wanderley (McGill, Montreal) held an overview of various types of motion capture solutions, and the pros and cons of each of them. He stressed two main challenges he had had over the years: synchronisation of various types of mo-cap data with audio, video, music notation, etc.
May 12, 2008
Optitrack motion capture
I held a guest lecture at the speech, music and hearing group at KTH in Stockholm a couple of weeks ago, and got a tour of the lab afterwards. There I got a demonstration of the Optitrack optical motion capture system, which, as compared to other similar systems, is an amazingly cheap solution starting at $4999. Obviously, it has lower accuracy and precision than the larger systems, but then it also costs 1/20 of the price… However, 100 Hz speed and millimeter precision is decent for a USB-based system, and the cameras are really portable (10x5 cm or so each).
May 8, 2008
Motion Capture System Using Accelerometers
Came across a student project from Cornell on doing motion capture using accelerometers, based on the Atmel controller. It is a nice overview of many of the challenges faced when working with accelerometers, and the implementation seems to work well.
{width=“300/”}
May 5, 2008
Softkinetic
Dutch company Softkineticoffers what they call natural interfaces, i.e. interfaces where you don’t have to put on any sensors to interact:
Softkinetic operates with a single depth sensing camera, requires no marker (no gamepad, no wiimote, no special gloves or clothing, no headset - nothing), and works under all lighting conditions and scene settings (at home, in a fitness center, an amusement park, a classroom, a game cafe, an industrial simulation room - anywhere.
April 24, 2008
Sensing Music-related Actions
The web page for our new research project called Sensing Music-related Actions is now up and running. This is a joint research project of the departments of Musicology and Informatics, and has received external funding through the VERDIKT program of the The Research Council of Norway. The project runs from July 2008 until July 2011.
The focus of the project will be on basic issues of sensing and analysing music-related actions, and creating various prototypes for testing the control possibilities of such actions in enactive devices.
February 25, 2008
Apple tries to patent gestures
Wired reports that Apple has filed around 200 patent applications related to multitouch and gesture control:
Yet it appears that the company is not trying to patent the entire multitouch concept, but rather trying to protect certain uses of it – specifically the methods to interpret gestures, and in some cases, the gestures themselves.
It is interesting to see that they mention the interpretation of a gesture. This means that they separate between gesture and action, i.
February 15, 2008
Recordings in Casa Paganini
The location of the EyesWeb Week is the facilities of the DIST group in the beautiful Casa Paganina, including a large auditorium next to the laboratories. This allows for an ecological setting for experiments, since performers can actually perform on a real stage with real audience. I wish we could have something like this in Oslo!
Here a picture from an experimental setup where we are looking at the synchronisation between the musicians in a string trio.
February 15, 2008
Tactile experience
On the plane down to Genova I had an interesting tactile experience. It turned out that the box that the lunch was served in had this 3D ornament of a walnut on the top cover (not so easy to represent in a picture, but you get the idea). Interesting how much this enhanced the experience!
{width=“400” height=“300”}
February 14, 2008
Emotional music examples
The Peretz group has made available a set of musical excerpts with emotion ratings. Perhaps not the most exciting musical collection, but I think it is very important that the community starts building some data sets that can be used as reference for various type of analyses.
We really need to create a set of music recordings including motion capture and video, but this first requires that we develop some common format that can be used for synchronisation and sharing.
February 14, 2008
Syncing Movement and Audio using a VST-plugin
I just heard Esteban Maestre from UPF present his project on creating a database of instrumental actions of bowed instruments, for use in the synthesis of score-based material. They have come up with a very interesting solution to the recording and synchronisation of audio with movement data: Building a VST plugin which implements recording of motion capture data from a Polhemus Liberty, together with bow sensing through an Arduino. This makes it possible to load the VST-plugin inside regular audio sequencing software and do the recording from there.
February 14, 2008
TRIL centre, Emobius and Shimmer
I just heard a presentation by a group of researchers from the Tril centre (Technology Research for Independent Living) in Dublin. They have developed Emobius (or EyesWeb Mobius), a set of blocks for various types of biomedical processing, as well as a graphical front-end to the forthcoming EyesWeb XMI. It is fascinating to see how the problems they are working on in applications for older persons are so similar to what we are dealing with in music research.
February 13, 2008
Motiongrams in EyesWeb!
We had a programming session this morning, and Paolo Coletta implemented a block for creating motiongrams in EyesWeb. It will be available in the new EyesWeb XMI release which will happen in the end of this week. Great!
January 7, 2008
Windows Mobile 7 and Touch Gesture
All the big companies are launching new devices with “gesture control” these days, and Microsoft is following along . This insider story presents some of the new features in Windows Mobile 7 which is supposed to take on the iPhone later this year.
The most interesting part of the “leak”, I think, is the differentiation between touch gestures and motion gestures, where the former is related to what I would call manipulation, such as in this example of one and two finger strokes:
December 11, 2007
Mapping and conditioning
The concept of “mapping” is frequently used in the computer music community these days, and has also been used over the last couple of days during the Jamoma workshop. This reminded me about the distinction between mapping and conditioning, as frequently pointed out by Marcelo Wanderley:
Conditioning: filtering, scaling and normalizing signals in a 1-to-1 mapping Mapping: creating couplings between multidimensional data sets, e.g. MxN. For clarity’s sake it is probably useful to separate between the two.
November 8, 2007
Musical vs. Music-related
Working on a book chapter, I am trying to clarify some terminology. Right now I am thinking about the differences between “musical” and “music-related” movements/actions/gestures. What is the difference? I find that it makes sense to think about whether the action is direct or indirect. In other words:
Musical actions: actions involved in music making, e.g. performing an instrument (i.e. sound-producing actions). Music-related actions: actions that are the result of, or influenced by, music, e.
November 5, 2007
Bodibeat
{#p501 .imagelink}Yamaha has announced a product called Bodibeat, a music player which will adjust the tempo of the music to follow your walking or running. It is based on an accelerometer and, I guess, an algorithm detecting peaks in the continuous movement signal. This gives the pace of the walker or runner which again can be used to control the playback speed of a music file. Apparently, the technology is based on research by BMAT, which is a spinoff from UPF.
September 25, 2007
Sumo wrestlers
Came across Sumotori Dreams, a sumo wrestling game based on physical models. The movements of the wrestlers seem to be based on some kind of physical modelling, and their presence is astonishingly real and far-fetched at the same time. Great fun!
August 30, 2007
Panel at ICMC
Yesterday I chaired a panel session at the International Computer Music Conference in Copenhagen entitled: “The Need of Formats for Streaming and Storing Music-Related Movement and Gesture Data”. Participants of the panel were:
Alexander R. Jensenius (Oslo) Benjamin Knapp [Antonio Camurri] (SARC / Genova) Nicolas Castagné (Acroe, Grenoble) Esteban Maestre (Pompeu Fabra) Joseph Malloch (McGill) Stuart Pullinger [Douglas McGilvray] (Glasgow) Diemo Schwarz (IRCAM) Matthew Wright (UC Berkeley / Stanford) The discussion was lively and the conclusion was to work towards more collaborative efforts when it comes to establishing standards and formats in the community.
August 26, 2007
Interview on ADHD
On Friday I appeared in an interview in Aftenposten, one of the larger newspapers in Norway. The interview describes a recently started collaboration between the Musical Gestures group and Terje Sagvolden’s group working on ADHD. More precisely, they are interested in using my Musical Gestures Toolbox and motiongrams for studying the movements of rats and children with ADHD.
June 12, 2007
ICMC panel
My panel proposal for ICMC 2007 in Copenhagen has been accepted. The title of the panel is The Need of Formats for Streaming and Storing Music-Related Movement and Gesture Data, and that more or less sums up what we are going to discuss.
The other participants in the panel will be Antonio Camurri (Genova), Nicolas Castagne (ACROE, Grenoble), Esteban Maestre (Pompeu Fabra), Joseph Malloch (McGill), Douglas McGilvray (Glasgow), Diemo Schwarz (IRCAM) and Matthew Wright (UC Berkeley / Stanford).
May 10, 2007
Björk to tour with Reactable
{#image457}The MTG group at Pompeu Fabra reports that Björk will use the Reactable in her upcoming tour:
With her first tour concert at the Coachella Festival in California, the Icelandic singer Björk introduced the reactable for the first time to a mainstream audience. Our instrument will form a key element of the artist’s current world tour “Volta” which will appear at numerous music festivals during the next 18 months.
I have tried the Reactable at various conferences and it is great that this innovative collaborative instrument gets some attention outside the music tech community.
May 2, 2007
Surface computing
A Microsoft demo of surface computing, showing several prototypes of “gesture control” (what I would call action control) in software.
March 26, 2007
Étienne-Jules Marey (1830 - 1904)
Came across a great website with lots of references to the work of Étienne-Jules Marey (1830 - 1904), a pioneer in early photography. I particularly like his chronophotographies.
{#image442 width=“500”}
March 15, 2007
ISSSM 2007
Students in musicology, music cognition and technology should consider ISSSM 2007:
Following on the success of the first international summer school in systematic musicology (ISSSM 2006), the summer school will be held for the second time at IPEM, the research centre of the Department of Musicology of Ghent University (Belgium). This year courses will focus on current topics in the research field such as embodied music cognition, music information retrieval and music and interactive media.
March 14, 2007
EMMA: Extensible MultiModal Annotation markup language
Strange that I didn’t see this before. Apparently, W3C has made a draft for multimodal annotation called EMMA: Extensible MultiModal Annotation markup language. The abstract of the document reads:
The W3C Multimodal Interaction working group aims to develop specifications to enable access to the Web using multimodal interaction. This document is part of a set of specifications for multimodal systems, and provides details of an XML markup language for containing and annotating the interpretation of user input.
February 22, 2007
Fidgeting
Yesterday, Jeroen Arendsen introduced me to the concept of fidgeting, the stuff that happens in between actions/gestures in a continuous flux of movement. I have been looking for a good word to describe this type of movement (which I have been calling “movement-noise”), and I am happy to finally have a better word for it. I made a small sketch showing how fidgeting fits into my movement-flux diagram to celebrate the new discovery:
February 21, 2007
Movement, Action (and Gesture) revisited
Ok, so I have been discussing the concepts of movement, action and gesture with various people since I posted this entry, and I have come to disagree with myself. Marcelo pointed out that an action doesn’t necessarily have to involve a movement, as touch and other types of manipulation should also be considered an action. After all, holding down the keys on a piano after the attack results in no movement, but it is certainly an action.
February 17, 2007
Action/Gesture Units
While I’m at it. Here’s a sketch of Kendon’s action unit, and which I think is equally valid for music-related actions.
{width=“487” height=“132”}
February 17, 2007
Movement and Action
Just to clarify: I am using action to denote chunks of movement:
{width=“550”}
Action is thus highly subjective, it is just a mental construct (for either the performer or perceiver, or both) of chunks in the continuous flux of movement. Acknowledging the fact that our brain is working at multiple speeds and resolutions, there could also be actions that are chunks of smaller actions.
It is these actions (i.e. movement chunks) that will be the basis for gestures (i.
February 17, 2007
Movement, action, gesture
Ever since I started my PhD project I have been struggling with the word gesture. Now as I am working on a theory chapter for my dissertation, I have had to really try and decide on some terminology, and this is my current approach:
I use movement as the general term to describe the act of changing physical position of body parts related to music performance or perception. Action is used to denote goal-directed movements that form a separate unit.
January 16, 2007
NOVINT Falcon
{.imagelink}NOVINT has finally got around to release Falcon the much awaited first, cheap haptic controller. I have my doubts about how solid the thing is, at least when I know how fragile the many times more expensive Phantoms are. Nevertheless, Falcon will finally introduce haptics to everyone.
January 11, 2007
Gestures and technology
What I find most fascinating about Apple’s new iPhone, is the shift from buttons to body. Getting away from the paradigm of pressing buttons to make a call or to navigate, the iPhone boasts a large multi-touch screen where the user will be able to interact by pointing at pictures and objects. Furthermore, the built-in rotation sensor will sense the direction of the device and rotate the screen accordingly, somehow similar to how new digital cameras rotate the pictures you take automatically.
January 11, 2007
Music for One Apartment and Six Drummers
A charming little Swedish Stomp-inspired video:
January 6, 2007
Tim Place on parameter control
Gregory Taylor has made an interview with Tim Place about Hipno. It is interesting how he comments about the Hipnoscope control:
The Hipnoscope does something that I’m quite proud of, which is that it allows you to quickly audition a plug-in and some of its possibilities. But at the same time it really rewards those who are patient explorers that spend time really focusing on subtleties offers. I still find myself surprised at the results I get sometimes - the Hipnoscope creates this palette where there is an almost infinite range of subtlety with some of the plug-ins.
January 5, 2007
Visual Acoustics
Christian Frisson pointed me to Visual Acoustics, a wonderful little web based music improvisation tool. Very simple and elegant
December 20, 2006
Movement-Sound Couplings
I am working on the theory chapter of my dissertation, and am trying to pin down some terminology. For a long time I have been using the concept of gesture-sound relationships to denote the intimate links between a physical movement and the resultant sound. However, since I am throwing away gesture for now, I also need to reconsider the rest of my vocabulary.
Hodgins (2004) uses the term music-movement structural correspondences, which I find problematic since it places music first.
December 1, 2006
Guest lecture: Benoît Bardy
Benoît Bardy held a very interesting guest lecture on the topic “Perception-Action Dynamics Underlying Gesture Classification” yesterday.
An interesting opening remark was on terminology. He commented that in his field (kinesiology) they never use the term gesture at all, while in the ConGAS community noone seems to talk about movement. He suggested the following definitions for some key terms:
Gesture: non-verbal communication, body language, sign, expressive movements Movement: change in position/orientation Action: goal-directed movement Skill: capacity to reach a goal with efficient performance I have tried to understand if there is a difference between movement and motion, but he couldn’t enlighten me there.
October 9, 2006
Gypsy MIDI controller
{#image292}Nick Rothwell reviews the Gypsy MIDI controller in Sound on Sound. An excerpt from his conclusion:
I know some artists who could build great live performances around a Gypsy MIDI suit, and others who would merely look like plonkers. As to the first question, here at Cassiel Central we’ve been through all manner of MIDI controllers and sensing systems, from fader boxes (motorised and not) through accelerometers, ultrasound systems, camera tracking, joysticks, game controllers and Buchla devices, and some common issues emerge.
August 22, 2006
Apple Remote Control
I am getting adjusted to my new MacBook and have realized that the remote control is a funny little thing. Cool features:
Works with Keynote Holding down play button puts the computer to sleep Shows up as “Apple IR” using HI in Max/MSP, so that it can be used for controlling anything there. Only problem is that I can’t turn off the system functions while using it in Max. To avoid people taking control over a presentation, here’s a short description of how it is possible to pair the remote:
July 31, 2006
Khronos Projector
{#image241}The Khronos Projector by Alvaro Cassinelli is an interactive-art installation allowing people to explore pre-recorded movie content in an entirely new way. […] The goal of the Khronos Projector is to go beyond these forms of exclusive temporal control, by giving the user an entirely new dimension to play with: by touching the projection screen, the user is able to send parts of the image forward or backwards in time. By actually touching a deformable projection screen, shaking it or curling it, separate “islands of time” as well as “temporal waves” are created within the visible frame.
July 17, 2006
New book: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard
{.imagelink}Eduardo Miranda and Marcelo M. Wanderley have just released a new book called New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. The chapters are:
- Musical Gestures: Acquisition and Mapping
Gestural Controllers Sensors and Sensor-to-Computer Interfaces Biosignal Interfaces Toward Intelligent Musical Instruments So far most publications in this field have been in conference proceedings, so it is great to have a book that can be used in teaching.
July 15, 2006
Electromyography
For some experiments we are conducting on piano playing I have been looking for a way of measuring muscle activity, or electromyography as it is more properly called:
Electromyography (EMG) is a medical technique for evaluating and recording physiologic properties of muscles at rest and while contracting. EMG is performed using a instrument called an electromyograph, to produce a record called an electromyogram. An electromyograph detects the electrical potential generated by muscle cells when these cells contract, and also when the cells are at rest.
June 27, 2006
Emotionally intelligent interfaces
Peter Robinson (University of Cambridge) are working on emotionally intelligent interfaces, and have made a setup for a summer show at a science museum in London where they can track 20 different types of emotional responses using computer vision:
Can you read minds? The answer is most likely ‘yes’. You may not consider it mind reading but our ability to understand what people are thinking and feeling from their facial expressions and gestures is just that.
June 22, 2006
NIME 06 Installations
Still trying to get through all my notes from Resonances… Of the many installations at NIME 06, I found three of them particularly interesting:
{#image227}Musical Loom by Kingsley Ng was based around an old loom standing in a dark room (or rather a “tent” built between the entrances to the toilets…). It was possible to “play” the loom and sounds and images would appear. The technical setup was built with a combination of infrared cameras and ultrasound sensors, and using EyesWeb for control.
June 21, 2006
ICMC papers
My paper entitled “Using motiongrams in the study of musical gestures” was accepted to ICMC 06 in New Orleans. The abstract is:
Navigating through hours of video material is often time-consuming, and it is similarly difficult to create good visualization of musical gestures in such a material. Traditional displays of time-sampled video frames are not particularly useful when studying single-shot studio recordings, since they present a series of still images and very little movement related information.
June 21, 2006
Interaction Design
We have started a collaboration between between UiO and AHO, and some of the music technology students followed courses with the interaction designers at AHO this spring semester. This was a great success, and I was impressed with what came out of it.
Henrik Marstrander has worked on a table interface where he can control various musical parameters, and Jon Olav Eikenes and Marie Wennesland has made a multi-touch multi-touch interface modelled after Jeff Han.
June 4, 2006
NIME Workshop: Dance and Technology
{.imagelink}Choreographer Dawn Stoppiello and composer/media artist Mark Coniglio of Troika Ranch talked about their work. They are currently using EyesWeb for tracking, and Isadora for video and audio generation. {#image204}Marc Downie presented his work developing tools for working with visuals in a dance context. He has been working with realtime motion capture on stage (both Vicon and Motion Analysis). He will release his Fluid system under GPL in October 2006.
May 29, 2006
United States Patent Application: 0060107822
Apple has recently filed an interesting US Patent Application:*
*
The invention generally pertains to a hand-held computing device. More particularly, the invention pertains to a computing device that is capable of controlling the speed of the music so as to affect the mood and behavior of the user during an activity such as exercise. By way of example, the speed of the music can be controlled to match the pace of the activity (synching the speed of the music to the activity of the user) or alternatively it can be controlled to drive the pace of the activity (increasing or decreasing the speed of the music to encourage a greater or lower pace).
May 23, 2006
Nike+iPod
Apple and Nike has teamed up and released the Nike+iPod package, which allows for using an iPod Nano as a pedometer and share the training information online. It is based on a wireless accelerometer (1.37 x 0.95 x 0.30 inches, 0.23 ounce, using a proprietary protocol at 2.4GHz) and a receiver that connects to the iPod (Size: 1.03 x 0.62 x 0.22 inches, 0.12 ounce). Suggested price is US$29, which is very cheap thinking about the included accelerometer.
May 19, 2006
int.lib by Oli Larkin
{.imagelink}int.lib is a set of abstractions/javascripts for Cycling 74’s Max MSP software that facilitates the control of multiple parameters by navigating a two dimensional visual environment. It implements a gravitational system, allowing the user to represent presets with variable sized balls. As the user moves around the space, the size of the balls and their proximity to the mouse cursor affects the weight of each preset in the interpolated output. int.
May 18, 2006
Nintendo Wii
Nintendo Wii features a wireless controller, with rumbling, sound and some kind of motion sensing (probably a 3D accelerometer?). It is good to see that such things are finally making their way into commercial products, and it will be interesting to see if we can use this for music making as well.
{#image188 width=“500”}
April 27, 2006
Sidney Fels lecture
Just went to a lecture by Sidney Fels from the Human Communication Technologies lab and MAGIC[]{#mce_editor_0_parent} at the University of British Columbia (interestingly enough located in the Forest Sciences Centre…). He was talking on the topic of intimate control of musical instruments, and presented some different projects:
GloveTalkII: “a system that translates hand gestures to speech through an adaptive interface.” Iamascope: a caleidoscope like thing, where users would see themselves on a big screen, as well as controlling a simple sound synthesis.
April 24, 2006
Turntable-Controlled Vibrating Chaise Longue
{.imagelink}Daito Manabe has developed a Turntable-Controlled Vibrating Chaise Longue where it is possible to feel 34 sounds played back through a vibrating chaise longue. Lots of pictures of the making process is available on Daitos web page under works/chair the difference.
April 23, 2006
Yves Guiard and bimanual action
Yves Guiard should have held a lecture at McGill last week, but unfortunately could not make it. Reading on his web page and looking up some of the references, I found some interesting comments about bimanual control. He writes:
During the nineteen eighties, I spent a lot of time trying to understand the logic of division of labour between the left and the right hands in human movements. I came to believe there is something deeply misleading to the concept of hand dominance, central to established thinking in the field of human laterality.
April 22, 2006
Palindrome
Found some interesting dance/performance examples at the web site of German/American performance company Palindrome. They are also developing the EyeCon video software for interactive performance.
March 28, 2006
The 5 Rhythms
I recently got to know about the concept of 5 rhythms, and the Norwegian group doing this.
Gabrielle Roth’s The 5 Rhythmsare an exhilarating and liberating approach to the exploration of improvised movement and dance that is authentic, inspired and catalytic. The 5 Rhythms (Flowing, Staccato, Chaos, Lyrical, Stillness) are a map which can take you on an ecstatic journey, opening you to the inherent wisdom, creativity and energy of your body.
February 21, 2006
Olympic Figure Skating
{.imagelink}Watching the ladies’ figure skating competition from the olympics, I am amazed by the total lack of connection between gestures and music. To start off with, I am not very impressed by the music accompanying the programmes, most being massively layered, romantic orchestral music, but the fact that it is also recorded by a microphone in front of a moderate PA system in the skating hall does not call for a good listening experience.
February 10, 2006
Metadata Hootenanny
{.imagelink}Metadata Hootenanny is a tool for easy adding metadata (annotations and chapters) to QuickTime files. It also has a nice timeline function, showing the frames (or only keyframes) of the movie file, where it is possible to easy navigate and add chapter information. Seems like an easy way of adding information quickly to movie files, although it does not have any more advanced features as found in real annotation software.
February 5, 2006
Video Annotation Software
A short overview of various video annotation software:
- Anvil by Michael Kipp is a java-based program for storing several layers of annotations, like a text sequencer. Can only use avi files. Intended for gesture research (understood as gestures used when talking).
- Transana from University of Wisconsin, Madison, is developed mainly as a tool for transcribing and describing video and audio content. Seems like it is mainly intended for behavioural studies.
December 2, 2005
In-shoe dynamic pressure measuring
“The pedar system is an accurate and reliable pressure distribution measuring system for monitoring local loads between the foot and the shoe.”
www.novel.de