Gesture Workshop

Gesture Workshop, Vannes, France, 18-20 May 2005

This inter-disciplinary conference gathered a number of different people working with gestures, covering subjects such as sign language , linguistics, behavioural psychologists, sports movement, human computer interaction and musical interaction. We presented a paper on "air playing", where Rolf Inge Godoy gave a theoretical introduction and described our observational studies and I presented the Musical Gestures Toolbox.

It was interesting to get an overview of what is going on in the field of sign language and gesture linguistics. Many groups are working on different ways of creating sign language directly from speech, while others try to analyse sign language and generate text or speech. The latter is more interesting for my project, I think, since it involves computer vision and recognition.

Things I found interesting

  • Martin Kaltenbrunner talked about his music table project, which I hope to see in Barcelona this fall. He also had ported PDa to a 2G iPod!
  • Frédéric Bevilacqua showed IRCAM’s new wireless ethersense, a very nice device communicating with regular wifi or USB. Too bad it costs around €800+tax. He also showed a betaversion of MNM (Music is not mapping) using HMMs for gesture recognition, and the FTM for matrix operations in Max. Seems very promising!
  • University of Aachen was represented with three PhD-students showing different types of video analysis. They have made available a database of sign language videos called BOSTON 50 and BOSTON 201. Morteza Zahedi talked about density thresholding and tangent distances for computer vision.
  • José Miguel Salles Dias showed a nice system for displaing hand trajectories using video analysis.
  • Anne Marie Burns presented her finger tracking system developed with EyesWeb.
  • Xavier Rodet presented the IRCAM Phase project, which is controlled by a haptic arm. It has been used for installations, but it would be very interesting to explore this as a musical instrument in performance.
  • Thomas Moeslund showed some computer vision work, and a trick of turning motion vectors into 4D Euler space. I didn’t really understand how this works, but is seems smart.
  • Nicolas Rasamimanana presented work on the IRCAM augmented violin, something similar to the hyperinstruments by Joe Paradiso and Diana Young, and showed graphs of clustering of different types of bowing strokes.
  • Ginevra Castellano presented her Master’s project on studying emotional response to music by analysing people’s movement of a laser pointer in 2D. She presented some results based on static analysis of the material, and I am looking forward to seeing the results from the functional analysis that she is currently working on.
  • They are doing a lot of interesting things at Helsinki University of Technology, virtual snowfighting, swimming, etc.
  • Irene Kambara from the McNeill lab showed some interesting conversation studies.
  • Kristoffer Jensen showed a system for controlling additive synthesis with a laser pointer.

Published by


Alexander Refsum Jensenius is a music researcher and research musician living in Oslo, Norway.