GDIF recording and playback

Kristian Nymoen have updated the Jamoma modules for recording and playing back GDIF data in Max 5. The modules are based on the FTM library (beta 12, 13-15 does not work), and can be downloaded here. We have also made available three use cases in the (soon to be expanded) fourMs database: simple mouse recording, sound saber and a short piano example. See the video below for a quick demonstration of how it works:

July 3, 2010 · 1 min · 74 words · ARJ

New GDIF + SpatDIF wiki: xDIF

Today I am starting my post-semester activities. Not that all grading is finished yet, and all administrative meetings are over for a while, but we had the last part of the official teaching program yesterday, so now I at least feel that the university summer has started. This means that I will (finally) start focusing more on doing research again, and I have several papers that I will try to finish over the coming months. ...

June 3, 2010 · 1 min · 203 words · ARJ

IRCAM workshop on GDIF & SpatDIF

{.vrtx-introduction} IRCAM is proposing a workshop on GDIF & SpatDIF 20-21 May. The invitation e-mail sent to the GDIF mailing list is attached below. Interested people should contact Baptiste Caramiaux directly to show their interest. We propose to organize at IRCAM a meeting/workshop about the GDIF and SpatDIF projects. Such an event could be very interesting to make an update of the current issues and choices made by the different research groups, and possibly to decide on further collaborative actions. ...

March 11, 2010 · 2 min · 220 words · ARJ

New GDIF web page and mailing list

Courtesy of BEK, we have set up a new web page for the Gesture Description Interchange Format (GDIF) at gdif.org. The web page is accompanied by the new GDIF mailing list that can be used for discussion. Happy GDIF’ing!

June 15, 2009 · 1 min · 39 words · ARJ

Papers at ICMC 2008

Last week I was in Belfast for the International Computer Music Conference (ICMC 2008). The conference was hosted by SARC, and it was great to finally be able to see (and hear!) the sonic lab which they have installed in their new building. I was involved in two papers, the first one being a Jamoma-related paper called “Flexible Control of Composite Parameters in Max/MSP” (PDF) written by Tim Place, Trond Lossius, Nils Peters and myself. Below is a picture of Trond giving the presentation. The main point of the paper is that we suggest that parameters should have properties and methods. This is both a general suggestion, and a specific one which we have started implementing in Jamoma using OSC. ...

September 4, 2008 · 1 min · 211 words · ARJ

Janer's dissertation

I had a quick read of Jordi Janer’s dissertation today: Singing-Driven Interfaces for Sound Synthesizers. The dissertation presents a good overview of various types of voice analysis techniques, and suggestions for various ways of using the voice as a controller for synthesis. I am particularly interested in his suggestion of a GDIF namespace for structuring parameters for voice control: /gdif/instrumental/excitation/loudness x /gdif/instrumental/modulation/pitch x /gdif/instrumental/modulation/formants x1 x2 /gdif/instrumental/modulation/breathiness x /gdif/instrumental/selection/phoneticclass x Here he is using Cadoz’ division of various types of instrumental “gestures”: excitation, modulation and selection, something which would also make sense for describing other types of instrumental actions. ...

May 23, 2008 · 1 min · 130 words · ARJ

Hyperinstruments workshop

A couple of weeks ago, we organised a workshop on hyperinstruments and GDIF at the Norwegian Academy of Music. This workshop was an initiative of the NIME strategic research project at the Academy, and I have put up some pictures from the workshop on the new blog site for the project.

May 10, 2008 · 1 min · 51 words · ARJ

Some thoughts on GDIF

We had a meeting about GDIF at McGill yesterday, and I realised that people had very different thoughts about what it is and what it can be used for. While GDIF is certainly intended for formalising the way we code movement and gesture information for realtime usage in NIME using OSC, it is also supposed to be used for offline analysis. I think the best way of doing this, is to have a three level approach as sketched here: ...

February 20, 2007 · 2 min · 224 words · ARJ

NIME paper on GDIF

Here is the poster I presented at NIME 2006 in Paris based on the paper Towards a Gesture Description Interchange Format. The paper was written together with Tellef Kvifte, and the abstract reads: This paper presents our need for a Gesture Description Interchange Format (GDIF) for storing, retrieving and sharing information about music-related gestures. Ideally, it should be possible to store all sorts of data from various commercial and custom made controllers, motion capture and computer vision systems, as well as results from different types of gesture analysis, in a coherent and consistent way. This would make it possible to use the information with different software, platforms and devices, and also allow for sharing data between research institutions. We present some of the data types that should be included, and discuss issues which need to be resolved. ...

July 5, 2006 · 1 min · 139 words · ARJ

ICMC papers

My paper entitled “Using motiongrams in the study of musical gestures” was accepted to ICMC 06 in New Orleans. The abstract is: Navigating through hours of video material is often time-consuming, and it is similarly difficult to create good visualization of musical gestures in such a material. Traditional displays of time-sampled video frames are not particularly useful when studying single-shot studio recordings, since they present a series of still images and very little movement related information. We have experimented with different types of motion displays, and present how we use motiongrams in our study of musical gestures. ...

June 21, 2006 · 1 min · 213 words · ARJ