Kinectofon: Performing with shapes in planes

Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique. Title Kinectofon: Performing with shapes in planes Links Paper (PDF) Poster (PDF) Software Videos (coming soon) Abstract The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. These two image streams are used to create different types of motiongrams, which, again, are used as the source material for a sonification process based on inverse FFT. The instrument is intuitive to play, allowing the performer to create sound by ``touching’’ a virtual sound wall. ...

May 28, 2013 · 1 min · 193 words · ARJ

New Master Thesis 2: Music Kinection: Musical Sound and Motion in Interactive Systems

Yet another of my master students have graduated recently, and here is a link to his thesis: Even Bekkedal: Music Kinection : Musical Sound and Motion in Interactive Systems Even has carried out a so-called “practical” master thesis, with a more practical focus. He has carried out a mocap analysis of how people move while playing computer games with a Kinect device, and has also prototyped several mocap instruments. ...

February 14, 2013 · 2 min · 332 words · ARJ

KinectRecorder

I am currently working on a paper describing some further exploration of the sonifyer technique and module that I have previously published on. The new thing is that I am now using the inputs from a Kinect device as the source material for the sonification, which opens up for using also the depth in the image as an element in the process. To be able to create figures for the paper, I needed to record the input from a Kinect to a regular video file. For that reason I have created a small Max patch called KinectRecorder, which allows for easy recording of one combined video file from the two inputs (regular video image and depth image) from the Kinect. As the screenshot below shows, there is not much more to the patch than starting the video input from the Kinect, and then start the recording. Files will be stored as with.jpg compression and named with the current date and time. ...

January 22, 2013 · 2 min · 252 words · ARJ

Mocap workshop in Trondheim

I will participate in a motion capture workshop at the Norwegian University of Science and Technology (NTNU) tomorrow. My contribution will consist of the following: Lecture: Introduction to motion capture (in music analysis and performance) Demo 1: Working with video analysis using the Musical Gestures Toolbox Demo 2: The Xsens MVN BioMech mobile mocap suit Workshop: Analysis and performance with Wii controllers, Phidgets accelerometers and Kinect Below are various resources. ...

April 17, 2012 · 3 min · 429 words · ARJ