Tag Archives: Sensors

uOSC

micro-OSC (uOSC) was made public yesterday at NIME:

micro-OSC (uOSC) is a firmware runtime system for embedded platforms designed to remain as small as possible while also supporting evolving trends in sensor interfaces such as regulated 3.3 Volt high-resolution sensors, mixed analog and digital multi-rate sensor interfacing, n > 8-bit data formats.

uOSC supports the Open Sound Control protocol directly on the microprocessor, and the completeness of this implementation serves as a functional reference platform for research and development of the OSC protocol.

The design philosophy of micro-OSC is “by musicians, for musicians”—it is used at CNMAT as a component in prototypes of new sensor-based musical instruments as well as a research platform for the study of realtime protocols and signal-quality issues related to musical gestures.

I have only read through the NIME paper briefly, but an interesting aspect is how they are focusing on the implementation of OSC bundles with time tags, something which is rarely found in most OSC applications. Looking forward to test this on the CUI.

Gumstix and PDa

Another post from the Mobile Music Workshop in Vienna. Yesterday I saw a demo on the Audioscape project by Mike Wozniewski (McGill). He was using the Gumstix, a really small system running a Linux version called OpenEmbedded. He was running PDa (a Pure Data clone) and was able to process sensor data and run audio off of the small device.

Sensing Music-related Actions

The web page for our new research project called Sensing Music-related Actions is now up and running. This is a joint research project of the departments of Musicology and Informatics, and has received external funding through the VERDIKT program of the The Research Council of Norway. The project runs from July 2008 until July 2011.

The focus of the project will be on basic issues of sensing and analysing music-related actions, and creating various prototypes for testing the control possibilities of such actions in enactive devices.

We are organising a kickoff-seminar on Tuesday 6 May with the following program:

  • 10:15-10:30: Rolf Inge Godøy (UiO): The Sensing Music-related Actions project
  • 10:30-11:30: Marcelo M. Wanderley (McGill): Motion capture of music-related actions
  • 11:30-12:30: Ben Knapp (Queens, Belfast): Biosensing of music-related actions
  • 13:30-17:00: Workshop with various biosensors and motion capture equipment

Thus we will be able to discuss music-related actions both from an “internal” (i.e. biosignals) and “external” (i.e. movement) point of view. Please come by if you are in Oslo!