Below you will find pages that utilize the taxonomy term “Max”
October 16, 2017
Working with an Arduino Mega 2560 in Max
I am involved in a student project which uses some Arduino Mega 2560 sensor interfaces in an interactive device. It has been a while since I worked with Arduinos myself, as I am mainly working with Belas these days. Also, I have never worked with the Mega before, so I had to look around a little to figure out how to set it up with Cycling ‘74’s Max.
I have previously used Maxuino for interfacing Arduinos with Max.
November 22, 2016
From Basic Music Research to Medical Tool
The Research Council of Norway is evaluating the research being done in the humanities these days, and all institutions were given the task to submit cases of how societal impact. Obviously, basic research is per definition not aiming at societal impact in the short run, and my research definitely falls into category.Still it is interesting to see that some of my basic research is, indeed, on the verge of making a societal impact in the sense that policy makers like to think about.
March 29, 2014
MultiControl on GitHub
[caption id=“attachment_2358” align=“alignright” width=“262”] Screenshot of MultiControl v0.6.2[/caption]
Today I have added MultiControl to my GitHub account. Inititally, I did not intend to release the source code for MultiControl, because it is so old and dirty. The whole patch is based on bpatchers and trying to hide things away in the pre-Max5-days, when presentation view did not exist.
I originally developed the Max patch back in 2004, mainly so that I could distribute a standalone application for my students to use.
August 1, 2013
New publication: Non-Realtime Sonification of Motiongrams
Today I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works.
June 25, 2013
Timelapser
I have recently started moving my development efforts over to GitHub, to keep everything in one place. Now I have also uploaded a small application I developed for a project by my mother, Norwegian sculptor Grete Refsum. She wanted to create a timelapse video of her making a new sculpture, “Hommage til kaffeselskapene”, for her installation piece Tante Vivi, fange nr. 24 127 Ravensbrück.
There are lots of timelapse software available, but none of them that fitted my needs.
January 22, 2013
KinectRecorder
I am currently working on a paper describing some further exploration of the sonifyer technique and module that I have previously published on. The new thing is that I am now using the inputs from a Kinect device as the source material for the sonification, which opens up for using also the depth in the image as an element in the process.
To be able to create figures for the paper, I needed to record the input from a Kinect to a regular video file.
December 18, 2012
MultiControl v.0.6.2
MultiControl is by far the most popular software application I have created, as can be seen in the web traffic here on my site, and also on the download site at the University of Oslo where the app resides. This is a tiny application that passes on data from a human interface device (mouse, game controller) through either OSC or MIDI. When I first created it back in 2004, there were not so many other options.
December 13, 2012
Performing with the Norwegian Noise Orchestra
Yesterday, I performed with the Norwegian Noise Orchestra at Betong in Oslo, at a concert organised by Dans for Voksne. The orchestra is an ad-hoc group of noisy improvisers, and I immediately felt at home. The performance lasted for 12 hours, from noon to midnight, and I performed for two hours in the afternoon.
For the performance I used my Soniperforma patch based on the sonifyer technique and the Jamoma module I developed a couple of years ago (jmod.
September 5, 2012
Teaching in Aldeburgh
I am currently in beautiful Aldeburgh, a small town on the east coast of England, teaching at the Britten-Pears Young Artist Programme together with Rolf Wallin and Tansy Davies. This post is mainly to summarise the things I have been going through, and provide links for various things.
Theoretical stuff My introductory lectures went through some of the theory of an embodied understanding of the experience of music. One aspect of this theory that I find very relevant for the development of interactive works is what I call action-sound relationships.
July 12, 2012
Paper #1 at SMC 2012: Evaluation of motiongrams
Today I presented the paper Evaluating how different video features influence the visual quality of resultant motiongrams at the Sound and Music Computing conference in Copenhagen.
Abstract
Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, background, lighting, clothing, video size and compression. It is argued that the proposed motiongram implementation is capable of visualising the main motion features even with quite drastic changes in all of the above mentioned variables.
June 25, 2012
Record videos of sonification
I got a question the other day about how it is possible to record a sonifyed video file based on my sonification module for Jamoma for Max. I wrote about my first experiments with the sonifyer module here, and also published a paper at this year’s ACHI conference about the technique.
It is quite straightforward to record a video file with the original video + audio using the jit.vcr object in Max.
February 9, 2012
New Laptop Orchestra Piece: Click-It
Yesterday I was teaching a workshop on laptop orchestra performance for the students in Live electronics at the Norwegian Academy of Music. I usually start such workshops by playing the piece Clix by Ge Wang (see e.g. here for a performance of it). It is a fun piece to play, and it is nice to show the students something else than Max patches.
Unfortunately, while setting up for the workshop I had problems getting Chuck to work on my new laptop.
February 3, 2012
Sonification of motiongrams
A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper.
See below for the full paper and video examples.
The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams.
April 1, 2011
Demonstration videos on using Phidgets electronic kits
I am using the Phidgets eletronics kits when teaching sound programming, and have now made two small videos demonstrating some basic principles.
First, there is a getting started with Phidgets in Max video in Norwegian:
And I have also made a video demonstrating the Phidgets2MIDI application that I developed earlier this year:
I am planning to make some videos showing some more musically interesting use of the electronics and software.
November 9, 2010
Sonification of motiongrams
I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram.
See the demonstration video below:
The module is available from the Jamoma source repository, and will probably make it into an official release at some point.
October 11, 2010
AudioAnalysis v0.5
I am teaching a course in sound theory this semester, and therefore thought it was time to update a little program I developed several years ago, called SoundAnalysis. While there are many excellent sound analysis programs out there (SonicVisualiser, Praat, etc.), they all work on pre-recorded sound material. That is certainly the best approach to sound analysis, but it is not ideal in a pedagogical setting where you want to explain things in realtime.
October 11, 2010
Many lines in a text file
I am trying to debug a Max patch that does video analysis. For some reason many of the exported text files containing the analysis results contain exactly 4314 lines. This is an odd number for a computer program to dislike, so I am currently going through the patch to figure out what is wrong.
The first thing I thought about was the text object, which is used for storing the data and write to a text file.
July 3, 2010
GDIF recording and playback
Kristian Nymoen have updated the Jamoma modules for recording and playing back GDIF data in Max 5. The modules are based on the FTM library (beta 12, 13-15 does not work), and can be downloaded here.
We have also made available three use cases in the (soon to be expanded) fourMs database: simple mouse recording, sound saber and a short piano example. See the video below for a quick demonstration of how it works:
July 2, 2010
New motiongram features
Inspired by the work [[[Static no. 12 by Daniel Crooks that I watched at the Sydney Biennale]{.entry-content}]{.status-content}]{.status-body} a couple of weeks ago, I have added the option of scanning a single column in the jmod.motiongram% module in Jamoma. Here is a video that shows how this works in practice:
About motiongrams A motiongram is a way of displaying motion (e.g. human motion) in the time-domain, somehow similar to how we are used to working with time-representations of audio (e.
July 1, 2010
Quantity of motion of an arbitrary number of inputs
In video analysis I have been working with what is often referred to as “quantity of motion” (which should not be confused with momentum, the product of mass and velocity p=mv), i.e. the sum of all active pixels in a motion image. In this sense, QoM is 0 if there is no motion, and has a positive value if there is motion in any direction.
Working with various types of sensor and motion capture systems, I see the same need to know how much motion there is in the system, independent of the number of variables and dimensions in the system studied.
November 27, 2009
Liquid Vapor
I performed in the open form piece Liquid Vapor by Else Olsen S. yesterday. The performance was special in many ways.
First, electronic music pioneer Pauline Oliveros was also performing in the piece. She performed electric accordeon and live electronics, a great combination.
Second, the performance took place in the magnificent foyer of the new Oslo opera house. Even if it is a large space, we managed to fill it up with all the different stations, equipment and instruments.
November 11, 2009
Jamoma 0.5 released
After extensive testing Jamoma 0.5 is finally released. Even though the version number is low, this release has been worked on for around 18 months, and is actively used in both teaching and performance.
What is Jamoma? A platform for interactive art-based research and performance. It consists of several parallell development efforts:
Jamoma Modular - a structured approach to development and control of modules in the graphical media environment Max. Jamoma DSP - an object-oriented, reflective, application programming interface for C++, with an emphasis on real-time signal processing.
October 7, 2009
Testing control of CataRT from video analysis
I am working with Victoria Johnson on a piece involving movement in physical and sonic space. Here is a screenshot of a patch where I use analysis output of some of my video modules from Jamoma to control the cursor navigating in the 2D-space in CataRT. The video camera is hanging in the ceiling, and this makes it possible for Victoria to explore sounds “spread out” on the floor. For one, CataRT is an amazing tool (thanks to Diemo for sharing it!
June 26, 2009
STSM at KTH
I am currently in Stockholm carrying out a Short Term Scientific Mission (STSM) in the Speech, music and Hearing group at KTH through the COST Action Sonic Interaction Design (SID). The main objective of the STSM is to work on preparations for some experiments on action-sound couplings that will be carried out in the SID project in the fall.
The first part of the SID experiments will involve studying how people move to sound, and the second part will look at how this knowledge can be used to create sound through movement.
April 27, 2009
Updated software
I was at the Musical Body conference at University of London last week and presented my work on visualisation of music-related movements. For my PhD I developed the Musical Gestures Toolbox as a collection of components and modules for Max/MSP/Jitter, and most of this has been merged into Jamoma. However, lots of potential users are not familiar with Max, so over the last couple of years I have decided to develop standalone applications for some of the main tasks.
October 28, 2008
Three workshops in a row
The last few weeks have been quite busy here in Oslo. We opened the new lab just about a month ago, and since then I have organised several workshops, guest lectures and concerts both at UiO and at NMH. I was planning to post some longer descriptions of what has been going on, but decided to go for a summary instead.
{height=“150”} First we had a workshop called embedded systems workshop, but which I retroactively have renamed RaPMIC workshop (Rapid Prototyping of Music Instruments and Controllers).
October 23, 2008
Some thoughts on data signal processing in Max
We are having a Jamoma workshop at the fourMs lab this week. Most of the time is being spent on making Jamoma 0.5 stable, but we are also discussing some other issues. Throughout these discussions, particularly about how to handle multichannel audio in Max, I have realised that we should also start thinking about data signals as a type in itself.
Jamoma is currently, as is Max, split into three different “types” of modules and processing: control, audio and video.
September 9, 2008
Max/MSP and databases
Andrew Benson just posted a short tutorial on how to use the built in SQL tools in Max 5. Databases seems to be the future, also in music, and it is incredibly cool to have the ability to work on this from within Max.
September 4, 2008
Papers at ICMC 2008
Last week I was in Belfast for the International Computer Music Conference (ICMC 2008). The conference was hosted by SARC, and it was great to finally be able to see (and hear!) the sonic lab which they have installed in their new building.
I was involved in two papers, the first one being a Jamoma-related paper called “Flexible Control of Composite Parameters in Max/MSP” (PDF) written by Tim Place, Trond Lossius, Nils Peters and myself.
June 17, 2008
AudioVideoAnalysis
To allow everyone to watch their own synchronised spectrograms and motiongrams, I have made a small application called AudioVideoAnalysis.
Download AudioVideoAnalysis for OS X (8MB) It currently has the following features:
Draws a spectrogram from any connected microphone Draws a motiongram/videogram from any connected camera Press the escape button to toggle fullscreen mode Built with Max/MSP by Cycling ‘74 on OS X.5. I will probably make a Windows version at some point, but haven’t gotten that far yet.
June 16, 2008
NIME paper
A group of Jamoma-developers presented a paper suggesting an extension to OSC at this year’s NIME in Genova two weeks ago:
Reference:
Place, T., T. Lossius, A. R. Jensenius, N. Peters and P. Baltazar (2008): Proceedings of the 2008 International Conference on New Interfaces for Musical Expression, 5-7 June 2008, Genova.
Downloads:
Full paper Poster Abstract:
An approach for creating structured Open Sound Control (OSC) messages by separating the addressing of node values and node properties is suggested.
June 11, 2008
Motiongrams sync'ed to spectrograms
One of my reasons for developing motiongrams was to have a solution for visualising movement in a way that would be compatible to spectrograms. That way it would be possible to study how movement is evolving over time, in relation to how the audio is changing over time.
In my current implementation of motiongrams in Max/MSP/Jitter (and partially in EyesWeb), there has been no way to synchronise with a spectrogram. The problem was that the built-in spectrogram in Max/MSP was running much faster than the motiongram, and they was therefore out of sync from the start.
June 8, 2008
NIME Jamoma workshop
Some pictures from our Jamoma workshop after NIME:
{width=“400” height=“237”}
Pascal showing the ramping and mapping magic in Jamoma.
{width=“400” height=“252”}
Tim showing that Jamoma is soon to be working in Max 5.
May 15, 2008
Mobile Python on S60 to Max/MSP
Richard Widerberg held a workshop today on using mobile python on Nokia phones running Symbian OS S60. He has gathered some links to everything that is needed to get a connection up and running with PD. Now I got a simple script up and running and communicating with Max/MSP through the serial object. It works, but it feels a bit limiting to only have one-dimensional control joystick up/down + number keys for interaction.
May 14, 2008
Max 5 improvements
Andrew Benson has written up a list of some tips for “Improving Your Patching Workflow”. I particularly like the trick of option-dragging media files from the file browser to create a file player + media file.
April 23, 2008
Max 5
Cycling ‘74 has released Max 5! I have been beta-testing the software for some time now, and can highly recommend the update. There has been a lot of discussion about the new “rounded corners”, but Max 5 is so much more about improvements and workflow. What I like the most about Max 5 is the presentation mode, which allows for separating the “code” from the user interface. This greatly enhances creating more complex patches with a neat little interface on top, since you can easily select which objects to include in the presentation, and then rearrange them as you like.
February 20, 2008
Ali Momeni lecture
Here’s a video of a lecture by Ali Momeni (now at the Interdisciplinary Program for Collaborative Arts at the University of Minnesota in Minneapolis), where he gives an overview of his work. Lots of interesting things!
February 13, 2008
Motiongrams in EyesWeb!
We had a programming session this morning, and Paolo Coletta implemented a block for creating motiongrams in EyesWeb. It will be available in the new EyesWeb XMI release which will happen in the end of this week. Great!
February 12, 2008
Free Software
I am participating in the EyesWeb Week in Genoa this week. This morning Nicola Bernardini held a lecture about Free Software. I have heard him talk on this topic several times before, but as I have now some more experience on participating in a Free Software project (i.e. Jamoma), I got more out of his ideas.
Some main points from the talk:
Use Free Software! Freeware and shareware may have nothing to do with Free Software.
February 7, 2008
SubEthaEdit and Max externals
I have had problems with SubEthaEdit making Max/MSP .mxo externals show up as folders instead of packages. I type the solution here so that I won’t get forget it anymore (thanks to jasch for figuring this out):
Right click on SubEthaEdit and select “show package contents” Open the Info.plist file in a text editor Remove the line with mxo Save and restart
January 18, 2008
Open Sound Control
The newly refurbished OSC forum web site has sparked off some discussions on the OSC_dev mailing list. One interesting note was a reply from Andy W. Schmeder on how OSC should be spelled out correctly:
The short answer is, use “Open Sound Control”. The other form one may encounter is “OpenSound Control”, but we don’t use that anymore. Any additional forms you may encounter are probably unintentional.
I have been using various versions over the years (also including OpenSoundControl), I guess this is then an official answer since Andy is working at CNMAT.
December 11, 2007
Coordinate systems
I am updating the GDIF messaging in the jmod.mouse module in Jamoma. Trond suggested to use the OpenGL convention for ranges and coordinate systems, which should give something like this:
{width=“414” height=“270”}
This means that values on the vertical axis would fall between [-1 1], while values on the horizontal axis would be dependent on the size of the screen. For my screen (1280x800) this gives a range of [-1.6 1.
December 11, 2007
Mapping and conditioning
The concept of “mapping” is frequently used in the computer music community these days, and has also been used over the last couple of days during the Jamoma workshop. This reminded me about the distinction between mapping and conditioning, as frequently pointed out by Marcelo Wanderley:
Conditioning: filtering, scaling and normalizing signals in a 1-to-1 mapping Mapping: creating couplings between multidimensional data sets, e.g. MxN. For clarity’s sake it is probably useful to separate between the two.
December 10, 2007
Jamoma Workshop in Brussels
We have a Jamoma workshop in Brussels this week. Some of the major things we will be talking about, and working on, during these days is:
FunctionLib: for handling various types of mathematical functions in a consistent manner. UnitLib: for converting between different types of units. Timing and structuring in modules and patches. Most of the time the first day, though, has been spent on general error solving and more random discussions.
November 12, 2007
Contracting
Following an interesting thread on careers in Max/MSP, I came across a link to Joshs Rules of Database Contracting. I particularly like these ones:
Ask Not Whats Possible: the question is not what you can do, the question is how much the client is willing to pay for it and how long they will wait. Time Substitutes for Money on a Logarithmic Scale: e.g cutting the time by 20% will require doubling the budget.
October 16, 2007
Peter Elsea Max tutorials
Peter Elsea recently announced an updated version of his Max/MSP tutorial. They are great at explaining many of the basics in Max/MSP, and I am very happy he has made the effort to write them up so neatly. However, I can’t help to think about the “old-school” approach to computer music, focusing on pitch classes, harmony, rhythm and MIDI control.
September 20, 2007
Giant Music Ball 2
We have been working on the Giant Music Ball all day. Besides soldering, aligning all the sensors was the biggest challenge. Here is a picture from early on in the process:
{height=“400”}
I love cables…
{width=“400”}
Crossing the Karl Johan street in the city centre of Oslo:
{width=“400”}
Lots of people stopped by and wondered what we were doing.
{width=“400”}
If you are around, please stop by booth 33 and check it out: Friday 9-16, Saturday 10-17.
August 26, 2007
Interview on ADHD
On Friday I appeared in an interview in Aftenposten, one of the larger newspapers in Norway. The interview describes a recently started collaboration between the Musical Gestures group and Terje Sagvolden’s group working on ADHD. More precisely, they are interested in using my Musical Gestures Toolbox and motiongrams for studying the movements of rats and children with ADHD.
June 12, 2007
Keyframe
Henrik Marstrander will present his master thesis project tomorrow. This is an interesting visual table for controlling musical sound.
Details: onsdag 13.6 kl 1230. Rom på venstre side i gangen på vei til Salen.
June 11, 2007
LiveAPI
There is now a solution for using OSC with Ableton Live available from LiveAPI.org. This is one of the first things I am going to check out as soon as I’m done with my dissertation
May 13, 2007
MultiControl is updated
{.imagelink}I have received a constant stream of questions for an Intel-version of the little MultiControl software I made many years ago. MultiControl makes it easy to use any standard game controller (HID compatible) to control music software through OSC or MIDI. Still only for OS X, but a Windows version will follow at some point.
A UB-version of the application is now available here, and I will try and get around to making a Windows version one day.
April 11, 2007
Cooperating Computer Music Languages
Brad Garton has posted an interesting blog entry about how he works with many different computer music languages from within Max/MSP, including RTcmix, Chuck, CSound, Super Collider and Lisp. He could also have used Java, Javascript, Python, etc.
I particularly like the idea of using Max as the basis for exposing students to many different types of programming languages and approaches, and will see if I can include some of this in my Max course in the fall.
April 5, 2007
Choosing the Right Video Format
The discussion about video standards for live processing has been summarised as:
Codec: Motion.jpg (for interlaced footage) or Photo.jpg. Compression ratio/quality: Quality 80 is a decent baseline for.jpg, though you can crank as high as 97 to improve quality. Keyframes: Encode a keyframe on every frame so it’s ‘scratch-ready’. Alpha channels: For video containing alpha channels, PNG is the format of choice. Sounds like more or less the same conclusion that has been reached in the Jitter forum, when this question comes up there once in a while.
March 29, 2007
Cycling '74 collaborating with Ableton
David Zicarelli has posted an interesting note about a new collaboration between Cycling ‘74 and Ableton. Ableton Live is one of the most popular and easy to use to use live electronics software tools, and Max/MSP is one of the most powerful and flexible, so this sounds very interesting.
March 24, 2007
Jamoma Workshops
We rounded up the “Jamoma week” with a workshop for a small crowd of power Max users at Ars Longa in Paris today. This time I think we were more successful in explaining that Jamoma is not just a set of ready made patches, it is really mostly about creating a systematic approach to Mac patching, in addition to improving communication in and between Max and similar environments.
The week in Albi was to a great extent spent fixing bugs in the Jamoma 0.
March 22, 2007
User Community and ROI
Reading this post, and viewing the picture below, I came to think about the dynamics of the Max/MSP community:
March 21, 2007
GUI and control
The idea behind the current implementation of Jamoma is based on separating GUI from algorithm. Currently this is solved by having the algorithm in a separate file which is included in the module file containing the GUI. This is better than having everything in one patch, but I don’t think that we could say that the GUI is separated enough from the algorithm.
We have discussed this a bit over the last couple of days, and I have been trying to think about various ways of dealing with this problem.
March 21, 2007
Technical Parameters
I have been thinking a lot about GUIs, namespaces and control parameters over the last couple of days. One of the big challenges we are facing is how to make technology more human-friendly. Often it seems that technology controls us more than we control the technology.
Creating a user interface of any kind is very similar how we think about mapping in musical instruments. In essence, any type of control is one, or several, layers of mapping between one set of parameters to another.
March 19, 2007
Jamoma workshop day 1
{width=“250”}A couple of pictures from the Jamoma workshop which started in Albi today, a lovely little town in Southern France.
Most of the day has been spent on getting people up and running with Jamoma and introductions. I hope we will have enough time to also discuss some pressing development issues. Some of the points high up on my lists are:
UnitLib. Creating a separate library of units that can handle conversions between various units.
March 12, 2007
SCPLugin
Tim’s post on Version Control Options made me install the SCPlugin which is somehow similar to TortoiseSVN for Windows. The nice thing is that it is integrated with Finder, so that I can do checkouts and commits by right-clicking on a folder. Much simpler than using the command line or svnX.
March 10, 2007
Jamoma 0.4
After a lot of hard work, Jamoma 0.4 has been released. This is a major upgrade from 0.3, the most important being that now all the core objects have been ported to C which makes everything run much faster. Lots of other things have improved as well, the only drawback being that 0.4 breaks compatibility with 0.3.
Installers for Mac and Windows are available from the Jamoma page.
February 27, 2007
MIT: MAS.960 Principles of Electronic Music Controllers
Came across the web site of MIT course MAS.960 Principles of Electronic Music Controllers, which has some interesting references and links tovarious resources on NIME development. It is also worth checking out many of the student projects.
February 22, 2007
MSP tilde in LaTeX
I spent a couple of minutes trying to figure out how to create a nice tilde (~) for writing the name of Max/MSP externals in LaTeX (e.g. dac~), so I figured I could post the solution in case anyone else wonders. First I tried using \tilde{} and \widetilde{}, but they didn’t look nice. However, this little thing does the trick:
$\sim$
I guess you need the math environment to get this working.
February 12, 2007
Managing Complex Patches in Max
Arne Eigenfeldt has written a Cycling ‘74 blog entry about Managing Complex Patches in Max:
One of the beauties of Max is its simplicity: the ability to quickly create a patch that does something artistically interesting. Part of this has to do with its visual programming style - patchcords allow us to see the relationship between graphic objects. However, unless you limit yourself to creating only straightforward patches, your patch can become a spaghetti-like series of connections that confound attempts at debugging.
January 26, 2007
Wormhole in Jamoma
Tim Place has posted a very interesting idea about handling presets and time:
We both came up with what essentially the same idea expressed in different ways. An interface where the Z-axis represents the progression of time. In terms what the user sees at any given time, it is like a single frame of a movie. Each frame then would represent the state or algorithm representing the state of the relevant Max patches or Jamoma module.
January 16, 2007
NOVINT Falcon
{.imagelink}NOVINT has finally got around to release Falcon the much awaited first, cheap haptic controller. I have my doubts about how solid the thing is, at least when I know how fragile the many times more expensive Phantoms are. Nevertheless, Falcon will finally introduce haptics to everyone.
January 6, 2007
Tim Place on parameter control
Gregory Taylor has made an interview with Tim Place about Hipno. It is interesting how he comments about the Hipnoscope control:
The Hipnoscope does something that I’m quite proud of, which is that it allows you to quickly audition a plug-in and some of its possibilities. But at the same time it really rewards those who are patient explorers that spend time really focusing on subtleties offers. I still find myself surprised at the results I get sometimes - the Hipnoscope creates this palette where there is an almost infinite range of subtlety with some of the plug-ins.
December 8, 2006
Music troll performance
{#image349}I performed with the music troll yesterday. It has been resting in the lab for a couple of months, and was a bit “rusty” to start up. What caused the biggest headache was to get my performance patches to work on my new MacBook. Last month I found that PeRColate was released as UB, but I hadn’t tested them. First I had problems making Max finding them, which seemed to be because of the source-folder resting in the path (thanks to mzed for the tip).
December 4, 2006
v001
Vade has posted a system called v001 for Max/MSP:
v001 is meant to help bring a structured method for building modular and reusable performance patches for Max/MSP and Jitter - it isnt a complete low level programming system like Jamoma (nor meant to replace it), rather an immediately useful set of pre-made objects and patching methodologies. I have found this architecure invaluable for rapidly creating new modules to expriment and integrate into my existing performance system, as well as being able to drastically alter the functionality of my performance patch while still being able to retain all of the previous effects ive generated, without having to re-code anything.
December 4, 2006
WiiMote used as a mouse on windows
This video shows WiiMote used as a mouse on windows.
December 4, 2006
YouOS: A Web Operating System
Jamie just pointed me to YouOS, an operating system running entirely within a web browser:
YouOS and its applications run entirely within a web browser, but have the look and feel of desktop applications. An application’s code and data reside remotely but are executed and modified locally. This model allows for a great deal of freedom. You can edit a document at home in a text editor and then go to school or work and instantly access the same text editor and document.
November 3, 2006
Pd *is* a programming language!
Many people often question whether Max/MSP, PD and other graphical environments can be considered programming languages. Claude Heiland-Allen proves that PD is a full programming language by making an interpreter in vanilla Pd using no externals.
November 3, 2006
Tapestrea
TAPESTREA (or taps) is a unified framework for interactively analyzing, transforming and synthesizing complex sounds. Given one or more recordings, it provides well-defined means to:
identify points of interest in the sound and extract them into reusable templates transform sound components independently of the background and/or other events continually resynthesize the background texture in a perceptually convincing manner controllably place event templates over backgrounds, using a novel graphical user interface and/or scripts written in the ChucK audio programming language leverage similarity based retrieval to locate other interesting sound components Taps provides a new way to completely transform a sound scene, dynamically generate soundscapes of unlimited length, and compose and design sound by combining elements from different recordings.
November 3, 2006
The Phase Vocoder
{#image312}Richard Dudas has written a great introduction to the phase vocoder and some Max/MSP implementation details.
November 2, 2006
Arduino
Seems like the Arduino community is growing quickly.
Arduino is an open-source physical computing platform based on a simple i/o board, and a development environment for writing Arduino software. The Arduino programming language is an implementation of Wiring, itself built on Processing.
At the moment I am very happy with the Phidgets interfacekits for my electronics work, but as soon as I am done with my dissertation I will get into the Arduino/Wiring/Processing world.
November 2, 2006
Audacity 1.3
There’s a new beta of Audacity 1.3 out. Previous versions have been somewhat unstable and lacking features, but now it starts to improve:
- New selection bar and improved selection tools
Dockable toolbars New “Repair” effect, other improved effects Auto-save and automatic crash recovery {#image307}
October 25, 2006
UB drivers for Phidgets
Phidgets just released a new library and drivers for Intel Macs. This was the last thing I really have been missing after I got my new MacBook this summer.
October 17, 2006
Compare or merge two folders' contents
I had some SVN problems, and figured it would probably be faster to do a full new checkout rather than trying to figure out the mysteries of the .svn folders. But then I realized that I had made some uncommitted changes in the old folder, and started brushing up on my unix diff commands. While looking for some hints on this, I came across this Mac OS X Hint on how to compare two folders.
October 11, 2006
Jade under LGPL
Tim Place just announced that Jade is now open sourced under LGPL.
Jade is a flexible, relatively easy-to-use, environment for composition and performance. For many Jade has made approaching the complexities of interactive music possible. Now it is available for free as an open-source project.
The basic unit in Jade is a Module. Jade has a number of ready-made modules for analyzing, generating, and processing audio and video. These modules can be created using Cycling’74’s Max/MSP/Jitter authoring environment.
October 11, 2006
Lego instruments
A group of German students are working on a project called Stekgreif where they include a number of popular sensors built as lego-blocks. Adding power through the lego bricks makes it possible to build instruments and other fun things entirely out of lego.
September 19, 2006
Myron - Computer Vision for Artists
Myron is the cross-platform, cross-language, open source, video capture and computer vision plugin. One core C object gets cross-compiled as a handful of high level language “wrapper” libraries. The wrapper for Java and Processing is called JMyron. The wrapper for Macromedia Director is called WebCamXtra. The aim of the project is to keep computer vision free and easy for the new media education and arts community.
I will have to look into whether it is be possible to use this with Max/MSP/Jitter.
September 19, 2006
Photonic textiles
*{#image275}Philips Research is currently showing off a their new Photonic Textiles at the IFA Consumer Electronics Fair in Berlin. The Photonic Fabric integrates flexible arrays of multicolored LEDs into the weave, allowing the fabric to give off light and display programmable patterns like text messages, without compromising the softness of the cloth. Philips’ Photonic Textile Prototypes include an “SMS pillow” and an “SMS Backpack” (send a text message to it and words scroll across it).
September 19, 2006
Sequencer Programming
In a Sneak Peek for OS X Leopard, Apple shows a debugging mode in Xcode that works something like music sequencers.
Taking its interface cues from timeline editors such as GarageBand, now you can visualize application performance like nothing you’ve seen before. Add different instruments so you can instantly see the results of code analyzers. Truly track read/write actions, UI events, and CPU load at the same time, so you can more easily determine relationships between them.
September 15, 2006
Mac Mini and VNC
As part of the Musikkball project, we are making a “music ball troll” for the science fair Forskningstorget in Oslo next week. I have been looking for a solution to make the setup as self-contained as possible, and this includes building a Mac mini into the speaker box founding the base of the “troll”. What is great with the mini is that it is possible to run it headless (without any attached mouse, keyboard and monitor), and control it using Chicken of the VNC from my MacBook.
August 22, 2006
Apple Remote Control
I am getting adjusted to my new MacBook and have realized that the remote control is a funny little thing. Cool features:
Works with Keynote Holding down play button puts the computer to sleep Shows up as “Apple IR” using HI in Max/MSP, so that it can be used for controlling anything there. Only problem is that I can’t turn off the system functions while using it in Max. To avoid people taking control over a presentation, here’s a short description of how it is possible to pair the remote:
August 2, 2006
Microsoft Live Labs: Photosynth
{#image248}Researchers at Microsoft Live Labs are working on Photosynth based on Photo Tourism from the University of Washington. By structuring the photos based on their relative position to each other, it is possible to navigate in a large photo collection in a 3D style way. The system looks very responsive from the video, but I would be curious to see how it works in a real-world context.
It would be very interesting to create similar navigation tools for audio.
August 1, 2006
RtFFT: A realtime spectrum analyzer
{#image244}RtFFT by Gary P. Scavone is a fairly basic realtime spectrum analyzer. It can simultaneously display an arbitrary number of FFT signals, which correspond to the spectra of data input from one or more channels of your computer soundcard. The plot window can be zoomed to any arbitrary limits. Controls are provided for the FFT size, the window type, and window averaging.
July 31, 2006
Khronos Projector
{#image241}The Khronos Projector by Alvaro Cassinelli is an interactive-art installation allowing people to explore pre-recorded movie content in an entirely new way. […] The goal of the Khronos Projector is to go beyond these forms of exclusive temporal control, by giving the user an entirely new dimension to play with: by touching the projection screen, the user is able to send parts of the image forward or backwards in time. By actually touching a deformable projection screen, shaking it or curling it, separate “islands of time” as well as “temporal waves” are created within the visible frame.
July 11, 2006
BEAM Foundation
In a discussion of “the most complex Max patch”, Barry Threw pointed to the patch used by TrioMetrik, the ensemble of the BEAM Foundation. There’s also a video with shots of musicians and patches.
July 9, 2006
Reverse Engineering Autechre with Max/MSP and Reaktor
{#image232}Came across a web page with reverse engineered Autechre Max/MSP and Reaktor patches. Interesting.
June 26, 2006
Occam - OSC-MIDI converter
Occam takes OSC messages and converts them to MIDI. It exports a MIDI source to CoreMIDI which can be used in any Mac OS X application that accepts MIDI. It broadcats the existence of this OSC-to-MIDI service using Rendezvous (Zero-Conf).
June 8, 2006
Misc. OSC stuff
I got to learn about a number of interesting OSC-related stuff during the OSC developers’ meeting at NIME today:
HID2OSC from the KeyWorx people OSCmap from Rémy Muller (IRCAM) where you can either manually build your OSC namespace with as a tree or let the OSC learn do the job for you. you can also save the namespaces for later reuse. Then you can link source and target adresses. It is available for both OSX and Windows.
June 4, 2006
NIME Workshop: Dance and Technology
{.imagelink}Choreographer Dawn Stoppiello and composer/media artist Mark Coniglio of Troika Ranch talked about their work. They are currently using EyesWeb for tracking, and Isadora for video and audio generation. {#image204}Marc Downie presented his work developing tools for working with visuals in a dance context. He has been working with realtime motion capture on stage (both Vicon and Motion Analysis). He will release his Fluid system under GPL in October 2006.
May 27, 2006
java plug-in for pure-data
Nils pointed me to a new java plug-in for pure-data. It is modeled after mxj in Max/MSP, which should make it possible to exchange java classes between the platforms.
May 20, 2006
Sonic Visualiser
{.imagelink}Sonic Visualiser from Queen Mary’s is yet another software tool for visualizing audio content. However, there are some features that stand out:
Cross-platform: available for OS X, Linux, Windows GPL’ed Native support for aiff, wav, mp3 and ogg (but what about AAC?) Annotations: Support for adding labelled time points and defining segments, point values and curves. The annotations can be overlayed on top of waveforms and spectrograms Time-stretch Vamp Plugins is at the core of the Sonic Visualiser, and it seems like they want this to become a standard for non-realtime audio plugins.
May 19, 2006
int.lib by Oli Larkin
{.imagelink}int.lib is a set of abstractions/javascripts for Cycling 74’s Max MSP software that facilitates the control of multiple parameters by navigating a two dimensional visual environment. It implements a gravitational system, allowing the user to represent presets with variable sized balls. As the user moves around the space, the size of the balls and their proximity to the mouse cursor affects the weight of each preset in the interpolated output. int.
May 11, 2006
DesignKlicks
Nils showed me this nice picture site called DesignKlicks from Spiegel Online. It is this 3d-picture space where you can move around and look at similar pictures. Unfortunately it is based on keyword descriptions and not on picture content. I really look forward to the day we get picture (and also music and video) browsers like this working on media content itself.
May 9, 2006
Cycling '74: MaxMSP => Working with Max is not easy
Found an interesting thread on the Max list entitled Working with Max is not easy. But what is easy. Before we learn something we find it difficult. When we know it we find it easy. I guess a problem with Max, if it can be called a problem, is that its low entry-level (at least compared to many other programming languages) is that the user might be misleaded into thinking that this is something that can be mastered in two weeks.
May 9, 2006
Wireless Networking in the Developing World
From the web site of the Wireless Networking in the Developing World project:
The massive popularity of wireless networking has caused equipment costs to continually plummet, while equipment capabilities continue to increase. By applying this technology in areas that are badly in need of critical communications infrastructure, more people can be brought online than ever before, in less time, for very little cost. We hope to not only convince
you that this is possible, but also show how we have made such networks work, and to give you the information and tools you need to start a network project in your local community.
May 3, 2006
Novint Falcon
{#image164}We are currently working with the Phantom Omni haptic devices at McGill, but unfortunately they are rather expensive. I have been looking forward to test the Novint Falcon which is supposed to sell for around $100, but after being in touch with the company it seems like they will not start shipping devices before next year.
I really think such devices will change the way we work with computers. The computer experience has been 2-dimensional way too long, and from my initial testing of 3D haptic devices shows how much potential is lying in this type of human computer interaction.
May 2, 2006
Google suggest
I have started to like Google suggest. It is surprisingly fast, even for unconventional search names. It would be even better if it could also incorporate some quick info about the top results of the searches that you are browsing through.
May 1, 2006
MIDI Specification
In a discussion on how to create an OSC namespace for MIDI, Trond pointed me to a site hosting a detailed MIDI Specification.
May 1, 2006
Trigonometry
I had to brush up on my trigonometry to solve some mapping issues, and found this nice overview. Strange how much I have forgotten about these things, I really need to get back to my linear algebra books! I never really understood the point of learning those vector transformation things back when I studied maths, but now as I have to implement some 3d gesture models I see that it is actually very useful.
April 27, 2006
Sidney Fels lecture
Just went to a lecture by Sidney Fels from the Human Communication Technologies lab and MAGIC[]{#mce_editor_0_parent} at the University of British Columbia (interestingly enough located in the Forest Sciences Centre…). He was talking on the topic of intimate control of musical instruments, and presented some different projects:
GloveTalkII: “a system that translates hand gestures to speech through an adaptive interface.” Iamascope: a caleidoscope like thing, where users would see themselves on a big screen, as well as controlling a simple sound synthesis.
April 26, 2006
MIDI network on OS X
In a discussion on using OSC to communicate over networks, Darryl just mentioned that OS X (apparently starting from Tiger) has the possibility to send MIDI messages over the network. I wonder how I have managed to oversee this feature, since it is sitting there as an option right in the Audio MIDI setup. The help file reads:
You can use the MIDI network driver to send and receive MIDI information between computers over a network.
April 25, 2006
OSC - MIDI address space
My post over at the Open Sound Control forum:
I guess we are all trying to get rid of MIDI, but as long as we have tons of gear around, it would be good to have a generic way of describing MIDI information in OSC. Perhaps I am missing something obvious, but I have looked around and haven’t found any suggestions for a full implementation of MIDI messages as an OSC address space.
April 23, 2006
A Tour of Microsoft's Mac Lab
A Tour of Microsoft’s Mac Lab with pictures of the hundreds of macs they have running, plus descriptions of how they have automated the testing process.
April 20, 2006
Music and Audio Users On Course in Intel Mac Transition
Interesting comment from Create Digital Music:
What I find especially interesting is how far ahead of the curve music software is — just the opposite of what you might expect. We have most drivers already shipping, with nearly all software either shipping already for Intel or promised in the next few months. While support is a bit spotty at the moment if you use a lot of plug-ins, I think most Mac musicians will be able to comfortably switch to Intel by the summer.
April 19, 2006
monome
{.imagelink}The monome 40h is a reconfigurable grid of sixty-four backlit buttons, connecting with USB and communicating both MIDI and OSC (Create Digital Music Review).
April 5, 2006
Theater Max
There seems to be a lot of initiatives for making “higher-level” abstractions for working in Max/MSP these days. Now, I just came across a project at UCLA intended mainly for theater productions:
Theater Max is the result of several years of work, lots of trial and error, and far too many hours of programming for us to count. What we now call Theater Max got its start in 2001 with a production of Eugene Ionesco’s Macbett.
April 2, 2006
SPEAR
{.imagelink}SPEAR is an application for audio analysis, editing and synthesis. The analysis procedure (which is based on the traditional McAulay-Quatieri technique) attempts to represent a sound with many individual sinusoidal tracks (partials), each corresponding to a single sinusoidal wave with time varying frequency and amplitude.
It offers some great features, and I particularly like the possibility to easily select single partials and edit them directly. Most controls also work in realtime.
April 2, 2006
Teatrix
Last week I participated in the Teatrix workshop organized by BEK at USF Verftet in Bergen. The idea was to explore technology in a stage setting. The people participating were: Paola Tognazzi, H.C. Gilje, Gisle Frøysland, Marie Nerland, Trond Lossius, Thorolf Thuestad, Tim Place, Iver Findlay, Linda Birkedal, Alexander Refsum Jensenius, Georges Gagneré, Anders Gogstad.
The most interesting for me was the chance to work together with Tim Place and Trond Lossius on Jamoma, and during the week we had the chance to discuss and develop quite a lot.
April 2, 2006
VLDCMCaR
Bob L. Sturm at UC Santa Barbara:
{.imagelink}VLDCMCaR (pronounced vldcmcar) is a MATLAB application for exploring concatenative audio synthesis using six independent matching criteria. The entire application is encompassed in a graphical user interface (GUI). Using this program a sound or composition can be concatenatively synthesized using audio segments from a corpus database of any size. Mahler can be synthesized using hours of Lawrence Welk; howling monkeys can approximate President Bush’s speech; and a Schoenberg string quartet can be remixed using Anthony Braxton playing alto saxaphone.
March 28, 2006
PLOrk: Princeton Laptop Orchestra
{#image113}Dan Trueman and Perry Cook at Princeton have set up an undergrad course called PLOrk: Princeton Laptop Orchestra, where they have 15 workstations consisting of Powerbooks, sound cards, sensor interfaces and spherical speakers. The idea is to give students the chance to improvise and experiment with electronic music in a really hands-on way (more info). Great idea! We should try and set up something like that in Oslo.
March 28, 2006
The Silent Speaker
Forbes.com writes about Charles Jorgensen who is working on what he calls subvocal speech recognition. He attaches a set of electrodes to the skin of his throat and his words are recognized by a computer even when he is not producing any sound.
March 27, 2006
MøB
{.imagelink}I’m participating in a workshop in Bergen, and got to meet Gisle Frøysland who is developing MøB, a software for installations and realtime manipulation of digital media in GNU/Linux-based networks. I am looking forward to seeing it in action during the course of the workshop.
February 23, 2006
Preview of Jamoma 0.3
{#p102 .imagelink}I have joined in as a developer of Jamoma, and am currently porting parts of the Musical Gestures Toolbox using this framework. One of the most exciting new things is that Trond Lossius has now rewritten/patched everything so that the whole framework is using Open Sound Control messages for all communication. This helps organizing the inner workings of the patches, the flow between modules in Max, and also opens for easier communication with other progamming environments and platforms.
February 22, 2006
UBC Max/MSP/Jitter Toolbox
Just came across the UBC Max/MSP/Jitter Toolbox which seems to be quite similar to Jamoma. The UBC Max/MSP/Jitter Toolbox is a collection of modules for creating and processing audio in Max/MSP and manipulating video and 3D graphics using Jitter. I have just briefly tested it, and here are some screenshots from one of the example patches.
{.imagelink}
February 20, 2006
dbv
{#p95 .imagelink}dbv is a customizable vj tool built with Max/MSP/Jitter. Simple, but with some nice implementation details. I particularly like the way it displays video thumbnails, and adds extra pages if you have more videos than it is space for in the preview pane.
February 20, 2006
traer.physics
{#p94 .imagelink}traer.physics is a particle system physics engine for the Processing video programming environment. The user community of Processing seems to be growing rapidly these days, and from my few tests of the language it seems to be stable and efficient.
Would be interesting to see if it is possible to combine Processing with Max/MSP/Jitter. OSC is one option, but it would be nice if someone made a wrapper so that it could be possible to run Processing from a Max object.
February 17, 2006
Nord Modular
{.imagelink}Clavia has recently released a new version of their software for Nord Modular which now includes the possibility to create new settings based on evolution algorithms. These algorithms were part of the PhD work of my colleague Palle Dahlstedt from Göteborg, and makes it possible to create new settings from a set of “parents”. Very interesting stuff! The software is available as a free download for both Windows and OSX, but of course you need to have a Clavia synth to really appreciate this…
February 2, 2006
HCI at Stanford University: d.tools
d.tools is a hardware and software system that enables designers to rapidly prototype the bits (the form) and the atoms (the interaction model) of physical user interfaces in concert. d.tools was built to support design thinking rather than implementation tinkering. With d.tools, designers place physical controllers (e.g., buttons, sliders), sensors (e.g., accelerometers), and output devices (e.g., LEDs, LCD screens) directly onto form prototypes, and author their behavior visually in our software workbench.
January 24, 2006
Integrated sensing display
Apple has patented a new Integrated sensing display:
On Jan. 12, the US Patent & Trademark Office revealed Apple’s new patent application titled “Integrated sensing display.” This is certainly the year of the integrated camera, as this patent presents.
An integrated sensing display is disclosed. The sensing display includes display elements integrated with image sensing elements. As a result, the integrated sensing device can not only output images (e.g., as a display) but also input images (e.
January 23, 2006
Quartz Composer
I just got to know about Apple’s Quartz Composer, which has been hiding secretly on my computer for a long time (it is installed with the developer’s tools). Found some great examples from Futurismo Zugakousaku, which really shows the power of the system.
January 15, 2006
New Cycling '74 forum
Just found out that Cycling ‘74 has released a brand new forum. Looks very promising, and it is nice that everything is available as RSS feeds.
December 30, 2005
Quintet.net
Georg Hajdu has just released a new version of his Quintet.net performance system.
“Quintet.net is an interactive network performance environment invented and developed by composer and computer musician Georg Hajdu. It enables up to five performers to play music over the Internet under the control of a “conductor.” The environment, which was programmed with the graphical programming language Max/MSP consists of four components: the Server, the Client, the Conductor and the Listener; the latter component enables the Internet/network audience to follow the performance […].
November 28, 2005
ChucK : Concurrent, On-the-fly Audio Programming Language
I finally got around to download and try
ChucK : Concurrent, On-the-fly Audio Programming Language by Ge Wang. Feels a bit strange, but I guess I need to work a little bit more with it. It says something about graphical tools in the readme, and I’m looking forward to that.
August 20, 2004
Interaktiv messe
Idé Forestill deg Betong full av mennesker som kun ved sin tilstedeværelse er med på å definere både form og innhold på en nattverdsgudstjeneste. Dette er Interaktiv messe, en ikke-lineær multimediadustjeneste som ble arrangert av Norges kristelige studentforbund søndag 29. august kl 2100 på Betong.
Interaktiv messe føyer seg inn i rekken av Norges kristelige studentforbunds eksperimenterelle messer. Denne gangen brukes ny teknologi for å forandre både form og innhold i gudstjenesten.
December 13, 2001
Laser dance
Working with choreographer Mia Habib, I created the piece Laser Dance, which was shown on 30 November 1 December 2001 at the Norwegian Academy of Ballet and Dance in Oslo.
The theme of the piece was “Light”, and the choreographer wanted to use direct light sources as the point of departure for the interaction. Mia had decided to work with laser beams, one along the backside of the stage and one on the diagonal, facing towards the audience.
November 28, 2001
Master exam concert
Last week I performed my master exam concert at the Department of Music and Theatre, University of Oslo. The program consisted of improvisations for piano and live electronics. Different MIDI, audio, and video processing techniques were used. Here I describe the different pieces.
Performa It is incredible how many exciting sounds one can get from a piano, and mallets are a nice change from playing on the keys. The computer helps with temporal adjustments and background sounds.