Below you will find pages that utilize the taxonomy term “motiongram”
May 20, 2023
The effect of skipping frames for video visualization
I have been exploring different video visualizations as part of my annual stillstanding project. Some of these I post as part of my daily Mastodon updates, while others I only test for future publications.
Most of the video visualizations and analyses are made with the Musical Gestures Toolbox for Python and structured as Jupyter Notebooks. I have been pondering whether skipping frames is a good idea. The 360-degree videos that I create visualizations from are shot at 25 fps.
May 10, 2023
Visualization of Musique de Table
Musique de Table is a wonderful piece written by Thierry de Mey. I have seen it performed live several times, and here came across a one-shot video recording that I thought it would be interesting to analyse:
The test with some video visualization tools in the Musical Gestures Toolbox for Python.
For running the commands below, you first need to import the toolbox in Python:
import musicalgestures as mg I started the process by importing the source video:
December 17, 2021
Flamenco video analysis
I continue my testing of the new Musical Gestures Toolbox for Python. One thing is to use the toolbox on controlled recordings with stationary cameras and non-moving backgrounds (see examples of visualizations of AIST videos). But it is also interesting to explore “real world” videos (such as the Bergensbanen train journey).
I came across a great video of flamenco dancer Selene Muñoz, and wondered how I could visualize what is going on there:
December 15, 2021
Kayaking motion analysis
Like many others, I bought a kayak during the pandemic, and I have had many nice trips in the Oslo fiord over the last year. Working at RITMO, I think a lot about rhythm these days, and the rhythmic nature of kayaking made me curious to investigate the pattern a little more.
Capturing kayaking motion My spontaneous investigations into kayak motion began with simply recording a short video of myself kayaking.
February 4, 2021
Visualising a Bach prelude played on Boomwhackers
I came across a fantastic performance of a Bach prelude played on Boomwhackers by Les Objets Volants.
https://www.youtube.com/watch?v=Y5seI0eJZCg
It is really incredible how they manage to coordinate the sticks and make it into a beautiful performance. Given my interest in the visual aspects of music performance, I reached for the Musical Gestures Toolbox to create some video visualisations.
I started with creating an average image of the video:
This image is not particularly interesting.
January 28, 2021
Analyzing a double stroke drum roll
Yesterday, PhD fellow Mojtaba Karbassi presented his research on impedance control in robotic drumming at RITMO. I will surely get back to discussing more of his research later. Today, I wanted to share the analysis of one of the videos he showed. Mojtaba is working on developing a robot that can play a double stroke drum roll. To explain what this is, he showed this video he had found online, made by John Wooton:
September 8, 2020
Motiongrams of rhythmic chimpanzee swaying
I came across a very interesting study on the Rhythmic swaying induced by sound in chimpanzees. The authors have shared the videos recorded in the study (Open Research is great!), so I was eager to try out some analyses with the Musical Gestures Toolbox for Matlab.
Here is an example of one of the videos from the collection:
The video quality is not very good, so I had my doubts about what I could find.
March 20, 2020
Pixel array images of long videos in FFmpeg
Continuing my explorations of FFmpeg for video visualization, today I came across this very nice blog post on creating “pixel array” images of videos. Here the idea is to reduce every single frame into only one pixel, and to plot this next to each other on a line. Of course, I wanted to try this out myself.
I find that creating motiongrams or videograms is a good way to visualize the content of videos.
February 21, 2020
Visualizing some videos from the AIST Dance Video Database
Researchers from AIST have released an open database of dance videos, and I got very excited to try out some visualization methods on some of the files. This was also a good chance to test out some new functionality in the Musical Gestures Toolbox for Matlab that we are developing at RITMO. The AIST collection contains a number of videos. I selected one hip-hop dance video based on a very steady rhythmic pattern, and a contemporary dance video that is more fluid in both motion and music.
January 24, 2020
Motiongram of high-speed violin bowing
I came across a high-speed recording of bowing on a violin string today, and thought it would be interesting to try to analyze it with the new version of the Musical Gestures Toolbox for Python. This is inspired by results from the creation of motiongrams of a high-speed guitar recording that I did some years ago.
Here is the original video:
From this I generated the following motion video:
And from this we get the following motiongram showing the vertical motion of the string (time running from left to right):
August 1, 2013
New publication: Non-Realtime Sonification of Motiongrams
Today I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works.
May 28, 2013
Kinectofon: Performing with shapes in planes
Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique.
Title
Kinectofon: Performing with shapes in planes
Links
Paper (PDF) Poster (PDF) Software Videos (coming soon) Abstract
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device.
April 6, 2013
ImageSonifyer
Earlier this year, before I started as head of department, I was working on a non-realtime implementation of my sonomotiongram technique (a sonomotiongram is a sonic display of motion from a video recording, created by sonifying a motiongram). Now I finally found some time to wrap it up and make it available as an OSX application called ImageSonifyer. The Max patch is also available, for those that want to look at what is going on.
February 21, 2013
Are you jumping or bouncing?
One of the most satisfying things of being a researcher, is to see that ideas, theories, methods, software and other things that you come up with, are useful to others. Today I received the master’s thesis of Per Erik Walslag, titled Are you jumping or bouncing? A case-study of jumping and bouncing in classical ballet using the motiongram computer program, in which he has made excellent use of my motiongram technique and my VideoAnalysis software.
January 14, 2013
New publication: Some video abstraction techniques for displaying body movement in analysis and performance
Today the MIT Press journal Leonardo has published my paper entitled “Some video abstraction techniques for displaying body movement in analysis and performance”. The paper is a summary of my work on different types of visualisation techniques of music-related body motion. Most of these techniques were developed during my PhD, but have been refined over the course of my post-doc fellowship.
The paper is available from the Leonardo web page (or MUSE), and will also be posted in the digital archive at UiO after the 6 month embargo period.
December 13, 2012
Performing with the Norwegian Noise Orchestra
Yesterday, I performed with the Norwegian Noise Orchestra at Betong in Oslo, at a concert organised by Dans for Voksne. The orchestra is an ad-hoc group of noisy improvisers, and I immediately felt at home. The performance lasted for 12 hours, from noon to midnight, and I performed for two hours in the afternoon.
For the performance I used my Soniperforma patch based on the sonifyer technique and the Jamoma module I developed a couple of years ago (jmod.
August 13, 2012
Hi-speed guitar recording
I was in Hamburg last week, teaching at the International Summer Shool in Systematic Musicology (ISSSM). While there, I was able to test a newly acquired high-speed video camera (Phantom V711) at the Department of Musicology.
[caption id=“attachment_1988” align=“alignnone” width=“300”] The beautiful building of the Department of Musicology in Hamburg[/caption]
[caption id=“attachment_1987” align=“alignnone” width=“300”] They have some really cool drawings in the ceiling at the entrance of the Department of Musicology in Hamburg.
July 12, 2012
Paper #1 at SMC 2012: Evaluation of motiongrams
Today I presented the paper Evaluating how different video features influence the visual quality of resultant motiongrams at the Sound and Music Computing conference in Copenhagen.
Abstract
Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, background, lighting, clothing, video size and compression. It is argued that the proposed motiongram implementation is capable of visualising the main motion features even with quite drastic changes in all of the above mentioned variables.
June 25, 2012
Record videos of sonification
I got a question the other day about how it is possible to record a sonifyed video file based on my sonification module for Jamoma for Max. I wrote about my first experiments with the sonifyer module here, and also published a paper at this year’s ACHI conference about the technique.
It is quite straightforward to record a video file with the original video + audio using the jit.vcr object in Max.
February 3, 2012
Sonification of motiongrams
A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper.
See below for the full paper and video examples.
The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams.
July 13, 2011
Difference between videogram and motiongram
For some upcoming blog posts on videograms, I will start by explaining the difference between a motiongram and a videogram. Both are temporal (image) representations of video content (as explained here), and are produced almost in the same way. The difference is that videograms start with the regular video image, and motiongrams start with a motion image.
So for a video of my hand like this:
we will get this horizontal videogram:
November 9, 2010
Sonification of motiongrams
I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram.
See the demonstration video below:
The module is available from the Jamoma source repository, and will probably make it into an official release at some point.
July 2, 2010
New motiongram features
Inspired by the work [[[Static no. 12 by Daniel Crooks that I watched at the Sydney Biennale]{.entry-content}]{.status-content}]{.status-body} a couple of weeks ago, I have added the option of scanning a single column in the jmod.motiongram% module in Jamoma. Here is a video that shows how this works in practice:
About motiongrams A motiongram is a way of displaying motion (e.g. human motion) in the time-domain, somehow similar to how we are used to working with time-representations of audio (e.
August 14, 2009
Presenting mocapgrams
Earlier today I held the presentation “Reduced Displays of Multidimensional Motion Capture Data Sets of Musical Performance” at the ESCOM conference in Jyväskylä, Finland. The presentation included an overview of different approaches to visualization of music-related movement, and also our most recent method: mocapgrams.
While motiongrams are reduced displays created from video files, mocapgrams are intended to work in a similar way, but created from motion capture data. They are conceptually similar, but otherwise quite different in the way they are generated.
August 26, 2008
Open lab
We have slowly been moving into our new lab spaces over the last weeks. The official opening of the labs is scheduled for Friday 26 September, but we had a pre-opening “Open lab” for the new music students last week, and here are some of the pictures shot by Anne Cathrine Wesnes during the presentation.
Here I am telling the students a little about our new research group, and showing the main room:
June 17, 2008
AudioVideoAnalysis
To allow everyone to watch their own synchronised spectrograms and motiongrams, I have made a small application called AudioVideoAnalysis.
Download AudioVideoAnalysis for OS X (8MB) It currently has the following features:
Draws a spectrogram from any connected microphone Draws a motiongram/videogram from any connected camera Press the escape button to toggle fullscreen mode Built with Max/MSP by Cycling ‘74 on OS X.5. I will probably make a Windows version at some point, but haven’t gotten that far yet.
June 11, 2008
Motiongrams sync'ed to spectrograms
One of my reasons for developing motiongrams was to have a solution for visualising movement in a way that would be compatible to spectrograms. That way it would be possible to study how movement is evolving over time, in relation to how the audio is changing over time.
In my current implementation of motiongrams in Max/MSP/Jitter (and partially in EyesWeb), there has been no way to synchronise with a spectrogram. The problem was that the built-in spectrogram in Max/MSP was running much faster than the motiongram, and they was therefore out of sync from the start.
May 15, 2008
Sonification of Traveling Landscapes
I just heard a talk called “Real-Time Synaesthetic Sonification of Traveling Landscapes” (PDF) by Tim Pohle and Peter Knees from the Department of Computational Perception (great name!) in Linz. They have made an application creating music from a moving video camera. The implementation is based on grabbing a one pixel wide column from the video, plotting these columns and sonifying the image. Interestingly enough, the images they get out (see below) of this are very close to the motiongrams and videograms I have been working on.
February 13, 2008
Motiongrams in EyesWeb!
We had a programming session this morning, and Paolo Coletta implemented a block for creating motiongrams in EyesWeb. It will be available in the new EyesWeb XMI release which will happen in the end of this week. Great!
November 1, 2006
Motiongrams
Challenge Traditional keyframe displays of videos are not particularly useful when studying single-shot studio recordings of music-related movements, since they mainly show static postural information and no motion.
Using motion images of various kinds helps in visualizing what is going on in the image. Below can be seen (from left): motion image, with noise reduction, with edge detection, with “trails” and added to the original image.
Making Motiongrams We are used to visualizing audio with spectrograms, and have been exploring different techniques for visualizing music-related movements in a similar manner.