I have for several years been collaborating with researchers at NTNU in Trondheim on developing video analysis tools for studying the movement patterns of infants. This has resulted in several papers, international testing (and a TV documentary). Now there is a new paper out, with some very successful data testing the reliability of the video analysis method:
Valle, Susanne Collier, Ragnhild Støen, Rannei Sæther, Alexander Refsum Jensenius, and Lars Adde. Test–retest Reliability of Computer-Based Video Analysis of General Movements in Healthy Term-Born Infants. Early Human Development 91, no. 10 (October 2015): 555–58. doi:10.1016/j.earlhumdev.2015.07.001.
Test–retest reliability of computer-based video analysis of general movements.
Results showed high reliability in healthy term-born infants.
There was significant association between computer-based video analysis and temporal organization of fidgety movements.
One of the most satisfying things of being a researcher, is to see that ideas, theories, methods, software and other things that you come up with, are useful to others. Today I received the master’s thesis of Per Erik Walslag, titled Are you jumping or bouncing? A case-study of jumping and bouncing in classical ballet using the motiongram computer program, in which he has made excellent use of my motiongram technique and my VideoAnalysis software. This thesis was completed at NTNU last year within the program Nordic Master’s in Dance (NoMAds). That master program is in itself a great example of how a group of fairly small departments can come up with an excellent collaborative study program. I was invited to guest lecture at the program back in 2009, and am very happy to see that my lecture inspired some thoughts and ideas in the students.
Master student Niko Plath was friendly enough to set aside some time to set up the system and do a test recording. Niko has been doing some fascinating work on measuring the motion of individual piano strings using the high-speed camera. For this type of study a camera-based approach makes it possible to measure the vibrations of individual strings without having to attach anything to the string or to the board.
While Niko has recorded the piano strings with a very high speed (500 KHz!) and low resolution(124 x 8 pixels), I was interested in seeing how the camera worked at the maximum resolution (1280 x 800 pixels). At this resolution, the maximum speed is 7 500 frames per second, and the maximum recording duration is 1.1 second.
Even though the recording is short, the processing and exporting of the file (21GB) takes quite some time. So I only had time to make one recording to try things out: a single strumming of all the (open) strings on a guitar, filming the vibrating strings over the sound board.
This was just a quick test, so there are several minor problems with the recording: one being that the guitar was placed upside down, so that the lower strings are at the bottom in the recording. Also, I did not hit the upper string very well, so that one only resonates a little in the beginning and decays quickly. Still, there is nothing as beautiful as watching high-speed recordings in slow-motion. Here you can see a version of the recording being played back at 100 frames per second:
Of course, I was interested in creating a motiongram of the recording. Rather than making this using a regular average technique, I rather used a slit-scan approach, selecting a single pixel column in the middle of the soundhole on the guitar. This was done using a few Jamoma modules in Max, and the patch looked like this:
The full motiongram is available here (TIFF 11 069 x 800 pixels), and below is a JPEG version in a more screen-friendly format. Even though the recording is only a little more than one second long, it is still possible to see the decay of the vibration of the strings, particularly the first strings (from above).
Below is a version showing only the beginning of the motiongram, and how the individual strings were strummed. Notice the difference in the “cut-off” of the shape of the wave of each of the strings.
No new scientific insights here, but it is always fun to see periodic motion with the naked eye. It is a good reminder that auditory and visual phenomena (and the perception of them) are related. Thanks to Niko for helping out, and to his supervisor Rolf Bader for letting me try the system.
Transformation a piece where we are using video analysis to control sound selection and spatialisation. We have been developing the setup and piece during the last couple of years, and performed variations of the piece at MIC, the Opera house and at the music academy last year.
After working happily with FW-products for many years, the recent trend of disappearing FW-ports have made me look for USB-based solutions. For hard drives the switch has been easy, and I also recently got my first USB-based sound card. The hardest part has been to figure out how to handle video cameras.
I have been using various Unibrain cameras for years, and have gotten used to the simplicity of being able to hook up multiple cameras to one computer. Last year when I tried hooking up multiple USB-based webcams to a computer (Windows, since they didn’t work on OSX at all), only one could work at a time. I was therefore pleasantly surprised when I found that Logitech’s QuickCam Vision Pro for Mac actually works well on OSX, and you can even have several of them running at the same time (see screenshot)! Now the only problem is the auto-focus and auto-contrast which tend to cause problems in video analysis (particularly when doing background subtraction).