I have been using Ubuntu as my main OS for the past year, but have often relied on my old MacBook for doing various things that I haven’t easily figured out how to do in Linux. One of those things is to trim video files non-destructively. This is quite simple to do in QuickTime, although Apple now forces you to save the file with a QuickTime container (.mov) even though there is still only MPEG-4 compression in the file (h.264).
There are numerous linux video editors available, but most of these offer way too many features and hence the need to re-compress the files. But I have found two solutions that work well.
The first one, ffmpeg, should be obvious, although I hadn’t thought that it could also do trimming. However, I often like GUI software, and I have found that Avidemux can do what I need very easily. Just open a file, add start and stop markers for the section to be trimmed, and click save. As opposed to QuickTime, it also allows for saving directly to MPEG-4 files (.mp4) without recoding the file.
There was only one thing that I had to look up, and that was the need for starting the trim section on a keyframe in the video. This is quite obvious when wanting to avoid re-encoding the file, but unfortunately Avidemux doesn’t help in explaining this but only gives an error message. The trick was to use the >> arrows to jump to the next keyframe, and then the file saved nicely.
I am happy to announce that I have a new publication out, written together with two of my colleagues Anne Danielsen and Mari Romarheim Haugen:
Moving to the Beat: Studying Entrainment to Micro-Rhythmic Changes in Pulse by Motion Capture
Pulse is a fundamental reference for the production and perception of rhythm. In this paper, we study entrainment to changes in the micro-rhythmic design of the basic pulse of the groove in ‘Left & Right’ by D’Angelo. In part 1 of the groove the beats have one specific position; in part 2, on the other hand, the different rhythmic layers specify two simultaneous but alternative beat positions that are approximately 50-80 ms apart. We first anticipate listeners’ perceptual response using the theories of entrainment and dynamic attending as points of departure. We then report on a motion capture experiment aimed at engaging listeners’ motion patterns in response to the two parts of the tune. The results show that when multiple onsets are introduced in part 2, the half note becomes a significant additional level of entrainment and the temporal locations of the perceived beats are drawn towards the added onsets.
I just got a message from Google Scholar that my Ph.D. dissertation has been cited 100 times! Writing a dissertation is a lot of hard work, and I am very happy that other people find it worth reading and citing.
I have written a chapter called From experimental music technology to clinical tool in the newly published anthology Music, Health, Technology and Design, edited by Karette A. Stensæth from the Norwegian Academy of Music. Here is the summary of the book:
This anthology presents a compilation of articles that explore the many intersections of music, health, technology and design. The first and largest part of the book includes articles deriving from the multidisciplinary research project called RHYME (www.rhyme.no). They engage with the study of the design, development, and use of digital and musical ‘co-creative tangibles’ for the potential health benefit of families with a child having physical or mental needs.
And here is the abstract of my chapter:
Human body motion is integral to all parts of musical experience, from performance to
perception. But how is it possible to study body motion in a systematic manner? This
article presents a set of video-based visualisation techniques developed for the analysis
of music-related body motion, including motion images, motion-history images and
motiongrams. It includes examples of how these techniques have been used in studies of
music and dance performances, and how they, quite unexpectedly, have become useful
in laboratory experiments on ADHD and clinical studies of CP. Finally, it includes
reflections regarding what music researchers can contribute to the study of human
motion and behaviour in general.
A couple of weeks ago, NRK, the Norwegian broadcasting company screened a documentary about my research together with the physiotherapists at NTNU in the CIMA project. The short story is that we have developed the tools I first made for the Musical Gestures Toolbox during my PhD, into a system with the ambition of detecting signs of cerebral palsy in infants.
The documentary was made for the science program Schrödingers Katt, and I am very happy that they spent so much time on developing the story, filming and editing. The video can be seen (in 3 parts) on NRK Nett-TV (at least within Norway), and below are a few screenshots.