Releasing the Musical Gestures Toolbox for Python

After several years in the making, we finally “released” the Musical Gestures Toolbox for Python at the NordicSMC Conference this week. The toolbox is a collection of modules targeted at researchers working with video recordings.

Below is a short video in which Bálint Laczkó and I briefly describe the toolbox:

About MGT for Python

The Musical Gestures Toolbox for Python includes video visualization techniques such as creating motion videos, motion history images, and motiongrams. These visualizations allow for studying video recordings from different temporal and spatial perspectives. The toolbox also includes basic computer vision methods, and it is designed to integrate well with audio analysis toolboxes.

It is possible to run the toolbox from the terminal:

ipython example
Example of running MGT for Python in a terminal.

Many people would probably prefer to run it in a Jupyter notebook:

Screenshots from the example Jupyter Notebook.

The MGT was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.

History

This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max. The latest version was primarily developed by Bálint Laczkó, Frida Furmyr, and Marcus Widmer.

Read more

To learn more about Musical Gestures Toolbox for Python, take a look at our paper presented at NordicSMC:

RaveForce: A Deep Reinforcement Learning Environment for Music Generation

My PhD student Qichao Lan is at SMC in Malaga this week, presenting the paper:

Lan, Qichao, Jim Tørresen, and Alexander Refsum Jensenius. “RaveForce: A Deep Reinforcement Learning Environment for Music Generation.” Proceedings of the Sound and Music Computing Conference. Malaga, 2019.

The framework that Qichao has developed runs nicely with a bridge between Jupyter Notebook and SuperCollider. This opens for lots of interesting experiments in the years to come.

Abstract:

RaveForce is a programming framework designed for a computational music generation method that involves audio sample level evaluation in symbolic music representation generation. It comprises a Python module and a SuperCollider quark. When connected with deep learning frameworks in Python, RaveForce can send the symbolic music representation generated by the neural network as Open Sound Control messages to the SuperCollider for non-realtime synthesis. SuperCollider can convert the symbolic representation into an audio file which will be sent back to the Python as the input of the neural network. With this iterative training, the neural network can be improved with deep reinforcement learning algorithms, taking the quantitative evaluation of the audio file as the reward. In this paper, we find that the proposed method can be used to search new synthesis parameters for a specific timbre of an electronic music note or loop.

Mobile Python on S60 to Max/MSP

Richard Widerberg held a workshop today on using mobile python on Nokia phones running Symbian OS S60. He has gathered some links to everything that is needed to get a connection up and running with PD. Now I got a simple script up and running and communicating with Max/MSP through the serial object. It works, but it feels a bit limiting to only have one-dimensional control joystick up/down + number keys for interaction. I should definitely get a phone with an accelerometer to test some more motion related applications.