Participating in the opening of The Guild

I participated in the opening of the Guild of Research Universities in Brussels yesterday. The Guild is

a transformative network of research-led universities from across the European continent, formed to strengthen the voice of universities in Europe, and to lead the way through new forms of collaboration in research, innovation and education.

The topic of the opening symposium is that of Open Innovation, a hot topic these days, and something that the European Commission is putting a lot of pressure on. I was invited to present an example of how open research can lead to innovation and to participate in a panel discussion. Below is an image of the setting, in the lovely Solvay Library in the heart of Brussels (and great to see that the 360-degree plugin works in WordPress!):


Ole Petter Ottersen, Chair of The Guild and Rector of the University of Oslo opened the symposium (click and drag to rotate image).

From basic music research to hospital application

In the symposium I showed a shortened version of the TV documentary that tells the unlikely story of how my basic music research has led to medical innovation. In 2005 I developed a method for visualizing the movements of dancers – motiongrams – with a set of accompanying software tools. As an open source advocate, I made these software tools freely available online, and witnessed how my code was picked up by artists, designers, hackers and researchers. Now my method is at the core of the system Computer-based Infant Movement Assessment (CIMA). This is a clinical system currently being tested in hospitals around the world, with the aim of detecting early-born infants’ risk of developing cerebral palsy.

Panel discussion

The panel discussion centered mainly on policy, and it was great to see that both European university leaders and the Commission embrace openness in all its entirety. Head of Cabinet Antonio Vicente effectively argued that Europe started late, but is quickly catching up in pushing for openness (access, data, research, innovation). The question is now how we get to do this.

I think that the EU should get a lot of credit for their brave move within open research, but individual universities need to push for the same type of openness throughout their institutions. Perhaps the biggest challenge is to change the mentality of peers, who ultimately are the key persons in making decisions as to who should get project funding, appointments and promotions. I see that we often fail in recruiting young researchers with an inclination towards open research. Such applicants consistently get evaluated as “weaker” in comparison with researchers that are following more traditional academic pathways.

Moving forwards, we need to continue with an (inter)national push, but we should not forget about the need for a culture change among individuals. This is something we need to work on at an institutional level.


A view from my panel position during the symposium (click and drag to rotate image).

Screen Shot 2014-11-05 at 20.49.19

From Basic Music Research to Medical Tool

The Research Council of Norway is evaluating the research being done in the humanities these days, and all institutions were given the task to submit cases of how societal impact. Obviously, basic research is per definition not aiming at societal impact in the short run, and my research definitely falls into category.Still it is interesting to see that some of my basic research is, indeed, on the verge of making a societal impact in the sense that policy makers like to think about. So I submitted the impact case “From Music to Medicine”, based on the system Computer-based Infant Movement Assessment (CIMA).

Musical Gestures Toolbox

CIMA is based on the Musical Gestures Toolbox, which started its life in the early 2000s, and which (in different forms) has been shared publicly since 2005.

My original aim of developing the MGT was to study musicians’ and dancers’ motion in a simple and holistic way.The focus was always on trying to capture as much relevant information as possible from a regular video recording, with a particular eye on the temporal development of human motion.

The MGT was first developed as standalone modules in the graphical programming environment Max, and was in 2006 merged into the Jamoma framework. This is a modular system developed and used by a group of international artists, under the lead of Timothy Place and Trond Lossius. The video analysis tools have since been used in a number of music/dance productions worldwide and are also actively used in arts education.

Studying ADHD

In 2006, I presented this research at the annual celebration of Norwegian research in the Oslo concert hall, after which professor Terje Sagvolden asked to test the video analysis system in his research on ADHD/ADD at Oslo University Hospital. This eventually lead to a collaboration in which the Musical Gestures Toolbox was used to analyse 16 rat caves in his lab. The system was also tested in the large-scale clinical ADHD study at Ullevål University Hospital in 2008 (1000+ participants). This collaboration ended abruptly with Sagvolden’s decease in 2011.

Studying Cerebral Palsy

The unlikely collaboration between researchers in music and medicine was featured in a newspaper article and a TV documentary in 2008, after which physiotherapist Lars Adde from the Department of Laboratory Medicine, Women’s and Children’s Health at the Norwegian University of Science and Technology (NTNU) called me to ask whether the tools could also be used to study infants. This has led to a long and fruitful collaboration and the development of the prototype Computer-based Infant Movement Assessment (CIMA) which is currently being tested in hospitals in Norway, USA, India, China and Turkey. A pre-patent has been filed and the aim is to provide a complete video-based solution for screening infants for the risk of developing cerebral palsy (CP).

It is documented that up to 18% of surviving infants who are born extremely preterm develop cerebral palsy (CP), and the total rate of neurological impairments is up to 45%. Specialist examination may be used to detect infants in the risk of developing CP, but this resource is only available at some hospitals. The CIMA aims to offer a standardised and affordable computer-based screening solution so that a much larger group of infants can be screened at an early stage, and the ones that fall in the risk zone may receive further specialist examination. Early intervention is critical to improving the motor capacities of the infants. The success of the CIMA methods developed on the MGT framework are to a large part based on the original focus on studying human motion through a holistic, simple and time-based approach.

The unlikely collaboration was featured in a new TV documentary in 2014.

References

  • Valle, S. C., Støen, R., Sæther, R., Jensenius, A. R., & Adde, L. (2015). Test–retest reliability of computer-based video analysis of general movements in healthy term- born infants. Early Human Development, 91(10), 555–558. http://doi.org/10.1016/j.earlhumdev.2015.07.001
  • Jensenius, A. R. (2014). From experimental music technology to clinical tool. In K. Stens\a eth (Ed.), Music, health, technology, and design. Oslo: Norwegian Academy of Music. Retrieved from http://urn.nb.no/URN:NBN:no-46186
  • Adde, L., Helbostad, J., Jensenius, A. R., Langaas, M., & Støen, R. (2013). Identification of fidgety movements and prediction of CP by the use of computer- based video analysis is more accurate when based on two video recordings. Physiotherapy Theory and Practice, 29(6), 469–475. http://doi.org/10.3109/09593985.2012.757404
  • Jensenius, A. R. (2013). Some video abstraction techniques for displaying body movement in analysis and performance. Leonardo, 46(1), 53–60. http://urn.nb.no/URN:NBN:no-38076
  • Adde, L., Langaas, M., Jensenius, A. R., Helbostad, J. L., & Støen, R. (2011). Computer Based Assessment of General Movements in Young Infants using One or Two Video Recordings. Pediatric Research, 70, 295–295. http://doi.org/10.1038/pr.2011.520
  • Adde, L., Helbostad, J. L., Jensenius, A. R., Taraldsen, G., Grunewaldt, K. H., & Støen, R. (2010). Early prediction of cerebral palsy by computer-based video analysis of general movements: a feasibility study. Developmental Medicine & Child Neurology, 52(8), 773–778. http://doi.org/10.1111/j.1469-8749.2010.03629.x
  • Adde, L., Helbostad, J. L., Jensenius, A. R., Taraldsen, G., & Støen, R. (2009). Using computer-based video analysis in the study of fidgety movements. Early Human Development, 85(9), 541–547. http://doi.org/10.1016/j.earlhumdev.2009.05.003
  • Jensenius, A. R. (2007). Action–Sound: Developing Methods and Tools to Study
    Music-Related Body Movement (PhD thesis). University of Oslo.
    http://urn.nb.no/URN:NBN:no-18922
2015-08-12_18-04-34

New SMC paper: Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music

2015-08-12_18-04-34My colleague Ragnhild Torvanger Solberg and I presented a paper at the Sound and Music Computing conference in Hamburg last week called: “Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music“.

This is a methodological paper, trying to summarize our experiences with using our Qualisys motion capture system for group dance studies. We have two other papers in the pipeline that describes the actual data from the experiments in question. The happy story in the SMC paper is that it is, indeed, possible to get good tracking with multiple people, although it requires quite some fine tuning of the system.

Download: Fulltext (PDF)

Abstract: What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group studies of dancers. This is exemplified through experiments with a Qualisys infrared motion capture system being used alongside a set of small inertial trackers from Axivity and regular video recordings. The conclusion is that it is possible to fine-tune an infrared tracking system to work satisfactory for group studies of complex body motion in a “club-like” environment. For ecological studies in a real club setting, however, inertial tracking is the most scalable and flexible solution.

Citation: Solberg, R. T., & Jensenius, Alexander Refsum, A. R. (2016). Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music. In Proceedings of the Sound and Music Computing Conference (pp. 469–474). Hamburg.

BibTeX
@inproceedings{solberg_optical_2016,
address = {Hamburg},
title = {Optical or {Inertial}? {Evaluation} of {Two} {Motion} {Capture} {Systems} for {Studies} of {Dancing} to {Electronic} {Dance} {Music}},
isbn = {978-3-00-053700-4},
abstract = {What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group studies of dancers. This is exemplified through experiments with a Qualisys infrared motion capture system being used alongside a set of small inertial trackers from Axivity and regular video recordings. The conclusion is that it is possible to fine-tune an infrared tracking system to work satisfactory for group studies of complex body motion in a “club-like” environment. For ecological studies in a real club setting, however, inertial tracking is the most scalable and flexible solution.},
booktitle = {Proceedings of the {Sound} and {Music} {Computing} {Conference}},
author = {Solberg, Ragnhild Torvanger and Jensenius, Alexander Refsum, Alexander Refsum},
year = {2016},
pages = {469--474},

New NIME paper: “The ‘Virtualmonium’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra”

The third NIME contribution from the fourMs lab this year was the paper:

The ‘Virtualmonium’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra

Despite increasingly accessible and user-friendly multi-channel compositional tools, many composers still choose stereo formats for their work, where the compositional process is allied to diffusion performance over a ‘classical’ loudspeaker orchestra. Although such orchestras remain common within UK institutions as well as in France, they are in decline in the rest of the world. In contrast, permanent, high-density loudspeaker arrays are on the rise, as is the practical application of 3-D audio technologies. Looking to the future, we need to reconcile the performance of historical and new stereo works, side-byside native 3-D compositions. In anticipation of this growing need, we have designed and tested a prototype ‘Virtualmonium’. The Virtualmonium is an instrument for classical diffusion performance over an acousmonium emulated in higher-order Ambisonics. It allows composers to custom-design loudspeaker orchestra emulations for the performance of their works, rehearse and refine performances off-site, and perform classical repertoire alongside native 3-D formats in the same concert. This paper describes the technical design of the Virtualmonium, assesses the success of the prototype in some preliminary listening tests and concerts, and speculates how the instrument can further composition and performance practice.

Reference
Barrett, N., & Jensenius, A. R. (2016). The “Virtualmonium”: an instrument for classical sound diffusion over a virtual loudspeaker orchestra. In Proceedings of the International Conference on New Interfaces For Musical Expression (pp. 55–60). Brisbane.

BibTeX

@inproceedings{barrett_virtualmonium:_2016,
    address = {Brisbane},
    title = {The ‘{Virtualmonium}’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra},
    abstract = {Despite increasingly accessible and user-friendly multi-channel compositional tools, many composers still choose stereo formats for their work, where the compositional process is allied to diffusion performance over a ‘classical’ loudspeaker orchestra. Although such orchestras remain common within UK institutions as well as in France, they are in decline in the rest of the world. In contrast, permanent, high-density loudspeaker arrays are on the rise, as is the practical application of 3-D audio technologies. Looking to the future, we need to reconcile the performance of historical and new stereo works, side-byside native 3-D compositions. In anticipation of this growing need, we have designed and tested a prototype ‘Virtualmonium’. The Virtualmonium is an instrument for classical diffusion performance over an acousmonium emulated in higher-order Ambisonics. It allows composers to custom-design loudspeaker orchestra emulations for the performance of their works, rehearse and refine performances off-site, and perform classical repertoire alongside native 3-D formats in the same concert. This paper describes the technical design of the Virtualmonium, assesses the success of the prototype in some preliminary listening tests and concerts, and speculates how the instrument can further composition and performance practice.},
    booktitle = {Proceedings of the {International} {Conference} on {New} {Interfaces} {For} {Musical} {Expression}},
    author = {Barrett, Natasha and Jensenius, Alexander Refsum},
    year = {2016},
    pages = {55--60},
    file = {Barrett_Jensenius_2016_The_‘Virtualmonium’.pdf:/home/alexarje/Dropbox/Reference/Zotero/Barrett_Jensenius/Barrett_Jensenius_2016_The_‘Virtualmonium’.pdf:application/pdf}
}

New paper: “NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs”

At NIME we have a large archive of the conference proceedings, but we do not (yet) have a proper repository for instrument designs. For that reason I took part in a workshop on Monday with the aim to lay the groundwork for a new repository:

NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs [PDF]

This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, open datasets, established evaluation methods and community standards for research practice. NIME could benefit from similar practices, both to share ideas amongst geographically distant researchers and to maintain instrument designs after their first performances. However, the needs of NIME are different from other communities on account of NIME’s reliance on custom hardware designs and the interdependence of technology and arts practice. This half-day workshop will promote a community discussion of the potential benefits and challenges of a DMI repository and plan concrete steps toward its implementation.

Reference
McPherson, A. P., Berdahl, E., Lyons, M. J., Jensenius, A. R., Bukvic, I. I., & Knudsen, A. (2016). NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs. In Proceedings of the International Conference on New Interfaces For Musical Expression. Brisbane.

BibTeX

@inproceedings{mcpherson_nimehub:_2016,
    address = {Brisbane},
    title = {{NIMEhub}: {Toward} a {Repository} for {Sharing} and {Archiving} {Instrument} {Designs}},
    abstract = {This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, open datasets, established evaluation methods and community standards for research practice. NIME could benefit from similar practices, both to share ideas amongst geographically distant researchers and to maintain instrument designs after their first performances. However, the needs of NIME are different from other communities on account of NIME's reliance on custom hardware designs and the interdependence of technology and arts practice. This half-day workshop will promote a community discussion of the potential benefits and challenges of a DMI repository and plan concrete steps toward its implementation.},
    booktitle = {Proceedings of the {International} {Conference} on {New} {Interfaces} {For} {Musical} {Expression}},
    author = {McPherson, Andrew P. and Berdahl, Edgar and Lyons, Michael J. and Jensenius, Alexander Refsum and Bukvic, Ivica Ico and Knudsen, Arve},
    year = {2016},
    file = {McPherson_et_al_2016_NIMEhub.pdf:/home/alexarje/Dropbox/Reference/Zotero/McPherson et al/McPherson_et_al_2016_NIMEhub.pdf:application/pdf}
}