Starting afresh

After four years as Head of Department (of Musicology at UiO), I am going back to my regular associate professor position in January. It has been a both challenging and rewarding period as HoD, during which I have learned a lot about managing people, managing budgets, understanding huge organizations, developing strategies, talking to all sorts of people at all levels in the system, and much more.

I am happy to hand over a Department in growth to the new HoD (Peter Edwards). We have implemented a new bachelor’s program, launched UiO’s first MOOC (Music Moves), and hired a number of new people, just to mention a few of the things I have worked on over the last years. I am also proud that we just got our new appointment plan approved before Christmas, aiming at hiring up to seven new professors within the next five years. Humanities departments are under a lot of pressure these days, so I am very grateful that we are in a position to expand in the coming years!

I have only been teaching sporadically while being HoD, so I am excited about getting back to running the course Interactive Music that I started up a while back. This is a so-called “practical-theoretical” course, aiming at giving students a holistic perspective on designing musical instruments and systems. I published a paper on the design of this course a few years ago (An action–sound approach to teaching interactive music), and have since gathered some more ideas that I want to test out when it comes to teaching students a combination of music cognition and technology focused around some concrete designs. I also hope that these ideas will turn into my next book project, if successful.

I am also excited about starting up ny new research project MICRO – Human Bodily Micromotion in Music Perception and Interaction, in which we will focus on how music influences us when at rest. Fortunately, the fourMs lab is really getting up to speed now, so we will really be able to study micromotion in great detail.

In getting ready for my new working life, I decided to wipe my main computer (a Lenovo Yoga Pro 2) yesterday. I have been running various versions of Ubuntu over the last years (Ubuntu Studio, Ubuntu GNOME, and Linux Mint), but decided to go for the regular Ubuntu 16.10 this time around. I think Unity has matured quite a bit now, and works very well on the Yoga’s multitouch HiDPI display. This was my first complete reinstall since I got the laptop almost three years, and was definitely needed. I always test a lot of different software and settings, so the system had gotten clogged up by lots of weird stuff on top of each other. The new clean system definitely feels smooth and well-functioning. It feels like a digital and mental “shower”, getting ready for the new year!

New paper: “NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs”

At NIME we have a large archive of the conference proceedings, but we do not (yet) have a proper repository for instrument designs. For that reason I took part in a workshop on Monday with the aim to lay the groundwork for a new repository:

NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs [PDF]

This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, open datasets, established evaluation methods and community standards for research practice. NIME could benefit from similar practices, both to share ideas amongst geographically distant researchers and to maintain instrument designs after their first performances. However, the needs of NIME are different from other communities on account of NIME’s reliance on custom hardware designs and the interdependence of technology and arts practice. This half-day workshop will promote a community discussion of the potential benefits and challenges of a DMI repository and plan concrete steps toward its implementation.

Reference
McPherson, A. P., Berdahl, E., Lyons, M. J., Jensenius, A. R., Bukvic, I. I., & Knudsen, A. (2016). NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs. In Proceedings of the International Conference on New Interfaces For Musical Expression. Brisbane.

BibTeX

@inproceedings{mcpherson_nimehub:_2016,
    address = {Brisbane},
    title = {{NIMEhub}: {Toward} a {Repository} for {Sharing} and {Archiving} {Instrument} {Designs}},
    abstract = {This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, open datasets, established evaluation methods and community standards for research practice. NIME could benefit from similar practices, both to share ideas amongst geographically distant researchers and to maintain instrument designs after their first performances. However, the needs of NIME are different from other communities on account of NIME's reliance on custom hardware designs and the interdependence of technology and arts practice. This half-day workshop will promote a community discussion of the potential benefits and challenges of a DMI repository and plan concrete steps toward its implementation.},
    booktitle = {Proceedings of the {International} {Conference} on {New} {Interfaces} {For} {Musical} {Expression}},
    author = {McPherson, Andrew P. and Berdahl, Edgar and Lyons, Michael J. and Jensenius, Alexander Refsum and Bukvic, Ivica Ico and Knudsen, Arve},
    year = {2016},
    file = {McPherson_et_al_2016_NIMEhub.pdf:/home/alexarje/Dropbox/Reference/Zotero/McPherson et al/McPherson_et_al_2016_NIMEhub.pdf:application/pdf}
}

New MOOC: Music Moves

Together with several colleagues, and with great practical and economic support from the University of Oslo, I am happy to announce that we will soon kick off our first free online course (a so-called MOOC) called Music Moves.

Music Moves: Why Does Music Make You Move?

Learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.

Go to course – starts 1 Feb

About the course

Music is movement. A bold statement, but one that we will explore together in this free online course. Together we will study music through different types of body movement. This includes everything from the sound-producing keyboard actions of a pianist to the energetic dance moves in a club.

You will learn about the theoretical foundations for what we call embodied music cognition and why body movement is crucial for how we experience the emotional moods in music. We will also explore different research methods used at universities and conservatories. These include advanced motion capture systems and sound analysis methods.

You will be guided by a group of music researchers from the University of Oslo, with musical examples from four professional musicians. The course is rich in high-quality text, images, video, audio and interactive elements.

Join us to learn more about terms such as entrainment and musical metaphors, and why it is difficult to sit still when you experience a good groove.

  • FREE online course
  • 3 hours pw
  • Certificates available

Educators

Alexander Refsum Jensenius Alexander Refsum Jensenius

Diana Kayser (Mentor) Diana Kayser (Mentor)

Hans T. Zeiner-Henriksen Hans T. Zeiner-Henriksen

Kristian Nymoen Kristian Nymoen

Requirements

This course is open to everyone. No technical knowledge of music or dance is required.

Get a personalised, digital and printed certificate

You can buy a Statement of Participation for this course — a personalised certificate in both digital and printed formats — to show that you’ve taken part.

Join the conversation on social media

Use the hashtag #FLmusicmoves to join and contribute to social media conversations about this course.

Go to course – starts 1 Feb

New publication: An Action-Sound Approach to Teaching Interactive Music

action-sound-os2013My paper titled An action–sound approach to teaching interactive music has recently been published by Organised Sound. The paper is based on some of the theoretical ideas on action-sound couplings developed in my PhD, combined with how I designed the course Interactive Music based on such an approach to music technology.

Abstract
The conceptual starting point for an `action-sound approach’ to teaching music technology is the acknowledgment of the couplings that exist in acoustic instruments between sounding objects, sound-producing actions and the resultant sounds themselves. Digital music technologies, on the other hand, are not limited to such natural couplings, but allow for arbitrary new relationships to be created between objects, actions and sounds. The endless possibilities of such virtual action-sound relationships can be exciting and creatively inspiring, but they can also lead to frustration among performers and confusion for audiences. This paper presents the theoretical foundations for an action-sound approach to electronic instrument design and discusses the ways in which this approach has shaped the undergraduate course titled `Interactive Music’ at the University of Oslo. In this course, students start out by exploring various types of acoustic action-sound couplings before moving on to designing, building, performing and evaluating both analogue and digital electronic instruments from an action-sound perspective.

Reference
Jensenius, A. R. (2013). An action–sound approach to teaching interactive music. Organised Sound, 18(2):178–189.

BibTeX

@article{Jensenius:2013b,
 Author = {Jensenius, Alexander Refsum},
 Journal = {Organised Sound},
 Number = {2},
 Pages = {178--189},
 Title = {An Action--Sound Approach to Teaching Interactive Music},
 Volume = {18},
 Year = {2013}}
 

Teaching in Aldeburgh

I am currently in beautiful Aldeburgh, a small town on the east coast of England, teaching at the Britten-Pears Young Artist Programme together with Rolf Wallin and Tansy Davies. This post is mainly to summarise the things I have been going through, and provide links for various things.

Theoretical stuff

My introductory lectures went through some of the theory of an embodied understanding of the experience of music. One aspect of this theory that I find very relevant for the development of interactive works is what I call action-sound relationships. By this I mean that we have an intuitive understanding of how certain actions may produce certain sounds. This is the cognitive basis for the fact that we can “hear” an action we only see, and “see” the action of a sound we can only hear. These ideas are presented and discussed more thoroughly in my PhD dissertation.

Examples of realtime motion capture

I went through a number of examples of how to use motion capture in musical contexts. Here are but a few of the examples:

Transformation is an improvisation piece for electric violin and live electronics. It is based on the idea of letting the performer control a large collection of sound fragments while moving around on stage. The technical setup for the piece is based on a video-based motion tracking system developed in Jamoma, coupled to the playback of sounds using concatenative synthesis in CataRT. Transformatillon is described more thoroughly in a paper in the upcoming issue of Computer Music Journal (winter 2012).

The SoundSaber motion capture instrument is tracking the position of a rod in space using an infrared marker-based motion capture system. The setup is described in more detail in this NIME 2011 paper.

Dance Jockey is a setup/piece in which the Xsens MVN inertial motion capture suit is used to control both sample playback and synthesis using a combination of posture and action recognition of the full body of the performer. It is described in this NIME 2012 paper.

Technical resources

The Max patches used in the course are available here (Aldeburgh-patches.zip). Here are pointers to other useful things:

  • Maxobjects.com is a database with pointers to a number of third party externals for Max. It is a great resource to find whatever you need.
  • Jamoma is a large collection of modules and externals, including video analysis and mapping tools
  • CNMAT depot is a large collection of externals, tutorials, and example patches.

Wii

The Wii controllers are a great way of getting started to use accelerometer data in Max. They are wireless (Bluetooth), and once you figure out the pairing with your computer they work quite well. There are several ways of using the with Max, amongst others:

  • ajh.wiimote is an external for getting data from the Wii controllers into Max.
  • OSCulator is a multipurpose control and mapping tool working with different types of controllers, including the Wii. Even though it may be a little more work passing OSC messages into Max from a separate application, OSCulator is what I find the most stable way of working with Wii controllers in Max.
  • Junxion is another multipurpose control and mapping tool, developed at STEIM. Works with lots of controllers + it also does video tracking.

Phidgets

The Phidgets kits are an inexpensive, user-friendly and soldering-free way of getting started working with musical electronics. The kits come with a sensor interface and a number of different types of sensors to test out.

  • The driver is necessary to get the data into your computer.
  • Then use the Phidgets Max externals to get data into Max.
  • Phidgets2MIDI is a small Max application developed for working more easily with data from the Phidgets.

Kinect

The MS Kinect controller is a great solution for getting started with full-body motion capture.

  • The jit.freenect object lets you grab the video image or depth image from the Kinect into Max. You can then use e.g. the Jamoma video modules or any other Jitter tools in Max.
  • Synapse is a standalone application for tracking a skeleton model. You need to set it up so that you can get OSC messages back into Max.
  • Descriptions of more advanced uses of the Kinect can be found at the fourMs wiki