What is a musical instrument?

A piano is an instrument. So is a violin. But what about the voice? Or a fork? Or a mobile phone? So what is (really) a musical instrument? That was the title of a short lecture I held at UiO’s Open Day today.

The 15-minute lecture is a very quick version of some of the concepts I have been working on for a new book project. Here I present a model for understanding what a musical instrument is and how new technology changes how we make and experience music.

The original lecture was in Norwegian, but I got inspired and recorded an English version right afterwards:

If you rather prefer the original, Norwegian version, here it is:

And, if you do want to learn more about these things, you can apply for one of our study programmes before 15 April: bachelor or master of musicology, or master of music, communication and technology.

Reflections on the roles of instrument builder, composer, performer

One thing that has occurred to me over recent years, is how the new international trend of developing music controllers and instruments, as for example most notably seen at the annual NIME conferences, challenges many traditional roles in music. A traditional Western view has been that of a clear separation between instrument constructor, musician and composer. The idea has been that the constructor makes the instrument, the composer makes the score, the performer plays the score with the instrument, and the perceiver experiences the performance, as illustrated in the figure below.

Traditional chain in the musical ecosphere.
Traditional chain in the musical ecosphere.

However, as we often see in the community surrounding the NIME conferences, there are many people that take on all of these three roles themselves. They make their own instruments, compose the music, and also perform themselves. This new trend also challenges the traditionally separated concepts of instrument and composition. Using various types of neurophysiological, physiological or biomechanical sensors, performers themselves may become part of the instrument. Similarly, the instrument may become part of the composition through various types of algorithmic processing. The perceivers may also become part of both the instrument and the composition in systems based on audience participation and collaborative performance. As such, the notion of the traditional concert is changing, since many “instruments” and “compositions” may be used as installations in which the perceivers take an active part. In this way perceivers are turned into performers, and the composers end up as perceivers to the performance.

I find this change of role exciting, but it is also a challenge to traditional (music) institutions that are built around the very idea of separating all these elements. So it is perhaps not too surprising, that a lot of NIME activity is happening outside traditional music arenas. I don’t have any empirical evidence of this, but my feeling is that there are more people developing, composing and performing with NIMEs in computer science departments, architecture schools, fine art academies or just entirely outside of any institutions, than within music academies. It will be interesting to see whether this will change over the years, and that we will see more interdisciplinary work also within the musical ecosphere.