I am happy to announce that my book Sound Actions - Conceptualizing Musical Instruments is now published! I am also thrilled that this is an open access book, meaning that is free to download and read. You are, of course, also welcome to pick up a paper copy!

Here is a quick video summary of the book’s content:

In the book, I combine perspectives from embodied music cognition and interactive music technology. The approach is what I call “embodied music technology”.

My main argument is that new music technologies — what I call musicking technologies — fundamentally change how music is performed and perceived.

Book cover

Four parts

The book is split into four parts, each containing three chapters.

PART I: MUSICKING

In the first part of the book, I present the Musicking Quadrant as a way to understand the different musical roles of instrument makers, composers, producers, performers, perceivers, and analysts.

Then I introduce the embodied music cognition framework and argue how this can be used to understand more about new musicking technologies.

Musical instruments are at the core of musical creation and experience. It is a mediator between action and sound. Traditionally, musical instruments have been studied from an object-oriented organology. I call for an embodied organology centered on the interaction with sound-producing objects.

PART II: EMBODIMENT

The second part of my book explains some key terminology related to embodiment, including the differences between motion, action, and gesture. I also explain how the concept of “degrees-of-freedom” helps understand both the technical complexity and the cognitive load of various music technologies. This is at the heart of my techno-cognitive reasoning.

Then I move on to describe the differences between sound-producing, sound-facilitating, sound-accompanying and communicative actions. The first ones are closely connected to sound production, while the latter is primarily meant for extra-musical communication.

Music-related body motion can be represented in many ways. I argue that traditional musical scores are based on “action notation”. The same is the case for the MIDI standard, which is focused on coding pitch and velocity information of notes. This differs from new standards that allow continuous control of various sonic and musical features.

PART III: INTERACTION

The third part of my book draws up the differences between action-sound couplings and action-sound mappings. I argue that an action-sound coupling is based on the interaction of physical objects. Such couplings can be explained through the laws of mechanics and acoustics. Acoustic musical instruments can be thought of as having varying action-sound separation: from embodied on one side to conceptual on the other.

A similar action-sound separation can be found for action–sound mappings. However, in such instruments, the mapping is designed and constructed using controllers, sound engines, and speakers. I argue that many new electroacoustic instruments may confuse both performers and perceivers because they violate basic principles of physical sound interaction.

The widespread use of sound amplification is an interesting case of how new technologies can alter our experience of spatiotemporality. Passing sound through cables allows for reducing the spatial distance between a performer and the perceiver. But the added latency of new networked music technologies also creates perceptual challenges.

PART IV: AFFECTION

In the fourth part of my book, I describe my journey from playing classical piano as a child to exploring various multidimensional keyboard controllers in my research.

I describe some of the unconventional instruments I have built over the years, including Cheapstick, Music Balls, and the Music Troll. The aim has been to break with the conventions of music technology designs. I have wanted to develop soft and colourful musical instruments without square corners.

More recently, I have been fascinated by the possibilities of air instruments and how it is possible to break all the rules I have presented in the previous chapters. In the Sverm air instruments, I have explored inverse sonic interaction: you create sound by not moving. The Self-playing guitars are a meeting point between action-sound couplings and mappings: digitally produced sound resonating in physical guitar bodies.

Postlude

My new book describes my journey to become an interdisciplinary scholar, combining theories and methods from the arts and humanities, the social and natural sciences, and using various design and engineering approaches in my exploration.

New musicking technologies bring many exciting opportunities, but they also require attention to diversity, accessibility, and sustainability.

All in all, new instruments change how music is created, performed, perceived, and understood. Want to learn more? Read my book!

Table of Contents

  • Acknowledgments
  • Prelude
  • Part I: Musicking
    • 1: Music as an Active Process
    • 2: Music as an Embodied Process
    • 3: Musical Instruments
  • Part II: Embodiment
    • 4: Music-Related Body Motion
    • 5: Functional Aspects
    • 6: Representations of Sound Actions
  • Part III: Interaction
    • 7: Action–Sound Couplings
    • 8: Action–Sound Mappings
    • 9: Spatiotemporality
  • Part IV: Affection
    • 10: From Ivory to Silicone
    • 11: Unconventional Instruments
    • 12: Performing in the Air
  • Postlude
  • Bibliography
  • Index