Below you will find pages that utilize the taxonomy term “music”
May 10, 2023
Visualization of Musique de Table
Musique de Table is a wonderful piece written by Thierry de Mey. I have seen it performed live several times, and here came across a one-shot video recording that I thought it would be interesting to analyse:
The test with some video visualization tools in the Musical Gestures Toolbox for Python.
For running the commands below, you first need to import the toolbox in Python:
import musicalgestures as mg I started the process by importing the source video:
December 13, 2022
New Book: Sound Actions - Conceptualizing Musical Instruments
I am happy to announce that my book Sound Actions - Conceptualizing Musical Instruments is now published! I am also thrilled that this is an open access book, meaning that is free to download and read. You are, of course, also welcome to pick up a paper copy!
Here is a quick video summary of the book’s content:
In the book, I combine perspectives from embodied music cognition and interactive music technology.
January 7, 2022
New online course: Motion Capture
After two years in the making, I am happy to finally introduce our new online course: Motion Capture: The art of studying human activity.
The course will run on the FutureLearn platform and is for everyone interested in the art of studying human movement. It has been developed by a team of RITMO researchers in close collaboration with the pedagogical team and production staff at LINK – Centre for Learning, Innovation & Academic Development.
January 7, 2022
Try not to headbang challenge
I recently came across a video of the so-called Try not to headbang challenge, where the idea is to, well, not to headbang while listening to music. This immediately caught my attention. After all, I have been researching music-related micromotion over the last years and have run the Norwegian Championship of Standstill since 2012.
Here is an example of Nath & Johnny trying the challenge:
https://www.youtube.com/watch?v=-I4CBsDT37I As seen in the video, they are doing ok, although they are far from sitting still.
November 27, 2021
New Book Chapter: Gestures in ensemble performance
I am happy to announce that Cagri Erdem and I have written a chapter titled “Gestures in ensemble performance” in the new book Together in Music: Coordination, Expression, Participation edited by Renee Timmers Freya Bailes, and Helena Daffern.
Video Teaser For the book launch, Cagri and I recorded a short video teaser:
https://youtu.be/Fd2kIAeorRk
Abstract The more formal abstract is:
The topic of gesture has received growing attention among music researchers over recent decades.
November 19, 2021
Rigorous Empirical Evaluation of Sound and Music Computing Research
At the NordicSMC conference last week, I was part of a panel discussing the topic Rigorous Empirical Evaluation of SMC Research. This was the original description of the session:
The goal of this session is to share, discuss, and appraise the topic of evaluation in the context of SMC research and development. Evaluation is a cornerstone of every scientific research domain, but is a complex subject in our context due to the interdisciplinary nature of SMC coupled with the subjectivity involved in assessing creative endeavours.
October 26, 2021
MusicLab Copenhagen
After nearly three years of planning, we can finally welcome people to MusicLab Copenhagen. This is a unique “science concert” involving the Danish String Quartet, one of the world’s leading classical ensembles. Tonight, they will perform pieces by Bach, Beethoven, Schnittke and folk music in a normal concert setting at Musikhuset in Copenhagen. However, the concert is nothing but normal.
Live music research During the concert, about twenty researchers from RITMO and partner institutions will conduct investigations and experiments informed by phenomenology, music psychology, complex systems analysis, and music technology.
September 22, 2021
Can AI replace humans?
Or, more specifically: can AI replace an artist? That is the question posed in a short documentary that I have contributed to for this year’s Research Days.
We were contacted before summer about trying to create a new song based on the catalogue of the Norwegian artist Ary. The idea was to use machine learning to generate the song. This has turned out to be an exciting project.
I was busy finishing the manuscript for my new book, so I wasn’t much involved in the development part myself.
July 1, 2021
Sound and Music Computing at the University of Oslo
This year’s Sound and Music Computing (SMC) Conference has opened for virtual lab tours. When we cannot travel to visit each other, this is a great way to showcase how things look and what we are working on.
Stefano Fasciani and I teamed up a couple of weeks ago to walk around some of the labs and studios at the Department of Musicology and RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion.
June 27, 2021
Running a hybrid conference
There are many ways to run conferences. Here is a summary of how we ran the Rhythm Production and Perception Workshop 2021 at RITMO this week. RPPW is called a workshop, but it is really a full-blown conference. Almost 200 participants enjoy 100 talks and posters, 2 keynote speeches, and 3 music performances spread across 4 days.
A hybrid format We started planning RPPW as an on-site event back in 2019.
June 17, 2021
New publication: NIME and the Environment
This week I presented the paper NIME and the Environment: Toward a More Sustainable NIME Practice at the International Conference on New Interfaces for Musical Expression (NIME) in Shanghai/online with Raul Masu, Adam Pultz Melbye, and John Sullivan. Below is our 3-minute video summary of the paper.
And here is the abstract:
This paper addresses environmental issues around NIME research and practice. We discuss the formulation of an environmental statement for the conference as well as the initiation of a NIME Eco Wiki containing information on environmental concerns related to the creation of new musical instruments.
April 26, 2021
Strings On-Line installation
We presented the installation Strings On-Line at NIME 2020. It was supposed to be a physical installation at the conference to be held in Birmingham, UK.
Due to the corona crisis, the conference went online, and we decided to redesign the proposed physical installation into an online installation instead. The installation ran continuously from 21-25 July last year, and hundreds of people “came by” to interact with it.
I finally got around to edit a short (1-minute) video promo of the installation:
March 11, 2021
What is a musical instrument?
A piano is an instrument. So is a violin. But what about the voice? Or a fork? Or a mobile phone? So what is (really) a musical instrument? That was the title of a short lecture I held at UiO’s Open Day today.
The 15-minute lecture is a very quick version of some of the concepts I have been working on for a new book project. Here I present a model for understanding what a musical instrument is and how new technology changes how we make and experience music.
February 4, 2021
Visualising a Bach prelude played on Boomwhackers
I came across a fantastic performance of a Bach prelude played on Boomwhackers by Les Objets Volants.
https://www.youtube.com/watch?v=Y5seI0eJZCg
It is really incredible how they manage to coordinate the sticks and make it into a beautiful performance. Given my interest in the visual aspects of music performance, I reached for the Musical Gestures Toolbox to create some video visualisations.
I started with creating an average image of the video:
This image is not particularly interesting.
January 22, 2021
New run of Music Moves
I am happy to announce a new run (the 6th) of our free online course Music Moves: Why Does Music Make You Move?. Here is a 1-minute welcome video:
The course starts on Monday (25 January 2021) and will run for six weeks. In the course, you will learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.
We developed the course 5 years ago, but the content is still valid.
November 22, 2020
Music and AI
Last week I was interviewed about music and artificial intelligence (AI). This led to several different stories on radio, TV, and as text. The reason for the sudden media interest in this topic was a story by The Guardian on the use of deep learning for creating music. They featured an example of the creation of Sinatra-inspired music made using a deep learning algorithm:
After these stories were published, I was asked about participating in a talk-show on Friday evening.
April 22, 2020
New publication: Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music
After several years of hard work, we are very happy to announce a new publication coming out of the MICRO project that I am leading: Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music (Frontiers Psychology).
This is the first journal article of my PhD student Agata Zelechowska, and it reports on a standstill study conducted a couple of years ago. It is slightly different than the paradigm we have used for the Championships of Standstill.
March 22, 2020
Method chapter freely available
I am a big supporter of Open Access publishing, but for various reasons some of my publications are not openly available by default. This is the case for the chapter Methods for Studying Music-Related Body Motion that I have contributed to the Springer Handbook of Systematic Musicology.
I am very happy to announce that the embargo on the book ran out today, which means that a pre-print version of my chapter is finally freely available in UiO’s digital repository.
June 6, 2019
NIME publication and performance: Vrengt
My PhD student Cagri Erdem developed a performance together with dancer Katja Henriksen Schia. The piece was first performed together with Qichao Lan and myself during the RITMO opening and also during MusicLab vol. 3. See here for a teaser of the performance:
This week Cagri, Katja and myself performed a version of the piece Vrengt at NIME in Porto Alegre.
We also presented a paper describing the development of the instrument/piece:
August 7, 2018
New article: Correspondences Between Music and Involuntary Human Micromotion During Standstill
I am happy to announce a new journal article coming out of the MICRO project:
Victor E. Gonzalez-Sanchez, Agata Zelechowska and Alexander Refsum Jensenius
Correspondences Between Music and Involuntary Human Micromotion During Standstill
Front. Psychol., 07 August 2018 | https://doi.org/10.3389/fpsyg.2018.01382
Abstract: The relationships between human body motion and music have been the focus of several studies characterizing the correspondence between voluntary motion and various sound features. The study of involuntary movement to music, however, is still scarce.
March 12, 2018
Nordic Sound and Music Computing Network up and running
I am super excited about our new Nordic Sound and Music Computing Network, which has just started up with funding from the Nordic Research Council.
This network brings together a group of internationally leading sound and music computing researchers from institutions in five Nordic countries: Aalborg University, Aalto University, KTH Royal Institute of Technology, University of Iceland, and University of Oslo. The network covers the field of sound and music from the “soft” to the “hard,” including the arts and humanities, and the social and natural sciences, as well as engineering, and involves a high level of technological competency.
December 13, 2017
Come study with me! New master's programme: Music, Communication and Technology
It has been fairly quiet here on the blog recently. One reason for this is that I am spending quite some time on setting up the new Music, Communication and Technology master’s programme. This is an exciting collaborative project with our colleagues at NTNU. The whole thing is focused around network-based communication, and the students will use, learn about, develop and evaluate technologies for musical communication between the two campuses in Oslo and Trondheim.
December 13, 2017
Come work with me! Lots of new positions at University of Oslo
I recently mentioned that I have been busy setting up the new MCT master’s programme. But I have been even more busy with preparing the startup of our new Centre of Excellence RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion. This is a large undertaking, and a collaboration between researchers from musicology, psychology and informatics. A visual “abstract” of the centre can be seen in the figure to the right.
October 9, 2017
And we're off: RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion
I am happy to announce that RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion officially started last week. This is a new centre of excellence funding by the Research Council of Norway.
Even though we have formally taken off, this mainly means that the management group has started to work. Establishing a centre with 50-60 researchers is not done in a few days, so we will more or less spend the coming year to get up to speed.
September 11, 2017
Sverm-Resonans - Installation at Ultima Contemporary Music Festival
I am happy to announce the opening of our new interactive art installation at the Ultima Contemporary Music Festival 2017: Sverm-resonans.
Time and place: Sep. 12, 2017 12:30 PM - Sep. 14, 2017 3:30 PM, Sentralen
Conceptual information The installation is as much haptic as audible.
An installation that gives you access to heightened sensations of stillness, sound and vibration.
Stand still. Listen. Locate the sound. Move. Stand still. Listen. Hear the tension.
July 20, 2017
SMC paper based on data from the first Norwegian Championship of Standstill
We have been carrying out three editions of the Norwegian Championship of Standstill over the years, but it is first with the new resources in the MICRO project that we have finally been able to properly analyze all the data. The first publication coming out of the (growing) data set was published at SMC this year:
Reference: Jensenius, Alexander Refsum; Zelechowska, Agata & Gonzalez Sanchez, Victor Evaristo (2017). The Musical Influence on People’s Micromotion when Standing Still in Groups, In Tapio Lokki; Jukka Pa?
June 22, 2017
New Master's Programme: Music, Communication & Technology
{.description .introduction} We are happy to announce that “Music, Communication & Technology” will be the very first joint degree between NTNU and UiO, the two biggest universities in Norway. The programme is now approved by the UiO board and will soon be approved by the NTNU board.
www.uio.no/mct-master www.ntnu.edu/studies/mct This is a different Master’s programme. Music is at the core, but the scope is larger. The students will be educated as technological humanists, with technical, reflective and aesthetic skills.
May 3, 2017
New publication: Sonic Microinteraction in the Air
I am happy to announce a new book chapter based on the artistic-scientific research in the Sverm and MICRO projects.
{.csl-bib-body} {.csl-entry} Citation: Jensenius, A. R. (2017). Sonic Microinteraction in “the Air.” In M. Lesaffre, P.-J. Maes, & M. Leman (Eds.), The Routledge Companion to Embodied Music Interaction (pp. 431–439). New York: Routledge.
{.csl-entry}
{.csl-entry} Abstract: This chapter looks at some of the principles involved in developing conceptual methods and technological systems concerning sonic microinteraction, a type of interaction with sounds that is generated by bodily motion at a very small scale.
March 16, 2017
New Centre of Excellence: RITMO
I am happy to announce that the Research Council of Norway has awarded funding to establish RITMO Centre of Excellence for Interdisciplinary Studies in Rhythm, Time and Motion. The centre is a collaboration between Departments of Musicology, Psychology and Informatics at University of Oslo.
Project summary Rhythm is omnipresent in human life, as we walk, talk, dance and play; as we tell stories about our past; and as we predict the future.
March 10, 2017
New Book: A NIME Reader
I am happy to announce that Springer has now released a book that I have been co-editing with Michael J. Lyons: “A NIME Reader: Fifteen Years of New Interfaces for Musical Expression”. From the book cover:
What is a musical instrument? What are the musical instruments of the future? This anthology presents thirty papers selected from the fifteen year long history of the International Conference on New Interfaces for Musical Expression (NIME).
February 5, 2017
Music Moves on YouTube
We have been running our free online course Music Moves a couple of times on the FutureLearn platform. The course consists of a number of videos, as well as articles, quizzes, etc., all of which help create a great learning experience for the people that take part.
One great thing about the FutureLearn model (similar to Coursera, etc.) is that they focus on creating a complete course. There are many benefits to such a model, not least to create a virtual student group that interact in a somewhat similar way to campus students.
February 3, 2017
Starting up the MICRO project
I am super excited about starting up my new project - MICRO - Human Bodily Micromotion in Music Perception and Interaction - these days. Here is a short trailer explaining the main points of the project:
Now I have also been able to recruit two great researchers to join me, postdoctoral researcher Victor Evaristo Gonzalez Sanchez and PhD fellow Agata Zelechowska. Together we will work on human micromotion, how music influences such micromotion, and how we can get towards microinteraction in digital musical instruments.
September 7, 2016
New SMC paper: Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music
My colleague Ragnhild Torvanger Solberg and I presented a paper at the Sound and Music Computing conference in Hamburg last week called: “Optical or Inertial? Evaluation of Two Motion Capture Systems for Studies of Dancing to Electronic Dance Music”.
This is a methodological paper, trying to summarize our experiences with using our Qualisys motion capture system for group dance studies. We have two other papers in the pipeline that describes the actual data from the experiments in question.
July 15, 2016
New paper: NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs
At NIME we have a large archive of the conference proceedings, but we do not (yet) have a proper repository for instrument designs. For that reason I took part in a workshop on Monday with the aim to lay the groundwork for a new repository:
NIMEhub: Toward a Repository for Sharing and Archiving Instrument Designs [PDF]
This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs.
July 2, 2016
New paper: Exploring Sound-Motion Similarity in Musical Experience
New paper in Journal of New Music Research:
Exploring Sound-Motion Similarity in Musical Experience (fulltext)**
**Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum
Abstract: People tend to perceive many and also salient similarities between musical sound and body motion in musical experience, as can be seen in countless situations of music performance or listening to music, and as has been documented by a number of studies in the past couple of decades.
March 13, 2016
New project Funding: MICRO!
I am happy to announce that I have received funding from the Norwegian Research Council’s program Young Research Talents for the project: MICRO - Human Bodily Micromotion in Music Perception and Interaction. This is a 4-year long project and I will be looking for both a PhD and postdoctoral fellow to join the team. The call will be out later this year, but please do not hesitate to contact me right if you are interested.
January 24, 2016
New MOOC: Music Moves
Together with several colleagues, and with great practical and economic support from the University of Oslo, I am happy to announce that we will soon kick off our first free online course (a so-called MOOC) called Music Moves.
Music Moves: Why Does Music Make You Move? Learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.
[Go to course – starts 1 Feb](https://www.
June 2, 2015
New publication: Microinteraction in Music/Dance Performance
This week I am participating at the NIME conference (New Interfaces for Musical Expression), organised at Louisiana State University, Baton Rouge, LA. I am doing some administrative work as chair of the NIME steering committee, and I was happy to present a paper yesterday:
Title
Microinteraction in Music/Dance Performance
Abstract
This paper presents the scientific-artistic project Sverm, which has focused on the use of micromotion and microsound in artistic practice. Starting from standing still in silence, the artists involved have developed conceptual and experiential knowledge of microactions, microsounds and the possibilities of microinteracting with light and sound.
December 13, 2014
New publication: From experimental music technology to clinical tool
I have written a chapter called From experimental music technology to clinical tool in the newly published anthology Music, Health, Technology and Design, edited by Karette A. Stensæth from the Norwegian Academy of Music. Here is the summary of the book:
This anthology presents a compilation of articles that explore the many intersections of music, health, technology and design. The first and largest part of the book includes articles deriving from the multidisciplinary research project called RHYME (www.
November 5, 2014
My research on national TV
A couple of weeks ago, NRK, the Norwegian broadcasting company screened a documentary about my research together with the physiotherapists at NTNU in the CIMA project. The short story is that we have developed the tools I first made for the Musical Gestures Toolbox during my PhD, into a system with the ambition of detecting signs of cerebral palsy in infants.
The documentary was made for the science program Schrödingers Katt, and I am very happy that they spent so much time on developing the story, filming and editing.
May 1, 2014
New publication: How still is still? exploring human standstill for artistic applications
I am happy to announce a new publication titled How still is still? exploring human standstill for artistic applications (PDF of preprint), published in the International Journal of Arts and Technology. The paper is based on the Sverm project, and was written and accepted two years ago. Sometimes academic publishing takes absurdly long, which this is an example of, but I am happy that the publication is finally out in the wild.
July 15, 2013
Documentation of the NIME project at Norwegian Academy of Music
From 2007 to 2011 I had a part-time research position at the Norwegian Academy of Music in a project called New Instruments for Musical Exploration, and with the acronym NIME. This project was also the reason why I ended up organising the NIME conference in Oslo in 2011.
The NIME project focused on creating an environment for musical innovation at the Norwegian Academy of Music, through exploring the design of new physical and electronic instruments.
February 20, 2013
New PhD Thesis: Kristian Nymoen
I am happy to announce that fourMs researcher Kristian Nymoen has successfully defended his PhD dissertation, and that the dissertation is now available in the DUO archive. I have had the pleasure of co-supervising Kristian’s project, and also to work closely with him on several of the papers included in the dissertation (and a few others).
Reference K. Nymoen. Methods and Technologies for Analysing Links Between Musical Sound and Body Motion.
February 14, 2013
New Master Thesis: Freestyle Dressage: an equipage riding to music
I am happy to announce that the dissertation of one my master students has just been made available in the DUO archive:
Catherine Støver: Freestyle Dressage : an equipage riding to music Catherine wrote about the importance and influence of music in freestyle dressage. Most of my students are working on more music technological topics, and I can clearly say that supervising Catherine was both fun and a great learning experience for myself.
January 17, 2013
NIME 2013 deadline approaching
{.alignright .size-full .wp-image-2180 width=“211” height=“160”}
Here is a little plug for the submission deadline for this year’s NIME conference. I usually don’t write so much about deadlines here, but as the current chairof the international steering committee for the conference series, I feel that I should do my share in helping to spread the word. The NIME conference is a great place to meet academics, designers, technologists, and artists, all working on creating weird instruments and music.
December 13, 2012
Performing with the Norwegian Noise Orchestra
Yesterday, I performed with the Norwegian Noise Orchestra at Betong in Oslo, at a concert organised by Dans for Voksne. The orchestra is an ad-hoc group of noisy improvisers, and I immediately felt at home. The performance lasted for 12 hours, from noon to midnight, and I performed for two hours in the afternoon.
For the performance I used my Soniperforma patch based on the sonifyer technique and the Jamoma module I developed a couple of years ago (jmod.
September 5, 2012
Teaching in Aldeburgh
I am currently in beautiful Aldeburgh, a small town on the east coast of England, teaching at the Britten-Pears Young Artist Programme together with Rolf Wallin and Tansy Davies. This post is mainly to summarise the things I have been going through, and provide links for various things.
Theoretical stuff My introductory lectures went through some of the theory of an embodied understanding of the experience of music. One aspect of this theory that I find very relevant for the development of interactive works is what I call action-sound relationships.
August 16, 2012
Reflections on the roles of instrument builder, composer, performer
One thing that has occurred to me over recent years, is how the new international trend of developing music controllers and instruments, as for example most notably seen at the annual NIME conferences, challenges many traditional roles in music. A traditional Western view has been that of a clear separation between instrument constructor, musician and composer. The idea has been that the constructor makes the instrument, the composer makes the score, the performer plays the score with the instrument, and the perceiver experiences the performance, as illustrated in the figure below.
February 3, 2012
Sonification of motiongrams
A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper.
See below for the full paper and video examples.
The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams.
March 28, 2011
Concert: Victoria Johnson
Together with Victoria Johnson I have developed the piece Transformation, a piece where we are using video analysis to control sound selection and spatialisation. We have been developing the setup and piece during the last couple of years, and performed variations of the piece at MIC, the Opera house and at the music academy last year.
The piece will be performed again today, Monday 28 March 2011 at 19:00 at the Norwegian Academy of Music.
October 25, 2010
Music is not only sound
After working with music-related movements for some years, and thereby arguing that movement is an integral part of music, I tend to react when people use “music” as a synonym for either “score” or “sound”.
I certainly agree that sound is an important part of music, and that scores (if they exist) are related to both musical sound and music in general. But I do not agree that music is sound.
August 6, 2009
Book manuscript ready
Over the last year I have been working on a text book based on my dissertation. It started out as a translation of my dissertation into Norwegian, but I quickly realized that an educational text is much more useful. So in practice I have written a totally new book, although it is drawing on research from my dissertation. The title of the book is Musikk og bevegelse (Music and movement) and that is exactly what it is about.
June 6, 2008
Virtual slide guitar
Jyri Pakarinen just presented a paper on the Virtual Slide Guitar (VSG) here at NIME in Genova.
They used a commercial 6DOF head tracking solution from Naturalpoint called TrackIR 4 Pro. The manufacturer promises:
Experience real time 3D view control in video games and simulations just by moving your head! The only true 6DOF head tracking system of its kind. TrackIR takes your PC gaming to astonishing new levels of realism and immersion!
May 28, 2008
MT9 format
Seems like the new MT9 format, or Music 2.0 as the company Audizen calls it, is all over the news these days. The idea is simple, and has been explored for years in the research community: distribute multichannel audio, so that the end user can have control over the single tracks. The problem of course is to make this into a standard, and I see many challenges in how this could be implemented:
May 8, 2008
OLPC Sound Samples
I am doing some “house-cleaning” on my computer, and came across the link to the OLPC Sound Samples which were announced last month. This collection covers a lot of different sounds, ranging from the Berklee samples to sets created by people in the CSound community. Obviously, not all the 10GB is equally interesting, but the initiative is excellent, and along with the Freesound project, it makes a great resource for various projects.
April 1, 2008
David Huron: Listening Styles and Listening Strategies
In a presentation at the Society for Music Theory 2002 Conference, 2002, David Huron proposed 21 listening modes:
Distracted listening Tangential listening Metaphysical listening Signal listening Sing-along listening Lyric listening Programmatic listening Allusive listening Reminiscent listening Identity listening Retentive listening Fault listening Feature listening Innovation listening Memory scan listening Directed listening Distance listening Ecstatic listening Emotional listening Kinesthetic listening Performance listening and he concludes: “This list is not intended to be exhaustive.
February 15, 2008
Recordings in Casa Paganini
The location of the EyesWeb Week is the facilities of the DIST group in the beautiful Casa Paganina, including a large auditorium next to the laboratories. This allows for an ecological setting for experiments, since performers can actually perform on a real stage with real audience. I wish we could have something like this in Oslo!
Here a picture from an experimental setup where we are looking at the synchronisation between the musicians in a string trio.
February 14, 2008
Emotional music examples
The Peretz group has made available a set of musical excerpts with emotion ratings. Perhaps not the most exciting musical collection, but I think it is very important that the community starts building some data sets that can be used as reference for various type of analyses.
We really need to create a set of music recordings including motion capture and video, but this first requires that we develop some common format that can be used for synchronisation and sharing.
February 14, 2008
Syncing Movement and Audio using a VST-plugin
I just heard Esteban Maestre from UPF present his project on creating a database of instrumental actions of bowed instruments, for use in the synthesis of score-based material. They have come up with a very interesting solution to the recording and synchronisation of audio with movement data: Building a VST plugin which implements recording of motion capture data from a Polhemus Liberty, together with bow sensing through an Arduino. This makes it possible to load the VST-plugin inside regular audio sequencing software and do the recording from there.
November 8, 2007
Musical vs. Music-related
Working on a book chapter, I am trying to clarify some terminology. Right now I am thinking about the differences between “musical” and “music-related” movements/actions/gestures. What is the difference? I find that it makes sense to think about whether the action is direct or indirect. In other words:
Musical actions: actions involved in music making, e.g. performing an instrument (i.e. sound-producing actions). Music-related actions: actions that are the result of, or influenced by, music, e.
October 23, 2007
Music Performance Research
I heard about the initiative last year at Music & Gesture 2 in Manchester, and now I see that the new online journal Music Performance Research is actually up and running.
Music Performance Research is an international peer-reviewed journal that disseminates theoretical and empirical research on the performance of music. Its purpose is to disseminate research on the nature of music performance from both theoretical and empirical perspectives. The journal publishes contributions from all disciplines that are relevant to music performance, including archaeology, cultural studies, composition, computer science, education, ethnomusicology, history, medicine, music theory and analysis, musicology, philosophy, physics, psychology, neuroscience and sociology.
October 10, 2007
Debut of the NMH Laptop Orchestra
As part of the Ultima festival and the opening of this year’s Musikkteknologidagene, Kjell Tore Innervik and I organised the debut of the NMH Laptop Orchestra. Inspired by PLORK, we lined up with laptops and performed two pieces by Alan Tormey and Ge Wang. This was an immediate success, and we hope to establish this as a permanent ensemble from now.
{#image494}
September 28, 2007
Towards Active Music... (or not)
{#image489} I am doing some background research for a paper on “active music” and have been testing various audio software over the last few days. I was very excited about testing GarageBand ’08, since Apple has been shouting loudly about its new “magic” features. I have to say that I had some expectations that we would actually see some novel features here, especially since they promise a “hand-picked” band on a virtual stage.
June 12, 2007
Keyframe
Henrik Marstrander will present his master thesis project tomorrow. This is an interesting visual table for controlling musical sound.
Details: onsdag 13.6 kl 1230. Rom på venstre side i gangen på vei til Salen.
May 15, 2007
Journal of interdisciplinary music studies
There is a new music journal out titled Journal of interdisciplinary music studies, and which seems to be freely available online. I was particularly pleased to read Richard Parncutt’s opening paper on the history and future of systematic musicology. While it has been overshadowed (and to some extent suppressed) by historical musicology for the last decade, there seems to be a growing interest for systematic musicology today.
However, as Parncutt argues, much of this research is carried out under other names and in other departments, e.
March 19, 2007
Active Music
Tod Machover’s article Shaping Minds Musically is an interesting read, summarising much of the work on hyperinstruments that have happened at the MIT Media Lab during the last ten years. The main point he is trying to make, is that music should be active rather than passive. This comes from the observation that most people’s involvement with music is from a reception side rather than from production.
There is more music than ever in the air, but fewer of us actually play music, sing music, or create our own music.
March 15, 2007
ISSSM 2007
Students in musicology, music cognition and technology should consider ISSSM 2007:
Following on the success of the first international summer school in systematic musicology (ISSSM 2006), the summer school will be held for the second time at IPEM, the research centre of the Department of Musicology of Ghent University (Belgium). This year courses will focus on current topics in the research field such as embodied music cognition, music information retrieval and music and interactive media.
March 8, 2007
Open Form Workshop
In between everything else I will be participating in the Open Form Workshop at the Music Academy this weekend. Christian Wolff, the last living of the “New York composers”, is visiting Oslo and we will be working with him during the workshop.
I have only had time to participate in some of the rehearsals so far, and it is very interesting. The pieces range from being very strict to very open leaving most things up to the performers.
February 20, 2007
Recording Hoax
Craig Sapp (formerly at CCARH now at CHARM) writes:
I have been analyzing the performances of Chopin Mazurkas and have been noticing an unusual occurence: the performances of the same two pianists always matched whenever I do an analysis for a particular mazurka. In fact, they matched as well as two different re-releases of the same original recording.
The full story about how the tracks have been slightly time-stretched, panned and EQed before being rereleased is covered in a recent story in Gramophone.
February 17, 2007
Bob Ludwig on Surround Mixing
I went to a speech on surround mixing (5.1) last night by Bob Ludwig of Gateway Mastering. He spent a lot of time talking about gear and technicalities of mastering, and also discussed the different stages in mastering for various formats SACD, DVD-Audio etc. An interesting thing he commented on is the fact that when Dolby Digital is downmixed to stereo in consumer gear, the LFE channel is left out. So he advised to use the LFE (.
February 17, 2007
Movement, action, gesture
Ever since I started my PhD project I have been struggling with the word gesture. Now as I am working on a theory chapter for my dissertation, I have had to really try and decide on some terminology, and this is my current approach:
I use movement as the general term to describe the act of changing physical position of body parts related to music performance or perception. Action is used to denote goal-directed movements that form a separate unit.
February 17, 2007
Trond Lossius' fellowship report
I spent my flight to Montreal (which became much longer than I expected when I was rescheduled through Chicago) reading Trond Lossius’ report for the Fellowship in the arts program. He addresses a number of interesting topics:
Commenting on the necessity for carrying out research for instead of on art, he discusses the concept of “art as code”:
It is not only a question of developing tools. [..] Programming code becomes a meta-medium, and creating the program is creating the art work.
February 12, 2007
Brad Garton
I came across Brad Garton’s blog via Tim. It starts:
Last week I was diagnosed with multiple myeloma, a fairly bad cancer of the bone marrow. The good news is that I am relatively young to be diagnosed with this disease and it seems that it was detected early. The bad news is that, well, it’s a ‘bad’ cancer to have. I think I’m about to embark on yet another life adventure.
February 8, 2007
MSc in Music Tech at Georgia Tech
Georgia Tech has been hiring a young and interesting music tech faculty over the last years, and now they start a Master of Science program in music tech with a focus on the design and development of novel enabling music technologies. This is yet another truly interdisciplinary music tech program to appear over the last couple of years, and accepting students from a number of different backgrounds, including music, computing and engineering.
February 8, 2007
Windows Vista soundscape
I wrote this blog entry several months ago, but never posted it because I thought I would have time to go back and evaluate the sounds more. Since I don’t see that happen any time before I finish my dissertation, I just go along and post it now:
Microsoft has posted some info and examples of the Vista soundscape. The sounds are designed by Robert Fripp and will be some of the most well known sounds on the planet in not too long.
January 12, 2007
Vibrating Plates
Derek Kverno and Jim Nolen have studied the vibration of circular, square and rectangular plates with unbound edges, and have posted som very nice images of radiation patterns of vibrating plates.
January 11, 2007
Music for One Apartment and Six Drummers
A charming little Swedish Stomp-inspired video:
January 5, 2007
Visual Acoustics
Christian Frisson pointed me to Visual Acoustics, a wonderful little web based music improvisation tool. Very simple and elegant
December 31, 2006
Noise
{#image361}If you ever wanted some nice, pink noise in the background while working on your computer, Noise is the tool! Apparently, lots of people use this to try and shut out more distractive sounds. While I would prefer a program doing noise-cancelling (which would probably be tricky using the built-in microphone since it would also detect your own sounds while typing on the keyboard), this actually works ok.
December 20, 2006
Linear presentations
I have been thinking about what I wrote about improvisation a couple of weeks ago. While preparing for a presentation last week, I was thinking about how linear my presentation software (Apple’s Keynote) is. It is as bad as PowerPoint when it comes to locking you into a linear presentation style. This is fine if you have a clear idea of what you would like to say and which order you want to say things in, but I often find that I have several sections that could be organized differently dependent on the audience, the time constraints etc.
December 20, 2006
Movement-Sound Couplings
I am working on the theory chapter of my dissertation, and am trying to pin down some terminology. For a long time I have been using the concept of gesture-sound relationships to denote the intimate links between a physical movement and the resultant sound. However, since I am throwing away gesture for now, I also need to reconsider the rest of my vocabulary.
Hodgins (2004) uses the term music-movement structural correspondences, which I find problematic since it places music first.
December 8, 2006
Music troll performance
{#image349}I performed with the music troll yesterday. It has been resting in the lab for a couple of months, and was a bit “rusty” to start up. What caused the biggest headache was to get my performance patches to work on my new MacBook. Last month I found that PeRColate was released as UB, but I hadn’t tested them. First I had problems making Max finding them, which seemed to be because of the source-folder resting in the path (thanks to mzed for the tip).
December 6, 2006
On Improvisation
Yesterday, someone commented that improvisation is all about being able to play some random stuff, in realtime. My experience is really the opposite. Learning to improvise on a musical instrument is really all about learning scales, phrases, motifs, and getting experienced in putting them together in a structured way. In realtime.
The same is true for improvised presentations and speeches. After holding a number of presentations on my research lately, I have been thinking about how similar the preparation process for a presentation is to a music performance.
November 16, 2006
M-AUDIO - MidAir
M-Audio has released MidAir a wireless MIDI transmitter and receiver system.
{width=“460” height=“250”}
The system is also able to synchronize between several performers.
I just wish that some of these large companies would start to use OSC one day…
November 3, 2006
Tapestrea
TAPESTREA (or taps) is a unified framework for interactively analyzing, transforming and synthesizing complex sounds. Given one or more recordings, it provides well-defined means to:
identify points of interest in the sound and extract them into reusable templates transform sound components independently of the background and/or other events continually resynthesize the background texture in a perceptually convincing manner controllably place event templates over backgrounds, using a novel graphical user interface and/or scripts written in the ChucK audio programming language leverage similarity based retrieval to locate other interesting sound components Taps provides a new way to completely transform a sound scene, dynamically generate soundscapes of unlimited length, and compose and design sound by combining elements from different recordings.
November 2, 2006
Audacity 1.3
There’s a new beta of Audacity 1.3 out. Previous versions have been somewhat unstable and lacking features, but now it starts to improve:
- New selection bar and improved selection tools
Dockable toolbars New “Repair” effect, other improved effects Auto-save and automatic crash recovery {#image307}
October 16, 2006
NoMuTe 2006
Just back from the 1st Nordic Music Technology Conference organized by NTNU in connection with Trondheim MatchMaking organized by TEKS. This is the follow-up conference from Musikkteknologidagene which I organized in Oslo last year as an attempt to gather people working within the field.
Ola Nordahl has posted some nice pictures from the Opening day, where Paul Lansky held a great keynote about his compositions (check out his music page for examples of his work).
September 29, 2006
Norwegian Science Fair
Last weekend we participated (again) with a stand at a big science fair down in the city centre of Oslo during the Norwegian Research Days.
{.imagelink}
The most interesting thing, and also what I have spent the most time on lately was a “music troll” I have been making together with Einar Sneve Martinussen and Arve Voldsund. The troll is basically a box with four speakers on the sides, and four arms sticking out with heads with included sensors.
August 22, 2006
Soundflower
Soundflower from Cycling ‘74, a small freeware utility allowing internal audio routing
under OS X, is available in Universal Binary for MacTel computers. Soundflower is similar to Jack, and while the latter has some more advanced features, I find Soundflower easier to use. They are both perfect for recording for example streaming audio.
August 18, 2006
Lasse - Hyperactive
{#image258}Lasse - Hyperactive is a very simple and low-cost videomusic production, but also very powerful and funny.
August 2, 2006
Unhappy Hour
I found (via Trond’s blog) the funny story Unhappy Hour about a group of people getting stuck with a jukebox playing Brian Eno’s Thursday Afternoon. I bought the DVD not too long ago, and it has become one my favourites.
Eno writes in the liner notes: These pieces represent a response to what is presently the most interesting challenge of video: how does one make something that can be seen again and again in the way that a record can be listened to repeatedly?
July 17, 2006
New book: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard
{.imagelink}Eduardo Miranda and Marcelo M. Wanderley have just released a new book called New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. The chapters are:
- Musical Gestures: Acquisition and Mapping
Gestural Controllers Sensors and Sensor-to-Computer Interfaces Biosignal Interfaces Toward Intelligent Musical Instruments So far most publications in this field have been in conference proceedings, so it is great to have a book that can be used in teaching.
July 15, 2006
Electromyography
For some experiments we are conducting on piano playing I have been looking for a way of measuring muscle activity, or electromyography as it is more properly called:
Electromyography (EMG) is a medical technique for evaluating and recording physiologic properties of muscles at rest and while contracting. EMG is performed using a instrument called an electromyograph, to produce a record called an electromyogram. An electromyograph detects the electrical potential generated by muscle cells when these cells contract, and also when the cells are at rest.
July 11, 2006
BEAM Foundation
In a discussion of “the most complex Max patch”, Barry Threw pointed to the patch used by TrioMetrik, the ensemble of the BEAM Foundation. There’s also a video with shots of musicians and patches.
July 11, 2006
Reactive Sound System
The Reactive Sound System adds sounds to the current soundscape, either to mask for example speech, or to make unpleasant sounds more pleasant. They have also developed an acoustic curtain with a microphone and flat speakers which can work with the system.
July 9, 2006
Reverse Engineering Autechre with Max/MSP and Reaktor
{#image232}Came across a web page with reverse engineered Autechre Max/MSP and Reaktor patches. Interesting.
June 22, 2006
NIME 06 Installations
Still trying to get through all my notes from Resonances… Of the many installations at NIME 06, I found three of them particularly interesting:
{#image227}Musical Loom by Kingsley Ng was based around an old loom standing in a dark room (or rather a “tent” built between the entrances to the toilets…). It was possible to “play” the loom and sounds and images would appear. The technical setup was built with a combination of infrared cameras and ultrasound sensors, and using EyesWeb for control.
June 21, 2006
Interaction Design
We have started a collaboration between between UiO and AHO, and some of the music technology students followed courses with the interaction designers at AHO this spring semester. This was a great success, and I was impressed with what came out of it.
Henrik Marstrander has worked on a table interface where he can control various musical parameters, and Jon Olav Eikenes and Marie Wennesland has made a multi-touch multi-touch interface modelled after Jeff Han.
June 8, 2006
NIME 06 Concerts
There were lots of concerts at NIME 06, and many interesting things to comment about:
Ben Neill played Mutantrumpet a hybrid acoustical and electronic instrument which was very interesting. {#image219}Circumference Cycles by Chris Strollo and Tina Blaine was very captivating. Glass plates suspended by metal wires with amplification and some effects, sounded great! Mari Kimura’s two pieces (Polytopia and Tricot) were great and showed how well electronics can be used together with an acoustic instrument (violin).
May 29, 2006
United States Patent Application: 0060107822
Apple has recently filed an interesting US Patent Application:*
*
The invention generally pertains to a hand-held computing device. More particularly, the invention pertains to a computing device that is capable of controlling the speed of the music so as to affect the mood and behavior of the user during an activity such as exercise. By way of example, the speed of the music can be controlled to match the pace of the activity (synching the speed of the music to the activity of the user) or alternatively it can be controlled to drive the pace of the activity (increasing or decreasing the speed of the music to encourage a greater or lower pace).
May 27, 2006
Deep Listening Institute, Ltd.
Doug pointed me to Deep Listening:
Deep Listening is a philosophy and practice developed by Pauline Oliveros that distinguishes the difference between the involuntary nature of hearing and the voluntary selective nature of listening. The result of the practice cultivates appreciation of sounds on a heightened level, expanding the potential for connection and interaction with one’s environment, technology and performance with others in music and related arts.
May 22, 2006
Political Eurovision Song Contest
The Eurovision Song Contest (or Melodi Grand Prix as it often called) is a bizarre annual music competition broadcasted over the whole of Europe. The music is rarely in focus, and most people tend to “love to hate” the concept. However, for many of the new countries in Europe the contest is important to show their own existence and bound with their allies this Norwegian commentator writes.
May 21, 2006
KORE Universal Sound Platform
Native instruments states that KORE should be the new universal sound platform solving “all problems” in large music software setups. Basically, it works as a generic host for plugins (VST and AU) that can be used in sequencers, and it comes with a hardware controller to facilitate the control.
The argumentation is convincing and the pictures nice, but it seems like this “new” product only scratches on the surface of the real problem.
May 21, 2006
USB Guitar
Seems like everything is getting USB-connectivity these days. The Samson condenser microphone has been out for a while, and not Behringer is releasing a .
iAXE393 USB-guitar, the Ultimate Electric Guitar with Built-In USB Port to Connect Straight to Your Computer. Jam and Record with Killer Modeling Amps and Stomp Boxes. Seems like it only outputs digital audio, though. Would have been interesting if it had had a built-in audio-to-MIDI (or even better to OSC) converter.
May 20, 2006
Sonic Visualiser
{.imagelink}Sonic Visualiser from Queen Mary’s is yet another software tool for visualizing audio content. However, there are some features that stand out:
Cross-platform: available for OS X, Linux, Windows GPL’ed Native support for aiff, wav, mp3 and ogg (but what about AAC?) Annotations: Support for adding labelled time points and defining segments, point values and curves. The annotations can be overlayed on top of waveforms and spectrograms Time-stretch Vamp Plugins is at the core of the Sonic Visualiser, and it seems like they want this to become a standard for non-realtime audio plugins.
May 19, 2006
int.lib by Oli Larkin
{.imagelink}int.lib is a set of abstractions/javascripts for Cycling 74’s Max MSP software that facilitates the control of multiple parameters by navigating a two dimensional visual environment. It implements a gravitational system, allowing the user to represent presets with variable sized balls. As the user moves around the space, the size of the balls and their proximity to the mouse cursor affects the weight of each preset in the interpolated output. int.
May 15, 2006
Laser Sound Performance
{#image172}A memorable show during the Elektrafestival was the Laser Sound Performance by Edwin van der Heide. He used two lasers and (I think) motorized mirrors and filters to create laser patterns on the wall and in the smoke filling the space. The sound was mostly sine tones, sawtooths and various types of noise at an extremely loud level (even with ear plugs). Not really sure how he did it, but there was a really tight synch between the movement of the lasers and the sounds.
May 13, 2006
Marnix de Nijs, media artist
{.imagelink}The installation Spatial Sounds (100dB at 100km/h) by Marnix de Nijs and Edwin van der Heide. Spatial Sounds 100 dB at 100 km/h was set up at Usine-C during the Elektrafestival.
A speaker is mounted on a metallic arm, rotating around at different speeds dependent on the people in the room. Ultrasonic sensors detect the distance to people in the space and changes the sound being played as well as speed of rotation (more technical info here).
May 11, 2006
Why do they play so loud?
I often go to concerts, and too often I find the need to use ear plugs because of loud sound levels. I really don’t get it, why is it necessary to play so loud all the time? Usually lots of people around me agrees that the music is unpleasantly loud, and I often see other people using ear plugs.
I have bought expensive ear plugs a couple of times, but I always tended to forget them (eventually loosing them…), so now I have just bought lots of really cheap ones so that I can have a pair in every pocket.
May 9, 2006
Frank A. Russo
Came across the web page of Frank A. Russo, and found a very interesting paper on Hearing Aids and Music discussing the auditory design of hearing aids:
Whether the hearing aid wearer is a musician or merely someone who likes to listen to music, the electronic and electro-acoustic parameters described can be optimized for music as well as for speech. That is, a hearing aid optimally set for music can be optimally set for speech, even though the converse is not necessarily true.
April 28, 2006
Live images on björk's MEDÚLLA web page
There are some simple gif animations that start playing when you hover over some of the images on björk’s MEDÚLLA web page. Nowadays, with lots of flash graphics everywhere, you rarely see such low-quality gifs anymore. However, for some reason I really found these small gifs appealing. Reminds me about David Crawford’s Stop Motion Studies.
April 27, 2006
Sidney Fels lecture
Just went to a lecture by Sidney Fels from the Human Communication Technologies lab and MAGIC[]{#mce_editor_0_parent} at the University of British Columbia (interestingly enough located in the Forest Sciences Centre…). He was talking on the topic of intimate control of musical instruments, and presented some different projects:
GloveTalkII: “a system that translates hand gestures to speech through an adaptive interface.” Iamascope: a caleidoscope like thing, where users would see themselves on a big screen, as well as controlling a simple sound synthesis.
April 26, 2006
MIDI network on OS X
In a discussion on using OSC to communicate over networks, Darryl just mentioned that OS X (apparently starting from Tiger) has the possibility to send MIDI messages over the network. I wonder how I have managed to oversee this feature, since it is sitting there as an option right in the Audio MIDI setup. The help file reads:
You can use the MIDI network driver to send and receive MIDI information between computers over a network.
April 25, 2006
OSC - MIDI address space
My post over at the Open Sound Control forum:
I guess we are all trying to get rid of MIDI, but as long as we have tons of gear around, it would be good to have a generic way of describing MIDI information in OSC. Perhaps I am missing something obvious, but I have looked around and haven’t found any suggestions for a full implementation of MIDI messages as an OSC address space.
April 24, 2006
Turntable-Controlled Vibrating Chaise Longue
{.imagelink}Daito Manabe has developed a Turntable-Controlled Vibrating Chaise Longue where it is possible to feel 34 sounds played back through a vibrating chaise longue. Lots of pictures of the making process is available on Daitos web page under works/chair the difference.
April 24, 2006
Visual Scratch
{#image139}Jesse Kriss has developed Visual Scratch a realtime visualization of scratch DJ performance, built using Processing, Max/MSP, Ms. Pinky, and MaxLink.
April 23, 2006
Art of Cobra
I went to see the McGill improv ensemble perform Art of Cobra by John Zorn. Usually, I find it more interesting to play free improvisation than listening to it, but this time it was quite entertaining.
Cobra is a rule game, explained by Zorn as “I’m going to hold up some cards and they’re going to play something.” he prompter holds up a que card, points at the performers and then they playing something.
April 23, 2006
WFS in electronic music
Today I went to a guest lecture by Marije Baalman on WaveFieldSynthesis (a spatial sound reproduction principle based on the Huygens principle) over at Concordia. I heard a demonstration of WFS at IRCAM a couple of years back, and it was good to (finally) get a good theoretical introduction to the field.
They are usually testing it with 24 speakers, but they are now going to make a permanent 900 speaker setup at the Technical University in Berlin for creating a surround WFS setup.
April 23, 2006
Yves Guiard and bimanual action
Yves Guiard should have held a lecture at McGill last week, but unfortunately could not make it. Reading on his web page and looking up some of the references, I found some interesting comments about bimanual control. He writes:
During the nineteen eighties, I spent a lot of time trying to understand the logic of division of labour between the left and the right hands in human movements. I came to believe there is something deeply misleading to the concept of hand dominance, central to established thinking in the field of human laterality.
April 22, 2006
Palindrome
Found some interesting dance/performance examples at the web site of German/American performance company Palindrome. They are also developing the EyeCon video software for interactive performance.
April 21, 2006
LibriVox
LibriVox is a voluntary project set up to record all books in the public domain and make them available, for free, in audio format on the internet. Besides the joy of having audio books, this is also very interesting from a speech/voice research perspective.
Another source for open-source text files is the French Incipit blog. Interestingly enough, I found a French version of Nicholas Cook’s introduction to music!
April 20, 2006
Ball State University Interactive Wireless Sculpture
Ball State University Interactive Wireless Sculpture is an outdoor interactive digital installation interpreting the wireless data infrastructure at Ball State University. Beginning the evening of April 18 and running through April 19, this digital media sculpture, consisting of 4 projection screens, computers, speakers and lights, will broadcast interactive media that reacts to the amount of traffic on the campus’ 15 wireless zones. The sculpture will contain its own wireless access points, sensing local interactions of viewers using wireless devices.
April 19, 2006
monome
{.imagelink}The monome 40h is a reconfigurable grid of sixty-four backlit buttons, connecting with USB and communicating both MIDI and OSC (Create Digital Music Review).
April 19, 2006
Sounds Like Bach
Douglas Hofstadter is discussing music and artificial intelligence:
Back when I was young – when I wrote “Gödel, Escher, Bach” – I asked myself the question “Will a computer program ever write beautiful music?”, and then proceeded to speculate as follows: “There will be no new kinds of beauty turned up for a long time by computer music-composing programs… To think – and I have heard this suggested – that we might soon be able to command a preprogrammed mass-produced mail-order twenty-dollar desk-model ‘music box’ to bring forth from its sterile circuitry pieces which Chopin or Bach might have written had they lived longer is a grotesque and shameful misestimation of the depth of the human spirit.
April 19, 2006
Why Is That Thing Beeping? A Sound Design Primer
Came across a nice introduction to sound design by Max Lord: Why Is That Thing Beeping? A Sound Design Primer - Boxes and Arrows
April 5, 2006
Theater Max
There seems to be a lot of initiatives for making “higher-level” abstractions for working in Max/MSP these days. Now, I just came across a project at UCLA intended mainly for theater productions:
Theater Max is the result of several years of work, lots of trial and error, and far too many hours of programming for us to count. What we now call Theater Max got its start in 2001 with a production of Eugene Ionesco’s Macbett.
April 2, 2006
SPEAR
{.imagelink}SPEAR is an application for audio analysis, editing and synthesis. The analysis procedure (which is based on the traditional McAulay-Quatieri technique) attempts to represent a sound with many individual sinusoidal tracks (partials), each corresponding to a single sinusoidal wave with time varying frequency and amplitude.
It offers some great features, and I particularly like the possibility to easily select single partials and edit them directly. Most controls also work in realtime.
April 2, 2006
VLDCMCaR
Bob L. Sturm at UC Santa Barbara:
{.imagelink}VLDCMCaR (pronounced vldcmcar) is a MATLAB application for exploring concatenative audio synthesis using six independent matching criteria. The entire application is encompassed in a graphical user interface (GUI). Using this program a sound or composition can be concatenatively synthesized using audio segments from a corpus database of any size. Mahler can be synthesized using hours of Lawrence Welk; howling monkeys can approximate President Bush’s speech; and a Schoenberg string quartet can be remixed using Anthony Braxton playing alto saxaphone.
March 30, 2006
Apple - Sound and Hearing
John Lazarro writes on the Auditory list:
*Apple released a software update today for iPods, that lets users set a maximum dB level for the device, and lets parents lockdown the maximum dB level of their children’s iPod with a combination lock. Apple also put up a website on how to use the feature to limit long-term hearing damage.
*
March 28, 2006
PLOrk: Princeton Laptop Orchestra
{#image113}Dan Trueman and Perry Cook at Princeton have set up an undergrad course called PLOrk: Princeton Laptop Orchestra, where they have 15 workstations consisting of Powerbooks, sound cards, sensor interfaces and spherical speakers. The idea is to give students the chance to improvise and experiment with electronic music in a really hands-on way (more info). Great idea! We should try and set up something like that in Oslo.
March 28, 2006
The 5 Rhythms
I recently got to know about the concept of 5 rhythms, and the Norwegian group doing this.
Gabrielle Roth’s The 5 Rhythmsare an exhilarating and liberating approach to the exploration of improvised movement and dance that is authentic, inspired and catalytic. The 5 Rhythms (Flowing, Staccato, Chaos, Lyrical, Stillness) are a map which can take you on an ecstatic journey, opening you to the inherent wisdom, creativity and energy of your body.
March 17, 2006
sCrAmBlEd?HaCkZ!
sCrAmBlEd?HaCkZ! is a Realtime-Mind-Music-Video-Re-De-Construction-Machine. It is a conceptual software which makes it possible to work with samples in a completely new way by making them available in a manner that does justice to their nature as concrete musical memories.
February 22, 2006
UBC Max/MSP/Jitter Toolbox
Just came across the UBC Max/MSP/Jitter Toolbox which seems to be quite similar to Jamoma. The UBC Max/MSP/Jitter Toolbox is a collection of modules for creating and processing audio in Max/MSP and manipulating video and 3D graphics using Jitter. I have just briefly tested it, and here are some screenshots from one of the example patches.
{.imagelink}
February 21, 2006
Olympic Figure Skating
{.imagelink}Watching the ladies’ figure skating competition from the olympics, I am amazed by the total lack of connection between gestures and music. To start off with, I am not very impressed by the music accompanying the programmes, most being massively layered, romantic orchestral music, but the fact that it is also recorded by a microphone in front of a moderate PA system in the skating hall does not call for a good listening experience.
February 17, 2006
Nord Modular
{.imagelink}Clavia has recently released a new version of their software for Nord Modular which now includes the possibility to create new settings based on evolution algorithms. These algorithms were part of the PhD work of my colleague Palle Dahlstedt from Göteborg, and makes it possible to create new settings from a set of “parents”. Very interesting stuff! The software is available as a free download for both Windows and OSX, but of course you need to have a Clavia synth to really appreciate this…
February 4, 2006
Access Hidden Files on iPod
I found a way of getting access to the music files on my windows-formatted iPod on a mac over at Ecoustics:
Launch the Terminal.application and type:
find /Volumes/[iPod’sNAME]/iPod_Control/Music -print | awk ‘¬ { gsub(/ /, “\ “); print }’
Substitute the name of your iPod for [iPod’sNAME]. Any spaces should be replaced with underscores (_). This will print a list of all the songs inside the Music folder with \ in place of spaces.
January 23, 2006
Into Great Silence
Film director Philip Groening has made Into Great Silence, a film about a Carthusian convent where the monks are living in complete silence:
Only in complete silence, one starts to hear. Only when language resigns, one starts to see.
About 160 minutes of next to total silence. How can that work in a cinema? How silent can it be? Will sound suddenly burst out, without warning? How dark can it be amid the unlit masses of monks in their sanctuary?
January 16, 2006
Intelligent MIDI Sequencing with Hamster Control
I first came across the Intelligent MIDI Sequencing with Hamster Control project a couple of years ago, and still find it a very funny!
January 15, 2006
New Cycling '74 forum
Just found out that Cycling ‘74 has released a brand new forum. Looks very promising, and it is nice that everything is available as RSS feeds.
January 14, 2006
Digital thoughts by Paul Lansky
I came across the piece Notjustmoreidlechatter by composer Paul Lansky, showcasing a fascinating use of voice for creating musical rhythm and texture. And then I found the article Digital thoughts where he explains some of his compositional ideas throughout the years.
January 12, 2006
Demonstrations of Auditory Illusions
I came across a nice site with demonstrations of auditory illusions. There is also the page of Diana Deutsch.
December 30, 2005
Quintet.net
Georg Hajdu has just released a new version of his Quintet.net performance system.
“Quintet.net is an interactive network performance environment invented and developed by composer and computer musician Georg Hajdu. It enables up to five performers to play music over the Internet under the control of a “conductor.” The environment, which was programmed with the graphical programming language Max/MSP consists of four components: the Server, the Client, the Conductor and the Listener; the latter component enables the Internet/network audience to follow the performance […].
December 30, 2005
Web Phases
I have been reading up on hypertext and hypermedia theory and looked around for papers on hypermusic. One of the few papers I found on the topic was by John Maxwell Hobbs describing his 1998 piece Web Phases.
November 29, 2005
Practising electronics
I think Kurt Ralske puts it very well in “The Pianist: A Note on Digital Technique”
“For the classical pianist, the tedium of endless hours of practicing scales takes on an aura of nobility; it’s a virtuous, character-building activity. Instead of practicing scales, the digital artist learns software and hardware, learns programming languages, learns the techniques of creating digital models of sound, image, information, and intelligence.”
I wonder when music technologists will be employed in orchestras as musicians.
November 28, 2005
ChucK : Concurrent, On-the-fly Audio Programming Language
I finally got around to download and try
ChucK : Concurrent, On-the-fly Audio Programming Language by Ge Wang. Feels a bit strange, but I guess I need to work a little bit more with it. It says something about graphical tools in the readme, and I’m looking forward to that.
December 13, 2001
Laser dance
Working with choreographer Mia Habib, I created the piece Laser Dance, which was shown on 30 November 1 December 2001 at the Norwegian Academy of Ballet and Dance in Oslo.
The theme of the piece was “Light”, and the choreographer wanted to use direct light sources as the point of departure for the interaction. Mia had decided to work with laser beams, one along the backside of the stage and one on the diagonal, facing towards the audience.