Even has carried out a so-called “practical” master thesis, with a more practical focus. He has carried out a mocap analysis of how people move while playing computer games with a Kinect device, and has also prototyped several mocap instruments.
Sound is often used as a feedback modality in technological devices. Yet relatively little is known about the relation between sound and motion in interactive systems. This thesis exam- ines what happens in the intersection between human-computer interaction, motion and sonic feedback. From the connection of music and motion, coupled by technology, we can draw the expression “Music Kinection”. A theoretical foundation accounts for the relationships that exist between sound and motion, and cognitive foundations for these relationships. This study of literature on music and motion, and music cognition theory, shows that there are many aspects that support various relationships between sound and motion. To see if it is possible to detect similarities between users of an interactive system, a user-study was performed with 16 subjects playing commercially available video games for the Kinect platform. Motion capture data was recorded and analyzed. The user-study showed that there is an overall similarity in the amount of motion performed by the user, but that there is some deviation in amount of motion performed by body parts important to the gameplay. Many users will choose the same body part for one task, but will apply different tactics when using this limb. Knowledge from the theory and observation study was used in the practical explorations of sound-action relationships. Two installations, Kinect Piano and Popsenteret Kinect installation, was made, together with two software prototypes, Soundshape and Music Kinection. The practical study showed that working with full-body motion capture and sound in human-computer interaction is dependent on good motion feature extraction algorithms and good mapping to sound engines.