Gestures and technology

What I find most fascinating about Apple’s new iPhone, is the shift from buttons to body. Getting away from the paradigm of pressing buttons to make a call or to navigate, the iPhone boasts a large multi-touch screen where the user will be able to interact by pointing at pictures and objects. Furthermore, the built-in rotation sensor will sense the direction of the device and rotate the screen accordingly, somehow similar to how new digital cameras rotate the pictures you take automatically. It seems like this will be the year when technology finally got a bit more human-friendly. Nintendo started by launching the motion-sensing game console Wii, and now Apple follows up by removing the buttons on a mobile media phone.

This new commercial turn comes as no big surprise for everyone that has been involved in technological research in the last years. Human-computer interaction has grown to a large research field dealing with this type of things, and the industry has a lot of prototypes to build on when developing new devices. Motion sensing and analysis is a hot research topic and is certainly the future when it comes to human-computer interaction.

Another hot research topic, that will probably also make it to new commercial products in not too long, is semantics. Rather than having one-to-one systems where you press a button and get a response, future devices will be able to sense and feel in a human-like fashion and respond accordingly. Then we can finally start talking about “human-friendly” devices.

Published by


Alexander Refsum Jensenius is a music researcher and research musician living in Oslo, Norway.

2 thoughts on “Gestures and technology”

Comments are closed.