A first test of the WordPress application for iPhone. Typing on the iPhone is slow, so I probably won’t use it that much, but it could come in handy.
It took me a while to realize that the microphone on the iPhone headset actually works when plugged into the headphone input on my MacBook. When plugged in, the microphone shows up in the preference pane. Cool, no need to buy a splitter, then, and it saves me carrying around an extra headset for skyping.
I recently upgraded to iWork ’09. I don’t use Pages or Numbers for anything, but Keynote is my preferred presentation software, so I thought it was good to get the latest version. The new animation features are ok, but the most interesting update so far is the possibility to control presentations from an iPhone.
While the first version of Keynote crashed once in a while, I can’t remember one single crash with the ’07 version. I guess this was one of the reasons I didn’t take enough care when I was working on a presentation last night, and obviously forgot to save what seems to be half an hour of editing. That backfired on me now when Keynote for some reason decided to die when changing between two pages in edit mode. Spending time on recreating the slides is annoying, but I am still puzzled about the fact that none of the iWork applications have an autosave function. Is there any logical reason Apple has chosen to omit this type of functionality?
The whole incident reminded me about a blog post that Hans Steiner wrote some years back called Ditch File -> Save!, where he asks why we have a save function at all. In fact, several of the programs I use on a daily basis do not have any save button, e.g. Journler. You change something and it is there.
All the big companies are launching new devices with “gesture control” these days, and Microsoft is following along . This insider story presents some of the new features in Windows Mobile 7 which is supposed to take on the iPhone later this year.
The most interesting part of the “leak”, I think, is the differentiation between touch gestures and motion gestures, where the former is related to what I would call manipulation, such as in this example of one and two finger strokes:
From the text and the accompanying images, motion gestures are used to denote actions related to movement of the device, such as moving it left or right, but also some more complex actions, such as shaking.
I think the separation between “touch” and “motion” gestures makes sense in this context, and it is along the lines of a more extensive set of definitions I have been working on. However, I think the words they use are somewhat misleading. Obviously, you have to touch the device when shaking it, and also be in motion when dragging with your fingers on the screen.
As I have mentioned elsewhere, I am thrilled by the fact that various sensing technologies are getting so cheap that they are incorporated everywhere. As could be seen from the presentation of Apple’s new iPhone, it includes an accelerometer to sense tilt of the device (and also movement if they decide to use that for anything), a proximity sensor (ultrasound?) to turn off the display when the phone is put to the ear and a light sensor to change the brightness of the screen (?).