One of the first posts on this weblog was about iPhone accessibility, and touch screen accessibility in general, which is been something I’ve had on the back burner for a long time. At the start of this year I was thinking about what my priorities should be in 2009 (wedding clearly being number one!) in attempt to get more done and, by coincidence, the Royal Institution Christmas lectures included a demo of the Dasher project which I think could be quite useful: reading most likely letters first rather than showing them in bigger boxes for example. A circular gesture to cycle through the letters in order of probability might work quite well.
Not everyone can see the point of considering this kind of thing on what, to the sighted, is a very visual device. I think that’s a pretty limited view of what is also a powerful, connected, personal computing device with a very flexible input device. Fortunately other people see it this way as well and are investigating ideas to make devices like the iPhone accessible.
I also found a site dedicated to mobile accessibility, via Tim’s blog, so perhaps a little momentum is building at last. One of my colleagues even twittered about accessibility of touch screen phones while I was writing this post!
Maybe I should get Andy to mock something up now he’s a pro at iPhone apps…
I have largely avoided all the iPhone excitement (being a child of the 70’s, I’m more than happy with my pocket calculator) but I recently found a post about the problems it has with accessibility.
A few years ago I had an idea about making touch screens devices more accessible after I noticed the rise in self-service kiosks that had no other way to provide input. The same idea could be ideal for the iPhone so I emailed email@example.com to see if Apple would be interested.
The full description can be found on the IP.com prior art database by searching for document ID IPCOM000125721D but the basic idea is for a touch screen device, like the iPhone, to support an accessible mode where, instead of the usual graphical buttons and layouts, large areas of the screen are used with a telephone prompt style system to interact with the user. For example, an audio prompt to, “Press the top right of the screen to make a call” and so on. High contrast blocks of colours would make it possible to find the right area with very little vision, and completely blind users could find the edge of the screen by touch, with small modifications to the case if necessary. Numeric input, to enter a phone number for example, could be handled with simple tactile markers arranged around the outside of the screen. Switching to the accessible mode could be as simple as pressing anywhere on the screen for a few seconds, or recognising a pre-paired bluetooth headset. The main advantage of all this is that it is largely software based with no additional external controls required, so ideal for small devices, or touch screen kiosks that have already been installed… and easy to add to something like the iPhone.
So it looks like I’m interested in the iPhone like everyone else now!