Molly Watt has Usher Syndrome; she’s deafblind. I couldn’t imagine being in her shoes, trying to use technology the way she must, but the way she’s documented her experiences put a smile on my face:
So far for me the most useful App on the Apple Watch is Maps – on my iPhone I can plan my journey from one destination to another, for me it will be on foot with Unis my guidedog.
This is where Haptics really come into its own – I can be directed without hearing or sight, but by a series of taps via the watch onto my wrist – 12 taps means turn right at the junction or 3 pairs of 2 taps means turn left, I’m still experimenting with this but so far very impressed – usher syndrome accessible!
I’m fortunate enough to have my full vision and hearing, so I tend to think of interfaces from that perspective. Watt’s post — indeed, all her posts — are a welcome reminder that there are those who experience technology completely differently.