As mentioned recently on BBC Click, touch screen devices have caused problems for blind and visually impaired people – as you can imagine, they don’t get the same contact and recognition as they would with physical buttons. Various companies have produced physical add-ons to represent the placement of keyboard buttons on devices such as the iPhone, but this is simply not a long term solution. The researchers behind Tesla Touch however, may provide the answer.
Developed at Disney Research in Pittsburgh, Tesla Touch is based on the electrovibration principle, which can programatically vary the frictio between sliding fingers and a touch panel. It does not require any mechanical parts and is inexpensive. Plus it requires low power consumption. The technology allows the user to physically feel the action they are doing rather than just sliding their fingers across a glass screen, which no apparant output reaction. This is of course not just a benefit for those with impaired vision. Lately I have been researching interactivity in products currently available (iPad, Kinect etc) and what is always missing is the ability to feel a real sort of physical feedback from a physical action – which is an important feature of how we interact with objects everyday. I wonder if Tesla Touch will really take off, or if peope will become irritated with the feedback and switch back to senseless media as it’s what they’re used to (especially digital natives).
In other news, the Computing Aesthetics book I’m reading is a lot to take in at once, but fantastic! I’ve began watching some MIT lectures online for computing and a bit of C++ learning, I ordered an Arduino Uno starter pack and a infrared distance sensor for myself for christmas, I hope they arrive soon! And lastly, now that I’m back in Liverpool, I need to get down to the Nam June Paik exhibition as soon as I can!
Good quote from John Maeda from my Maeda & Media book (2000, p. 320) about how we link actions with sound (rather than but similar to – physical feedback) –
Sound as an effect, versus a necessity, is a difficult distinction to make. We expect most of our actions to result in reactions that are not only visual but aural. Without sound, one is distanced from the reaction, making it more abstract. Once a sound is made, however, your body receives confirmation that the event is not imagined, but real.