Usher Syndrome Advocate Molly Watt recently wrote about her experience with the new Apple Watch. Considering she was born deaf and is legally blind, you would think the tiny touch screen would be off limits for her.
However, her experience shows that smartwatches may be more useful than some folks have imagined, especially for the deaf community.
Usher Syndrome is a genetic disorder that causes hearing loss and visual impairment. Watt is severely deaf and only has a small tunnel of vision in her right eye. She was impressed with the features of Apple Watch, especially the Digital Touch and Haptics features.
In her blog, Watt explains that she has devised a code system that she has with her friends using the Digital Touch feature, which allows the user to sketch something and send it to another user. Watt notes that her mother uses Digital Touch to get her attention when she is at home in her room without her hearing aids in. Mom sends a sketch, daughter gets a tap on the wrist and knows to check in.
The Haptics feature also allows someone with visual or auditory impairment to receive an alert and easily know about it. When you receive text messages, phone calls, emails, and more, you receive a vibration on your wrist, sort of like someone is poking you. So, the wearer will always know when a notification is coming in, no matter where the smartphone is at the time.
Watt explains that Maps on her Apple Watch has proven very useful when getting location directions. “I can be directed without hearing or sight, but by a series of taps via the watch onto my wrist.”
Although Watt is speaking about Apple Watch, the vibration feature is not exclusive. It comes standard on most smartwatches. The potential for creating useful apps for deaf and hard of hearing, as well as the blind community, is great.
Back in 2013, South Korea based Moneual developed a smartwatch that recognizes sound, sends a vibration to the user, and displays an image of what that sound is. For example, a horn honking or doorbell ringing will alert the user visually and with a tap.
Pebble, Samsung, and LG have an app called “ISeeWhatYouSay,” which acts as speech-to-text by sending what one person is saying via a smartphone to the user’s wrist as text.
It is possible to create an app that would translate all speech into a series of taps so that a person suffering from dual sensory impairment (like deaf and blindness), could communicate with others more easily.
Say what you will about wearable computers, but there is definitely a lot of potential for them to become very useful for the deaf, and even the blind community.