Developers and hackers have been using the Xbox Kinect in several ways, some of which demonstrate useful applications for assistive technology. One such application for the Kinect involves its speech recognition capabilities, as we examined in a previous entry. A second application involves the Kinect sensor’s ability to record physical gestures: Users with speech impediments can perform gestures as a substitute for verbal locution.
A case in point: Chad Ruble, a talented developer, has been working on ways to help his mother maintain contact with her family. As the result of a stroke, she suffers from aphasia, an impairment in the ability to use or understand words. Ruble recently used a Kinect sensor to “bridge the digital keyboard gap,” so that his mother could once again send emails. He designed a program that allows her to select simple icons associated with different emotions, which are converted into text.
The first step was coming up with a visual “dashboard” to help her compose simple messages. Each icon is associated with a specific emotion, which can then be qualified by an amount. I used a Kinect with the SimpleOpenNI library for Processing along with some gesture recognition code from Matt Richardson to track the position of my mom’s hand. I then used a sample Processing sketch from Daniel Shiffman to generate and send the email by using the green arrow button. The red “X” resets the screen.
The proliferation of new technologies such as the Xbox Kinect, along with the admirable work of developers like Ruble, are allowing computer users with disabilities to communicate with others in ways that would have been impossible twenty years ago.