Music allows people around the world to creatively express their thoughts and feelings. For people who share this passion, losing the ability to make music would be a depressing and scary proposition. David Anderson is one of these people. He has a neurological disorder (ALS) that weakens the body’s muscles progressively over time. ALS typically does not affect cognitive abilities, even though many people with ALS will eventually lose all ability to control voluntary movement. David Anderson’s ALS is at the stage where he is only able to physically control the movement of his eyes. With a group of friends, the 1 Blink Equals Yes project was launched on Kickstarter to source funding for creating a software solution that would allow David to perform a DJ set on New Years Eve of 2012. The design challenge was to determine how to enable a person who only has control of his eye movements to be able to successfully DJ.
Eye tracking technology exists to enable people that cannot utilize traditional input methods like the mouse and keyboard to control software. The most common design pattern for eye tracking technology is to have the mouse cursor follow where the user is gazing on the screen. Eye tracking technologies use image processing techniques on data from a connected video camera to assess where the user is directing his or her attention. The software can simulate a mouse click when it identifies that the user has focused his or her attention on a particular location on the screen. The following video demonstrates a user typing messages and controlling the UI using an assistive technology called the Eyegaze Edge.
Ableton Live is a software tool favored by many of the world’s top DJs. Ableton’s user interface is a great example of device independence: as it was created to work with whatever type of device users have when they access it. Many musicians do not want to be limited to using only the mouse or keyboard. Ableton Live produced an API that allows various hardware and software manufacturers to write custom interfaces to control the underlying functionality of the software. Ableton is a perfect fit for an accessibility project that will need to imagine new interfaces optimized for eye tracking control.
Ableton’s interface comprises launchable clips contained within a grid, as well as rotary dials and sliders to control minimum and maximum values of various parameters. These parameters can be modified directly using a mouse or keyboard inside the Ableton user interface.
These parameters can also be accessed through using a hardware controller such as the APC40. With this type of interface, the user does not need to use a mouse or a keyboard and can instead work directly on the piece of hardware with physical buttons, dials, and sliders. Many users appreciate having a more tactile interaction with the music they create.
Ableton Live can also be accessed through touch tablet interfaces. TouchOsc is an example of software that can be used to build different visual layouts. Most UI layouts developed for a touch interface reduce the number of controls on the screen and make them larger so they are easier to physically control. Additionally touch control of Ableton means the user does not have to be seated in front of a mouse and keyboard to create music.
The 1 Blink Equals Yes project is a success: KARE11 has the video from New Years Eve 2012 with David Anderson dj’ing using only his eyes. He has decided that he will not let his physical condition stop his ability to communicate with the world through music. We are excited to see how this technology evolves over time and to learn more about the technical implementation. The Tobii Rex is one example of a new hardware device being manufactured to enable gaze interaction to be more seamlessly built into software. We look forward to seeing how we all will benefit from research and innovation in this area.