The Leap Motion Controller: Good for Users with Motor Differences?

Equal Entry
December 8, 2017

Leap Motion app home screen

by Sam Berman

The Leap Motion Controller is an exciting new navigation and interaction device for virtual reality and personal computing on both Mac and Windows platforms. This device promises to allow users to navigate and interact with content with only their own hands rather than holding onto a physical object such as a mouse. My time using the device showed me that there is a lot of potential, but that there are a lot of improvements that need to be made before I give up conventionally established navigation tools.

The device itself is a minimal motion detecting camera that’s smaller than a smartphone. To set it up, simply plug it into a USB port on your computer, place it on your desk or table within arm’s reach, and download the software for your operating system of choice.

Once the software installs, you can launch a program called App Home. This program is essentially an app store with programs designed specifically for the sensor. When App Home first launches, it will recommend that you start with an application called Playground, which is essentially a gamified tutorial that aims to help orient new users to the device.

When I opened Playground, I experienced my first issue: a smudge that prevented the camera from tracking my hand movements. Once I cleaned off the smudge, I was presented with my first task: place four rectangular heads on four robotic figures. This was deceptively challenging. As soon as I lifted my arm up to bring my hand in the range of the sensor, I sent the heads flying across the screen.

It took me a few attempts before I could pick up one of the heads with my pointer finger and thumb without first knocking it out of reach accidentally. It took me another few tries before I was able to place the head on a robot without first unintentionally knocking the robot down or out of the way. I noticed the sensor is very sensitive to subtle movements and is prone to tracking movements I didn’t intend for it to replicate.

For example, on numerous occasions, it registered the movement of my arm as I moved my hand into the range of the device. This caused me to knock around the objects on the screen, making it more difficult to interact with them.

Once the robots had their heads attached to their bodies, knocking into them caused further negative repercussions. Firstly, their heads could be knocked off. Secondly, the robots could become dizzy (their eyes would display little swirls), which caused them to become less stable on their wheeled feet and knock into the other robots.

The next activity in the Playground app had me pulling petals off a flower “she loves me, she loves me not” style. This forced me to focus on being more gentle and delicate with my movements. It was very easy to grab the whole plant by the stem and rip it from its roots. During both these activities, I saw the sensor incorrectly map my hand and finger movements multiple times.

There were times when my hand was open but the device showed it as closed. There were also times when my hand was closed but the device thought it was open. It seemed to have the most difficulty when I held one or two fingers out with the rest of my fingers closed.

A light-infused hand illuminates darkness. Text onscreen says "Place hands above the Leap Motion Controller".

In hindsight, these aforementioned activities would have been considerably easier to accomplish had I been able to use both my hands; one hand to hold the robot or flower steady while placing the heads on or pulling the petals off with the other.

There are a variety of reasons why people may lack dexterity in both hands, and these differences should be acknowledged and accommodated. It should not be assumed that every user has the same level of access to various functionality.

Beyond cerebral palsy, which in my case hinders the fine motor skills of my right hand, I can see amputees and people with arthritis benefiting from the gesture-based modes of interaction that Leap Motion provides. Users with these motor differences could, in theory, still use their affected hand or arm to select options presented onscreen or in virtual reality.

This would not only improve overall performance during gameplay, for example but could also encourage people to use a limb that they may underutilize. Overall, gesture-based interaction would make switching between typing and navigation much more efficient for people with certain types of motor differences.

The developers of the Leap Motion Controller created a very compelling navigation interface. Incorporating natural behaviors, the device relies heavily on human intuition to gain an understanding of how it works. Its simple design is clever but currently falls short of its potential. I would be interested to hear other people’s perspectives on using this technology — please share your thoughts in the comments! — and I look forward to seeing what may come with future updates to Leap Motion.

Sam Berman has two years of experience as an educator and advocate for people with disabilities, and in the aging community teaching computer literacy. In the spring of 2016, he attended a web development boot camp at the New York Code & Design Academy with the hopes of combining his passions for technology and advocacy into a career. As a consultant for Equal Entry, he taps into these experiences to make the web easier to use for people with disabilities.

Equal Entry
Accessibility technology company that offers services including accessibility audits, training, and expert witness on cases related to digital accessibility.

One comment:

  1. i work with people with severe levels of cerebral palsy and have a hard time with hand recognition. their contracted hand that they open with effort seems to be badly recognized by the sensor that loses tracing very often leading to a frustrating experience. I guess that Leap programmers have to give more attention to getting the hand model to recognize these cases. Maybe training a ML model with hands of people with cerebral palsy and providing an option in the leap driver to select the “normal” or the cerebral palsy mode

Comments are closed.