The State of Accessibility and VR, Part 1

Equal Entry
November 1, 2017
Fruit Ninja for PlayStation VR

The most popular application of VR as it stands now is gaming. Never has this been more obvious than with the opening of VR World, New York City’s first virtual reality arcade. While gaming is the most natural and apparent use, we will see VR incorporated in an increasingly broad array of industries as hardware and software become cheaper to produce and manufacture.

In the meantime, gaming presents an excellent opportunity for a case study of VR’s accessibility. While spending a few hours immersing myself in a few alternate realities at VR World, I had mixed experiences as a one-handed user.

One of the most profound aspects of virtual reality is the immersive experience it creates. In a video for Big Think, Wired founder Kevin Kelly explains, “VR and MR [mixed reality] operate on a different part of your brain than when you’re watching on a screen.”

According to Kelly, “The VR is working on a lower, different part of your brainstem, it’s much more primeval. When you take your VR goggles off, you remember not having seen something, but having experienced it.” This distinction between watching and experiencing is what makes virtual reality such a powerful platform.

However, as a one-handed user, having so much interaction dependent on the use of both hands took away from the experience. There were only two games that I played–Thumper and Fruit Ninja–that made me feel as if I weren’t missing out by playing with one hand. In both cases, I felt as if that had more to do with the simplicity of the gameplay than with the user interfaces of the games themselves.

Regardless of the complexity of the gameplay or the interfaces, games should be completely playable by users of all abilities. Additionally, regardless of my disability, the disability should not have to be represented in the game and should not affect my ability to excel and succeed in the game.

Conversely, my inability to press the buttons and triggers on the controller in my right hand directly impacted my ability to succeed in the majority of the games that I played. While in some games I could get by with one hand, it frequently felt like mere survival.

Part of the appeal of playing video games can be the aspect of escapism, of getting out of your normal day-to-day. This is exemplified by the recent push for VR in the senior community, where it is being presented to older adults as an alternative to passively watching TV, and even as a change of scenery for those with limited mobility who can’t get out and travel. (Articles about this trend were recently published by NPR and Forbes.)

In one game, I found myself in a cave with monsters charging toward me at a feverish pace. Again, due to my lack of fine motor skills, I could only defend myself with one gun, whereas if I had increased dexterity with my right hand, I could have more effectively battled the onslaught with two guns. The overall experience of these games simply wasn’t as immersive as it could have been since I was constantly being reminded of my real-world limitations in these virtual worlds.

Clearly, there are many opportunities for improvement. Alternative interaction methods need to be implemented in VR platforms so that those of us with physical disabilities aren’t hindered by these differences in “virtuality”. One potential alternative input method is thought control.

At first, this suggestion may seem outlandish, but bear with me. There are companies already working on this. Neurable is a company building Brain Computer Interfaces (BCIs) by leveraging and combining electroencephalography with custom algorithms that learn from your behavior. In their prototype game, you wear the VR headset as well as EEG sensors. Before starting the game itself, you train the algorithms to know when you are focusing your attention on an object.

According to the NY Times, “A pulse of light bounces around the virtual room, and each time it hits a small colored ball in front of you, you think about the ball. At that moment, when you focus on the light and it stimulates your brain, the system reads the electrical spikes of your brain activity.” While still in its early stages of development, this is promising as a way to supplement one-handed gameplay.

Another more obvious alternative interaction method is to incorporate an already common mode of input for computing into virtual environments. The method I’m referring to is that of typing. I think the most seamless way to accomplish this would be to use a tool such as TAP.

TAP is a wearable that allows the user to turn any surface into a keyboard. Currently, this device works with smartphones, tablets, and desktop/laptop computers. I can easily see this device being adapted and extended to VR.

A third alternative input method is to more broadly implement gesture-based interaction with a device such as Leap Motion. This device removes the need to hold a physical controller in hand to track movements. These controllers themselves can be a barrier to inclusion.

Head-tracking is another viable way to navigate interactive menus in VR. The company Sesame Enable has already developed smartphones that are navigable by tracking the movements of the user’s head. Eye-tracking and voice control are two additional ways for navigating and interacting with virtual worlds. VR voice interfaces are already being worked on by both Oculus Rift and IBM Watson.

In theory, at least, all of these interaction techniques could work simultaneously and in tandem with one another, letting players use whatever method best suits their individual needs.

The field of virtual reality is still in its infancy, but we are seeing an increasing amount of major tech companies incorporate VR into their platforms in various ways to varying degrees. There is no doubt that it will play a huge role in the future of computing. For this reason, UX developers and designers need to ensure that there are adequate, if not ample, alternative input methods.

Sam Berman has two years of experience as an educator and advocate for people with disabilities and the aging community teaching computer literacy. In the spring of 2016, he attended a web development boot camp at the New York Code & Design Academy with the hopes of combining his passions for technology and advocacy into a career. As a consultant for Equal Entry, he taps into these experiences to make the web easier to use for people with disabilities.

Equal Entry
Accessibility technology company that offers services including accessibility audits, training, and expert witness on cases related to digital accessibility.

One comment:

Comments are closed.