Mobile Accessibility Tools: Camera Switches and More

Image Description: Futuristic image of an open field with accessibility devices available to all

This article is based on a talk Shafik Quoraishee gave at A11yNYC.

Mobile has its own accessibility guidelines and standards. They align with the web accessibility guidelines and standards. But that’s just a precursor. Before diving in, here’s a quick history of accessibility for mobile devices.

Mobile devices haven’t been around for that long. The very first mobile devices were Nokia devices, Blackberries, personal digital assistants (PDA), and Palm Pilots. The only accessibility they had was text-to-speech and voice recognition.

The iPhone came out in 2007. Then in 2009, Apple added VoiceOver to the iPhone. That’s when mobile devices started having a screen reader, touchscreen accessibility, and accessibility tools.

Android followed in 2008 with its first usable market Android device. And in 2011, they started adding TalkBack and a lot of comprehensive accessibility options. The difference with the Android operating system is sparse and bifurcated, and there are many different Android devices.

So, it took time to unify the Android accessibility options across all the devices. Moving forward, the W3C released the WCAG 2.1 standard, which incorporated mobile-specific accessibility guidelines.

Accessibility Shortcuts

If you go through the steps of accessing the accessibility menu to make changes, it can be complicated. Every time you want to turn on an accessibility option like magnification and screen readers, it’s easy to do it when you’ve got the accessibility shortcuts set up.

The standards are uniform across devices. Users can tailor the shortcuts based on their preferences. You can access shortcuts through a button on the phone like pressing both volume buttons on an Android or pressing the power button three times on the iPhone.

There are advanced features like AssistiveTouch on iOS and Accessibility Menu on Android to open accessibility options without using a hardware button. For example, you could swipe down or tap the back of the iPhone or tap an onscreen button to open the accessibility options.

With the variety of ways to create shortcuts, it’s accessible to everyone. For example, some people may not want to push a button three times. In this case, then they can use an on-screen button or swipe down for accessibility options.

Switches

What’s interesting about Camera Switch Access is that the device can be configured to read facial expressions. You use your facial expressions to interact with different options on your device. That includes things like clicking on an app, interacting with an app, playing a game, going to settings, going to navigation options, and so on.

Camera Switch Access comes with almost every device. Most devices’ cameras can recognize facial gestures. They let users assign specific actions to specific interactions. This is valuable to people who may not want to interact with a device using their hands or voices. Facial gestures give people another means of using their devices.

If you can’t obtain a hardware switch, you can use the camera’s facial recognition to move around the device. A user can choose a gesture and the size of the gesture. For example, a small smile might be one gesture while a big smile is another.

The video contains a demo of how to set up the camera switch and use it to navigate and interact with an app. Switch access is multimodal. This means you can interact with the phone using more than one method, such as gestures, visual gestures, vocal gestures, and manual interactions. It’s also possible to mix switches and controls.

Another type of switch is a hardware switch. There are many different kinds like these switches from AbleNet. These work with Android and iOS devices, tablets, computers, and devices with Bluetooth or USB. Hardware switches come in different forms, such as buttons, sip-and-puff, or joystick controls. Switches can be configured and tailored to specific tasks or applications. The video contains a hardware switch demo.

Live Captions

Live Captions is a technology that provides captions from any source of audio on your device. It’s not necessarily tied to the video. For example, a video doesn’t have captions. You can turn on Live Captions and the device will caption the video. It can work with other sound sources on a device, such as audiobooks or a live conversation.

They work better in some situations than others. For example, if someone is singing a song with music, they tend not to work well. Live Captions work best with high-quality sound that’s clear without background noise.

You can access Live Captions but selecting a button, which you can move around the screen. If you don’t want a button on the screen, you can add Live Captions to Accessibility Shortcuts.

Accessibility Scanner for Androids

Accessibility Scanner is an Android tool that’s the equivalent of a color contrast checker. But it’s more advanced in what it does as it checks a lot of different accessibility settings for a device. The scanner scans an app to provide suggestions on how to improve the app’s accessibility.

Be My Eyes

Vision is a dominant sense for many as people’s eyes provide an endless stream of information. But that’s not the case for someone who is blind. OpenAI’s Be My Eyes is a voice-first interface that tells the user what the artificial intelligence sees.

The user and artificial intelligence can have a conversation. It can start with a simple “Tell me what do you see.” Artificial intelligence can read everything about the environment and describe it. It lets you ask questions about the scene. For example, if AI mentions a bridge, the user can ask if the bridge is currently up or down.

There are many interesting things about accessibility switches, mobile devices, and other things that are very useful, in terms of how many people use mobile devices today. And how these are becoming more and more relevant in everyone’s everyday life.

Resources

Video Highlights

Watch the Presentation

Bio

Shafik Quoraishee is a Games Engineer at The New York Times, who’s building out the games mobile platform, specifically in Android.

Before his involvement with Games, he worked in the media industry for several years and in almost all walks of application development from full stack and data engineering to artificial intelligence. He has experience in building out features to make them more accessible and usable to the user population. Connect with Shafik on Medium and X @SQuoraishee.

Equal Entry
Accessibility technology company that offers services including accessibility audits, training, and expert witness on cases related to digital accessibility.

Leave a Reply

Your email address will not be published. Required fields are marked *