Accessible Multimodal Input in Augmented Reality Training Applications | Accessibility VR Meetup Recap

In this talk, Ashley Coffey shares PEAT research on making immersive hybrid workplaces more inclusive and accessible. She reflects on themes explored in an “Inclusive XR in the Workplace” white paper co-authored with the XR Association.

Tim Stutts shares his design experience with multi-modal inputs and sensory feedback for augmented reality head-mounted displays, touching briefly on his previous work on Magic Leap’s Lumin OS, symbolic input and surrounding accessibility efforts, before diving into current Vuforia Work Instructions applications for HoloLens, RealWear, and most recently Magic Leap, with the debut of Capture and Vantage applications to the platform.

We’ve pulled together the resources and highlights from this meetup. Explore the resources and watch the video.

Resources

Highlights

Leave a Reply

Your email address will not be published.