multi-sensory paper image
Human-Centered AI
Autonomous is Not Enough: Designing Multisensory Mid-Air Gestures for Vehicle Interactions Among People with Visual Impairments

Should fully autonomous vehicles (FAVs) be designed inclusively and accessibly, independence will be transformed for millions of people experiencing transportation-limiting disabilities worldwide. Although FAVs hold promise to improve efficient transportation without intervention, a truly accessible experience must enable user input, for all people, in many driving scenarios (e.g., to alter a route or pull over during an emergency). Therefore, this paper explores desires for control in FAVs among (n=23) people who are blind and visually impaired. Results indicate strong support for control across a battery of driving tasks, as well as the need for multimodal information. These findings inspired the design and evaluation of a novel multisensory interface leveraging mid-air gestures, audio, and haptics. All participants successfully navigated driving scenarios using our gestural-audio interface, reporting high ease-of-use. Contributions include the first inclusively designed gesture set for FAV control and insight regarding supplemental haptic and audio cues. READ MORE