DOI: 10.1177/10711813251357895 ISSN: 1071-1813

Evaluating Multimodal Interfaces for Visually Impaired Users in Autonomous Ridesharing: A Usability Study

Gerui Xu, Mahwish Zaman, Cory Vogel, Yvonne Chen, Laura Weisz, Shan Bao

Transportation accessibility remains a critical challenge for visually impaired individuals, constraining their autonomy and societal participation. Although autonomous vehicles (AVs) hold transformative potential for enhancing mobility, prevailing human-machine interfaces (HMIs) frequently neglect the unique interaction requirements of this population. This study investigates the efficacy of a multimodal HMI explicitly designed to facilitate autonomous ridesharing interactions for visually impaired users. Employing a between-subjects experimental design, we evaluated user trust and satisfaction across six core ridesharing functions under three distinct conditions: (1) visually impaired participants with multimodal (audio-visual) feedback, (2) non-visually impaired participants, and (3) visually impaired participants without audio feedback ( N  = 24). Our findings demonstrate that audio-enhanced multimodal interfaces bridge the accessibility gap, enabling visually impaired users to attain trust and satisfaction levels statistically comparable to those of non-visually impaired users. Furthermore, the absence of audio feedback significantly degraded navigational confidence, vehicle identification accuracy, and overall user experience ( p  < .05). These results theoretically validate the significance of auditory cues in AV HMIs, while empirically confirming design principles for universal accessibility. By providing actionable guidelines for inclusive interface design, this work advances equitable mobility solutions and underscores the imperative of user-centered autonomy in next-generation transportation systems.