The promise of augmented reality has tantalized for over two decades. Now, the parallel emergence of mainstream wearable technologies, improved speech recognition, and auditory scene analysis creates new opportunities to employ augmented reality to address real-world accessibility challenges for people with disabilities. How can we, for example, support a user with a language impairment in finding difficult words when needed, sense and feedback sound information for a user who is deaf or hard of hearing, or augment a blind person’s sense of touch to help interpret non-tactile information? In this talk, I will describe projects that address these questions as well as related issues, including basic concerns of what it means to make augmented reality accessible, the potential social implications of these technologies, and broader application beyond accessibility.
Leah Findlater is an Assistant Professor in Human Centered Design & Engineering (HCDE) at the University of Washington (UW). She directs the Inclusive Design Lab, whose mission is to lower barriers to technology use and information access for users with a range of physical, sensory, and cognitive abilities. She has published over 60 papers in top-tier academic venues, nine of which have been recognized with Best Paper or Honorable Mention awards at ACM CHI. She holds an NSF CAREER Award and her research is funded by NSF, the Department of Defense, Nokia and Google. Before joining HCDE, she was on the faculty at the University of Maryland’s College of Information Studies, and spent two years as an NSERC Postdoctoral Fellow at the UW Information School. She received her PhD in Computer Science from the University of British Columbia.