Presented at 8:00am in Colorado F on Wednesday, November 8, 2023.
#38098Speaker(s)
- Guru Nanma Purushotam, University of Illinois Urbana-Champaign
- Vineeth Parashivamurthy, ,
- Ann Fredricksen, Coordinator of Accessible Media Services, University of Illinois
- Lawrence Angrave, , University of Illinois
Session Details
- Length of Session: 1-hr
- Format: Lecture
- Expertise Level: All Levels
- Type of session: General Conference
Summary
We present a wearable navigation aid, “Empowering Indoor Mobility.” By combining multichannel haptic feedback and advanced computer vision, users can navigate indoor spaces and avoid obstacles. Together with a formal presentation, there will be an opportunity to experience this technology. Further, we invite you to be part of lively audience-wide discussion of the accessibility challenges of our educational physical spaces and how wearable accessible technology can empower individuals.
Abstract
Navigating indoor environments can be a significant challenge for people who are blind or have low vision. Navigating to a goal while avoiding obstacles is challenging especially when moving in unknown, changing, cluttered, or communal spaces, or with hazards at different heights (e.g. desk edges or low ceilings). While assistive technologies exist to identify-and-speak visual items or aid in navigation, they fall short in addressing the unique requirements of indoor mobility of obstacle collision. In response to this issue, we present a novel wearable navigation aid designed specifically to address the challenges of navigating in indoor settings.
Our proposed system, named Empowering Indoor Mobility, employs advanced computer vision techniques to identify obstacles and calculate an optimal walking path in real-time. This information is then presented to the user through a lightweight wearable accessory which includes an array of haptic actuators. The accessory provides intuitive tactile feedback, allowing users to quickly sense and prioritize close vs. nearby hazards and understand and follow the suggested path, without the need for extensive training or adaptation. Vision and Haptic processing is performed locally, i.e. the technology does not require internet connectivity.
In this presentation, we discuss the development, implementation, and evaluation of the technology. We also look forward to a lively discussion and hearing everyone’s ideas, comments, and feedback on how this and similar next-generation of digital accessible technology can be used to empower individuals in navigating indoor spaces.
Keypoints
- Empowering Indoor Mobility enhances navigation with advanced computer vision and real-time path optimization.
- The wearable accessory utilizes haptic feedback for intuitive hazard detection and path guidance.
- Local processing ensures independence from internet connectivity, promoting accessibility and reliability.
Disability Areas
Vision
Topic Areas
Assistive Technology, Research, Uncategorized
Speaker Bio(s)
Guru Nanma Purushotam
Nanma is currently pursuing a Master of Computer Science degree at the University of Illinois Urbana-Champaign, while also working as a graduate assistant at the Accessible Media Services Office in Disability Resources and Educational Services. Her recent projects involve enhancing transcription accuracy and developing a computer vision project to empower indoor mobility for individuals with visual impairments.
Vineeth Parashivamurthy
Vineeth holds a Bachelor of Engineering degree in Mechanical Engineering and is currently dedicated to developing an affordable wearable haptic navigation device, designed to assist visually impaired individuals in their daily navigation needs.
Ann Fredricksen
Ann Fredricksen has her BA in Physics from Carthage College and has her MS/LIS degree from the University of Illinois Urbana-Champaign. She has been working for Disability Resources and Educational Services in the Accessible Media Services Office since 2008. She now serves as the Coordinator of Accessible Media Services, which provides accessible learning material for courses being taught within the University’s system. In 2020 she was Awarded the Lorine Y. Cowan Award for Excellence in Access and Accommodations from the Office of the Vice Chancellor for Diversity, Equity & Inclusion.
Ann has been focusing her career on media accessibility and not only is she the captioning expert on campus but is also responsible for responding to inquiries and information about audio description. She has created a captioning training course to serve as a resource for the University of Illinois’ faculty and staff on how to meet accessibility standards with free to low-cost software already available to this population.
Lawrence Angrave
Lawrence Angrave is a Teaching Professor at the Computer Science department of the University of Illinois at Urbana Champaign (UIUC). His interests include digital accessibility and how students can succeed in on-campus and online learning environments, especially underrepresented students. With live captioning during lecture, and online text-searchable videos, his ClassTranscribe project is helping all students be more effective learners.