How might we help visually impaired individuals navigate independently inside a building?
A mobile application that serves as an in-building navigation system. The app identifies a user's location within a building, takes user input in natural language, and provides turn-by-turn audio navigation instructions to their destination
-Interviews, User Journey Mapping, Prototyping, User Testing
Sketch, Adobe Creative Suite, Zeplin, InVision
7 (2 Designers & 5 Engineers)
Buildings, Cities and Maps are designed with sighted users in mind, which makes navigation a challenging task for visually impaired users. "All maps are visual and I can't see!" said one of our interviewees, while talking about the challenges she faces in navigating inside a building.
What will the future campus look like, and more importantly how accessible will it be? Beyster Bluepath is a step forward towards an accessible campus and is currently built at the Bob and Betty Beyster Building at the University of Michigan, Ann Arbor. It was made possible by generous support from Vectorform.
There is already a lot of work done in the area of building products for people with visual impairment. So I started with secondary research, which helped me understand the different forms of visual impairment and the current products on the market. I went through app store reviews to get a better understanding of the pros and cons of these products. We then conducted 5 in-person interviews to understand what a typical day was like for someone with a visual impairment and gained insight on their needs. Our interview protocols were designed to understand a day in the life of our users, their current navigation techniques, obstacles they face and how current systems may or may not address their needs. We also ran a focus group at a local low-vision support group to validate the needs we had gathered thus far. Apart from users, we also interviewed a family member and a mobility instructor to gain insight on the current support structure.
We then sorted the data from our interviews to identify user needs, fears and navigational cues. Analyzing this data led us to user personas that served as a holistic representation of the target user group.
USER JOURNEY / FLOW
As a UX designer, I was interested in the experience that one has as they walk a space with somebody's instructions. So, I created a user journey map to discover opportunities and challenges. This led to the following design decisions:
1. Our system does not replace the cane, but only substitutes it.
Our users lay a lot of trust in their cane to help them walk around a space. It helped them judge the surfaces they were walking on, detect obstacles, sense turns in the hallway and served as a cue to others to give way. So, we are not looking to replace the current way of navigation. What our users needed were navigation instructions/ a map. They usually had to depend on others to tell them where to go, or needed someone to walk with them. So, our system serves as the friend that gives them directions as they walk through a space.
2. Simple and understandable language for navigation instructions.
We do not want to overwhelm the user with complex instructions. So, the system uses these simple instructions: "Turn Left/Right", "Approaching Left Turn/Right Turn" and "Walk Straight/ Continue Straight". The rationale behind using "Approaching Left Turn/Right Turn" is that a lot of users didn't have a perception of distance. They found more value in knowing that a turn was ahead and were able to look out for the edge on the wall using their cane.
3. Clear Feedback when system does not understand user input.
Since the system takes input through voice instructions, errors are expected. The system will repeat the destination before it sets out to navigate the user. It will also ask the user to retry if it is not able to parse a query well.
4. Account for missed or incorrect turns.
The system is able to localize a user continuously as they walk through a space and will know if they are off track. We would reroute the user in this case and ensure a no-confusion shortest path reroute.
5. Detect obstacles
Hanging obstacles are a major cause of injury for VI users as they cannot detect them with a cane. Our system uses depth tracking to sense obstacles that are within a certain threshold distance and will alert the user of them. To ensure we do not overwhelm the user, this feedback will be haptic(vibration) instead of audio.
With a clear set of expectations from the system, our team played with different pieces of technology like bluetooth beacons, RFID detectors and depth sensors to evaluate advantages and disadvantages of each technological choice. We set these up in our homes to evaluate what would be suitable for the end user. We also visited the Detroit Institute of Art to try their new augmented reality mobile tour to get our hands on Google Tango. We then compared them on 4 terms namely accuracy, response rate, cost and security. By weighing their pros and cons with user preferences, Google Tango emerged as a clear winner as it lets the user carry a single device that they can use without the dependency of external infrastructure (requiring institutions to install tags and beacons on every item in the building). We decided to build a mobile application that can elicit user input via natural language, find the user inside a building and guide the user to their destination through voice commands.
Having narrowed down on our choice of technology, we mapped the system components and their interactions.. This helped the engineering team clearly understand different modules of the system, their interactions and how these would integrate to form a single system.
Our primary user group will not use a visual interface and will interface with the application through audio instructions. We designed a simple UI for the convenience of users with low vision/ mild visual impairments.
THE TEAM. THE TEAM. THE TEAM.
(From left to right) Asha Shenoy, David Cao, Minh Tran, Jonathan Hamermesh, Moran Guo, Aaron Tang