The existing tools that help Visually Impaired (VI) people in navigation are inefficient in communicating the relevant spatial information, especially to children. Without any aid, it is difficult for VI to follow a certain direction or a straight path. Thus, VI children depend on other people and infrastructure to navigate.
S.A.M. is a System for Assistive Movement, designed for the needs of visually impaired people to help them during navigation. This solution uses 3D audio sound to provide directions to the users while building upon the analogy of a ‘companion’. This would give users a stress-free, independent journey experience.
User Research, Co-Workshop Design & Facilitation, Conceptualization, Testing
12 Months, Bachelor's Thesis Project 2018-19
Globally the number of people of all ages visually impaired is estimated to be 285 million, of whom 39 million are blind. In total, only, 1/10 disabled person has the assistive tech and products at their disposal. The project started with the aim to explore the lives of visually impaired people and seek solutions to improve their lives through easily accessible and effective solutions.
The process of the whole project is divided into three parts - Discover, Define and Develop. The Discover phase included initial research to find the opportunity areas. In the Define phase, the problem was clearly defined to seek possible solutions. Lastly, in the Develop phase, the solution was detailed, prototyped and tested.
I conducted primary research in the Guwahati Blind School in Assam, India. Furthermore, I did literature research and market analysis to better understand the overall scenario.
Literature research was done to understand specific problems faced by VI people and further find opportunity areas where intervention is needed. Several research papers and upcoming innovations in the field were studied around design for accessibility and visual impairment.
Two telephonic interviews with VI adults and two in-person interviews with children during the full-day visit to Guwahati Blind School. During the visit to the school, observations were made around their activities, the problems they face with those, and how they adapt to them.
The competitor analysis was done to understand the existing solutions and technologies available for VI - the kind of interactions and mode of communication involved. Further, the availability of these solutions was understood from the point of access and cost in India.
Modes of Interaction
The technologies found were analyzed on the basis of interaction involved in the three phases of communication - Reception from source, Analysis & Interpretation, Retention or Action. These majorly included devices based on touch and audio senses.
After analyzing the findings of initial secondary and primary research, indoor navigation emerged as a critical issue during their daily routine and some of the emerging pain points during navigation were as follows -
There is no way to instantaneously read the information on signboards in the path.
Without references, VI are forced to rely on memory heavily to follow certain routes.
The inability of technology to communicate spatial information effectively.
The key problem identified was that without any visual cues, people tend to walk in arcs or circular path. Thus the project brief was to enable visually impaired people to walk in a certain direction independently for a seamless experience during indoor navigation.
Existing Solutions for Navigation
The competitor Analysis revealed that there was lack of empathy and efficiency in the existing solutions and communicating the right directions was still a persisting issue. Further, most solutions were not available in India.
To improve the creative process and remove the bias of sighted designers, the approach that was taken was to involve visually impaired people in the ideation process. Co-design methodology opened up insights about the challenges of co-designing with blind children apart from the solution-oriented observations.
Read the full paper about the co-design methodology and workshop here.
The key insights that emerged from the co-design session were:
VI people are well aware of the surrounding sounds and noises, be it listening to people talking or following a voice or signal for direction. During the activity, it was observed that they were signaling each other by snapping their fingers to make sounds and guide them about the direction.
They always feel at ease when accompanied by a friend or person. They often formed chains by either holding hands or keeping their hands on the shoulders of the next person. In fact, most of them had their navigation partners to move around in the school.
They heavily depended on memorized infrastructure of the places they are familiar with. Thus, whenever they visit a new place or building, they face multiple issues in understanding the layout of the place to navigate since most places are not designed with accessibility considerations.
Ideation started with some system-level ideas. Considering the insights gathered from the research, a simple scenario was broken down into small actions. For the points where the exchange of information was needed between the system and the user, different interactions were explored. The kinds of interactions were dependent on the kind of device being used. The core idea was how seamlessly and naturally the information about one’s environment can be delivered to VI people. In current situations, the user is unfamiliar with everything in the surrounding and get aware (by touch) about everything at random (not particularly what is required or what is needed). To come up with a solution where the user is made aware of only relevant things in a subtle way. From these explorations, sound emerged as a necessary component.
3D Spatial Sound
In order to convey the directional information, the 3D spatial sound was explored instead of traditional 2D mono audio. The use of spatial sound not only makes it natural to understand the instructions but is also efficient in guiding a person on a straight path via following the sound in front of them.
There were two sets of information needed to be delivered to the user, one about their immediate surrounding (micro navigation) and another about their holistic route (macro navigation). To differentiate these sets and to avoid confusion, two different characters were ideated that can deliver these two sets of information.
The concept was detailed out in three steps- defining the virtual personalities, sound design associations and the physical component design.
From the research, it was observed that dogs are often used by people during navigation and as companions. Hence for the initial idea, a virtual dog was considered along with a narrator like persona to help the users with other instructions. The idea was that the concept can be expanded in the future to incorporate other pets or personas of their choice as well. Some of the keywords that were thought while defining the characters were soulmate, friend, ally, buddy, chum, super you, your free spirit, pal, intimate. Below is a mood board created while detailing the personas.
A friendly companion, guide who is active, trustworthy, loyal, intelligent, cheerful and caring. He is always around, ready and excited for a walk. The dog helps the user in micro navigation involving non verbal communication. The associated meanings of Bruno's sounds like bark, woof, footsteps, sniff etc. were explored in the next step.
An assistant or narrator who is helping, amiable, reliable, considerate, affable, resourceful and always up for exploration. Sam can be played out by either male or female voice as per user preference. Sam provide the users with verbal information. While Sam is to the point and crisp in giving instructions during macro navigation, he/she is friendly and funny during longer conversations. Blind people often get stressed in case they are travelling alone so keeping the narrator with a light mood who can bring a smile on the user’s face and make them comfortable and at ease. In contrast to Bruno who is always around, Sam comes when needed or called upon for help. Connected to the community closely, Sam can access data from other members for new destination and routes and connect the user to family and friends.
Initially, a basic understanding of different types of dogs, and the general activities performed by them and the sounds associated with those were gathered. To further understand their meanings for humans, a survey was conducted with forty-seven normal sighted people ranging between age 20-53. The survey included a brief idea about a virtual dog guiding one during navigation. Then the users were asked to hear various dog sounds and associate them with various actions to the best of their understanding. From the survey, the responses indicated that the natural action associated with bark was ‘stop/ wait’, with a walk (footsteps) was ‘keep going’, with sniff was ‘look around’ and with woof was ‘call/ wait’.
Physical Component Design
The need was to have a wide-angle camera unit which can capture the user’s surroundings for collecting details about the location/ path. This data is fetched to recognise scenes, user location, obstacles, people and surroundings. In the future, the camera unit could be integrated with glasses having earpiece as well embedded in them.
3D audio output is provided through headphones that are open ear to allow ambient sound as per user preference. Since the 3d sound will require earpiece in both ears, the user will get cut off from the environmental sounds if it is not an open ear headphone. Secondly, they include head tracking so that the audio is independent of user's head moment to keep sound sources fixed.
To test the overall concept, user testing was done with five users. The user testing was conducted with sighted people by blindfolding them. Here, it’s assumed that blind people are more capable than normal people in hearing surround sounds and are better abled to detect directional sound and therefore if blind-folded sighted people are able to follow the concept and it works for them, it will definitely work for visually impaired people.
Setup | Wizard-of-Oz techniche was used to create an effect of surround sound where a person held a phone producing required sound in front of the user, panning left/right as per the direction in which the user needs to be guided. To play out the sounds, a sound board application was used in the phone.
• To test the hypothesis - VI people can walk on a straight path by following a source of sound in front of them.
• To check the accuracy at which they are able to turn/ rotate following the sounds
• To understand how comfortable users are in navigating using the chosen sounds and with the meanings associated with the sound
• To find out how much time does the activity take and its overall efficiency