EchoLearn
WHAT’S IT?
EchoLearn is an experimental pedagogy and mobile VR simulator activating the blind and visually impaired community's equal access to echolocation distance education.
Users in the experience can directly perceive 3D objects and spatial settings in VR by emitting virtual mouth-clicks and interpreting their simulated and customizable sound reflections.
Through the VR exercises, they can learn to associate sound reflections with basic physical attributes, recognize daily objects, and navigate life-like scenarios through echolocation.
Context
Thesis Project at Parsons School of Design, Sept.'20 - May' 21
Echolocation Consultant
Thomas Tajo
Instructors
John Roach, Barbara Morris, Anezka Sebek, Anna Harsanyi
Tag
Echolocation VR Exercises · Active Acoustic Simulation · Accessible VR Navigation
00:00-00:11 Design Brief 00:12-00:55 Mockup Test 00:56-01:24 How It Works 01:25-01:42 User Scenarios
HOW DID I GET THERE?
1. CONTEXT
HUMAN ECHOLOCATION
Although it is well-known as a navigational method by animals like bats and whales, echolocation is also a direct perceptual and navigational technique used by a subset of blind individuals for their improved independence and mobility.
By interpreting the echoes of their mouth-clicks, echolocation users can perceive their surrounding objects and understand features such as the size, shape, location, distance, motion, and surface texture of objects.
Human echolocation lets blind man 'see'
Video courtesy of CNN
ECHOLOCATION EDUCATION
Echolocation is advocated by international organizations like Vision Inclusive and Visioneers, which provide onsite echolocation training to blind and visually impaired individuals.
The dominant onsite training pedagogy starts from trusting hearing, recognizing everyday objects, and then navigating environments through echolocation.
Echolocation Training 1
Video courtesy of Great Big Story
Echolocation Training 2
Video courtesy of Great Big Story
ACCESSIBLE INTERACTIVE MEDIA
The blind and visually impaired individuals can access most of the smartphone functionalities through built-in screen readers like the Voiceover of iOS and TalkBack of Android, which describe visual elements and interaction affordances on display.
Auditory descriptions help sightless users navigating their screen using simple, standardized hand gestures.
Through onsite observations, sightless individuals tend to own high-quality headphones for advanced acoustic enjoyment. As configuring headphones with motion sensors is a current trend of hearables manufacturers, it gives a chance to develop immersive acoustic experiences based on headphone head tracking for sightless audiences without causing their extra device purchase.
iOS Voiceover Control
However…
Although echolocation has its promise as advanced navigational skills for sightless individuals’ improved independence, echolocation education and advocacy are yet to be constrained due to two reasons:
Beginners’ Untrained Hearing
unable to distinguish and interpret information from subtle sound reflections
Scarcity of qualified instructors
resulting in the unattainability of educational resources, worsen by the global pandemic
Moreover…
“(Technology interventions) can be useful, but these are largely from the perspective of trying to give blind people sight back and forcing everybody to see the world (through vision) … I’m more supportive that sightless individuals should try to ‘see’ with the senses that work for themselves”
-- Thomas Tajo, a blind disability activist & echolocation instructor
So I Wonder…
How might interactive technology interventions enable equal access to echolocation educational resources and simplify echo/reverb perception for echolocation beginners?
2. SYNTHESIS & STRATEGY
Partnering with Thomas Tajo, a Belgium-based echolocation educational authority, I proposed EchoLearn as a distance education solution.
Strategically, EchoLearn simulates adjustable sound reflections in VR to simplify sound perception for beginners and enable echolocation self-practicing and remote training.

Thomas Tajo
Echolocation Instructor
Brussels, Belgium

Jerry Tan
Interaction Designer
New York, NY
Customizable Acoustic Simulation
EchoLearn reproduces the echolocation experience through the real-time active acoustic simulation in VR. By double-tapping to emit virtual mouth-clicks and interpreting the simulated sound reflections, users can directly experience 3D objects and spatial settings. The sound reflection simulations are customizable, aligning with users’ different levels of hearing perception.
Instructed Self-Practicing & Remote Training
EchoLearn provides exercises for beginners to associate sound reflections with basic physical attributes, recognize daily objects, and navigate life-like scenarios through echolocation. They can perform self-practices following instructions in the app or attain remote training connecting with an echolocation instructor through the app.
It may intervene from the early stages in vision deterioration or vision loss as a transitional training method until a beginner is ready for advanced onsite training.
3. PEDAGOGY DESIGN
In response to that, I collaborated with Thomas Tajo, an Belgium-based echolocation instructor, proposing EchoLearn as an educational solution, leveraging technology interventions.
collaborate with TT, look deep into the pedagogy, looking for its pattern and possibile marriage with interactive technologies.
We synthesize the current pedagogy,
reproduce the five exercises in VR, customizable, personalized, scalable, and remote
4. TECH IMPLEMENT
Echolocation Training 1
Video courtesy of Great Big Story