0
    Nova
    Tech + EngineeringTech & Engineering

    Blind People Could Someday Scale Mountains—Using Their Tongue As a Guide

    ByAllison EckNOVA NextNOVA Next

    Erik Weihenmayer is blind—but that hasn’t stopped him from climbing mountains.

    By tapping into BrainPort—a device that converts visual stimuli into delicate electrical vibrations on the tongue—he can perceive the size, distance, and depth of craggy landscapes ahead. His mouth serves as an alternate medium through which he can learn to “see.” It’s just one example of a trend known as “sensory substitution”—the act of substituting one sensory experience for another.

    Here’s Dana Smith, writing for The Crux:

    An array of 400 electrodes on an area a little larger than a postage stamp sits on the tongue and receives input from a video camera hooked up to a set of snazzy sunglasses, ala Google Glass. The visual signal is processed through a small computer connected to the device, with the camera pixels corresponding to different electrodes in the array. This visual information is thus translated and spit out as electrical pulses on the tongue, varying in intensity, duration, location and number depending on the incoming signal. The researchers describe the pulses as feeling like bubbles or sparkling water on the tongue.

    BrainPort converts visual stimuli into delicate electrical vibrations on the tongue.

    Sensory substitution devices employ touch, as well. The Robotics and Mechanisms Laboratory at Virginia Tech, led by Dr. Dennis Hong, has developed a “haptic” car for blind drivers based on capabilities derived from the self-driving car:

    Lasers in the front of the car serve as the automobile’s eyes, collecting information about obstacles and the boundaries of the road. Specialized gloves then relay this information to the driver to help them steer, gradually vibrating the fingertips on either hand in the direction the car should be turned. A vibrating chair provides guidance on optimal speed: the placement and intensity of the vibrations tell the driver to speed up, slow down or come to an emergency stop. Finally, air puffs coming out of a tablet-like device located next to the driver create a map on their palms and fingertips to help them navigate the road ahead. This allows the driver to make advanced decisions, giving them greater independence over the automated feedback from the car.

    The car has been taken out for a handful of successful test drives, and the National Federation of the Blind also showed its support for the project, as such tools empower visually-impaired people and grant them more autonomy in their daily lives.

    Last October I reported on yet another example of a sensory substitution device that, if successful, could allow blind people to mimic bats’ state-of-the-art echolocation strategies. Anyone can echolocate on their own—the most basic definition of the term just refers to the act of making noise to help gauge one’s location in space. But this sensor would emulate a bats’ broadband chirps in immense detail. The user would learn to interpret the echoes in order to get around:

    Here’s how it works. Say a big brown bat is on the prowl for food. The higher and lower pitches of its chirps detect different objects. As it searches for small prey—like a nearby June bug—it listens for subtle changes in the higher pitches of the echoes. This helps it recognize an insect by its “acoustic texture,” a detailed bit of information that’s similar to what a human might see under a microscope. Then, like the flip of a lens in an optometrist’s office, it starts hearing lower frequencies, which help it to avoid bigger objects, like a tree or another bat. Thanks to the sound filtering provided by the speedy gap junctions, big brown bats can discriminate between frequencies and time delays in just a few millionths of a second. In one fell sweep, they have all the frequencies they need to swoop through a three-dimensional soundscape.

    An echolocation device for humans would attempt to replicate this process. Information received from echolocation devices and other sound-based technologies, like “ the vOICe ,” stimulate visual areas of the brain that would otherwise remain inactive. None of these devices are permanent solutions, though they are incredibly helpful to researchers and provide a glimpse at what might be possible in the future.

    Funding for NOVA Next is provided in part by the Eleanor and Howard Morgan Family Foundation.

    Major funding for NOVA is provided by the David H. Koch Fund for Science, the NOVA Science Trust, the Corporation for Public Broadcasting, and PBS viewers.