Robotics can help the blind to navigate
Technologies that help machines navigate are being adapted to help blind people find their way around.
Robots need help navigating their surroundings and use sophisticated location systems to keep track of their position. Now the same technologies are being adapted to help blind people navigate indoor and outdoor spaces independently.
One such system, being developed by Edwige Pissaloux and colleagues at the Institute of Intelligent Systems and Robotics at the Pierre and Marie Curie University in Paris, France, consists of a pair of glasses equipped with cameras and sensors like those used in robot exploration.
The system, unveiled at a talk at the Massachusetts Institute of Technology this spring, produces a 3D map of the wearer’s environment and their position within it that is constantly updated and displayed in a simplified form on a handheld electronic Braille device. It could eventually allow blind people to make their way, unaided, wherever they want to go, said Pissaloux.
“Navigation for me means not only being able to move around by avoiding nearby obstacles, but also to understand how the space is socially organized — for example, where you are in relation to the pharmacy, library or intersection,” she said.
3D tactile maps
Two cameras on either side of the glasses generate a 3D image of the scene. A processor analyses the image, picking out the edges of walls or objects, which it uses to create a 3D map.
The system’s collection of accelerometers and gyroscopes — like those used in robots to monitor their position — keeps track of the user’s location and speed. This information is combined with the 3D image to determine the user’s position in relation to other objects.
The system generates almost 10 maps per second, which are transmitted to the handheld Braille device to be displayed as a dynamic tactile map. The Braille pad consists of an 8-centimeter-square grid of 64 taxels — pins with a shape memory alloy spring in the middle. When heat is applied to the springs, they expand, raising the pins to represent boundaries.
The Braille version of the map is updated fast enough for a visually-impaired wearer to pass through an area at walking speed, said Pissaloux. Seth Teller, who develops assistive technologies at MIT, called the work exciting and ambitious.
This is not the only robotics project to be re-purposed. Software that predicts how far a robot has traveled based on information from its on-board sensors is being modified to track a person’s movements based on their stride length.
The low-cost system, being developed by Eelke Folmer and Kostas Bekris at the University of Nevada in Reno would help blind people navigate around buildings using just a smartphone.
The new system uses freely available 2D digital indoor maps and the smartphone’s built-in accelerometer and compass. Directions are provided using synthetic speech.
To help the smartphone calibrate and adjust to a user’s individual stride length, the user must initially use touch to detect the landmarks in their environment, such as corridor intersections, doors and elevators.
Virtual assistants
A virtual assistant can help blind people explore their surroundings. Developed by Suranga Nanayakkara at the MIT Media Lab, EyeRing consists of a ring equipped with a camera, and a set of headphones. The user points the ring at an object they are holding and uses voice commands to say what they need to know — the color of an item of clothing, say, or the denomination of paper money.
The ring takes a picture of the object, which is transmitted wirelessly to a cellphone, where software analyses the image. The required information is then read out by a synthesized voice. It was presented at the Conference on Human Factors in Computing Systems in Austin, Texas, in May.
— © 2012 New Scientist Magazine. All rights reserved. Distributed by Tribune Media Services, Inc.