Navigation and orientation technologies offer practical solution for people with vision loss.
In Part 1, Wolynski and Matos described accessible GPS and wearable navigation tech. Apps such as Lazarillo, BlindSquare, and HapticNav make navigation tech easily accessible by pairing the power of smartphone tech (location services, GPS, etc.) with navigation resources for people with vision loss, such as audible turn-by-turn directions and information on upcoming intersections and points of interest. External devices, such as the WeWALK Smart Cane, Ara, biped, and SideSight can further enhance navigation with cameras, sensors, and haptic actuators to alert users to obstacles. Some work with GPS apps previously mentioned, and other leverage artificial intelligence (AI) to communicate with the user. In Part 2, Wolynski and Matos discuss technology that helps people with vision loss navigate traffic and indoor directions, which may lack precise GPS.
To read the full Part 1 article, click here.
Knowing when it is safe to cross a busy street can pose a challenge for someone with vision loss. Making use of the iPhone’s camera and integrating its AI software, the Oko application by AYES aids with street crossing. With proper orientation and direction of the iPhone camera, a user will receive, in real time, the status of pedestrian traffic signals via sound, haptic vibration, or a visual on the screen notifying whether a signal reads “Don’t walk” or “Walk” or is about to change. Free to use and only available on the iPhone platform, the company notes that its use is not meant to replace orientation and mobility skills.1 However, Matos says it adds another layer of protection and confidence in crossing streets in a busy city. The company reports that GPS routes and mapping will soon be added.
Figure 1a. NaviLens code at the M66 bus stop in New York, New York, gives bus information, including time of next bus arrival. (Image courtesy of Bryan Wolynski, OD, FAAO)
Figure 1b. NaviLens app recognizing a NaviLens code used for indoor orientation, giving information about the Lighthouse Guild Technology Center in New York, New York. (Image courtesy of Bryan Wolynski, OD, FAAO)
Another approach to accessible travel is to create accessible signs. Based on this concept, NaviLens has developed an accessible code. Different from a QR code, a NaviLens code is designed to be easily and instantly recognized by a smartphone’s camera from far distances and extreme angles. The square-shaped code has a white and black border surrounded by a grid of smaller colored squares (Figure 1), which can be used for outdoor or indoor signage. Available as a free application on iOS or Android, a user scans the environment for the code with their smartphone camera. Once recognized, auditory information preprogrammed into the code is instantly given. Although not meant to give step-by-step navigation, the information can help orient an individual to their surroundings, give POI, or direct someone toward the code with its magnet feature. NaviLens codes are currently used for indoor signage on building directories and museum exhibits and for outdoor orientation in public spaces. As part of a pilot project, the Metropolitan Transit Authority in New York, New York, has NaviLens codes displayed at select train stations, bus stops, and routes, giving entrance and other physical orientation information, including next transport arrivals.2 Feedback can also be spoken in other languages no matter what language the data were initially entered in, making this useful for everyone and very helpful for tourists. NaviLens codes are also being adopted for accessible product packaging.3
Since the Americans with Disabilities Act was passed in 1990, braille has been included on indoor signage to make it accessible. However, not many people with vision impairment read braille, and if they do, it is difficult for them to locate it on signage. Other technology solutions for those with vision loss have been researched with the focus of using sensor networks, such as ultrawideband, Bluetooth low-energy (BLE) beacons, radio-frequency identification (RFID), near-field communication (NFC) tags, computer vision/camera systems, or a combination of these options.4-6
The BLE beacon is a small, lightweight, energy-efficient short-range transmitter, which sends out a radio signal when in range of a receiver, usually a smartphone. Although location accuracy can be off by as much as 7.81 m,7 the information provided, as in other technologies mentioned previously, is best used for orientation information rather than turn-by-turn navigation. Although BLE beacons have been studied for indoor and outdoor accessible navigation,4 commercial BLE beacons have mainly been used as an indoor solution.
The company RightHear has a system using BLE beacons in their audible wayfinding system, which includes:
With RightHear, there is no need to search for signage or use the smartphone’s camera. When the system is in use, information is transmitted to the user’s smartphone, speaking out content such as business hours, emergency information, location of restrooms, or floor directory signage. Information is also segmented into cardinal positions, and the application can inform the individual what is ahead in the direction they are facing. Another function is the ability for a user who is blind or visually impaired to virtually explore a RightHear location before travel, which can be found on their app. The RightHear app also supports an outdoor navigation option using GPS. Other companies using BLE beacons for indoor navigation include Lazarillo and BlindSquare.
In addition to BLE beacons, other radio-transmitting sensor technologies include RFID and NFC tags.4,8 Many of us are familiar with NFC tags that are used for contactless payment. The WayAround company is using this technology in a similar way. Through a free smartphone app, users can get helpful information transmitted to their smartphone by tapping on strategically placed WayAround NFC tags. Information can include office room numbers, location, and orienting content within a building. These tags can also be purchased for home use to digitally label products, clothing, and other items around the home.
Unlike NFC tags, which require close touch contact, RFID tags can be read by simply passing by an RFID reader, much like we do at continuing education events. Hearsee Mobility, a nonprofit company in Utah, has developed a white cane designed to receive RFID tag signals, allowing users to receive information about POI and location of offices or restrooms while navigating an indoor route.
Most technologies for indoor navigation systems give orientation, proximity location, POI, and audible signage information. As previously mentioned, although GPS can provide approximate turn-by-turn navigation outside, it does not work for indoor routes. GoodMaps addresses this gap by using LiDAR technology.
Setting up GoodMaps involves several steps:
Once operational, users choose a destination on the GoodMaps application, which is free to users on iOS or Android platforms (Figure 3). While using the app and holding the smartphone’s camera facing forward, the user receives audible and visual step-by-step guidance. As with any technology, there is a learning curve, so individuals should always rely on their orientation and mobility skills. Currently, GoodMaps is available in airports, train stations, and retail businesses worldwide. A list of locations can be found on the app.9
Another approach to indoor navigation comes from Microsoft’s Seeing AI application. This free app, available on iOS and Android, is used by many people with blindness or visual impairment for reading text, scene descriptions, recognizing barcodes, money, and other uses. The app has various features called channels. One of them, the World Channel, includes navigation using virtual beacons or breadcrumbs. Users can follow saved routes visually on the screen, going toward a virtual beacon (large transparent blue dot on the screen) or through spatial sounds, which require headphones. Here’s how it works:
1. Initial setup:Use the smartphone camera to scan the starting point until the app notifies you that 100% of the area has been scanned.
2. Save a route:Start walking toward your destination. The app will drop virtual beacons or breadcrumbs as you walk to save and name the route later.
3. Follow the route: To navigate your saved route, select the route in the World Channel. Follow the virtual beacons on the screen or use spatial sounds with headphones to guide you from one beacon to the next until you complete the route.
Companies such as BeMyEyes and Aira provide live sighted assistance to individuals with blindness or visual impairment. BeMyEyes is a free service and app on iOS and Android that connects sighted volunteers with users with visual impairment needing assistance. Volunteers see through the user’s smartphone camera to help with tasks such as shopping, finding things, and navigation, with calls lasting anywhere from 1 to 3 minutes.10 Recently, BeMyEyes has introduced a new feature called Be My AI, which uses ChatGPT to provide highly descriptive scene explanations allowing for orientation information and the ability to ask more questions about an individual’s surroundings.
Aira also connects users to sighted assistance; however, the service is subscription based and connects to a trained professional agent. Findings from a study on Aira’s service found that more than 10,000 calls in a 3-month period averaged 8 to 9 minutes and were used primarily for reading, navigation, and home management.11 Aira is also developing an AI virtual option, which is currently in beta testing.
Some other noteworthy options include Seeing AI, Envision AI, and Ray-Ban Meta smart glasses.They all offer AI scene description functionality. Envision AI provides this through their free smartphone application or on their smart glasses, which are available for purchase. Envision AI glasses also offer sighted assistance through their Call a Companion feature. Ray-Ban Meta smart glasses use WhatsApp, allowing recipients of a call to see through the glasses’ camera and communicate directly with the user through the glasses.
Technology is constantly evolving, and the systems mentioned in this article are also improving. There is a growth of commercial options, especially for indoor navigation, integrated as an app on our smartphones and in the future with the promise of wearables. For these advancements to be effective, they need to be user-friendly and aesthetically acceptable in public and provide customizable options for feedback. Recently, OpenAI showcased a future feature where an AI assistant will seamlessly provide real-time assistance, as demonstrated in a video of a blind person touring London in the United Kingdom and hailing a taxi. The future of AI integration and capabilities is promising, but we should remain cautiously optimistic.
The marketing and promotion of assistive and mainstream technologies do not always correctly represent what the technology can do. More data-driven research is needed as technology advances. Although technologies such as GPS can get a user close to a destination, they can fall short on the last few feet of a journey. Consequently, individuals with blindness or visual impairment still need to rely on their orientation and mobility skills. What’s more, they need those skills to learn how to use and integrate technology into their lives. Optometrists should consider more orientation and mobility referrals, particularly for older patients with low vision who are more at risk for falling and becoming detached from their community due to their fears of travel.
Nevertheless, technology is revolutionizing all our lives. Fortunately, new solutions are emerging that can help patients with low vision and are being brought to and tested at Lighthouse Guild in New York, New York. Incorporating technology as part of low-vision care can significantly enhance the safety and functionality of our patients, enabling our patients to accomplish their daily activities and not only meet their goals but surpass them.