News
HCI News

Accessibility and Inclusion through Technology

Helping the sensory-impaired overcome their impediments.

Posted

Significant improvements in technology in recent years have created more advanced assistive tools and services that allow people with sensory impairments to lead more independent and fulfilling lives. While there will always be a need for traditional assistive devices such as white canes, Braille signage for the visually impaired, or closed-captioning services for the hearing-impaired, these new devices, apps, and underlying technological approaches are helping to create a more inclusive world.

The number of people who benefit from technological improvements to assistive technologies is notable. According to the International Agency for the Prevention of Blindness, 43 million people around the world were living with blindness as of 2021. Meanwhile, hearing loss currently affects more than 1.5 billion people worldwide, of whom 430 million have moderate or higher levels of hearing loss in their better-hearing ear, according to the World Report on Hearing, published in 2021 by the World Health Organization. In addition, the Wheelchair Foundation has identified more than 131 million people around the world who require a wheelchair.

Assistive technologies are generally designed to allow sensory impaired people to communicate with others more easily, acquire information via alternative modalities, or more easily and independently navigate in physical spaces. The most efficient solutions are also capable of serving as fully inclusive solutions that meet the needs of both sensory-impaired people and the general population.

Back to Top

Communicating Seamlessly in Privacy

A key underlying principle of today’s assistive technology is to allow for greater independence without requiring active intervention by humans or other third-party technologies that create friction and inefficiency. One example is Nagish, a mobile application that converts audio telephone calls into text on a real-time basis. Nagish created a mobile application that converts text to speech and speech to text in real time, so one side of a phone call can type and read while the other side hears and speaks, without requiring a human translator on either side of the conversation. The service uses a combination of natural language processing and internally developed captioning engines to provide real-time conversions, with an accuracy rate of more than 96%.

uf1.jpg
Figure. A boy who has suffered total vision loss uses the OrCam MyEye 2 to interpret written text and read it aloud to him.

“If you’re deaf, you basically have three options, but all of them come down to a person in the middle that translates between you and a hearing person,” says Tomer Aharoni, co-founder and CEO of Nagish. “So whether it’s through text to speech or speech to text, to these manually typed [services], or through video-relay services, which are sign language interpreters converting in between sign language and speech, you don’t have the ability to have a private conversation. And we wanted to change that.”

To maximize accuracy, Nagish also incorporates some external captioning technology and provides users with the ability to highlight words or phrases on a transcript that are incorrectly transcribed. Says Aharoni: “We don’t access the captions or use call transcripts to train our systems, but we have a set of heuristics in place to make sure that [accuracy rates] are above a certain threshold.”

For vision-impaired people, OrCam has developed MyEye, a voice-activated device that attaches to virtually any eyeglasses and uses a camera mated to AI technology, including facial and object recognition and natural language understanding, to read text from a book, smartphone screen, or any other surface, as well as being able to recognize faces identify objects, or otherwise convey visual information audibly, in real time and offline. This allows vision-impaired people to interact with and acquire information in any location, even ones where Braille signage or audio cues are not in use.

Back to Top

Enhanced Information through Smart Mobility Technology

Other technology developers are focused on improving the ability to help the sensory-impaired independently navigate in public spaces, either by adding advanced features to traditional white canes, or via the use of AI-aided smartphone applications.

WeWALK (wewalk.io/en) is a smart device that attaches to a traditional white cane and uses an ultrasonic sensor to detect objects above chest level. When paired with the WeWALK mobile application via Bluetooth, a user can interact with mobile applications using the integrated WeWALK touch-pad, without holding his or her phone. Currently, WeWALK is integrated with Google Maps and Amazon Alexa, and the company announced a partnership in 2021 with Moovit’s transit API (https://moovitapp.com/nycnj-121/poi/en), which allows users to receive the best routing information for each journey based on crowdsourced data, so visually-impaired users can navigate safely to, on, and from public transit.


Technology developers are focusing on improving the ability to help the sensory-impaired navigate independently in public spaces.


The SmartCane (https://assistech.iitd.ac.in/smartcane.php), an assistive product developed by Assistech, a laboratory founded in 2007 at the Indian Institute of Technology Delhi, is another obstacle-detection device that uses non-contact, ultrasonic sensing to alert users via vibration if they approach objects up to three meters away, compared with the typical one-meter distance limitation of a traditional white cane. SmartCane, in use by more than 100,000 people, is particularly useful in India, where it is common to find stray animals sharing space with humans, along with other typical stationary hazards, such as curbs, low-hanging branches, or parked bicycles or carts.

Other researchers have taken the approach of using robotics to supplant traditional guidance solutions, such as service guide dogs. AlphaDog (http://www.weilan.com/en/robots.html) is a quadrupedal robot developed by Weilan, a China-based start-up founded in 2019, which incorporates artificial intelligence, IoT, 5G communications, virtual reality, autonomous driving, and swarm intelligence technologies. The robot incorporates sensors that identify and avoid obstacles in the environment; it can be programmed to navigate in familiar surroundings (but may find navigation and maneuverability challenging in unfamiliar environments).

Meanwhile, Israeli company Seamless Vision has developed Buddy (https://vimeo.com/425528955), an autonomous robot that uses a variety of sensors to detect stationary and dynamic obstacles located within urban environments. While Buddy can direct users to previously known or mapped locations in metropolitan areas, it is best suited to flat surfaces. Buddy also is only capable of providing basic information about its surroundings and is limited in its ability to detect the user’s intentions.

Back to Top

Melding Navigation and Information

For sensory-impaired people, the ability to navigate from place to place safely and efficiently can be aided via the use of smartphone applications, which can not only provide the user’s current location and safe pathways or routes (avoiding challenges such as stairs, ramps, or areas of high congestion), but also additional information on points of interest, such as the location of public restroom facilities, specific details on retail services, and the location of critical infrastructure, such as emergency exits.

NavCog (https://apps.apple.com/us/app/navcog/id1042163426) is an iPhone app for indoor navigation that uses Bluetooth beacons to provide location information, aimed specifically to help people with visual impairments explore the world without vision. By connecting the app to a back-end database containing a map of the facility, along with points of interest, stairways, escalators, and other features, navigation apps can provide verbal navigation data, text descriptions and icons, and adaptive and optimized routing information to ensure those with disabilities can safely navigate within a building or location. This is also helpful for people who are not familiar with the layout of buildings or complex places, such as universities, airports, hospitals, or shopping districts. To date, NavCog has been tested in Japan in the Nihonbashi Muromachi area (COREDO Muromachi), Narita International Airport, and the Toyosu Civic Center, as well as the Allegheny General Hospital and The Andy Warhol Museum, both in Pittsburgh, PA.

Bluetooth beacons also are used to power Evelity (https://www.evelity.com), a mobile app available for iOS and Android devices that lets sensory-impaired people access location information, points-of-interest information, and navigation support. Depending upon the needs of the user, the actual interface used may be different (audio prompts are used for visually impaired people versus text directions and icons for deaf people), but the end functionality is consistently aimed at helping to inform the user.


Smartphone applications can help the sensory-impaired navigate from place to place safely and efficiently.


The installation of an inclusive app such as Evelity can benefit all visitors to or users of a building who want access to a more personalized navigation experience, says Sylvain Denoncin, the company’s CEO. “For example, we made a pilot installation for the metro [transit system] of Lyon in France, and they said, ‘okay, this is great for visually-impaired people or wheelchair-bound people’,” Denoncin recalls. “But in terms of investment, it’s good because it’ll be helpful for tourists because the app can work in their own language,” creating a better overall visitor experience.

Meanwhile, U.K.-based Waymap Ltd. (https://www.waymap.com/en), offers a mobile app that eschews the use of beacons, instead relying on a proprietary device-sensor-based algorithm to provide inclusive location-based wayfinding technology, both indoors and outdoors within its partner locations. Waymap relies on existing computer-aided design (CAD) drawings of a space and a physical, 360-degree walkthrough using a LiDAR (light detection and ranging) scan of the facility to identify and map key points of interest, pathways and corridors, and any other physical features that need to be captured, including grade changes, surface changes, and restricted areas. All features are then geolocated to specific coordinates, and new, inclusive maps are created that adhere to standards of the United Nations’ International Telecommunications Union (ITU) and the U.S. Consumer Technology Association (CTA), specifically the ITU-T F.921 and CTA-2076 standards, which cover audio-based wayfinding for visually impaired and low-vision people.

Then, leveraging the standard sensors embedded in smartphones (such as the internal compass, pedometer, and accelerometer), the app determines an individual’s position within the space. A proprietary algorithm within the app measures where a user is against their previous step, then calculates the probability of where his or her next possible step will be, allowing users to navigate within a space based on geolocation data within the environment (such as the presence of obstructions, or popular routes to points of interest).

“Basically, our algorithm turns your phone into a precise navigation device,” says Tom Pey, Waymap’s CEO and founder. “It does that by measuring where you are against your previous step, rather than measuring where you are against a satellite or a Bluetooth beacon.” As more people use the app, says Pey, the algorithm will continue to learn the best paths, and get even better at predicting users’ likely paths and routes to specific points of interest.

Waymap, which has been deployed in the Washington, D.C. metro system, and is in proof-of-concept testing in transit systems in Los Angeles; Singapore; Madrid, Spain; and Brisbane, Australia is able to calculate a user’s location to within a single meter, and directionally, within a 10-degree range. The tight integration of a fully mapped building and the Waymap app also allows for planned or unplanned changes to be accounted for by the building or the transit system’s operations staff.

*  Further Reading

How Beacon Wayfinding is Changing Indoor Navigation https://bit.ly/3oCV3C7

Global Blindness Data International Agency for the Prevention of Blindness https://bit.ly/3N48Jaw

WeWALK: Revolutionary Smart Cane for the Visually Impaired https://www.youtube.com/watch?v=Rr9RaisO11E

Analysis of Wheelchair Need, Wheelchair Foundation https://bit.ly/43QKOKP

World Report on Hearing, World Health Organization https://www.who.int/publications/i/item/9789240021570

Back to Top

 

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More