Sign In

Communications of the ACM

ACM TechNews

Augmented Reality App Created by NYU Students Can Translate Sign Language in Real Time


American Sign Language signs for A, S, L.

Researchers at New York University have used developed an application that enables users to capture sign language with a smartphone camera and see a live translation into their native language.

Credit: cuchimes.com

New York University (NYU) researchers have used computer vision and augmented reality to develop ARSL, an application that enables users to capture sign language with a smartphone camera and see a live translation into their native language.

The team says ARSL also can translate spoken language into sign language.

The prototype was developed by researchers working in the NYU Future Prototyping and Talent Development program, a partnership between the NYU Media Lab and Verizon.

"We make magic when we pair leading students with outstanding mentors in the Envrmnt team at our AR/VR lab," says Christian Egeler, director of XR product development for Envrmnt, Verizon's platform for extended reality solutions. "We discover the next generation of talent when we engage them in leading edge projects in real time, building the technologies of tomorrow."

From Next Reality
View Full Article

 

Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA


 

No entries found