Snapchat has developed augmented reality lenses that can recognize and translate American Sign Language.
The Verge website said that the company’s move comes as part of its efforts to commemorate the International Week of the Deaf, and the new augmented reality lenses are expected to use artificial intelligence and computer vision technology.
The company was able to develop SignAll technology that can be used to track hand movements to translate sign language into spoken language, after the company launched the Ace ASL mobile application supported by the iOS operating system in April.
This technology works thanks to the ability of one lens to read the spelling of the fingers, meaning that it forms individual letters with the fingers to spell a word, while other lenses guide the user how to spell the user name of the sign language, as well as reading some commonly used words such as “love” and “smile” and “Embrace”
Jenica Bounds, a software designer at the company who was a key player in the project, said she hopes these new elements added to the platform will boost awareness and help more people learn a new way of communicating.
Pounds revealed that the biggest motivation for participating in this project is her eldest son, who is having difficulty learning American Sign Language.