Snap has new Lenses that show users how to fingerspell in American Sign Language

The new features were designed by incorporating feedback and guidance from deaf and hard-of-hearing employees at the company, and using AI and computer vision technology developed by Hungarian startup SignAll, which focuses on technology for deaf people. SignAll has developed technology that can be used to track hand movements to translate sign language into spoken language. The company launched fingerspelling mobile app Ace ASL for iOS in April, and for Android earlier this month.

The new Snap features were also driven internally by Snap Lab software engineer Jennica Pounds, who is deaf. Pounds said her oldest son has had a hard time learning American Sign Language, which was a big motivation for her to work on the new tools. “I’m passionate about this technology because I truly believe it’s going to break so many applications wide open,” she said in a news release. “It’s tech like this that will help families like mine communicate and grow together.”

The AR Lenses will help Snap users learn to fingerspell their names, along with common words like “love,” “hug,” and “smile.” They can capture and share the experiences with other users in chat. And there’s also a new group of Bitmoji stickers showing some commonly signed terms.