On April 15th, HelloMonday/DEPT® and American Society for Deaf Children are launching GiveAHand.Ai, the world’s largest open-source image library of hands, fully tagged with data to help build better hand models.
This was the challenge: Live audio transcription and translation tools have limitations for the deaf and hard of hearing due to sign language's complex combination of fast-paced hand gestures, facial expressions, and full body movements. While machine learning models can handle facial expressions and body movements, detecting hand and finger movements remains a challenge. AI is on the rise and democratising access to data so it can be used in limitless ways. But since most available tools are trained on pre-existing data and images, it’s difficult to build useful machine learning models from sources that aren’t readily available.
And the solution: Launched to celebrate American Sign Language Day (April 15th), GiveAHand.ai is using tech for good. One hundred percent crowdfunded, the data collected in the platform will generate a diverse dataset of hands: diverse shapes, colours, backgrounds, and gestures. Now, anyone can put their hands to good use, by contributing and uploading images, helping to build an image library that will help unlock sign language. Researchers can then download and use these fully tagged images to improve their machine-learning models, truly allowing the detection and translation of the full spectrum of Sign Language.
GiveAHand.ai is the second collaboration between Hello Monday/DEPT® and ASDC: in 2021, they launched Fingerspelling.xyz, a hand-tracking experience using machine learning to help learn the sign language alphabet. Currently, more than 5.1 million correct hand signs have been registered, and the American Society for Deaf Children now uses Fingerspelling.xyz as part of their own training materials.