senckađ
Group745
Group745
Group745
Group745
Group745
Group745
EDITION
Global
USA
UK
AUNZ
CANADA
IRELAND
FRANCE
GERMANY
ASIA
EUROPE
LATAM
MEA
Behind the Work in association withScheme Engine
Group745

Building an Open-Source Sign Language Platform for Accessible Communication

08/04/2025
84
Share
LBB's Tará McKerr meets with the team at Hello Monday/DEPT® to find out about their AI-powered platform transforming ASL education

Back in 2021, Hello Monday/DEPT® and the American Society for Deaf Children (ASDC), launched Fingerspelling.xyz. It’s a platform that utilises machine learning to help people learn the sign language alphabet through hand tracking.

I’d highly recommend giving it a go yourself – even after a few minutes, it felt like an incredibly effective and intuitive way to learn.

Ever since then, the teams knew they wanted to create something that would move beyond fingerspelling and teach full signs. But expanding from static letters to full signs comes with a set of new challenges, Anders Jessen tells me. “Unlike the alphabet, which consists of relatively simple, single-hand handshapes, most signs involve movement, dynamic transitions between handshapes, and often the use of both hands,” he explains. They discovered, however, that the core learning structure from Fingerspelling worked well – an insight that carried through to the new platform.

Their new platform, named ‘Signs’, uses your webcam to analyse your hand and finger positions in real time through AI. Anders says this feedback loop allows users to instantly see how accurately they’re performing a sign and correct themselves on the spot. “It’s a completely new way to learn ASL online – interactive, responsive, and rooted in immediate engagement,” says Anders.

Working again with ASDC, they also collaborated with technology company, NVIDIA, who brought a shared passion for building purposeful technology. Signs uses your webcam to analyse your hand and finger positions in real time through AI. This feedback loop allows users to instantly see how accurately they’re performing a sign, and correct themselves on the spot.

“It’s a completely new way to learn ASL online – interactive, responsive, and rooted in immediate engagement. It removes the guesswork and makes learning much more intuitive and rewarding,” says Anders.

Speaking about their collaboration with NVIDA, Anders notes, “Together, we explored not only how to teach ASL, but also invite the community to contribute to something bigger.” That’s how the platform’s dual structure was born - ‘Learn’ for receiving knowledge and ‘Contribute’ for giving back. We believe that balance is key.

“The ‘Contribute’ section plays a vital role in building a large, diverse dataset of ASL videos. Our goal is to collect 400 videos per sign across 1,000 signs, which will be released as an open-source dataset. All videos are tagged with 3D keypoints, which makes them incredibly useful for future tools and research in ASL learning and recognition,” he explains.

“Crowdsourcing enables a rich variety of signing styles, body types, and perspectives, which, in turn, helps improve detection accuracy, avatar realism, and the overall inclusivity of the platform. Eventually, this data will even allow us to flip the current experience: instead of copying the avatar, users could sign first, and the system would guess what they’re signing,” Anders adds.

He tells me that at Hello Monday/DEPT®, they’ve consistently been utilising tech and creativity to serve greater causes. “With Undo the Firewall, we created a method to try and counter internet censorship by making every website owner a potential distributor of independent news – bypassing state propaganda and the digital firewalls that support it,” he explains. “And for ReflexAI with support from a team of Google.org Fellows, we helped bring to life HomeTeam. An AI-powered training tool that helps veterans practice critical conversations about mental health and suicide.” It’s wonderful to see words so sturdily backed by action, and they’ve got it in bucketloads.

Their hope is that the open-source dataset will empower others to build more tools that support sign language learning and communications. “This is just the beginning,” says Anders. “We’re already thinking about how to expand the platform beyond ASL. With over 200 different sign languages around the world, there’s a huge opportunity to scale this concept and make it globally accessible.”

When asked about the accuracy of the platform, Anders explains that every sign is verified by two Deaf individuals and a certified interpreter, ensuring a high standard of accuracy and cultural relevance.

“At the moment, the avatar focuses primarily on hand and arm movements. Facial expressions and other non-manual signals – crucial in ASL – are represented in accompanying text descriptions for specific signs,” he adds.

The team say they are aware of many regional variations in ASL, and while their goal is to capture the first 1,000 signs, they are planning to expand to include more regional differences in the future.

The feedback so far has been overwhelmingly positive. But even more valuable to Anders has been the constructive feedback and ideas for new features. “Be it input on how we describe different hand shapes, how the camera is accessed, and how we can make the platform even better in the future.”

Some of those updates are already live, while others are in development. The team is committed to continuous improvement and deeply values the community’s voice in that process.

In terms of what’s next, Anders hopes to “inspire a new wave of innovation in accessible technology. And we’re excited to see how others will build on top of it.”

SIGN UP FOR OUR NEWSLETTER
SUBSCRIBE TO LBB’S newsletter
FOLLOW US
LBB’s Global Sponsor
Group745
Language:
English
v10.0.0