All Use Cases

Supporting Embodied AI for British Sign Language with MANUS Gloves

January 30, 2026
Robotics
ENTERTAINMENT
Other Fields
XR/VR
Research

Sign Language is a fully embodied language where meaning is carried through handshape, motion, space, facial expression, and body posture. While AI and immersive technologies have advanced rapidly, sign language translation remains a deeply challenging problem.

The SIGNATURE-BSL project explores what is required to build AI-enabled services for British Sign Language detection, generation, and translation, delivered through naturalistic animated 3D virtual humans. (Click here for the project video)The project is funded through the Higher Education Innovation Fund from UK Research and Innovation, and is led by researchers at The Open University.

BSL presents a distinctive triple challenge. The language includes an estimated 20,000 to 100,000 signs that are combined productively through space and time, creating high linguistic complexity. Effective machine translation depends on large volumes of high-quality training data, which must be captured using high-fidelity motion capture and facial tracking. At the same time, virtual humans used to deliver sign language translations often suffer from the uncanny valley effect, reducing user trust and limiting real-world adoption.

High-fidelity capture of sign language movement

At the core of SIGNATURE-BSL is the need to capture expressive, precise movement from expert BSL signers. The project leverages The Open University’s new XR Studios in Milton Keynes to record hand, body, and facial motion at the level of detail required for both linguistic research and animation.

MANUS gloves play a central role in this process by capturing fine-grained finger motion that is essential for sign language. Subtle differences in finger position and movement can change meaning, and smooth transitions between signs are often as important as the signs themselves. High-resolution hand data allows researchers to preserve these distinctions and build a reliable inventory of recorded signs performed by BSL experts.

This embodied data forms the foundation for both recognition and generation research, supporting future machine learning approaches that depend on consistent and anatomically meaningful motion capture.

Animating trustworthy virtual humans

Capturing motion is only one part of the problem. SIGNATURE-BSL also investigates how recorded sign language data can be delivered programmatically to animate virtual humans in a way that feels natural and trustworthy. The project explores this through a web-based proof-of-concept application that demonstrates how captured movement can be replayed, controlled, and integrated within XR environments. MANUS glove data enables expressive finger animation that aligns with human anatomy, helping virtual signers avoid the stiffness and ambiguity that often undermine user trust.

For sign language users, unnatural hand motion is immediately noticeable. Accurate hand articulation is therefore not a cosmetic detail, but a requirement for acceptance and usability.

Exploring new representations for sign language

AI Alongside motion capture, the project uses volumetric video to create 4D recordings of BSL speakers. These recordings are used to assess the feasibility of a mesh-based transformer architecture designed for spatio-temporal 3D data, offering an alternative path for automated recognition and generation. By combining MANUS glove data, full-body motion capture, facial tracking, and volumetric video, SIGNATURE-BSL explores multiple representations of sign language movement. This allows the research team to compare approaches and identify methods that are both scalable and respectful of the language’s embodied nature.

Inclusion, access, and long-term impact

SIGNATURE-BSL runs under the umbrella of The Open University’s research programme on Immersive, Inclusive, and Embodied Learning, chaired by Prof Dr Fridolin Wild at the Institute of Educational Technology. The project brings together contributors Fridolin Wild, Kevin McLeod, Ítalo Lopes dos Santos, Richard Lovelock, Phil Downs, Gerard O’Malley, Ryan Hale, and Matthew Moran. As Principal Investigator Prof Dr Fridolin Wild explains:

“The long-term goal of AI machine translation for BSL with 3D virtual humans promises strong benefits for workforce inclusion because deaf and hard-of-hearing individuals can participate more easily in a more diverse job market. It can also help widen accessor lower cost in lower stakes contexts. It can empower deaf communities and improve independence.”

By combining MANUS gloves with advanced XR infrastructure and AI research, SIGNATURE-BSL establishes a proof of concept for how high-fidelity embodied data can support more inclusive language technologies. This project represents an early but important step toward making sign language more accessible through trustworthy, human-centered AI.

No items found.
Ready to bring motion to life?
Join studios and creative teams worldwide using MANUS for high-fidelity motion capture and real-time performance control.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to shape the future of immersive interaction?
Join XR pioneers using MANUS Gloves for lifelike hand interaction and motion fidelity in virtual environment.