All Use Cases

Dexterous Teleoperation with MANUS and the Psyonic Ability Hand in MIT CSAIL's Actuated Neck Study

March 13, 2026
Robotics
ENTERTAINMENT
Other Fields
XR/VR
Research

This use case is based on the research paper: Learning to Look Around: Enhancing Teleoperation and Learning with a Human-like Actuated Neck. The research results, methodologies, and performance metrics described are reported by the paper’s authors. For complete technical details, please refer to the original paper here.

Researchers at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) set out to solve one of teleoperation's hardest problems: the perception gap. When operators control robots remotely, fixed cameras restrict the field of view, create occlusions, and increase cognitive load, making complex manipulation tasks difficult or sometimes impossible.

Their solution was a robotic system equipped with a 5-DOF actuated neck that mirrors the operator's natural head movements, giving the operator a true first-person perspective from the robot's point of view. To validate the system, the team conducted seven complex whole-body manipulation tasks and trained autonomous policies on three additional tasks, collecting more than 360 teleoperated demonstrations in total.

Every component of the system needed to operate reliably without introducing variables that could affect the experiment. This is where MANUS came in.

Enabling Dexterous Control Where Other Methods Fell Short

The team selected MANUS gloves to control the robot's dexterous Psyonic Ability Hand, capturing hand articulation across 25 degrees of freedom. This enabled precise finger and hand control for manipulation tasks ranging from loading a dishwasher to transferring a cup from a bottom shelf to a box on a higher table.

The gloves were one of several tracking technologies evaluated in the study. During testing, the team encountered issues with VR headset based hand tracking. When the operator's hands moved close to the robot's body, they left the headset's field of view, causing jitter and inconsistencies that disrupted task performance and raised safety concerns in close proximity scenarios.

As a wearable system, MANUS does not rely on an external field of view. The researchers adopted it without requiring modifications or workarounds, and it does not appear as a limitation anywhere in the paper.

Strong Performance Across Autonomous Tasks  

With MANUS handling dexterous hand control, the team was able to focus fully on their core research question. The system achieved strong results, including a 95 percent success rate for cup transfer from a bottom shelf, 90 percent for left-to-right pick and place, and 82 percent for close-range object manipulation.

A single autonomous policy trained on demonstrations collected with MANUS gloves learned to identify and complete three different tasks without explicit task labels. The policy generalized across varying object positions and workspace heights.

Reliable Where It Counts Most  

The MIT CSAIL team built a sophisticated teleoperation system and required every component to perform reliably under demanding research conditions. MANUS was chosen to handle the most dexterous and demanding part of the system, and it never became a variable the researchers had to troubleshoot.

No items found.
Ready to bring motion to life?
Join studios and creative teams worldwide using MANUS for high-fidelity motion capture and real-time performance control.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to shape the future of immersive interaction?
Join XR pioneers using MANUS Gloves for lifelike hand interaction and motion fidelity in virtual environment.