All Use Cases

How to Teleoperate PSYONIC Ability Hand with MANUS

November 19, 2025
Robotics
ENTERTAINMENT
Other Fields
XR/VR
Research

Teleoperation enables humans to control robotic hands with natural finger movements. Based on internal testing with the MANUS Metagloves Pro and the PSYONIC Ability Hand, this workflow outlines how to create a responsive, real-time ROS 2 teleoperation setup using MANUS high-precision finger tracking and PSYONIC’s adaptive robotic hand.

What You Need Before You Start

This workflow requires the MANUS Metagloves Pro, a PSYONIC Ability Hand, and a Linux workstation running ROS 2. Although MANUS Core is Windows-based, Linux systems can use the MANUS Integrated SDK, which streams glove data directly into the Linux environment. The MANUS ROS 2 nodes publish joint-angle data, while the PSYONIC ROS 2 node sends control commands to the Ability Hand.

Step-by-Step Setup Instructions

1. Connect the Hardware

Connect the Metagloves Pro dongle to your Linux workstation and use the MANUS Integrated SDK to stream finger data into ROS 2. Install ROS 2 and download both the MANUS ROS 2 packages and the PSYONIC ROS 2 Python wrapper.

The MANUS nodes publish raw and joint-angle data, and the PSYONIC package provides the command interface for controlling the Ability Hand. Once both are active, the system is ready for integration through the translation layer.

2. Create the Translation Layer

The translation layer bridges MANUS output to PSYONIC input.

Because the kinematic structure of the human hand includes more degrees of freedom (DOF) than the Ability Hand’s actuation model, MANUS joint angles need to be mapped into the robotic hand’s commands.

MANUS provides MCP, PIP, and DIP angles of the operator’s hand in degrees, which were merged into a single normalized bend value for each finger and mapped to the PSYONIC finger-position interface.

3. Fine-Tune the Data

After the translation layer is running, test basic gestures to identify scaling issues or offsets:

  • Open hand
  • Full fist
  • Pinch
  • Tripod grip

Minor adjustments, such as scaling, zero-offset corrections, or light smoothing, are normal, especially when mapping continuous human motion to the Ability Hand’s reduced DOFs.

Common Issues and Trouble Shooting

Most challenges observed during internal testing came from the thumb, which needs to be mapped to the thumb of the Ability Hand with less degrees of freedom. MANUS provides detailed thumb values, including flexion and palmar abduction, while the robot thumb only accepts basic flexion inputs.

This challenge was solved through:

  • Thumb rotation → use ThumbMCPspread, scaled slightly
  • Thumb flexion → combine MCP/IP flexion like the other fingers, then apply a small scaling factor

This produces stable, predictable thumb behavior without unintended rotations.

Conclusion

Using the MANUS Metagloves Pro with the PSYONIC Ability Hand enables a clean, real-time teleoperation setup that closely mirrors natural finger motion. With the MANUS Integrated SDK, a simple translation layer, and light calibration, developers can achieve smooth and intuitive robotic-hand control suitable for research, development, and embodied AI experimentation.

Ready to bring motion to life?
Join studios and creative teams worldwide using MANUS for high-fidelity motion capture and real-time performance control.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to shape the future of immersive interaction?
Join XR pioneers using MANUS Gloves for lifelike hand interaction and motion fidelity in virtual environment.