Teleoperation enables humans to control robotic hands with natural finger movements. Based on internal testing with the MANUS Metagloves Pro and the PSYONIC Ability Hand, this workflow outlines how to create a responsive, real-time ROS 2 teleoperation setup using MANUS high-precision finger tracking and PSYONIC’s adaptive robotic hand.
This workflow requires the MANUS Metagloves Pro, a PSYONIC Ability Hand, and a Linux workstation running ROS 2. Although MANUS Core is Windows-based, Linux systems can use the MANUS Integrated SDK, which streams glove data directly into the Linux environment. The MANUS ROS 2 nodes publish joint-angle data, while the PSYONIC ROS 2 node sends control commands to the Ability Hand.
Connect the Metagloves Pro dongle to your Linux workstation and use the MANUS Integrated SDK to stream finger data into ROS 2. Install ROS 2 and download both the MANUS ROS 2 packages and the PSYONIC ROS 2 Python wrapper.
The MANUS nodes publish raw and joint-angle data, and the PSYONIC package provides the command interface for controlling the Ability Hand. Once both are active, the system is ready for integration through the translation layer.
The translation layer bridges MANUS output to PSYONIC input.
Because the kinematic structure of the human hand includes more degrees of freedom (DOF) than the Ability Hand’s actuation model, MANUS joint angles need to be mapped into the robotic hand’s commands.
MANUS provides MCP, PIP, and DIP angles of the operator’s hand in degrees, which were merged into a single normalized bend value for each finger and mapped to the PSYONIC finger-position interface.
After the translation layer is running, test basic gestures to identify scaling issues or offsets:
Minor adjustments, such as scaling, zero-offset corrections, or light smoothing, are normal, especially when mapping continuous human motion to the Ability Hand’s reduced DOFs.
Most challenges observed during internal testing came from the thumb, which needs to be mapped to the thumb of the Ability Hand with less degrees of freedom. MANUS provides detailed thumb values, including flexion and palmar abduction, while the robot thumb only accepts basic flexion inputs.
This challenge was solved through:
ThumbMCPspread, scaled slightlyThis produces stable, predictable thumb behavior without unintended rotations.
Using the MANUS Metagloves Pro with the PSYONIC Ability Hand enables a clean, real-time teleoperation setup that closely mirrors natural finger motion. With the MANUS Integrated SDK, a simple translation layer, and light calibration, developers can achieve smooth and intuitive robotic-hand control suitable for research, development, and embodied AI experimentation.