In this article, you’ll see how MANUS takes EMF sensor data and transforms it into motion-ready output, with a step-by-step look at capture, skeletal modeling, and retargeting for reliable hand and finger tracking.
MANUS gloves use EMF sensors (Electromagnetic Field sensors) to deliver low-latency, high-precision hand tracking. The workflow can be broken down into three core data layers:
These three data formats support a range of applications, from gesture recognition to character animation and robotics, translating real-time physical hand motion into digital data.
The raw sensor data measures sensor positions relative to the magnetic coils inside the top casing on the back of the hand. This means that offsets between the sensors and the joints must be taken into account when interpreting the data.
The data is available for Metagloves Pro with the Core+SDK License.
The skeleton data is MANUS Core’s internal representation of the user’s hand. It is generated using the device’s sensor data, calibration values, and the Advanced Hand Solver. It consists of 25 nodes, representing the 25 joints of each hand, and serves as a digital clone of the wearer’s hand.
For this format, both skeletal data and ergonomics data are available. Ergonomics data captures hand shape as bend and stretch values for each joint in degrees (flexion/extension and splay/abduction), making it particularly valuable for movement research and rehabilitation.
You can visualize the ergonomic data in the MANUS Core Dashboard, which displays individual finger flex angles and CMC spread values. Skeleton and ergonomics data can also be exported as .CSV files for analysis, including:
Retargeted skeletons are the models streamed to the plugins that animate your characters.
With the MANUS Core Developer Tools (DevTools), users can prepare their models for use. Models can be sent directly from the Unreal and Unity plugins to DevTools, where you define the skeleton by labeling bones and creating chains. Once complete, you can either send the skeleton definition back to the plugin or export it as an .mskl file for manual integration.
The retargeted data can be mapped beyond the normal human sized hands. For example, our partner Het Nieuwe Kader retargeted MANUS hand data onto a rendered robotic hand (see image below).
From the first finger movement to animating a virtual hand or controlling a robot, the MANUS data layers ensure every detail is captured, interpreted, and applied with precision. By converting EMF sensor data into structured skeletons and adaptable retargeted motion, creators, engineers, and researchers gain a consistent and reliable foundation for their work.