All Use Cases

MIT.nano Studies Expert Piano Performance with MANUS Data Gloves

December 2, 2025
Robotics
ENTERTAINMENT
Other Fields
XR/VR
Research

About the MIT.nano Immersion Lab Research Initiative

The MIT.nano Immersion Lab is MIT’s cross-disciplinary research environment focused on studying human performance through advanced sensing technologies. Researchers, engineers, performers, and scientists collaborate to understand how movement, physiology, and digital tools can reveal new insights into skilled human behavior. The lab uses motion capture, wearable sensors, imaging tools, and computational analysis to study everything from biomechanics to expressive performance.

The Challenge: Capturing the Nuance of Pianists’ Movement

Studying expert pianists requires measuring extremely subtle details of hand shape, finger articulation, upper-body posture, speed, accuracy, and expressive nuance. Traditional video or observational methods are not precise enough to analyze these micro-movements. For researcher Hannah Park-Kaufmann and principal investigator Praneeth Namburi, the goal was to record full-body movement along with physiological data and highly accurate finger motion, all while allowing the pianist to play naturally.

To achieve this, the team worked with pianist Harrison Lee, combining motion capture, EKG, ultrasound imaging, and wearable sensors throughout an extended recording session. The challenge was to collect detailed data at high fidelity without interfering with Harrison’s ability to perform technically demanding repertoire.

The Solution: High-Fidelity Finger Tracking

To capture precise finger articulation during performance, the research team integrated MANUS data gloves into their multi-sensor setup. The gloves provided accurate, real-time tracking that allowed the researchers to analyze fast octave passages, fine finger independence, chord transitions, rapid hand adjustments, and timing at high speeds. Throughout a five-hour session involving complex repertoire and additional movement tests, the gloves maintained stable calibration and responsive tracking. This allowed Harrison to play naturally even with multiple sensors attached, producing a comprehensive dataset linking finger motion, body movement, and physiological activity. The information gathered would have been impossible to collect reliably with conventional video or mechanical sensing systems.

Behind the Scenes: Motion Capture Meets Musical Performance

Inside the controlled environment of the MIT Immersion Lab, Harrison performed a full repertoire set while wearing MANUS data gloves and a series of physiological and motion sensors. After the primary performances, he completed additional tests such as rapid octave repetitions to evaluate the system under technically extreme conditions. Even during these high-speed challenges, MANUS data gloves delivered low-latency, smooth, and accurate tracking. This enabled the team to examine the biomechanics of virtuosic performance with scientific detail while allowing the musician to play in a natural and expressive manner.

No items found.
Ready to bring motion to life?
Join studios and creative teams worldwide using MANUS for high-fidelity motion capture and real-time performance control.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to access machine-grade motion data?
Join leading labs worldwide using MANUS for embodied AI.
Ready to shape the future of immersive interaction?
Join XR pioneers using MANUS Gloves for lifelike hand interaction and motion fidelity in virtual environment.