Techman Robot has presented its latest developments in the area of humanoid robotics at Nvidia GTC 2026, including a motion capture-based training approach for its TM Xplore I platform.
The company also announced a collaboration with j-mex, integrating wearable motion capture technology into its robotics development workflow.
Motion capture used to train humanoid robots
The system combines Techman’s TM Xplore I humanoid robot with j-mex’s Moxi motion capture suit and VR interface, allowing operators to control the robot through real-time human movement.
According to the companies, users wearing a VR headset and motion capture suit can replicate detailed actions – such as dual-arm grasping and object sorting – with the robot in real time.
The approach is designed to generate high-quality training data by capturing human motion directly, which can then be used to improve robotic performance in real-world tasks.
Techman says this method supports faster development of “sim-to-real” capabilities, where robots trained in simulated or controlled environments can operate more effectively in dynamic physical settings.
Shift toward data-driven robot training
The use of motion capture reflects a broader shift in robotics toward data-driven training methods, particularly in the development of so-called “physical AI” systems.
By collecting large volumes of real-world motion data, developers can train robots to perform more complex and variable tasks without relying solely on traditional programming techniques.
“Future automation will no longer be limited to monotonous, repetitive tasks, but will fully advance toward deep, intelligent collaboration equipped with perception and thinking capabilities,” said Scott Huang, chief operating officer at Techman Robot.
Collaboration with motion capture specialist j-mex
j-mex says its inertial measurement unit (IMU)-based motion capture technology enables precise tracking of human movement, which can be translated into robotic actions.
“Through high-precision IMU motion capture technology, we can accurately convert subtle human movements into high-quality training data. We are delighted to work with Techman Robot to jointly accelerate the practical application of Physical AI,” said Tsang-Der Ni, CEO of j-mex.
Positioning within physical AI development
Techman Robot says it is focusing on integrating perception, decision-making, and motion control capabilities across both collaborative and humanoid robotics platforms.
The company describes its approach as combining vision, reasoning, and execution within a unified system, reflecting wider industry efforts to develop robots capable of operating in less structured environments.
The TM Xplore I demonstration at GTC highlights ongoing work to bridge the gap between human behavior and robotic execution, as companies explore new ways to train machines for real-world applications.

