DEEP Robotics has unveiled new performance claims for its DR02 humanoid robot, positioning the system as an all-weather platform capable of operating reliably beyond laboratory environments.
According to the company, the DR02 is designed to address two longstanding challenges in humanoid robotics: limited environmental adaptability and rigid, unnatural movement.
The robot features IP66-rated waterproof and dustproof protection and incorporates updates in motion control intended to improve stability, coordination, and smoothness during dynamic tasks.
DEEP Robotics says the DR02 has undergone “systematic upgrades across three dimensions – whole-body coordinated movement, dynamic disturbance rejection stability, and smooth action transition”, with the goal of improving performance in complex real-world scenarios.
A central focus of the platform is whole-body coordination. Rather than relying on isolated joint control, the DR02 uses coordinated motion across multiple joints, allowing it to recover from falls and transition from lying down to standing without external assistance.
The company highlights active waist control as a key element, enabling the robot to adjust its center of gravity and redistribute momentum during rapid movements such as turns and kicks.
Stability under external disturbance is another claimed improvement. DEEP Robotics says the DR02 can maintain balance during high-speed actions and recover quickly from interference, including exposure to water splashes. Combined with its sealed enclosure, this is intended to support outdoor operation in rain, dust, and other challenging conditions.
The company also points to smoother action transitions as evidence of progress toward more human-like motion. Demonstrations such as Tai Chi movements are used to illustrate reduced mechanical stuttering between joint movements.
Looking ahead, DEEP Robotics positions the DR02 as a platform for scenario-based deployment, including logistics, industrial inspection, and emergency response.
The company says the robot integrates motion control, perception, and decision-making into a closed-loop system, supporting what it describes as a shift from “laboratory demonstrations” to “scenario-based practical work”.
