RE2 Robotics receives $2.5 million from US Navy to develop manipulation system
RE2 Robotics, a developer of human-like robotic manipulation systems, has received $2.5 million in funding from the Office of Naval Research to continue the development and commercialization of its technology under the Dexterous Maritime Manipulation System (DM2S) program.
RE2’s DM2S technology will provide Navy personnel with the ability to autonomously perform mine countermeasure (MCM) missions.
In this next phase of the program, RE2 will upgrade its dual-arm prototype, known as the Maritime Dexterous Manipulation System (MDMS), for deep ocean use; apply computer vision and machine-learning algorithms to enable autonomous manipulation capabilities; and integrate with underwater vehicles that can autonomously navigate.
Jorgen Pedersen, president and CEO of RE2 Robotics, says: “In the first phase of this project, we successfully developed a dexterous underwater robotic system that was capable of teleoperation in an ocean environment.
“This additional funding enables our team to further expand and upgrade the capabilities of our underwater robotic arms to perform MCM tasks in deeper water through the use of autonomy.
“In addition, this advanced technology will allow us to pursue commercial opportunities, such as underwater inspection and maintenance in the oil and gas industry.”
Unlike other underwater robotic systems that are hydraulic-driven, MDMS uses an energy-saving, electromechanical system.
This allows the system to perform longer-duration subsea inspection and intervention tasks while reducing system maintenance and downtime.
Jack Reinhart, vice president of project management at RE2, says: “With the development of our first MDMS prototype, we created a compact, lightweight system with a sealed, neutrally buoyant design that was successfully tested in the Pacific Ocean.
“We’re now looking forward to improving upon that proven design by adding even greater functionality in deep water, including integration with new underwater vehicles and computer-vision-based autonomy.”