Robotics & Automation News

Market trends and business perspectives

brain diagram

Berkeley spin-off Embodied Intelligence raises $7 million to build ‘teachable’ robots

Leading researchers in artificial intelligence, deep learning, and reinforcement learning from OpenAI and UC Berkeley assemble to launch an ambitious new project

Embodied Intelligence, a startup established by AI researchers from the University of California Berkeley, and their partners at OpenAI, has raised $7 million in seed funding. 

The company says it is “striving to democratize robotics” by enabling anyone to teach a robot new skills.

The funding round was led by Amplify Partners with participation from Lux Capital, SV Angels, FreeS, 11.2 Capital and A.Capital. 

The capital will be used to develop AI software that makes it easy to teach robots new, complex skills.

 

Building on the founders’ pioneering research in deep imitation learning, deep reinforcement learning and meta-learning, Embodied Intelligence is developing AI software – or “robot brains” – that can be loaded onto any existing robots.

While traditional programming of robots requires writing code, a time-consuming endeavor even for robotics experts, Embodied Intelligence software will empower anyone to program a robot by simply donning a VR headset and guiding a robot through a task.

These human demonstrations train deep neural nets, which are further tuned through the use of reinforcement learning, resulting in robots that can be easily taught a wide range of skills in areas where existing solutions break down.

Complicated tasks which are all candidates to benefit from Embodied Intelligence’s work include:

  • the manipulation of deformable objects such as wires, fabrics, linens, apparel, fluid-bags, and food;
  • picking parts and order items out of cluttered, unstructured bins;
  • completing assemblies where hard automation struggles due to variability in parts, configurations, and
  • individualization of orders.

Sunil Dhaliwal, general partner at Amplify Partners, says: “Recent breakthroughs in AI have enabled robots to learn locomotion, develop manipulation skills from trial and error, and to learn from VR demonstrations. However, all of these advances have been in simulation or laboratory environments.

“The Embodied Intelligence team that led much of this work will now bring these cutting-edge AI and robotics advances into the real world.”

Pieter Abbeel, president and chief scientist of Embodied Intelligence, says: “Traditional robot programming requires substantial time and expertise.

“What we will provide is an AI layer that can be added to any existing robot, enabling robots to learn new skills rather than requiring explicit programming.”

Peter Chen, CEO of Embodied Intelligence, says: “Commodity VR devices provide an easy way to control and teach physical robots.

“Since the robot simply mimics the hand motion that’s tracked by VR, a person without any special training can make the robot do the right thing right from the beginning.

“The robot will keep learning and after a while the robot says, ‘I got this, I can do this task on my own now’.”

Shahin Farshchi, partner at Lux Capital, says: “Despite remarkable advances, robots continue to require armies of PhDs to make them perform human-like tasks.

“Embodied Intelligence advances the state of the art in reinforcement and imitation learning, yielding robots that can be trained to do complicated tasks, quickly.”

Rocky Duan, CTO of Embodied Intelligence, says: “Advanced robotic capabilities have been confined to established players that can afford costly R&D efforts.

“Our teachable robots will empower any size businesses to incorporate robotics into their manufacturing processes and keep up with the competition.”

The founding team behind Embodied Intelligence has a combined 30 years of experience in artificial intelligence, deep learning and robotics.

Abbeel’s lab at UC Berkeley has pioneered many recent breakthroughs in robot learning, including a robot that organizes laundry, robots that learn (simulated) locomotion, robots that learns vision-based manipulation from their own trial and error and from human VR teleop.

Founders Pieter Abbeel, Peter Chen and Rocky Duan recently spent over a year at OpenAI, “pushing the frontier” in deep imitation learning, reinforcement learning, unsupervised learning, and learning-to-learn.

Print Friendly, PDF & Email