• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Skip to footer
  • Home
  • Subscribe
  • Your Membership
    • Edit Your Profile
  • Services
    • Advertising
    • Case studies
    • Design
    • Email marketing
    • Lead generation
    • Magazine
    • Press releases
    • Publishing
    • Sponsored posts
    • Webcasting
    • Webinars
    • White papers
    • Writing
  • Shop
    • My Account
    • Cart
  • About
    • Contact
    • Privacy
    • Terms of use
  • Events

Robotics & Automation News

Market trends and business perspectives

  • News
  • Features
  • Video
  • Webinars
  • White papers
  • Press releases
  • Featured companies
    • AMD Xilinx
    • BlueBotics
    • Elite Robot
    • RGo Robotics
    • SICK Sensor Intelligence
    • Vicor Power

The robot smiles back: Columbia scientists teach robot how to respond to human facial expressions

May 28, 2021 by Mark Allinson Leave a Comment

Columbia Engineering researchers use AI to teach robots to make appropriate reactive human facial expressions, an ability that could build trust between humans and their robotic co-workers and care-givers. (See video below.)

While our facial expressions play a huge role in building trust, most robots still sport the blank and static visage of a professional poker player.

With the increasing use of robots in locations where robots and humans need to work closely together, from nursing homes to warehouses and factories, the need for a more responsive, facially realistic robot is growing more urgent.

Long interested in the interactions between robots and humans, researchers in the Creative Machines Lab at Columbia Engineering have been working for five years to create EVA, a new autonomous robot with a soft and expressive face that responds to match the expressions of nearby humans.

The research will be presented at the ICRA conference on May 30, 2021, and the robot blueprints are open-sourced on Hardware-X (April 2021).

“The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes,” said Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering) and director of the Creative Machines Lab.

Lipson observed a similar trend in the grocery store, where he encountered restocking robots wearing name badges, and in one case, decked out in a cozy, hand-knit cap.

“People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name,” he said. “This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?”

While this sounds simple, creating a convincing robotic face has been a formidable challenge for roboticists.

For decades, robotic body parts have been made of metal or hard plastic, materials that were too stiff to flow and move the way human tissue does. Robotic hardware has been similarly crude and difficult to work with—circuits, sensors, and motors are heavy, power-intensive, and bulky.

The first phase of the project began in Lipson’s lab several years ago when undergraduate student Zanwar Faraj led a team of students in building the robot’s physical “machinery”.

They constructed EVA as a disembodied bust that bears a strong resemblance to the silent but facially animated performers of the Blue Man Group.

EVA can express the six basic emotions of anger, disgust, fear, joy, sadness, and surprise, as well as an array of more nuanced emotions, by using artificial “muscles” (that is, cables and motors) that pull on specific points on EVA’s face, mimicking the movements of the more than 42 tiny muscles attached at various points to the skin and bones of human faces.

“The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions,” Faraj noted.

To overcome this challenge, the team relied heavily on 3D printing to manufacture parts with complex shapes that integrated seamlessly and efficiently with EVA’s skull.

After weeks of tugging cables to make EVA smile, frown, or look upset, the team noticed that EVA’s blue, disembodied face could elicit emotional responses from their lab mates.

“I was minding my own business one day when EVA suddenly gave me a big, friendly smile,” Lipson recalled. “I knew it was purely mechanical, but I found myself reflexively smiling back.”

Once the team was satisfied with EVA’s “mechanics,” they began to address the project’s second major phase: programming the artificial intelligence that would guide EVA’s facial movements.

While lifelike animatronic robots have been in use at theme parks and in movie studios for years, Lipson’s team made two technological advances.

EVA uses deep learning artificial intelligence to “read” and then mirror the expressions on nearby human faces. And EVA’s ability to mimic a wide range of different human facial expressions is learned by trial and error from watching videos of itself.

The most difficult human activities to automate involve non-repetitive physical movements that take place in complicated social settings. Boyuan Chen, Lipson’s PhD student who led the software phase of the project, quickly realized that EVA’s facial movements were too complex a process to be governed by pre-defined sets of rules.

To tackle this challenge, Chen and a second team of students created EVA’s brain using several Deep Learning neural networks.

The robot’s brain needed to master two capabilities: First, to learn to use its own complex system of mechanical muscles to generate any particular facial expression, and, second, to know which faces to make by “reading” the faces of humans.

To teach EVA what its own face looked like, Chen and team filmed hours of footage of EVA making a series of random faces.

Then, like a human watching herself on Zoom, EVA’s internal neural networks learned to pair muscle motion with the video footage of its own face.

Now that EVA had a primitive sense of how its own face worked (known as a “self-image”), it used a second network to match its own self-image with the image of a human face captured on its video camera.

After several refinements and iterations, EVA acquired the ability to read human face gestures from a camera, and to respond by mirroring that human’s facial expression.

The researchers note that EVA is a laboratory experiment, and mimicry alone is still a far cry from the complex ways in which humans communicate using facial expressions. But such enabling technologies could someday have beneficial, real-world applications.

For example, robots capable of responding to a wide variety of human body language would be useful in workplaces, hospitals, schools, and homes.

“There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers,” said Lipson. “Our brains seem to respond well to robots that have some kind of recognizable physical presence.”

Added Chen, “Robots are intertwined in our lives in a growing number of ways, so building trust between humans and machines is increasingly important.”

Main image: Data Collection Process: Eva is practicing random facial expressions by recording what it looks like from the front camera. Credit: Creative Machines Lab/Columbia Engineering.

Print Friendly, PDF & Email

Share this:

  • Print
  • Facebook
  • LinkedIn
  • Reddit
  • Twitter
  • Tumblr
  • Pinterest
  • Skype
  • WhatsApp
  • Telegram
  • Pocket

You might also like…

Filed Under: Features, Science Tagged With: eva, expressions, face, facial, human, humans, lipson, robot, robots, team

Join the Robotics & Automation News community

Reader Interactions

You must log in to post a comment.

Primary Sidebar

Latest articles

  • How do you heat a car to sleep?
  • Florida university opens Blendid robotic smoothie kiosk
  • Robots vs CNCs: What’s better for working metal?
  • Thermal imaging cameras in the food industry
  • IDS offers its industrial camera users free update to its new deep learning method
  • August Robotics machines helping with floor marking at Leipzig Messe
  • Security robots ‘deter crime and minimize risk’, says Knightscope
  • Schneider Electric invests €40 million in new smart factory in Hungary
  • Jabil to manufacture Sarcos’ robotic systems
  • Smart Robotics launches new robotic mixed case palletizer

Most Read

  • Stiga launches ‘world’s smartest’ autonomous lawn mower
    Stiga launches ‘world’s smartest’ autonomous lawn mower
  • Top 20 electric vehicle charging station companies
    Top 20 electric vehicle charging station companies
  • Difference Between Three-Phase and Single-Phase Power
    Difference Between Three-Phase and Single-Phase Power
  • Agility Robotics launches next generation of its humanoid worker robot
    Agility Robotics launches next generation of its humanoid worker robot
  • Scientists have found more water in space than they ever knew possible
    Scientists have found more water in space than they ever knew possible
  • Schneider Electric invests €40 million in new smart factory in Hungary
    Schneider Electric invests €40 million in new smart factory in Hungary
  • Top 20 programmable logic controller manufacturers
    Top 20 programmable logic controller manufacturers
  • Qualcomm releases ‘groundbreaking IoT and robotics’ platforms
    Qualcomm releases ‘groundbreaking IoT and robotics’ platforms
  • DeepRoute.ai launches $1,000 ‘map-free’ self-driving solution
    DeepRoute.ai launches $1,000 ‘map-free’ self-driving solution
  • Robots vs CNCs: What’s better for working metal?
    Robots vs CNCs: What’s better for working metal?

Overused words

ai applications automated automation automotive autonomous business companies company control customers data design development digital electric global industrial industry logistics machine manufacturing market mobile operations platform process production robot robotic robotics robots safety software solution solutions system systems technologies technology time vehicle vehicles warehouse work

Secondary Sidebar

Latest news

  • How do you heat a car to sleep?
  • Florida university opens Blendid robotic smoothie kiosk
  • Robots vs CNCs: What’s better for working metal?
  • Thermal imaging cameras in the food industry
  • IDS offers its industrial camera users free update to its new deep learning method
  • August Robotics machines helping with floor marking at Leipzig Messe
  • Security robots ‘deter crime and minimize risk’, says Knightscope
  • Schneider Electric invests €40 million in new smart factory in Hungary
  • Jabil to manufacture Sarcos’ robotic systems
  • Smart Robotics launches new robotic mixed case palletizer

Footer

We are…

Robotics and Automation News was established in May, 2015, and is now one of the most widely-read websites in its category.

Please consider supporting us by becoming a paying subscriber, or through advertising and sponsorships, or by purchasing products and services through our shop – or a combination of all of the above.

Thank you.

Independent

Archivists

May 2021
M T W T F S S
 12
3456789
10111213141516
17181920212223
24252627282930
31  
« Apr   Jun »

Complex

Old-skool

This website and its associated magazine, and weekly newsletter, are all produced by a small team of experienced journalists and media professionals.

If you have any suggestions or comments, feel free to contact us at any of the email addresses on our contact page.

We’d be happy to hear from you, and will always reply as soon as possible.

Future-facing

Free, fair and legal

We support the principles of net neutrality and equal opportunities.

Member of The Internet Defense League

Copyright © 2023 · News Pro on Genesis Framework · WordPress · Log in

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
Do not sell my personal information.
Cookie SettingsAccept
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT