At the point in the future, the engineers estimate the time when the robot is a normal part of everyday life, providing care for children and parents and giving labor. For humans, facial expressions are an unspoken way to communicate and play an important role in building trust between people. Scientists have worked to make robots that can use expression of facial right at the right time.
Engineers in the Creative Machinery Laboratory at Columbia Engineering have worked for half a decade to make a robot called Eva, a new autonomous robot with a soft and expressive face was able to respond to the human expression nearby. One researcher about this project, Hod Lipson, said the idea for Eva began to form several years ago.
Lipson, James and Sally Scapa Innovation and Director of the Creative Machine Lab, said that he and his students began to pay attention to robotic stocking shelves and other tasks in the real world being humanized by people who worked with them. Humanization takes the form of things like googly eyes and hand knitted hats. The researchers think that if they add things like hats and clothes to robots make them more human and relatable, build robots with expressive faces will be more.
The team said the biggest challenge in creating their robots designed a system that was quite compact so that it was fitting in the limits of human skulls but it was quite functional to produce various facial expressions. The team uses 3D printing to build parts with complex forms that can integrate smoothly into the engine skull. After the mechanical portion was sorted, the team moved to the next phase involving AI programming to guide facial movements.
Eva uses AI learning to read and then reflects the expression on the human face nearby. AI studied new human expression with trials and mistakes from watching the video itself. The team acknowledged that Eva was still a laboratory experiment and very far from being able to imitate complex ways that communicate with facial expressions.