With science progressing blazingly fast, no technology can be written off for the far future with conviction anymore. Roboticists at the University of Pisa are working on FACE, a robot that they claim is capable of displaying emotions like a human. The team of researchers at the Interdepartmental Research Centre “E.Piaggio” have taken the premise of whether a robot can express emotions as the basis of their research.
From neutral, to amazed to disgusted
FACE, or Facial Automation for Conveying Emotions is a life-like android delivering emotional information through facial expressions, which the researchers are deploying to study the human robot empathic link. According to the official website, “FACE is part of a complex Human Interaction Persuasive Observation Platform (HIPOP) able to collect synchronized information acquired from different sensors, i.e. physiological, psychological and behavioural data. Thanks to its modularity, HIPOP allows scientists to configure different experiments selecting the number and the type of available modules to follow protocol requirements.“
The FACE robot can also mimic facial expressions. According to the researchers, the basic expressions that have been used to create more 'complex facial states' have been born out of the six basic expressions identified by Ekman – Anger, Disgust, Fear, Happiness, Sadness and Surprise. The video below is an informative insight on what FACE is all about.
The head of the robot is essentially an artificial skull, covered by a special skin made of a material called Frubber. Developed by Hanson Robotics, Frubber is a silicone elastomer, containing 70 percent air by volume, the mechanical properties of which enable the robot to generate complex facial movements. “The control of the size and distribution of the open and closed air cells in the Frubber skin is what allows it to move much like human skin. The fact that it can be moved by small servos with little force makes it useful for humanoid robot faces,” adds Nicole Lazzeri, one of the members of the team.
The researchers, as part of their study used Hybrid Engine for Facial Expressions Synthesis (HEFES) software, which has been built to enact people's emotional responses realistically. It has been revealed that the system using HEFES software gives a robot realistic and acceptable expressions, overcoming the “uncanny valley principle”, which led people to become uneasy at the sight of a robot that looks almost human.
To know more about FACE, visit FaceTeam.it
Publish date: July 16, 2012 6:01 pm| Modified date: December 18, 2013 10:48 pm