EYE GAZE EVALUATION BASED ON EYELID FOR EXPRESSING HUMANOID ROBOT EMOTION IN CONVERSATION WITH PEOPLE

Robots are able to cause differences in perceptions at the social emotional level, for example robots that act as social partners who have the ability to communicate non-verbally by using gestures in the form of basic emotional expressions that are recognized across cultures, namely happy, sad, s...

Full description

Saved in:
Bibliographic Details
Main Author: Setyo Jatmiko, Arief
Format: Theses
Language:Indonesia
Online Access:https://digilib.itb.ac.id/gdl/view/52305
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Institut Teknologi Bandung
Language: Indonesia
Description
Summary:Robots are able to cause differences in perceptions at the social emotional level, for example robots that act as social partners who have the ability to communicate non-verbally by using gestures in the form of basic emotional expressions that are recognized across cultures, namely happy, sad, surprise, fear, anger, and disgust. The level of emotional recognition in robots is related to facial features, the more complete facial features are used to express emotions, the better the level of emotional recognition. The question is which features have the most impact on emotional expression in robots. Research to evaluate the impact of one of the facial features that has been carried out is research on eyelid position conducted on a 2D virtual agent in the form of a one-eye avatar. The problem arises because basically there are differences in the interaction between 2D virtual agents and robots with physical embodiment, so this study was conducted to evaluate the effect of the physical embodiment of humanoid robot eyelids on human perception on the meaning of emotions displayed by changing the position of the eyelids. The method used was a questionnaire using a 5-level Likert scale online and then tested using a two way repeated measure ANOVA. Evaluation and analysis were carried out on 40 randomly selected participants (Minage = 16, Maxage = 57, Rangeage = 41, Medianage = 17.00, Meanage = 24.5, SDage = 12.412, Varianceage = 154.051). The results obtained indicate that there is a significant interaction (F (10,812, 421,662) = 10,557, p = 5.63 x 10-17 < .05, ?p2 = .213) between the sample and emotions, so it can be concluded that each sample is capable of displaying emotional meanings that understood by the participants even though the meanings were mixed in each sample, for example surprised sample were perceived as happy (mean = 3.025), surprise (mean = 4.375), fear (mean = 3.375) and angry (mean = 3.425). The results of this study can be used as a reference for robot makers that simply by using the eyelid feature on the face, the meaning of emotions can be perceived well by humans so that making social robots can be simpler.