Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots
Affective computing explores the development of systems and devices that can perceive, translate, process, and reproduce human emotion. It is an interdisciplinary field which includes computer science, psychology and cognitive science. An inspiration for the research is the ability to simulate empat...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Published: |
Animo Repository
2019
|
Subjects: | |
Online Access: | https://animorepository.dlsu.edu.ph/faculty_research/1462 https://animorepository.dlsu.edu.ph/context/faculty_research/article/2461/type/native/viewcontent |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | De La Salle University |
id |
oai:animorepository.dlsu.edu.ph:faculty_research-2461 |
---|---|
record_format |
eprints |
spelling |
oai:animorepository.dlsu.edu.ph:faculty_research-24612021-06-29T02:23:30Z Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots Carranza, Karmelo Antonio Lazaro R. Manalili, Joshua Bugtai, Nilo T. Baldovino, Renann G. Affective computing explores the development of systems and devices that can perceive, translate, process, and reproduce human emotion. It is an interdisciplinary field which includes computer science, psychology and cognitive science. An inspiration for the research is the ability to simulate empathy when communicating with computers or in the future robots. This paper explored the potential of facial expression tracking with deep learning to make chatbots more emotionally aware through developing a post-therapy session survey chatbot which responds depending on two inputs, interactant's response and facial expression. The developed chatbot summarizes emotional state of the user during the survey through percentages of the tracked facial expressions throughout the conversation with the chatbot. Facial expression tracking for happy, neutral, and hurt had 66.7%, 16.7%, and 56.7% tracking accuracy, respectively. Moreover, the developed program was tested to track expressions simultaneously per second. It can track 17 expressions with stationary subject and 14 expressions with non-stationary subject in a span of 30 seconds. © 2019 IEEE. 2019-11-01T07:00:00Z text text/html https://animorepository.dlsu.edu.ph/faculty_research/1462 https://animorepository.dlsu.edu.ph/context/faculty_research/article/2461/type/native/viewcontent Faculty Research Work Animo Repository Emotion recognition Face perception Robots—Programming Computer Sciences |
institution |
De La Salle University |
building |
De La Salle University Library |
continent |
Asia |
country |
Philippines Philippines |
content_provider |
De La Salle University Library |
collection |
DLSU Institutional Repository |
topic |
Emotion recognition Face perception Robots—Programming Computer Sciences |
spellingShingle |
Emotion recognition Face perception Robots—Programming Computer Sciences Carranza, Karmelo Antonio Lazaro R. Manalili, Joshua Bugtai, Nilo T. Baldovino, Renann G. Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots |
description |
Affective computing explores the development of systems and devices that can perceive, translate, process, and reproduce human emotion. It is an interdisciplinary field which includes computer science, psychology and cognitive science. An inspiration for the research is the ability to simulate empathy when communicating with computers or in the future robots. This paper explored the potential of facial expression tracking with deep learning to make chatbots more emotionally aware through developing a post-therapy session survey chatbot which responds depending on two inputs, interactant's response and facial expression. The developed chatbot summarizes emotional state of the user during the survey through percentages of the tracked facial expressions throughout the conversation with the chatbot. Facial expression tracking for happy, neutral, and hurt had 66.7%, 16.7%, and 56.7% tracking accuracy, respectively. Moreover, the developed program was tested to track expressions simultaneously per second. It can track 17 expressions with stationary subject and 14 expressions with non-stationary subject in a span of 30 seconds. © 2019 IEEE. |
format |
text |
author |
Carranza, Karmelo Antonio Lazaro R. Manalili, Joshua Bugtai, Nilo T. Baldovino, Renann G. |
author_facet |
Carranza, Karmelo Antonio Lazaro R. Manalili, Joshua Bugtai, Nilo T. Baldovino, Renann G. |
author_sort |
Carranza, Karmelo Antonio Lazaro R. |
title |
Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots |
title_short |
Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots |
title_full |
Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots |
title_fullStr |
Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots |
title_full_unstemmed |
Expression tracking with OpenCV deep learning for a development of emotionally aware chatbots |
title_sort |
expression tracking with opencv deep learning for a development of emotionally aware chatbots |
publisher |
Animo Repository |
publishDate |
2019 |
url |
https://animorepository.dlsu.edu.ph/faculty_research/1462 https://animorepository.dlsu.edu.ph/context/faculty_research/article/2461/type/native/viewcontent |
_version_ |
1703981067641290752 |