Monitoring of group discussion using Android apps

Human group interaction is very complex and rich. Monitoring any group can be done using non-verbal (facial expressions, body languages, gestures, etc ) or/and verbal (voice tone, voice pitch,...etc) signals. Despite the impressive advancements in video-audio signal processing, analyzing different a...

Full description

Saved in:
Bibliographic Details
Main Author: Hamdi Hamzah
Other Authors: Justin Dauwels
Format: Final Year Project
Language:English
Published: 2014
Subjects:
Online Access:http://hdl.handle.net/10356/60424
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Human group interaction is very complex and rich. Monitoring any group can be done using non-verbal (facial expressions, body languages, gestures, etc ) or/and verbal (voice tone, voice pitch,...etc) signals. Despite the impressive advancements in video-audio signal processing, analyzing different aspects of human interaction as empathy, hostility, (dis-)agreement, flirting, dominance, superiority, inferiority, etc. remains very challenging. Investigating these aspects is crucial for building intelligent robots that can participate in conversations in a natural way, or to fulfill certain social responsibilities in our life. As our ultimate goal is to improve the current computer systems and social robots that suffer from a lack of social skills, we will explore the social signals and social behaviors, including social interactions (like turn taking), social attitude (like alliance), and social relations/ roles to build more socially intelligent model for robots. More specifically, we have the following three aims: - Analyze social signals (audio-video signals) collected simultaneously from different candidates in a group interaction. - Develop a real-time automated system to annotate the measured signals based on the features discussed above. - Implement such system as an Android app.