Monitoring of group discussion using Android apps
Human group interaction is very complex and rich. Monitoring any group can be done using non-verbal (facial expressions, body languages, gestures, etc ) or/and verbal (voice tone, voice pitch,...etc) signals. Despite the impressive advancements in video-audio signal processing, analyzing different a...
محفوظ في:
المؤلف الرئيسي: | |
---|---|
مؤلفون آخرون: | |
التنسيق: | Final Year Project |
اللغة: | English |
منشور في: |
2014
|
الموضوعات: | |
الوصول للمادة أونلاين: | http://hdl.handle.net/10356/60424 |
الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
المؤسسة: | Nanyang Technological University |
اللغة: | English |
الملخص: | Human group interaction is very complex and rich. Monitoring any group can be done using non-verbal (facial expressions, body languages, gestures, etc ) or/and verbal (voice tone, voice pitch,...etc) signals. Despite the impressive advancements in video-audio signal processing, analyzing different aspects of human interaction as empathy, hostility, (dis-)agreement, flirting, dominance, superiority, inferiority, etc. remains very challenging. Investigating these aspects is crucial for building intelligent robots that can participate in conversations in a natural way, or to fulfill certain social responsibilities in our life. As our ultimate goal is to improve the current computer systems and social robots that suffer from a lack of social skills, we will explore the social signals and social behaviors, including social interactions (like turn taking), social attitude (like alliance), and social relations/ roles to build more socially intelligent model for robots. More specifically, we have the following three aims:
- Analyze social signals (audio-video signals) collected simultaneously from different candidates in a group interaction.
- Develop a real-time automated system to annotate the measured signals based on the features discussed above.
- Implement such system as an Android app. |
---|