Automated real-time analytics for multi-party dialogues

This project explores the idea of detecting high-level features, which includes human personality and emotion, based on low-level prosodic cues displayed in a conversation. After listening to audio recordings lasting 2 minutes long each, participants were made to rate the high-level features display...

Full description

Saved in:
Bibliographic Details
Main Author: See, Yihui
Other Authors: Justin Dauwels
Format: Final Year Project
Language:English
Published: 2015
Subjects:
Online Access:http://hdl.handle.net/10356/64184
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:This project explores the idea of detecting high-level features, which includes human personality and emotion, based on low-level prosodic cues displayed in a conversation. After listening to audio recordings lasting 2 minutes long each, participants were made to rate the high-level features displayed by each speaker during the conversation. The high-level features were selected on the basis that they are easier to be identified in a conversation by the human brain. The analysis from the annotations received from the participants showed that not all high-level features can be well identified. From the two-party dialogs, politeness, confusion and hostility were the most easily identifiable features from both annotations and classification results. The same analysis had also been extended to multiparty dialogs. In this project, a group of 4 speakers was used as the representation for multiparty dialogs. The analysis from the annotations received gave a slightly different outcome compared to the two-party dialogs. Judging from the annotations, interest, disagreement, likeability, politeness and respect were the more easily identifiable features. However, the classifiers built from the annotations gave more accurate detection for the features likeability, friendliness, respect, confusion and hostility.