Gender-specific classifiers in phoneme recognition and academic emotion detection
Gender-specific classifiers are shown to outperform general classifiers. In calibrated experiments designed to demonstrate this, two sets of data were used to build male-specific and female-specific classifiers. The first dataset is used to predict vowel phonemes based on speech signals, and the sec...
Saved in:
Main Authors: | , , |
---|---|
Format: | text |
Published: |
Animo Repository
2016
|
Subjects: | |
Online Access: | https://animorepository.dlsu.edu.ph/faculty_research/1280 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | De La Salle University |
Summary: | Gender-specific classifiers are shown to outperform general classifiers. In calibrated experiments designed to demonstrate this, two sets of data were used to build male-specific and female-specific classifiers. The first dataset is used to predict vowel phonemes based on speech signals, and the second dataset is used to predict negative emotions based on brainwave (EEG) signals. A Multi-Layered-Perceptron (MLP) is first trained as a general classifier, where all data from both male and female users are combined. This general classifier recognizes vowel phonemes with a baseline accuracy of 91.09%, while that for EEG signals has an average baseline accuracy of 58.70%. The experiments show that the performance significantly improves when the classifiers are trained to be gender-specific–that is, there is a separate classifier for male users, and a separate classifier for female users. For the vowel phoneme recognition dataset, the average accuracy increases to 94.20% and 95.60%, for male only users and female-only users, respectively. As for the EEG dataset, the accuracy increases to 65.33% for male-only users and to 70.50% for female-only users. Performance rates using recall and precision show the same trend. A further probe is done using SOM to visualize the distribution of the sub-clusters among male and female users. © Springer International Publishing AG 2016. |
---|