Automatic music mood classification

A new technology on music resources classifications is needed to search for or manage music pieces more effectively as the traditional music resource management requires a lot of manual work and is too time-consuming. In recent days, researches on automatic classifications based on music mood are co...

Full description

Saved in:
Bibliographic Details
Main Author: Cui, Min.
Other Authors: Wan Chunru
Format: Final Year Project
Language:English
Published: 2010
Subjects:
Online Access:http://hdl.handle.net/10356/40345
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-40345
record_format dspace
spelling sg-ntu-dr.10356-403452023-07-07T15:48:29Z Automatic music mood classification Cui, Min. Wan Chunru School of Electrical and Electronic Engineering DRNTU::Engineering A new technology on music resources classifications is needed to search for or manage music pieces more effectively as the traditional music resource management requires a lot of manual work and is too time-consuming. In recent days, researches on automatic classifications based on music mood are conducted by many researchers, which have great significance as music contents are good representations of human emotions. The emotion plane proposed by Thayer, which defines the emotion classes dimensionally in terms of arousal and valence, is commonly adopted to avoid the ambiguity of the music emotion description. Another investigation on music mood perception show that human perceives music emotions in the way that is most similar to Hevner’s categorization of music mood. Empirical studies also show that the main features that influence the human perception of music emotions are: music tempo, pitch and music articulation. Hence, these three audio features are extracted in order to classify a piece of music into the correct emotional expression categorization. The proposed approach for musical mood classification is implemented with the MIR-Toolbox integrated within the MATLAB software, which is dedicated to the extraction from audio files of musical features such as tonality, rhythm and so on. A database containing music pieces of all 8 clusters of emotional expressions was constructed according to Hevner’s categorization, within which the training set was used for music features extraction, while the testing set was used for musical classification and accuracy test. The result of this automatic music mood classification project is quite satisfactory, as the categorization outcomes are quite accurate. However, due to the limited time and author’s knowledge, some other music features such as tonality that related to emotional expression are not extracted, which may reduce the classification accuracy. Hence, much more effort needs to be put in to perform a better categorization approach in the future. Bachelor of Engineering 2010-06-15T01:22:26Z 2010-06-15T01:22:26Z 2010 2010 Final Year Project (FYP) http://hdl.handle.net/10356/40345 en Nanyang Technological University 76 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering
spellingShingle DRNTU::Engineering
Cui, Min.
Automatic music mood classification
description A new technology on music resources classifications is needed to search for or manage music pieces more effectively as the traditional music resource management requires a lot of manual work and is too time-consuming. In recent days, researches on automatic classifications based on music mood are conducted by many researchers, which have great significance as music contents are good representations of human emotions. The emotion plane proposed by Thayer, which defines the emotion classes dimensionally in terms of arousal and valence, is commonly adopted to avoid the ambiguity of the music emotion description. Another investigation on music mood perception show that human perceives music emotions in the way that is most similar to Hevner’s categorization of music mood. Empirical studies also show that the main features that influence the human perception of music emotions are: music tempo, pitch and music articulation. Hence, these three audio features are extracted in order to classify a piece of music into the correct emotional expression categorization. The proposed approach for musical mood classification is implemented with the MIR-Toolbox integrated within the MATLAB software, which is dedicated to the extraction from audio files of musical features such as tonality, rhythm and so on. A database containing music pieces of all 8 clusters of emotional expressions was constructed according to Hevner’s categorization, within which the training set was used for music features extraction, while the testing set was used for musical classification and accuracy test. The result of this automatic music mood classification project is quite satisfactory, as the categorization outcomes are quite accurate. However, due to the limited time and author’s knowledge, some other music features such as tonality that related to emotional expression are not extracted, which may reduce the classification accuracy. Hence, much more effort needs to be put in to perform a better categorization approach in the future.
author2 Wan Chunru
author_facet Wan Chunru
Cui, Min.
format Final Year Project
author Cui, Min.
author_sort Cui, Min.
title Automatic music mood classification
title_short Automatic music mood classification
title_full Automatic music mood classification
title_fullStr Automatic music mood classification
title_full_unstemmed Automatic music mood classification
title_sort automatic music mood classification
publishDate 2010
url http://hdl.handle.net/10356/40345
_version_ 1772829109563424768