Creating a visual music interface via emotion detection
In this technological age, the number of digital music files inside personal computers is expanding at an exponential rate. Furthermore, with the growing market of portable digital audio players, music is now easily accessible to the general public. However, there is one group of people that is the...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2010
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/40235 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-40235 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-402352023-07-07T17:08:54Z Creating a visual music interface via emotion detection Lim, Clement Shi Hong. Pina Marziliano School of Electrical and Electronic Engineering DRNTU::Visual arts and music In this technological age, the number of digital music files inside personal computers is expanding at an exponential rate. Furthermore, with the growing market of portable digital audio players, music is now easily accessible to the general public. However, there is one group of people that is the deaf who are unable to find a satisfying audio player that allows them to appreciate music in a meaningful way. Music classification based on moods, plays an important part in multimedia applications. It allows the user to easily find what songs they really want in the huge library of music and how to effectively manage the music database in today modern society. By exploiting the use of short-time analysis techniques together with the Support Vector Machine Classifier, it is possible to classify music based on emotions. Besides classifying music based on emotions, another area of concern is to express the emotion of the songs via colours and images. In order to achieve this, pictures of virtual humans are used to express the emotion of the songs. This is achievable via the manipulation of lighting parameters (colours and intensity) and filter parameters (hue, saturation and brightness). In this project, an attempt is made to integrate these two different methods in order to design a music appreciation system that provides the listener with the opportunity to see the music visually in terms of lyrics, colours, images, frequency and time domain plots of the song’s signal. Through a more in-depth understanding of the content and context of the songs, the deaf will be able to appreciate their music in an exciting new way whereby emotions of the songs can be easily detected and expressed. Bachelor of Engineering 2010-06-14T01:39:56Z 2010-06-14T01:39:56Z 2010 2010 Final Year Project (FYP) http://hdl.handle.net/10356/40235 en Nanyang Technological University 88 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Visual arts and music |
spellingShingle |
DRNTU::Visual arts and music Lim, Clement Shi Hong. Creating a visual music interface via emotion detection |
description |
In this technological age, the number of digital music files inside personal computers is expanding at an exponential rate. Furthermore, with the growing market of portable digital audio players, music is now easily accessible to the general public. However, there is one group of people that is the deaf who are unable to find a satisfying audio player that allows them to appreciate music in a meaningful way.
Music classification based on moods, plays an important part in multimedia applications. It allows the user to easily find what songs they really want in the huge library of music and how to effectively manage the music database in today modern society. By exploiting the use of short-time analysis techniques together with the Support Vector Machine Classifier, it is possible to classify music based on emotions.
Besides classifying music based on emotions, another area of concern is to express the emotion of the songs via colours and images. In order to achieve this, pictures of virtual humans are used to express the emotion of the songs. This is achievable via the manipulation of lighting parameters (colours and intensity) and filter parameters (hue, saturation and brightness).
In this project, an attempt is made to integrate these two different methods in order to design a music appreciation system that provides the listener with the opportunity to see the music visually in terms of lyrics, colours, images, frequency and time domain plots of the song’s signal. Through a more in-depth understanding of the content and context of the songs, the deaf will be able to appreciate their music in an exciting new way whereby emotions of the songs can be easily detected and expressed. |
author2 |
Pina Marziliano |
author_facet |
Pina Marziliano Lim, Clement Shi Hong. |
format |
Final Year Project |
author |
Lim, Clement Shi Hong. |
author_sort |
Lim, Clement Shi Hong. |
title |
Creating a visual music interface via emotion detection |
title_short |
Creating a visual music interface via emotion detection |
title_full |
Creating a visual music interface via emotion detection |
title_fullStr |
Creating a visual music interface via emotion detection |
title_full_unstemmed |
Creating a visual music interface via emotion detection |
title_sort |
creating a visual music interface via emotion detection |
publishDate |
2010 |
url |
http://hdl.handle.net/10356/40235 |
_version_ |
1772828893761241088 |