Reproducing Musicality: Detecting Musical Objects and Emulating Musicality Through Partial Evolution

Musicology is a growing focus in computer science. Past research has had success in automatically generating music through learning-based agents [1] that make use of neural networks and through model and rule-based approaches [2]. These methods require a significant amount of information, either in...

全面介紹

Saved in:
書目詳細資料
Main Authors: Samson, Aran V, Coronel, Andrei D
格式: text
出版: Archīum Ateneo 2019
主題:
在線閱讀:https://archium.ateneo.edu/discs-faculty-pubs/297
https://ieeexplore.ieee.org/document/8669033
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Musicology is a growing focus in computer science. Past research has had success in automatically generating music through learning-based agents [1] that make use of neural networks and through model and rule-based approaches [2]. These methods require a significant amount of information, either in the form of a large dataset for learning or a comprehensive set of rules based on musical concepts. This paper explores a model in which a minimal amount of musical information is needed to compose a desired style of music. This paper makes use of objectness, a concept directly derived from imagery and pattern recognition to extract specific musical objects from a single musical piece. This is then used as the foundation to produce a new generated musical piece that is similar in style to the original. The overall musical piece is generated through a partial evolution. This method eliminates the need for a large amount of pre-provided data and directly composes music based on a singular source piece.