Sequence prediction using recurrent neural network

The project implemented a Gap-Filling Engine capable of filling in gaps in missing sequences of various types. A strategy was introduced to look forward into subsequent data, enabling the Engine to improve the accuracy of the prediction by more than 30%. A Sequence Model based on Long Short-term Mem...

全面介紹

Saved in:
書目詳細資料
主要作者: Nguyen, Phan Huy
其他作者: Goh Wooi Boon
格式: Final Year Project
語言:English
出版: 2017
主題:
在線閱讀:http://hdl.handle.net/10356/70503
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:The project implemented a Gap-Filling Engine capable of filling in gaps in missing sequences of various types. A strategy was introduced to look forward into subsequent data, enabling the Engine to improve the accuracy of the prediction by more than 30%. A Sequence Model based on Long Short-term Memory (LSTM) Recurrent Neural Network was used to learn patterns in several sequence types. Optimization technique by caching the LSTM was implemented, shortening runtime by several hundred times, allowing significantly more experiments to be executed. Performance evaluation strategy was designed and carried out to analyze the impact of various factors on the Gap-Filling Engine, providing a better understand of the Engine, as well as recommendations for users. The Gap-Filling Engine shows potentials of applications on several fields, especially in reconstructing structured sequential data with missing values.