Sequence prediction using recurrent neural network
The project implemented a Gap-Filling Engine capable of filling in gaps in missing sequences of various types. A strategy was introduced to look forward into subsequent data, enabling the Engine to improve the accuracy of the prediction by more than 30%. A Sequence Model based on Long Short-term Mem...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2017
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/70503 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-70503 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-705032023-03-03T20:41:52Z Sequence prediction using recurrent neural network Nguyen, Phan Huy Goh Wooi Boon School of Computer Science and Engineering DRNTU::Engineering::Computer science and engineering The project implemented a Gap-Filling Engine capable of filling in gaps in missing sequences of various types. A strategy was introduced to look forward into subsequent data, enabling the Engine to improve the accuracy of the prediction by more than 30%. A Sequence Model based on Long Short-term Memory (LSTM) Recurrent Neural Network was used to learn patterns in several sequence types. Optimization technique by caching the LSTM was implemented, shortening runtime by several hundred times, allowing significantly more experiments to be executed. Performance evaluation strategy was designed and carried out to analyze the impact of various factors on the Gap-Filling Engine, providing a better understand of the Engine, as well as recommendations for users. The Gap-Filling Engine shows potentials of applications on several fields, especially in reconstructing structured sequential data with missing values. Bachelor of Engineering (Computer Science) 2017-04-26T01:41:25Z 2017-04-26T01:41:25Z 2017 Final Year Project (FYP) http://hdl.handle.net/10356/70503 en Nanyang Technological University 77 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Computer science and engineering |
spellingShingle |
DRNTU::Engineering::Computer science and engineering Nguyen, Phan Huy Sequence prediction using recurrent neural network |
description |
The project implemented a Gap-Filling Engine capable of filling in gaps in missing sequences of various types. A strategy was introduced to look forward into subsequent data, enabling the Engine to improve the accuracy of the prediction by more than 30%. A Sequence Model based on Long Short-term Memory (LSTM) Recurrent Neural Network was used to learn patterns in several sequence types.
Optimization technique by caching the LSTM was implemented, shortening runtime by several hundred times, allowing significantly more experiments to be executed. Performance evaluation strategy was designed and carried out to analyze the impact of various factors on the Gap-Filling Engine, providing a better understand of the Engine, as well as recommendations for users. The Gap-Filling Engine shows potentials of applications on several fields, especially in reconstructing structured sequential data with missing values. |
author2 |
Goh Wooi Boon |
author_facet |
Goh Wooi Boon Nguyen, Phan Huy |
format |
Final Year Project |
author |
Nguyen, Phan Huy |
author_sort |
Nguyen, Phan Huy |
title |
Sequence prediction using recurrent neural network |
title_short |
Sequence prediction using recurrent neural network |
title_full |
Sequence prediction using recurrent neural network |
title_fullStr |
Sequence prediction using recurrent neural network |
title_full_unstemmed |
Sequence prediction using recurrent neural network |
title_sort |
sequence prediction using recurrent neural network |
publishDate |
2017 |
url |
http://hdl.handle.net/10356/70503 |
_version_ |
1759856639743098880 |