Compression methods for approximate stochastic modelling
Understanding of the natural world can be accounted to effective information processing. Natural processes, however, are often riddled with enormous amount of information and high non-linearity. A need to simplify these problems becomes paramount for practical purposes. Computational mechanics has c...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/139314 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-139314 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1393142023-02-28T23:13:13Z Compression methods for approximate stochastic modelling Onggadinata, Kelvin Gu Mile School of Physical and Mathematical Sciences gumile@ntu.edu.sg Science::Mathematics::Applied mathematics::Information theory Science::Physics Understanding of the natural world can be accounted to effective information processing. Natural processes, however, are often riddled with enormous amount of information and high non-linearity. A need to simplify these problems becomes paramount for practical purposes. Computational mechanics has come up with a successful model --- known as $\epsilon$-machine --- to carry out statistically accurate prediction while assuming minimal representation. Reducing statistical complexity --- a fundamental measure of memory one needs to save to operate the $\epsilon$-machine --- has been a subject of interest as it provides operational advantages. In this final year project, we explore the problem of optimal construction of approximate $\epsilon$-machine that prioritize minimal complexity at the expense of prediction power. Several approaches are proposed to achieve this, namely the brute-force method and the information bottleneck method. Although requiring preliminary assumption on the approximate $\epsilon$-machine's structures, the brute-force method is fairly effective in finding the minimal model under some constraint. A more robust method is the information bottleneck method, which uses principle of rate-distortion theory to find a set of optimal predictive states to represent the process. We propose the use of two information bottleneck variations, specifically the deterministic information bottleneck and agglomerative clustering, to construct optimal predictive states that are unifilar. Through artificial reconstruction, the predictive states' probability law can be be used to inspire its corresponding approximate $\epsilon$-machine. Combination of these two techniques --- the information bottleneck followed by brute-force method --- can be proven to be useful in the problem of optimal model compression. Bachelor of Science in Physics 2020-05-18T22:00:11Z 2020-05-18T22:00:11Z 2020 Final Year Project (FYP) https://hdl.handle.net/10356/139314 en application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Science::Mathematics::Applied mathematics::Information theory Science::Physics |
spellingShingle |
Science::Mathematics::Applied mathematics::Information theory Science::Physics Onggadinata, Kelvin Compression methods for approximate stochastic modelling |
description |
Understanding of the natural world can be accounted to effective information processing. Natural processes, however, are often riddled with enormous amount of information and high non-linearity. A need to simplify these problems becomes paramount for practical purposes. Computational mechanics has come up with a successful model --- known as $\epsilon$-machine --- to carry out statistically accurate prediction while assuming minimal representation. Reducing statistical complexity --- a fundamental measure of memory one needs to save to operate the $\epsilon$-machine --- has been a subject of interest as it provides operational advantages. In this final year project, we explore the problem of optimal construction of approximate $\epsilon$-machine that prioritize minimal complexity at the expense of prediction power. Several approaches are proposed to achieve this, namely the brute-force method and the information bottleneck method. Although requiring preliminary assumption on the approximate $\epsilon$-machine's structures, the brute-force method is fairly effective in finding the minimal model under some constraint. A more robust method is the information bottleneck method, which uses principle of rate-distortion theory to find a set of optimal predictive states to represent the process. We propose the use of two information bottleneck variations, specifically the deterministic information bottleneck and agglomerative clustering, to construct optimal predictive states that are unifilar. Through artificial reconstruction, the predictive states' probability law can be be used to inspire its corresponding approximate $\epsilon$-machine. Combination of these two techniques --- the information bottleneck followed by brute-force method --- can be proven to be useful in the problem of optimal model compression. |
author2 |
Gu Mile |
author_facet |
Gu Mile Onggadinata, Kelvin |
format |
Final Year Project |
author |
Onggadinata, Kelvin |
author_sort |
Onggadinata, Kelvin |
title |
Compression methods for approximate stochastic modelling |
title_short |
Compression methods for approximate stochastic modelling |
title_full |
Compression methods for approximate stochastic modelling |
title_fullStr |
Compression methods for approximate stochastic modelling |
title_full_unstemmed |
Compression methods for approximate stochastic modelling |
title_sort |
compression methods for approximate stochastic modelling |
publisher |
Nanyang Technological University |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/139314 |
_version_ |
1759854418018172928 |