The fundamental thermodynamic bounds on finite models

The minimum heat cost of computation is subject to bounds arising from Landauer's principle. Here, I derive bounds on finite modeling-the production or anticipation of patterns (time-series data)-by devices that model the pattern in a piecewise manner and are equipped with a finite amount of me...

全面介紹

Saved in:
書目詳細資料
主要作者: Garner, Andrew J. P.
其他作者: School of Physical and Mathematical Sciences
格式: Article
語言:English
出版: 2021
主題:
在線閱讀:https://hdl.handle.net/10356/153740
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:The minimum heat cost of computation is subject to bounds arising from Landauer's principle. Here, I derive bounds on finite modeling-the production or anticipation of patterns (time-series data)-by devices that model the pattern in a piecewise manner and are equipped with a finite amount of memory. When producing a pattern, I show that the minimum dissipation is proportional to the information in the model's memory about the pattern's history that never manifests in the device's future behavior and must be expunged from memory. I provide a general construction of a model that allows this dissipation to be reduced to zero. By also considering devices that consume or effect arbitrary changes on a pattern, I discuss how these finite models can form an information reservoir framework consistent with the second law of thermodynamics.