Quantum information entropy
Information entropy, the expected amount of information produced by a random data source, has been a topic of interest since Shannon introduced the concept in 1948. The concept was later generalised to Rényi and Tsalli entropies for different research applications in the classical information theory...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/78657 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Information entropy, the expected amount of information produced by a random data source, has been a topic of interest since Shannon introduced the concept in 1948. The concept was later generalised to Rényi and Tsalli entropies for different research applications in the classical information theory. The information entropy becomes increasingly important when dealing with data compression and cryptography in this data-driven era which requires efficient management of the big data and effective extraction of meaningful information. Hence, in this final year project report, the author introduces the new representation of von Neumann entropy with less computational effort and new representations of Rényi and Tsalli in the quantum information domain. Meanwhile, the author investigates mathematically the feasibility of the proposed representations of von Neumann, Rényi and Tsalli entropies. Subsequently, the new representations are validated through MATLAB and case studies. |
---|