Information Loss, Dynamical Systems, and Landauer's Principle

Project Type: Master/Diploma Thesis
Student: Gebhard Wallinger

Short Description

Recently, we were able to show that the information loss induced by a static function g is given as [1]

H(X|Y) = h(X) - h(Y) + E{log|g'(X)|}

where X and Y are the continuous input and output random variables, respectively, H(X|Y) is the information loss, and h( ) denotes the differential entropy. This result has a tight connection to the entropy production of a dynamical system, i.e., the entropy (e.g., in the form of heat) this system ejects into its environment [2]. In addition to that, it can be shown that the last term in the formula above is related to the Kolmogorov-Sinai entropy rate, a quantity which describes the information generation in chaotic systems.

Finally, in 1961 Landauer [3] argued that every irreversible computation (i.e., with positive information loss) leads to a dissipation of heat, suggesting a connection between information theory and statistical physics.

In this project, the connections between dynamical system theory, information loss in static systems, and Landauer's principle shall be investigated. In particular, the relation between information generation, information loss, and (energetic) entropy production in dynamical shall be analyzed, leading to a principle of conservation of information.

Results

Gebhard has already finished this thesis, which can be downloaded here.

References

[1] B. Geiger and G. Kubin, "On the information loss in memoryless systems: The multivariate case," in IZS, Zurich, Feb. 2012, preprint available: arXiv:1109.4856 [cs.IT].
[2] D. Ruelle, "Positivity of entropy production in nonequilibrium statistical mechanics," J.Stat.Phys., vol. 85, pp. 1-23, 1996.
[3] R. Landauer, "Irreversibility and heat generation in the computing process," IBM Journal of Research and Development, vol. 5, pp. 183-191, Jul. 1961.