Project Type:
Student Project
Student:
Johann Steiner
Mentor: Bernhard Geiger
|
Mutual Information - like correlation - is a measure of the statistical dependence between random variables. Unlike correlation, however, mutual information also captures non-linear dependencies between different random variables. Applications of this measure are mainly situated in the field of information theory: concepts like channel capacity and rate-distortion functions have a close relation to mutual information.
Many different algorithms for computing the mutual information exist. One of the most sophisticated was introduced by Fraser and Swinney [1], and relies upon an iterative partioning of the joint probability space of the constituent random variables (i.e., iterative, multi-dimensional histogram binning). In [2], a similar, but computationally more efficient algorithm was proposed.
The objective of this thesis is to implement various algorithms for computing the mutual information in MATLAB. The functionality of the program should be verified by comparing simulations with analytic results. Finally, the algorithms implemented should be compared with respect to accuracy and computational complexity.
[1] A. M. Fraser and H. L. Swinney, “Independent coordinates for strange attractors from mutual information,” Physical Review A, vol. 33, no. 2, pp. 1134–1140, February 1986.
[2] H.-P. Bernhard and G. Kubin, “A fast mutual information calculation algorithm,” Signal Processing VII: Theories and Applications, vol. 1, pp. 50–53, September 1994.