Signal Processing and Speech Communication Laboratory
hometheses & projects › Implementation of Mutual Information Calculation Algorithms

Implementation of Mutual Information Calculation Algorithms

Status
In work
Type
Master Project
Announcement date
12 Jan 2011
Student
Johann Steiner
Mentors
  • Bernhard Geiger
Research Areas

Short Description

Mutual Information - like correlation - is a measure of the statistical dependence between random variables. Unlike correlation, however, mutual information also captures non-linear dependencies between different random variables. Applications of this measure are mainly situated in the field of information theory: concepts like channel capacity and rate-distortion functions have a close relation to mutual information.


Many different algorithms for computing the mutual information exist. One of the most sophisticated was introduced by Fraser and Swinney [1], and relies upon an iterative partioning of the joint probability space of the constituent random variables (i.e., iterative, multi-dimensional histogram binning). In [2], a similar, but computationally more efficient algorithm was proposed.

The objective of this thesis is to implement various algorithms for computing the mutual information in MATLAB. The functionality of the program should be verified by comparing simulations with analytic results. Finally, the algorithms implemented should be compared with respect to accuracy and computational complexity.

Your Tasks

  • Implementation of the algorithms in [1] and [2]
  • Verification of the implementation in a simple application
  • Literature survey for different algorithms
  • Implementation of different algorithms (thesis only)
  • Comparison of implemented algorithms with respect to accuracy and complexity (thesis only)

Your Profile/Requirements

  • Good knowledge in MATLAB programming
  • Basic knowledge in probability and stochastic processes
  • Basic knowledge in information theory would be beneficial

References

[1] A. M. Fraser and H. L. Swinney, “Independent coordinates for strange attractors from mutual information,” Physical Review A, vol. 33, no. 2, pp. 1134–1140, February 1986.
[2] H.-P. Bernhard and G. Kubin, “A fast mutual information calculation algorithm,” Signal Processing VII: Theories and Applications, vol. 1, pp. 50–53, September 1994.