Source Coding Theory

Instructor: 

This lecture course treats the principles underlying the encoding of speech, audio, video, and images at low bit rates. Source coding techniques such as scalar and vector quantization, orthogonal transforms, and linear prediction are introduced and their performance is analyzed theoretically. The theoretical bounds on the performance of source coders are discussed.

Contents

  • Information theory. Information theory of discrete and continuous variables: entropy, Kraft inequality, relative entropy, entropy rate, redundancy rate, mutual information. Estimation of probability density and probability mass functions. Gaussian Mixture Models (GMMs) and Expectation-maximization (EM) algorithm. Maximum entropy principle.
  • Lossless coding. Nonadaptive codes: Shannon, Huffman, arithmetic codes. Universal and adaptive codes. Ziv-Lempel codes.
  • Rate-distortion theory. The rate-distortion function, Shannon lower bound, rate distribution over independent variables, reverse waterfilling, Blahut algorithm.
  • High-rate quantization. Constrained-resolution and constrained-entropy quantization. Vector versus scalar quantization. Practical high-rate-theory-based quantizers: mixture and lattice quantizers, companding.
  • Low-rate quantization. Lloyd training (k-means) algorithm for constrained-resolution and constrained-entropy cases. Structured vector quantization (tree-structured, multi-stage, gain-shape, split). Fast search methods.
  • Transforms and filter banks. Bases and frames. Transforms and filter banks. Fixed transforms: DFT, DCT, MLT, Gabor frames, Balian-Low theorem. A-priori adaptation: Karhunen-Loeve, a-priori energy concentration. A-posteriori adaptation: a-posteriori energy concentration, best-basis search, matching pursuit.
  • Linear prediction. Closed-loop prediction, noise-shaping, analysis-by-synthesis, spectral flatness, Kolmogorov's formula, redundancy.
    Non-linear prediction/estimation. GMM-based MinMSE estimation

Homework Assignments

No. Date Due Date PDF Additional Material
1  9.3.  16.3.  hw1_11.pdf  
2  16.3.  25.3.  hw2_11.pdf  
3  25.3.  1.4.  hw3_11.pdf density_model_quality_measure.m
4 7.4.  13.4. hw4_11.pdf  
5  13.4.  11.5.  hw5_11.pdf  
6  11.5.  18.5. (or before the exam when your exam is earlier)  hw6_11.pdf  reverse_water_filling.m

Grading

6 homework assignments: 10% each = 60% (solving bonus problems gives additional points)
1 oral exam:                                           40%
                                                            100% (without bonus problems)

>90% 1
>80%, <=90% 2
>65%, <=80% 3
>50%, <=65% 4
<=50% 5

References

  • W.B. Kleijn: A Basis for Source Coding, KTH Stockholm, 2008 (will be used as the lecture notes; a copy will be available)
  • T.M. Cover and J.A. Thomas: Elements of Information Theory, 2nd Edition, John Wiley & Sons, NJ, 2006
  • R.M. Gray: Source Coding Theory, Springer, 1989
  • T. Berger: Rate-distortion theory: A mathematical basis for data compression, Prentice-Hall, NJ, 1971
  • T. Berger and J.D. Gibson: Lossy Source Coding, IEEE Transactions on Information Theory, Vol. 44, No. 6, Oct. 1998
  • C.E. Shannon: A Mathematical Theory of Communication, The Bell System Technical Journal, Vol. 27, Jul., Oct. 1948
  • A. Gersho: Asymptotically optimal block quantization, IEEE Transactions on Information Theory, Vol. 25, No. 4, Jul. 1979

 

Term: 
Summer
Education Level: 
Master Level
AttachmentSize
hw1_11.pdf15.34 KB
hw2_11.pdf22.52 KB
hw3_11.pdf18.23 KB
density_model_quality_measure.m763 bytes
hw4_11.pdf24.42 KB
hw5_11.pdf29.09 KB
hw6_11.pdf17.06 KB
reverse_water_filling.m1.34 KB