Information theory

Fall 2018 at ENS Lyon

General information

This course is an introduction to information theory and error correcting codes from the computer science point of view.


For some online resources on probability (for computer science), you can look at these notes or these notes (starting at Lecture 10).


The grading scheme will take into account


Date Topic References
Wed Sep 12th 13:30-15:30 Admin. Overview of the fundamental problem of noisy channel coding. Definition of entropy.
Ch 1 of [Mac], Ch 2 of [CT], Shannon's paper
Tue Sep 18th 10:15-12:15 Conditional entropy, mutual information, relative entropy. Data compression, variable-length lossless compressor.
Ch 8.1 of [PW], Ch 4 of [Mac], Ch 2,3,5 of [CT]
Tue Sep 18th 13:30-15:30 Continuing variable-length coding. Prefix codes, Kraft's inequality, H(X) < L(C) < H(X)+1.
Fixed-length compression, general bound in the size of the smallest set with prob >= 1-delta.
Ch 8.2 of [PW], Ch 5,6 of [Mac], Ch 5 of [CT], Ch 9.1 of [PW]
Wed Oct 3rd Shannon's source coding theorem
Universal compression: arithmetic coding. See here for more details on compression
Ch 16.1 of [PW], Ch 6 of [Mac]
Wed Oct 10th Channel coding: Information capacity of a channel.
Additivity of information capacity for independent channels. Converse bounds
Ch 16.3 [PW], Ch 7 of [CT].
Fri Oct 17th Achievability bounds. Shannon noisy coding theorem.
Ch 16.4 [PW], Ch 7 of [CT]
Oct 24th Information theory and combinatorics and very brief introduction to Kolmogorov complexity
Entropy and counting
Nov 8th Midterm
Nov 22nd Error correcting codes. Minimum distance. Linear codes.
Ch 1 and 4 of [GRS]
Dec 5th Linear error correcting codes and their properties. Hamming and Hadamard codes.
Ch 2 of [GRS]
Dec 12th Reed-Solomon codes: definition, properties and efficient decoding algorithm.
Ch 5 and 13 of [GRS]
Dec 19th Code concatenation. Application of ECC to communication complexity.
Ch 9 and 11 of [GRS]
Communication complexity