Information theory

Fall 2018 at ENS Lyon

General information

This course is an introduction to information theory and error correcting codes from the computer science point of view.

Resources

For some online resources on probability (for computer science), you can look at these notes or these notes (starting at Lecture 10).

Grading

The grading scheme will take into account

Schedule

Date Topic References
Wed Sep 12th 13:30-15:30 Admin. Overview of the fundamental problem of noisy channel coding. Definition of entropy.
Ch 1 of [Mac], Ch 2 of [CT], Shannon's paper
Tue Sep 18th 10:15-12:15 Conditional entropy, mutual information, relative entropy. Data compression, variable-length lossless compressor.
Ch 8.1 of [PW], Ch 4 of [Mac], Ch 2,3,5 of [CT]
Tue Sep 18th 13:30-15:30 Continuing variable-length coding. Prefix codes, Kraft's inequality, H(X) < L(C) < H(X)+1.
Fixed-length compression, general bound in the size of the smallest set with prob >= 1-delta.
Ch 8.2 of [PW], Ch 5,6 of [Mac], Ch 5 of [CT], Ch 9.1 of [PW]
Wed Oct 3rd Shannon's source coding theorem
Universal compression: arithmetic coding. See here for more details on compression
Ch 16.1 of [PW], Ch 6 of [Mac]
Wed Oct 10th Channel coding: Information capacity of a channel.
Additivity of information capacity for independent channels. Converse bounds
Ch 16.3 [PW], Ch 7 of [CT].
Fri Oct 17th Achievability bounds. Shannon noisy coding theorem.
Ch 16.4 [PW], Ch 7 of [CT]
Oct 24th Information theory and combinatorics and very brief introduction to Kolmogorov complexity
Entropy and counting
Nov 8th Midterm
Nov 22nd Error correcting codes. Minimum distance. Linear codes.
Ch 1 and 4 of [GRS]

Homeworks