Date  Topic  References 
Wed Sep 12th 13:3015:30  Admin. Overview of the fundamental problem of noisy channel coding. Definition of entropy. 
Ch 1 of [Mac], Ch 2 of [CT], Shannon's paper 
Tue Sep 18th 10:1512:15  Conditional entropy, mutual information, relative entropy. Data compression, variablelength lossless compressor. 
Ch 8.1 of [PW], Ch 4 of [Mac], Ch 2,3,5 of [CT] 
Tue Sep 18th 13:3015:30  Continuing variablelength coding. Prefix codes, Kraft's inequality, H(X) < L(C) < H(X)+1. Fixedlength compression, general bound in the size of the smallest set with prob >= 1delta. 
Ch 8.2 of [PW], Ch 5,6 of [Mac], Ch 5 of [CT], Ch 9.1 of [PW] 
Wed Oct 3rd 
Shannon's source coding theorem Universal compression: arithmetic coding. See here for more details on compression 
Ch 16.1 of [PW], Ch 6 of [Mac] 
Wed Oct 10th 
Channel coding: Information capacity of a channel. Additivity of information capacity for independent channels. Converse bounds 
Ch 16.3 [PW], Ch 7 of [CT]. 
Fri Oct 17th  Achievability bounds. Shannon noisy coding theorem. 
Ch 16.4 [PW], Ch 7 of [CT] 
Oct 24th  Information theory and combinatorics and very brief introduction to Kolmogorov complexity 
Entropy and counting 
Nov 8th  Midterm 

Nov 22nd  Error correcting codes. Minimum distance. Linear codes. 
Ch 1 and 4 of [GRS] 
Dec 5th  Linear error correcting codes and their properties. Hamming and Hadamard codes. 
Ch 2 of [GRS] 
Dec 12th  ReedSolomon codes: definition, properties and efficient decoding algorithm. 
Ch 5 and 13 of [GRS] 
Dec 19th  Code concatenation. Application of ECC to communication complexity. 
Ch 9 and 11 of [GRS] Communication complexity 