|Tue Sep 12th 10:15-12:15
||Admin. Overview of the fundamental problem of noisy channel coding. Definition of entropy.
|| Ch 1 of [Mac], Ch 2 of [CT], Shannon's paper
|Wed Sep 13th 15:45-17:45
||Conditional entropy, mutual information, relative entropy. Data compression, variable-length lossless compressor.
|| Ch 8.1 of [PW], Ch 4 of [Mac], Ch 2,3,5 of [CT]
|Fri Sep 22nd 9:00-12:00
||Continuing variable-length coding. Prefix codes, Kraft's inequality, H(X) < L(C) < H(X)+1.
Fixed-length compression, general bound in the size of the smallest set with prob >= 1-delta. Special case of memoryless sources: Shannon's source coding theorem (only proved achievability)
|| Ch 8.2 of [PW], Ch 5,6 of [Mac], Ch 5 of [CT], Ch 9.1 of [PW]
|Fri Oct 13th 10:15-12:15
Universal compression: arithmetic coding. See here for more details on compression
Channel coding: Information capacity of a channel.
|| Ch 16.1 of [PW], Ch 6 of [Mac]
|Mon Oct 16th 10:15-12:10
Additivity of information capacity for independent channels. Converse bounds
|| Ch 16.3 [PW], Ch 7 of [CT].
|Fri Oct 27rd 10:15-12:15
|| Achievability bounds. Shannon noisy coding theorem.
|| Ch 16.4 [PW], Ch 7 of [CT]
|Fri Nov 10th 10:15-12:15
|| Midterm exam