- Lectures
- Omar Fawzi
- Wednesdays 13:30 to 15:30
- Amphi B

- Tutorials
- Ievgeniia Oshurko and Johanna Seif
- Fridays 8:00 - 10:00 (might change)

- Definition and properties of entropy functions
- Data compression
- Shannon's noisy coding theorem
- Applications of information theory to combinatorics, cryptography, complexity theory
- Error correcting codes: Definitions, properties and examples
- Applications of error correcting codes

- [CT] Elements of Information Theory
- [Mac] Information Theory, Inference and Learning Algorithms
- [PW] Lecture notes on Information Theory
- [GRS] Essential Coding Theory
- [Sen] Introduction a la theorie de l'information

- A final exam
- One midterm exam
- Homeworks (every 1 or 2 weeks)
- Bonus: Small project, such as writing a wikipedia article

Date | Topic | References |

Wed Sep 12th 13:30-15:30 | Admin. Overview of the fundamental problem of noisy channel coding. Definition of entropy. |
Ch 1 of [Mac], Ch 2 of [CT], Shannon's paper |

Tue Sep 18th 10:15-12:15 | Conditional entropy, mutual information, relative entropy. Data compression, variable-length lossless compressor. |
Ch 8.1 of [PW], Ch 4 of [Mac], Ch 2,3,5 of [CT] |

Tue Sep 18th 13:30-15:30 | Continuing variable-length coding. Prefix codes, Kraft's inequality, H(X) < L(C) < H(X)+1. Fixed-length compression, general bound in the size of the smallest set with prob >= 1-delta. |
Ch 8.2 of [PW], Ch 5,6 of [Mac], Ch 5 of [CT], Ch 9.1 of [PW] |