Information Theory, M1 course, Fall 2018-2019

Exercise sheets

  • TD1
    • Weighing problem
    • A realistic find query
    • Axiomatic approach to the Shannon entropy
    • Renyi entropy
  • TD2
    • HW1 corrected
    • Axiomatic approach to the Shannon entropy
    • Data processing inequality for mutual information
    • Code for unknown distribution
    • Entropy of Markov chains
  • TD3
    • HW2 corrected
    • Code for unknown distribution
    • Entropy of Markov chains
    • Fixed-length almost lossless compressor: source coding theorem
  • TD4
    • Fixed-length almost lossless compressor: source coding theorem
    • Typical sets
    • Channel capacity
  • TD5
    • Lempel-Ziv compression
    • Channel capacity
  • TD6
    • Channel capacity
    • Binary Erasure Channel
    • Expurgation
    • Fun with Fano
  • TD7
    • HW 3 corrected
    • Fun with Fano
    • Entropy in combinatorics
    • Algorithmic approach to the channel coding problem
  • TD8
    • Midterm preparation
  • TD9
    • q-ary Entropy and Volume of Hamming Balls
    • Hamming riddle
    • Finite fields
  • TD10
    • Finite fields
    • Error-correcting VS error-detecting codes
    • Generalized Hamming bound
    • Parity check matrix
    • Almost-universal hash-functions: link between almost-universal hash-functions and codes with a good distance
  • TD11
    • HW4 corrected
    • Parity check matrix
    • Singleton Bound
    • Weights of Codewords
    • Codes Achieving the Gilbert-Varshamov Bound
  • TD12
    • Codes Achieving the Gilbert-Varshamov Bound
    • Reed-Solomon codes
    • Secret Sharing
  • TD13
    • HW5 corrected
    • Secret Sharing
    • Algorithmic approach to the channel coding problem
  • TD14
    • Revision for Final