Introduction to information theory
Introduction
This course provides an introduction to information theory whose foundations were laid by C. Shannon. Information theory deals with finding the fundamental limits for compressing a signal, storing data or communicating information reliably over a noisy channel. It turns out that all these limits can be expressed in terms of a single quantity: the entropy.
We will cover during this course Shannon's major results. Since Shannon's results information theory has found applications in many areas. We will discuss some of them, such as statistics and cryptography.
Acknowledgment
I am particularly grateful to Nicolas Sendrier and Jean-Pierre Tillich for sharing with me their materials to build this course (as well as providing many advices).
Lectures
- Introduction to information theory: Lecture 1 and
Exercise Session 1
1) Discrete probabilities
2) Overview of information theory
3) Entropy, what else?
- Source coding theorem and first efficient compression algorithms: Lecture 2 and
Exercise Session 2
1) Source coding theorem
2) Compression with symbol codes
3) Optimal source coding: Huffman coding
- Typical sequences and Asymptotic Equipartition Property (AEP): Lecture 3 and
Exercise Session 3
1) Some reminders
2) Entropy and stochastic processes
3) Asymptotic equipartition property
4) Memoryless sources verify the AEP
5) Markov chains: general sources verifying the AEP
- Compression: arithmetic coding Lecture 4
1) Some reminders: Huffman coding and Shannon source coding theorem
2) Coding with intervals: Shannon-Fano-Elias and Shannon
3) Arithmetic coding
- Method of types and applications: Lecture 5 and
Exercise Session 5
1) Method of types
2) Alternative law of large numbers
3) Universal coding
4) Large deviation theory
5) Chernoff's bound
6) Sanov's theorem
7) Some applications of Sanov's theorem
- Communication over a noisy channel: Lecture 6 and Exercise Session 6
1) Noisy channels and capacity
2) Noisy-channel coding theorem
3) Proof of achievability
4) What is impossible
5) Conclusion
- Introduction to linear codes Lecture 7 and
Exercise Session 7
1) Basics on linear codes
2) Dual representation of linear codes
3) Hamming distance/weight
4) Bounds on minimum distance
5) Reed-Solomon codes and their decoding algorithm
- Random linear codes and Shannon's theorem Lecture 8 and
Exercise Session 8
1) Shannon's theorem for linear codes
2) About random codes
3) Proof of Shannon's theorem for linear codes
4) Random codes: a powerful tool
5) A little bit of cryptography
Final exam
Presentation of one of the below topics.
- Kolmogorov Complexity in Elements of Information Theory, Chapter 14,
Thomas M. Cover and Joy A. Thomas.
- About code-based cryptography, Alekhnovich’s cryptosystems,
in the lecture notes by Gilles Zémor.
- A tutorial about an important family of error correcting codes: LDPC codes.
- Lempel-Ziv compression algorithm in Elements of Information Theory, Chapter 13,
Thomas M. Cover and Joy A. Thomas.
- Quantum information theory: Chapter 11 up to 12.3 in Quantum Computation and Quantum Information,
Michael A. Nielsen and Isaac L. Chuang.
- Programming project as described at the end of Lecture 3.
Bibliography (books and lectures notes used for this course)