| 1 | Introduction, entropy | |
| 2 | Jensen's inequality, data processing theorem, Fanos's inequality | |
| 3 | Different types of convergence, asymptotic equipartition property (AEP), typical set, joint typicality | |
| 4 | Entropies of stochastic processes | Problem set 1 due |
| 5 | Data compression, Kraft inequality, optimal codes | Problem set 2 due |
| 6 | Huffman codes | Problem set 3 due |
| 7 | Shannon-Fano-Elias codes, Slepian-Wolf | |
| 8 | Channel capacity, binary symmetric and erasure channels | |
| 9 | Maximizing capacity, Blahut-Arimoto | Problem set 4 due |
| 10 | The channel coding theorem | |
| 11 | Strong coding theorem, types of errors | Problem set 5 due |
| 12 | Strong coding theorem, error exponents | |
| | In-class midterm | |
| 13 | Fano's inequality and the converse to the coding theorem | Problem set 6 due |
| 14 | Feedback capacity | |
| 15 | Joint source channel coding | Problem set 7 due |
| 16 | Differential entropy, maximizing entropy | |
| 17 | Additive Gaussian noise channel | Problem set 8 due |
| 18 | Gaussian channels: parallel, colored noise, inter-symbol interference | |
| 19 | Gaussian channels with feedback | Problem set 9 due |
| 20 | Multiple access channels | |
| 21 | Broadcast channels | Problem set 10 due |
| | In-class presentations (2 sessions) | |
| 22 | Finite state Markov channels | |
| 23 | Channel side information, wide-band channels | |