Lectures

  • Lecture 1 Introduction

  • Lecture 2 Entropy and mutual information

  • Lecture 3 Chain rules and inequalities

  • Lecture 4 Data processing, Fano's inequality

  • Lecture 5 Asymptotic equipartition property

  • Lecture 6 Entropy rate

  • Lecture 7 Source coding and Kraft inequality

  • Lecture 8 Optimal code length and roof code

  • Lecture 9 Huffman codes

  • Lecture 10 Shannon-Fano-Elias and arithmetic codes

  • Lecture 11 Maximum entropy

  • Lecture 12 Channel capacity

  • Lecture 13 Channel coding theorem, joint typicality

  • Lecture 14 Proof of channel coding theorem (Notes)

  • Lecture 15 Hamming codes and Viterbi algorithm

  • Lecture 16 Feedback channel, source-channel separation theorem (Notes)

  • Lecture 17 Differential entropy

  • Lecture 18 Gaussian channel

  • Lecture 19 Parallel Gaussian channel and water-filling

  • Lecture 20 Quantization and rate-distortion

  • Lecture 21 Rate-distortion theorem (Notes on calculating R(D))

  • Lecture 22 Final review and future topics

  • Supplemental Materials

  • Sum of random variables

  • Independence

  • The Markov chain Monte Carlo revolution, by Persi Diaconis

  • Reference for arithmetic codes, Arithmetic coding for data compression, by I. Witten, R. M. Neal and G. Cleary, Communications of the ACM, 1987

  • Midterm 1 review

  • Midterm 2 review

  • Midterm 1 review by TA

  • Midterm 2 review by TA

  • The Viterbi Algorithm, by G. David Forney, JR., Proc. of the IEEE, Vol. 61, No. 3, March 1973

  • Polar codes demo, by Erdal Arikan at Bilkent University

  • You and your research, advices on research by R. Hamming, 1986.

  • Fisher information and surface area, proof in Sec. 4, by M. Costa and T. Cover, 1983.

  • Water-filling solution, a derivation given by Stephen Boyd and Lieven Vandenberghe in Convex Optimization.

  • Quantization and compression, introductory lecture notes by Robert Gray, Stanford University, 2007.