This page was last updated 01/06/2015 062:30:18
Lecture 1: Introduction to information theory; Shannon entropy. Cover&Thomas book "Elements of Information Theory" or MacKay book "Information Theory, Inference and Learning Algorithms".
Lecture 2: Axiomatics for Shannon entropy (in one Appendix of Shannon’s original paper). Joint, conditional and relative entropies; Mutual Information. Cover&Thomas or MacKay book.
Lecture 3: Laughlin's work on the retina; Weak noise limit for estimation of information transmission. Original paper by S. Laughlin.
Lecture 4: Kelly's horse race and bet hedging. Asymptotic Equipartition. Cover&Thomas or MacKay book.