Physics 273 | Fall 2017

Lecture Notes:

This page was last updated 01/06/2015 062:30:18


Lecture 1: Introduction to information theory; Shannon entropy. Cover&Thomas book "Elements of Information Theory" or MacKay book "Information Theory, Inference and Learning Algorithms".

Lecture 2: Axiomatics for Shannon entropy (in one Appendix of Shannon’s original paper). Joint, conditional and relative entropies. Cover&Thomas or MacKay book.

Lecture 3: Mutual Information, Lagrange multipliers, Laughlin's work on the retina. Cover&Thomas or MacKay book and the original paper by S. Laughlin.

Lecture 4: Weak noise limit for estimation of information transmission. Kelly horse race and bet hedging. Cover&Thomas book.