Applications of information theory

Material covered: axiomatic definition of entropy, joint and conditional entropy, chain rule, convexity, sub-additivity, Fano inequality, binomial tails estimates; perfect matchings: Bregman’s theorem [Radhakrishnan]; local lemma probabilistic algorithm [Moser]; Cauchy-Schwartz and generalizations using entropy [Friedgut]; mutual information, sub-modularity; applications to additive combinatorics: entropy versions of Ruzsa distance and Plunecke-Ruzsa inequality [Tao,Madiman]; relative entropy: connections to entropy and information, Pinsker’s inequality;¬†applications to harmonic functions on groups; Shearer’s inequality and simple applications to geometry; Second law of thermodynamics (on Markov chains). Principle of maximum entropy. Prefix codes. Asymptotic equipartition property. Kolmogorov’s complexity. Continuous-time Markov chains: convergence in variance and divergence, and Poincare and log-Sobolev inequalities. Iso-perimetric inequalities and log-Sobolev inequalities for the discrete cube.

Assignments: one, two, and three.

Other relevant courses with scribes:
Mark Braverman’s (Princeton)
Anup Rao’s (Univesity of Washington)