Applications of information theory

Material covered:

  • axiomatic definition of entropy, joint and conditional entropy, chain rule, convexity, sub-additivity, binomial tails estimates
  • Bregman’s theorem on perfect matchings [Radhakrishnan]
  • local lemma probabilistic algorithm [Moser]
  • Shearer’s inequality and simple applications to geometry
  • Cauchy-Schwartz and generalizations using entropy [Friedgut]
  • mutual information, sub-modularity; applications to additive combinatorics: entropy versions of Ruzsa distance and Plunecke-Ruzsa inequality [Tao,Madiman]
  • Pinsker’s inequality; applications to harmonic functions on groups
  • second law of thermodynamics (on Markov chains)
  • Principle of maximum entropy. Prefix codes. Asymptotic equipartition property
  • Markov chains: convergence in variance and divergence, and Poincare and log-Sobolev inequalities; iso-perimetric inequalities and log-Sobolev inequalities for the discrete cube
  • information in learning theory

Assignments: onetwo and three

Other relevant courses with scribes:
Mark Braverman’s (Princeton)
Anup Rao’s (Univesity of Washington)

Gromov on entropy: here