Material Detail

Convergence of MDL and Bayesian Methods

Convergence of MDL and Bayesian Methods

This video was recorded at Notions of Complexity: Information-theoretic, Computational and Statistical Approaches Workshop, Eindhoven 2004. We introduce a complexity measure which we call KL-complexity. Based on this concept, we present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized estimators. We show that simple and clean finite sample convergence bounds can be obtained from this approach. In particular, we are able to improve some classical results concerning the convergence of MDL density estimation and Bayesian posterior distributions

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.