Material Detail

Online Learning and Bregman Divergences

Online Learning and Bregman Divergences

This video was recorded at Machine Learning Summer School (MLSS), Taipei 2006. L 1: Introduction to Online Learning (Predicting as good as the best expert, Predicting as good as the best linear combination of experts, Additive versus multiplicative family of updates) L 2: Bregman divergences and Loss bounds (Introduction to Bregman divergences, Relative loss bounds for the linear case, Nonlinear case & matching losses, Duality and relation to exponential families) L 3: Extensions, interpretations, applications (Online to Batch Conversions, Prior information on the weight vector, Some applications)

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.