Material Detail

Exploiting feature covariance in high-dimensional online learning

Exploiting feature covariance in high-dimensional online learning

This video was recorded at 13th International Conference on Artificial Intelligence and Statistics (AISTATS), Sardinia 2010. Some online algorithms for linear classification model the uncertainty in their weights over the course of learning. Modeling the full covariance structure of the weights can provide a significant advantage for classification. However, for high-dimensional, large-scale data, even though there may be many second-order feature interactions, it is computationally infeasible to maintain this covariance structure. To extend second-order methods to high-dimensional data, we develop low-rank approximations of the covariance structure. We evaluate our approach on both synthetic and real-world data sets using the confidence-weighted online learning framework. We show improvements over diagonal covariance matrices for both low and high-dimensional data.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collection (1) Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.