Material Detail

Efficient Projections onto the L1-Ball for Learning in High Dimensions

Efficient Projections onto the L1-Ball for Learning in High Dimensions

This video was recorded at 25th International Conference on Machine Learning (ICML), Helsinki 2008. We describe efficient algorithms for projecting a vector onto the L1-ball. We present two methods for projection. The first performs exact projection in O(n) time, where n is the dimension of the space. The second works on vectors k of whose elements are perturbed outside the L1-ball, projecting in O(k log(n)) time. This setting is especially useful for online learning in sparse feature spaces such as text categorization applications. We demonstrate the merits and effectiveness of our algorithms in numerous batch and online learning tasks. We show that variants of stochastic gradient projection methods augmented with our efficient projection procedures outperform state-of-the-art optimization techniques such as interior point methods. We also show that in online settings gradient updates with L1 projections outperform the EG algorithm while obtaining models with high degrees of sparsity. Disclaimer: VideoLectures.NET emphasizes that this talk was not recorded in its full duration but due to its content value was published.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.