Material Detail

Lecture 3: Convergence Proof

Lecture 3: Convergence Proof

This video was recorded at Stanford Engineering Everywhere EE364B - Convex Optimization II. Okay, if there's no questions about last time, then I think we'll just jump in and start in on subgradient methods. So far subgradient methods, we look at the – I mean, subgradient method is embarrassingly simple, right, it's – you make a step in the negative in anegative, I'll call the negative, but the correct English would be an anegative subgradient direction. ... See the whole transcript at Convex Optimization II - Lecture 03

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.