Material Detail
Robustness and Generalizability for Markovian Samples
This video was recorded at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), Bled 2009. We consider robustness of learning algorithms and prove that under a very general setup, robustness of an algorithm implies that it generalizes, and consequently, a robust algorithm that asymptotically minimizes empirical risk is consistent. In particular, this relationship holds in the case where both training samples and testing samples are generated according to evolving of a Markovian chain satisfying the Doeblin condition. We further provide conditions that ensure robustness and hence generalizability and in some cases consistency, all under the Markovian setup. Two notable examples are support vector machines and Lasso.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info