Material Detail

Kernel Representations and Kernel Density Estimation

Kernel Representations and Kernel Density Estimation

This video was recorded at Workshop on Sparsity and Inverse Problems in Statistical Theory and Econometrics, Berlin 2008. There has been a great deal of attention in recent times particularly in machine learning to representation of multivariate data points x by K(x, ·) where K is positive and symmetric and thus induces a reproducing kernel Hilbert space.The idea is then to use the matrix K(Xi , Xj )as a substitute for the empirical covariance matrix of a sample X1 , . . . , Xn for PCA and other inference.(Jordan and Fukumizu(2006) for instance. Nadler et. al(2006) connected this approach to one based on random walks and diffusion limits and indicated a connection to kernel density estimation.By making at least a formal connection to a multiplication operator on a function space we make further connection and show how clustering results of Beylkin ,Shih and Yu (2008) which apparently differ from Nadler et al. can be explained.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.