Matrix Perturbation and Manifold-based Dimension Reduction.

Applied and Computational Mathematics Seminar
Monday, November 23, 2009 - 13:00
1 hour (actually 50 minutes)
Skiles 255
Georgia Tech (School of ISyE)
Many algorithms were proposed in the past ten years on utilizing manifold structure for dimension reduction. Interestingly, many algorithms ended up with computing for eigen-subspaces. Applying theorems from matrix perturbation, we study the consistency and rate of convergence of some manifold-based learning algorithm. In particular, we studied local tangent space alignment (Zhang & Zha 2004) and give a worst-case upper bound on its performance. Some conjectures on the rate of convergence are made. It's a joint work with a former student, Andrew Smith.