Selectable Reduced Rank Regression and Principle Component Analysis

Series
Stochastics Seminar
Time
Tuesday, October 9, 2012 - 3:05pm for 1 hour (actually 50 minutes)
Location
Skyles 005
Speaker
Yiyuan She – Florida State University
Organizer
Karim Lounici
Rank reduction as an effective technique for dimension reduction is widely used in statistical modeling and machine learning. Modern statistical applications entail high dimensional data analysis where there may exist a large number of nuisance variables. But the plain rank reduction cannot discern relevant or important variables. The talk discusses joint variable and rank selection for predictive learning. We propose to apply sparsity and reduced rank techniques to attain simultaneous feature selection and feature extraction in a vector regression setup. A class of estimators is introduced based on novel penalties that impose both row and rank restrictions on the coefficient matrix. Selectable principle component analysis is proposed and studied from a self-regression standpoint which gives an extension to the sparse principle component analysis. We show that these estimators adapt to the unknown matrix sparsity and have fast rates of convergence in comparison with LASSO and reduced rank regression. Efficient computational algorithms are developed and applied to real world applications.