Lower bounds for the estimation of principal components

Series
Stochastics Seminar
Time
Thursday, February 4, 2021 - 3:30pm for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Martin Wahl – Humboldt University in Berlin – martin.wahl@math.hu-berlin.dehttps://www.mathematik.hu-berlin.de/de/forschung/forschungsgebiete/stochastik/stoch-employees/martin-wahl/
Organizer
Vladimir Koltchinskii

This talk will be concerned with nonasymptotic lower bounds for the estimation of principal subspaces. I will start by reviewing some previous methods, including the local asymptotic minimax theorem and the Grassmann approach. Then I will present a new approach based on a van Trees inequality (i.e. a Bayesian version of the Cramér-Rao inequality) tailored for invariant statistical models. As applications, I will provide nonasymptotic lower bounds for principal component analysis and the matrix denoising problem, two examples that are invariant with respect to the orthogonal group. These lower bounds are characterized by doubly substochastic matrices whose entries are bounded by the different Fisher information directions, confirming recent upper bounds in the context of the empirical covariance operator.

Seminar link: https://bluejeans.com/129119189