Special topics course offered in Spring 2018 by Henry Matzinger on "Big matrices and real life machine learning applied problems"
-PCA and SVD in high dimension revisited and applied to text classification and comparison to other dimensional reduction techniques
-getting eigenvectors for PCA in high dimension by distributed approximation algorithms
-sparse PCA in high dimension (that is finding rotations which lead to sparse factors)
-our method for recovering the true spectrum of the covariance matrix from the sample covariance matrix and many applications in machine learning, and comparing the method to free-probability based methods
-deep learning, Ada-boosting, kernelization and others revisited.
-little intro to theory of VC-dmension and thinking about deep learning and search for other functions which might be useful too...