Exploiting low-dimensional data structures in deep learning

Series
School of Mathematics Colloquium
Time
Thursday, November 2, 2023 - 11:00am for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Wenjing Liao – Georgia Tech – wliao60@gatech.eduhttps://wliao60.math.gatech.edu/
Organizer
Gong Chen

In the past decade, deep learning has made astonishing breakthroughs in various real-world applications. It is a common belief that deep neural networks are good at learning various geometric structures hidden in data sets, such as rich local regularities, global symmetries, or repetitive patterns. One of the central interests in deep learning theory is to understand why deep neural networks are successful, and how they utilize low-dimensional data structures. In this talk, I will present some statistical learning theory of deep neural networks where data exhibit low-dimensional structures, such as lying on a low-dimensional manifold. The learning tasks include regression, classification, feature representation and operator learning. When data are sampled on a low-dimensional manifold, the sample complexity crucially depends on the intrinsic dimension of the manifold instead of the ambient dimension of the data. These results demonstrate that deep neural networks are adaptive to low-dimensional geometric structures of data sets.