I am a PhD candidate in School of Mathematics at Georgia Institute of Technology, advised by Prof. Molei Tao. I am on the job market this year.
CV and Google Scholar
My research lies at the intersection of machine learning, optimization, sampling, (stochastic) dynamics, and computational math. More precisely, I focus on the mathematical foundations of machine learning and sampling via convergence analysis of different dynamics.
Good regularity creates large learning rate implicit biases: edge of stability, balancing, and catapult
Yuqing Wang, Zhenghao Xu, Tuo Zhao, Molei Tao
Preprint (short version accepted in M3L, NeurIPS 2023 workshop)
pdf
Markov Chain Monte Carlo for Gaussian: A Linear Control Perspective
Bo Yuan, Jiaojiao Fan, Yuqing Wang, Molei Tao, Yongxin Chen
IEEE Control Systems Letters 2023
pdf
Momentum Stiefel Optimizer, with Applications to Suitably-Orthogonal Attention, and Optimal Transport
Lingkai Kong, Yuqing Wang, Molei Tao
ICLR 2023
pdf
Large Learning Rate Tames Homogeneity: Convergence and Balancing Effect
Yuqing Wang, Minshuo Chen, Tuo Zhao, Molei Tao
ICLR 2022
pdf
Why Do Deep Residual Networks Generalize Better than Deep Feedforward Networks? -- A Neural Tangent Kernel Perspective
Kaixuan Huang*, Yuqing Wang*, Molei Tao, Tuo Zhao (*Equal contribution)
NeurIPS 2020
pdf