Generative modeling through time reversal and reflection of diffusion processes

Series
Applied and Computational Mathematics Seminar
Time
Monday, April 29, 2024 - 2:00pm for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Nicole Yang – Emory University – https://nicoletyang.github.io/
Organizer
Molei Tao

Please Note: Speaker will present in person.

In this talk, we discuss generative modeling algorithms motivated by the time reversal and reflection properties of diffusion processes. Score-based diffusion models (SBDM) have recently emerged as state-of-the-art approaches for image generation. We develop SBDMs in the infinite-dimensional setting, that is, we model the training data as functions supported on a rectangular domain. Besides the quest for generating images at ever higher resolution, our primary motivation is to create a well-posed infinite-dimensional learning problem so that we can discretize it consistently at multiple resolution levels. We demonstrate how to overcome two shortcomings of current SBDM approaches in the infinite-dimensional setting by ensuring the well-posedness of forward and reverse processes, and derive the convergence of the approximation of multilevel training. We illustrate that approximating the score function with an operator network is beneficial for multilevel training.

In the second part of this talk, we propose the Reflected Schrodinger Bridge algorithm: an entropy-regularized optimal transport approach tailored for generating data within diverse bounded domains. We derive reflected forward-backward stochastic differential equations with Neumann and Robin boundary conditions, extend divergence-based likelihood training to bounded domains, and demonstrate its scalability in constrained generative modeling.