Seminars and Colloquia by Series

Efficient hybrid spatial-temporal operator learning

Series
SIAM Student Seminar
Time
Friday, March 29, 2024 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Francesco BrardaEmory University

Recent advancements in operator-type neural networks, such as Fourier Neural Operator (FNO) and Deep Operator Network (DeepONet), have shown promising results in approximating the solutions of spatial-temporal Partial Differential Equations (PDEs). However, these neural networks often entail considerable training expenses, and may not always achieve the desired accuracy required in many scientific and engineering disciplines. In this paper, we propose a new operator learning framework to address these issues. The proposed paradigm leverages the traditional wisdom from numerical PDE theory and techniques to refine the pipeline of existing operator neural networks. Specifically, the proposed architecture initiates the training for a single or a few epochs for the operator-type neural networks in consideration, concluding with the freezing of the model parameters. The latter are then fed into an error correction scheme: a single parametrized linear spectral layer trained with a convex loss function defined through a reliable functional-type a posteriori error estimator.This design allows the operator neural networks to effectively tackle low-frequency errors, while the added linear layer addresses high-frequency errors. Numerical experiments on a commonly used benchmark of 2D Navier-Stokes equations demonstrate improvements in both computational time and accuracy, compared to existing FNO variants and traditional numerical approaches.

Optimization in Data Science: Enhancing Autoencoders and Accelerating Federated Learning

Series
SIAM Student Seminar
Time
Monday, January 22, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Xue FengUC Davis

In this presentation, I will discuss my research in the field of data science, specifically in two areas: improving autoencoder interpolations and accelerating federated learning algorithms. My work combines advanced mathematical concepts with practical machine learning applications, contributing to both the theoretical and applied aspects of data science. The first part of my talk focuses on image sequence interpolation using autoencoders, which are essential tools in generative modeling. The focus is when there is only limited training data. By introducing a novel regularization term based on dynamic optimal transport to the loss function of autoencoder, my method can generate more robust and semantically coherent interpolation results. Additionally, the trained autoencoder can be used to generate barycenters. However, computation efficiency is a bottleneck of our method, and we are working on improving it. The second part of my presentation focuses on accelerating federated learning (FL) through the application of Anderson Acceleration. Our method achieves the same level of convergence performance as state-of-the-art second-order methods like GIANT by reweighting the local points and their gradients. However, our method only requires first-order information, making it a more practical and efficient choice for large-scale and complex training problems. Furthermore, our method is theoretically guaranteed to converge to the global minimizer with a linear rate.

Controlled SPDEs: Peng’s Maximum Principle and Numerical Methods

Series
SIAM Student Seminar
Time
Friday, November 17, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Lukas WesselsGeorgia Tech

In this talk, we consider a finite-horizon optimal control problem of stochastic reaction-diffusion equations. First, we apply the spike variation method which relies on introducing the first and second order adjoint state. We give a novel characterization of the second order adjoint state as the solution to a backward SPDE. Using this representation, we prove the maximum principle for controlled SPDEs.

In the second part, we present a numerical algorithm that allows the efficient approximation of optimal controls in the case of stochastic reaction-diffusion equations with additive noise by first reducing the problem to controls of feedback form and then approximating the feedback function using finitely based approximations. Numerical experiments using artificial neural networks as well as radial basis function networks illustrate the performance of our algorithm.

This talk is based on joint work with Wilhelm Stannat and Alexander Vogler. Talk will also be streamed: https://gatech.zoom.us/j/93808617657?pwd=ME44NWUxbk1NRkhUMzRsK3c0ZGtvQT09

Neural-ODE for PDE Solution Operators

Series
SIAM Student Seminar
Time
Friday, September 29, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Nathan GabyGeorgia State University

We consider a numerical method to approximate the solution operator for evolutional partial differential equations (PDEs). By employing a general reduced-order model, such as a deep neural network, we connect the evolution of a model's parameters with trajectories in a corresponding function space. Using the Neural Ordinary Differential Equations (NODE) technique we learn a vector field over the parameter space such that from any initial starting point, the resulting trajectory solves the evolutional PDE. Numerical results are presented for a number of high-dimensional problems where traditional methods fail due to the curse of dimensionality.

Geometric Equations for Matroid Varieties

Series
SIAM Student Seminar
Time
Tuesday, November 15, 2022 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Ashley K. WheelerSchool of Mathematics

Each point x in Gr(r, n) corresponds to an r × n matrix Ax which gives rise to a matroid Mx on its columns. Gel’fand, Goresky, MacPherson, and Serganova showed that the sets {y ∈ Gr(r, n)|My = Mx} form a stratification of Gr(r, n) with many beautiful properties. However, results of Mnëv and Sturmfels show that these strata can be quite complicated, and in particular may have arbitrary singularities. We study the ideals Ix of matroid varieties, the Zariski closures of these strata. We construct several classes of examples based on theorems from projective geometry and describe how the Grassmann-Cayley algebra may be used to derive non-trivial elements of Ix geometrically when the combinatorics of the matroid is sufficiently rich.

Sparse Quadratic Optimization via Polynomial Roots

Series
SIAM Student Seminar
Time
Tuesday, October 25, 2022 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Kevin ShuSchool of Mathematics

We'll talk about problems of optimizing a quadratic function subject to quadratic constraints, in addition to a sparsity constraint that requires that solutions have only a few nonzero entries. Such problems include sparse versions of linear regression and principal components analysis. We'll see that this problem can be formulated as a convex conical optimization problem over a sparse version of the positive semidefinite cone, and then see how we can approximate such problems using ideas arising from the study of hyperbolic polynomials. We'll also describe a fast algorithm for such problems, which performs well in practical situations.

Ergodic theory: a statistical description of chaotic dynamical systems

Series
SIAM Student Seminar
Time
Friday, December 3, 2021 - 14:30 for 1 hour (actually 50 minutes)
Location
Skiles 169
Speaker
Alex BlumenthalGeorgia Tech

Dynamical systems model the way that real-world systems evolve in time. While the time-asymptotic behavior of many systems can be characterized by “simple” dynamical features such as equilibria and periodic orbits, some systems evolve in a chaotic, seemingly random way. For such systems it is no longer meaningful to track one trajectory at a time individually- instead, a natural approach is to treat the initial condition as random and to observe how its probabilistic law evolves in time. This is the core idea of ergodic theory, the topic of this talk. I will not assume much beyond some basics of probability theory, e.g., random variables. 

About Coalescence of Eigenvalues for Matrices Depending on Several Parameters

Series
SIAM Student Seminar
Time
Friday, November 12, 2021 - 14:30 for 1 hour (actually 50 minutes)
Location
Skiles 169
Speaker
Luca DieciGeorgia Institute of Technology

We review some theoretical and computational results on locating eigenvalues coalescence for matrices smoothly depending on parameters. Focus is on the symmetric 2 parameter case, and Hermitian 3 parameter case. Full and banded matrices are of interest.

Mathematical approaches to Imaging and data

Series
SIAM Student Seminar
Time
Friday, September 24, 2021 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 169
Speaker
Sung-Ha KangSchool of Math, Georgia Tech,

 

I will talk about introduction to mathematical image processing, and cover how numerical PDE can be used in data understanding.  This talk will present some of variational/PDE-based methods for image processing, such as denoising, inpainting, colorization.  If time permits, I will introduce identification of differential equation from given noisy data.   

A Self-Limiting Hawkes Process

Series
SIAM Student Seminar
Time
Monday, November 16, 2020 - 12:30 for 1 hour (actually 50 minutes)
Location
ONLINE at https://bluejeans.com/703668715
Speaker
John OlindeGT Math

Many real life processes that we would like to model have a self-exciting property, i.e. the occurrence of one event causes a temporary spike in the probability of other events occurring nearby in space and time.  Examples of processes that have this property are earthquakes, crime in a neighborhood, or emails within a company.  In 1971, Alan Hawkes first used what is now known as the Hawkes process to model such processes.  Since then much work has been done on estimating the parameters of a Hawkes process given a data set and creating variants of the process for different applications.

 

In this talk, I will be proposing a new variant of a Hawkes process that takes into account the effect of police activity on the underlying crime rate and an algorithm for estimating its parameters given a crime data set.

Pages