Seminars and Colloquia by Series

Wednesday, April 4, 2018 - 13:55 , Location: Skiles 005 , Benjamin Jaye , Clemson University , , Organizer: Galyna Livshyts
Monday, April 2, 2018 - 14:00 , Location: Skiles 006 , Linh Truong , Columbia University , Organizer: Jennifer Hom
Monday, April 2, 2018 - 11:15 , Location: skiles 005 , Manfred Heinz Denker , Penn State University , Organizer: Livia Corsi
Consider a $T$-preserving probability measure $m$ on a  dynamical system $T:X\to X$. The occupation time of a measurable function is the sequence $\ell_n(A,x)$   ($A\subset \mathbb R, x\in X$) defined as the number of $j\le n$ for which the partial sums $S_jf(x)\in A$.  The talk will discuss conditions which ensure that this sequence, properly normed, converges weakly to some limit distribution. It turns out that this distribution is Mittag-Leffler and in particular the result covers the case when $f\circ T^j$ is a fractal Gaussian noise of Hurst parameter $>3/4$.
Friday, March 30, 2018 - 15:00 , Location: Skiles 006 , Chethan Pandarinath , GT BME , Organizer: Sung Ha Kang
Since its inception, neuroscience has largely focused on the neuron as the functional unit of the nervous system. However, recent evidence demonstrates that populations of neurons within a brain area collectively show emergent functional properties ("dynamics"), properties that are not apparent at the level of individual neurons. These emergent dynamics likely serve as the brain’s fundamental computational mechanism. This shift compels neuroscientists to characterize emergent properties – that is, interactions between neurons – to understand computation in brain networks. Yet this introduces a daunting challenge – with millions of neurons in any given brain area, characterizing interactions within an area, and further, between brain areas, rapidly becomes intractable.I will demonstrate a novel unsupervised tool, Latent Factor Analysis via Dynamical Systems ("LFADS"), that can accurately and succinctly capture the emergent dynamics of large neural populations from limited sampling. LFADS is based around deep learning architectures (variational sequential auto-encoders), and builds a model of an observed neural population's dynamics using a nonlinear dynamical system (a recurrent neural network). When applied to neuronal ensemble recordings (~200 neurons) from macaque primary motor cortex (M1), we find that modeling population dynamics yields accurate estimates of the state of M1, as well as accurate predictions of the animal's motor behavior, on millisecond timescales. I will also demonstrate how our approach allows us to infer perturbations to the dynamical system (i.e., unobserved inputs to the neural population), and further allows us to leverage population recordings across long timescales (months) to build more accurate models of M1's dynamics.This approach demonstrates the power of deep learning tools to model nonlinear dynamical systems and infer accurate estimates of the states of large biological networks. In addition, we will discuss future directions, where we aim to pry open the "black box" of the trained recurrent neural networks, in order to understand the computations being performed by the modeled neural populations.pre-print available: [] 
Friday, March 30, 2018 - 15:00 , Location: Skiles 202 , Rui Han , Institute for Advanced Study , Organizer: Michael Loss
This talk will be focused on the large deviation theory (LDT) for Schr\"odinger cocycles over a quasi-periodic or skew-shift base. We will also talk about its connection to positivity and regularity of the Lyapunov exponent, as well as localization. We will also discuss some open problems of the skew-shift model.
Friday, March 30, 2018 - 15:00 , Location: Skiles 202 , Rui Han , IAS , Organizer: Michael Loss
Wednesday, March 28, 2018 - 14:00 , Location: Atlanta , Justin Lanier , GaTech , Organizer: Anubhav Mukherjee
Wednesday, March 28, 2018 - 13:55 , Location: Skiles 005 , Laura Cladek , UCLA , , Organizer: Michael Lacey
Wednesday, March 28, 2018 - 12:10 , Location: Skiles 006 , Wenjing Liao , Georgia Tech , , Organizer:
Many data sets in image analysis and signal processing are in a high-dimensional space but exhibit a low-dimensional structure. We are interested in building efficient representations of these data for the purpose of compression and inference. In the setting where a data set in $R^D$ consists of samples from a probability measure concentrated on or near an unknown $d$-dimensional manifold with $d$ much smaller than $D$, we consider two sets of problems: low-dimensional geometric approximations to the manifold and regression of a function on the manifold. In the first case, we construct multiscale low-dimensional empirical approximations to the manifold and give finite-sample performance guarantees. In the second case, we exploit these empirical geometric approximations of the manifold and construct multiscale approximations to the function. We prove finite-sample guarantees showing that we attain the same learning rates as if the function was defined on a Euclidean domain of dimension $d$. In both cases our approximations can adapt to the regularity of the manifold or the function even when this varies at different scales or locations.
Monday, March 26, 2018 - 14:30 , Location: Room 304 , Bob Gompf and Sergei Gukov , UT Austin and Cal Tech , Organizer: Caitlin Leverson
For oriented manifolds of dimension at least 4 that are simply connected at infinity, it is known that end summing (the noncompact analogue of boundary summing) is a uniquely defined operation. Calcut and Haggerty showed that more complicated fundamental group behavior at infinity can lead to nonuniqueness. We will examine how and when uniqueness fails. There are examples in various categories (homotopy, TOP, PL and DIFF) of nonuniqueness that cannot be detected in a weaker category. In contrast, we will present a group-theoretic condition that guarantees uniqueness. As an application, the monoid of smooth manifolds homeomorphic to R^4 acts on the set of smoothings of any noncompact 4-manifold. (This work is joint with Jack Calcut.)