- You are here:
- GT Home
- Home
- News & Events

Series: School of Mathematics Colloquium

The probability of outcomes of repeated
fair coin tosses can be computed exactly using binomial coefficients.
Performing asymptotics on these formulas uncovers the Gaussian
distribution and the first instance of the central limit theorem. This
talk will focus on higher version of this story. We will consider random
motion subject to random forcing. By leveraging structures from representation theory and quantum integrable systems
we can compute the analogs of binomial coefficients and extract new and
different asymptotic behaviors than those of the Gaussian. This model
and its analysis fall into the general theory of "integrable
probability".

Series: School of Mathematics Colloquium

Series: School of Mathematics Colloquium

Associated to a planar cubic graph, there is a closed surface in R^5, as defined by Treumann and Zaslow. R^5 has a canonical geometry, called a contact structure, which is compatible with the surface. The data of how this surface interacts with the geometry recovers interesting data about the graph, notably its chromatic polynomial. This also connects with pseudo-holomorphic curve counts which have boundary on the surface, and by looking at the resulting differential graded algebra coming from symplectic field theory, we obtain new definitions of n-colorings which are strongly non-linear as compared to other known definitions. There are also relationships with SL_2 gauge theory, mathematical physics, symplectic flexibility, and holomorphic contact geometry. During the talk we'll explain the basic ideas behind the various fields above, and why these various concepts connect.

Series: School of Mathematics Colloquium

Traditional Erdos Magic (a.k.a. The Probabilistic Method) proves the existence of an object with certain properties
by showing that a random (appropriately defined) object will have those properties with positive probability. Modern Erdos Magic analyzes a random process, a random (CS take note!) algorithm. These, when successful, can find a "needle in an exponential haystack" in polynomial time.
We'll look at two particular examples, both involving a family of n-element sets under suitable side conditions. The Lovasz Local Lemma finds a coloring with no set monochromatic. A result of this speaker finds a coloring with low discrepency. In both cases the original proofs were not implementable but Modern Erdos Magic finds the colorings in polynomial times.
The methods are varied. Basic probability and combinatorics. Brownian Motion. Semigroups. Martingales. Recursions ... and Tetris!

Series: School of Mathematics Colloquium

Series: School of Mathematics Colloquium

The study of nonconventional sums $S_{N}=\sum_{n=1}^{N}F(X(n),X(2n),\dots,X(\ell n))$, where $X(n)=g \circ T^n$ for a measure preserving transformation $T$, has a 40 years history after Furstenberg showed that they are related to the ergodic theory proof of Szemeredi's theorem about arithmetic progressions in the sets of integers of positive density. Recently, it turned out that various limit theorems of probabilty theory can be successfully studied for sums $S_{N}$ when $X(n), n=1,2,\dots$ are weakly dependent random variables. I will talk about a more general situation of nonconventional arrays of the form $S_{N}=\sum_{n=1}^{N}F(X(p_{1}n+q_{1}N),X(p_{2}n+q_{2}N),\dots,X(p_{\ell}n+q_{\ell}N))$ and how this is related to an extended version of Szemeredi's theorem. I'll discuss also ergodic and limit theorems for such and more general nonconventional arrays.

Series: School of Mathematics Colloquium

Simulation of hyperelastic materials is widely adopted in the computer graphics community for applications that include virtual clothing, skin, muscle, fat, etc. Elastoplastic materials with a hyperelastic constitutive model combined with a notion of stress constraint (or feasible stress region) are also gaining increasing applicability in the field. In these models, the elastic potential energy only increases with the elastic partof the deformation decomposition. The evolution of the plastic part is designed to satisfy the stress constraint. Perhaps the most common example of this phenomenon is denting of an elastic shell. However, other very powerful examples include frictional contact material interactions. I will discuss some of the mathematical aspects of these models and present some recent results and examples in computer graphics applications.

Series: School of Mathematics Colloquium

I will present a survey of the main results about first and second order models of swarming where repulsion and attraction are modeled through pairwise potentials. We will mainly focus on the stability of the fascinating patterns that you get by random particle simulations, flocks and mills, and their qualitative behavior. Qualitative properties of local minimizers of the interaction energies are crucial in order to understand these complex behaviors. Compactly supported global minimizers determine the flock patterns whose existence is related to the classical H-stability in statistical mechanics and the classical obstacle problem for differential operators.

Series: School of Mathematics Colloquium

Charles Stein brought the method that now bears his name to life in a 1972 Berkeley symposium paper that presented a new way to obtain information on the quality of the normal approximation, justified by the Central Limit Theorem asymptotic, by operating directly on random variables. At the heart of the method is the seemingly harmless characterization that a random variable $W$ has the standard normal ${\cal N}(0,1)$ distribution if and only if E[Wf(W)]=E[f'(W)] for all functions $f$ for which these expressions exist. From its inception, it was clear that Stein's approach had the power to provide non-asymptotic bounds, and to handle various dependency structures. In the near half century since the appearance of this work for the normal, the `characterizing equation' approach driving Stein's method has been applied to roughly thirty additional distributions using variations of the basic techniques, coupling and distributional transformations among them. Further offshoots are connections to Malliavin calculus and the concentration of measure phenomenon, and applications to random graphs and permutations, statistics, stochastic integrals, molecular biology and physics.

Series: School of Mathematics Colloquium

We present algorithms for performing sparse univariate
polynomial interpolation with errors in the evaluations of
the polynomial. Our interpolation algorithms use as a
substep an algorithm that originally is by R. Prony from
the French Revolution (Year III, 1795) for interpolating
exponential sums and which is rediscovered to decode
digital error correcting BCH codes over finite fields (1960).
Since Prony's algorithm is quite simple, we will give
a complete description, as an alternative for Lagrange/Newton
interpolation for sparse polynomials. When very few errors
in the evaluations are permitted, multiple sparse interpolants
are possible over finite fields or the complex numbers,
but not over the real numbers. The problem is then a simple
example of list-decoding in the sense of Guruswami-Sudan.
Finally, we present a connection to the Erdoes-Turan Conjecture
(Szemeredi's Theorem).
This is joint work with Clement Pernet, Univ. Grenoble.