## Seminars & Colloquia

### Tuesday, September 23, 2014

#### The Gaussian Radon Transform for Infinite-Dimensional Banach Spaces

Tue, 09/23/2014 - 3:00pm, Skiles 170

Irina Holmes, School of Mathematics, Georgia Tech        Organizer: Karim Lounici

In this talk we construct an infinite-dimensional, stochastic version of the Radon transform. We work within the framework of abstract Wiener spaces, introduced by Leonard Gross. We present some basic properties of this transform, as well as compute some specific examples on the classical Wiener space.

### Tuesday, September 9, 2014

#### Variable Selection Consistency of Linear Programming Discriminant Estimator

Tue, 09/09/2014 - 3:00pm, Skiles 005

Dong Xia, School of Mathematics, Georgia Tech        Organizer: Karim Lounici

The linear programming discriminant(LPD) estimator is used in sparse linear discriminant analysis for high dimensional classification problems. In this talk we will give a sufficient condition for the variable selection property of the LPD estimator and our result provides optimal bound on the requirement of sample size $n$ and magnitude of components of Bayes direction.

### Tuesday, October 23, 2012

#### Rademacher Averages and Phase Transitions in Glivenko–Cantelli Classes

Tue, 10/23/2012 - 3:05pm, Skyles 005

Krishnakumar Balasubramanian, Georgia Institute of Technology        Organizer: Karim Lounici

I will be presenting the paper by S. Mendelson titled 'Rademacher Averages and Phase Transitions in Glivenko–Cantelli Classes'. Fat-shattering dimension and its use in characterizing GC classes will be introduced. Then a new quantity based on the growth rate of the Rademacher averages will be introduced. This parameter enables one to provide improved complexity estimates for the agnostic learning problem with respect to any norm. A phase transition phenomenon that appears in the sample complexity estimates, covering numbers estimates, and in the growth rate of the Rademacher averages will be discussed. Further (i) estimates on the covering numbers of a class when considered as a subset of spaces and (ii) estimate the fat-shattering dimension of the convex hull of a given class will be discussed.

### Tuesday, October 2, 2012

#### Statistical Algorithms and a Lower Bound for Detecting a Planted Clique

Tue, 10/02/2012 - 3:05pm, Skyles 005

Santosh Vempala, Georgia Institute of Technology        Organizer: Karim Lounici

We present a framework for proving lower bounds on computational problems over distributions, including optimization and unsupervised learning. The framework is based on defining a restricted class of algorithms, called statistical algorithms, that instead of directly accessing samples from the input distribution can only obtain an estimate of the expectation of any given function on the input distribution. Our definition captures many natural algorithms used in theory and practice, e.g., moment-based methods, local search, linear programming, MCMC and simulated annealing. Our techniques are inspired by the statistical query model of Kearns from learning theory, which addresses the complexity of PAC-learning. For specific well-known problems over distributions, we obtain lower bounds on the complexity of any statistical algorithm. These include an exponential lower bounds for moment maximization and a nearly optimal lower bound for detecting planted clique distributions when the planted clique has size n^{1/2-\delta} for any constant \delta > 0. Variants of the latter problem have been assumed to be hard to prove hardness for other problems and for cryptographic applications. Our lower bounds provide concrete evidence of hardness. This is joint work with V. Feldman, E. Grigorescu, L. Reyzin and Y. Xiao.

### Tuesday, September 25, 2012

#### Uniform entropy numbers for VC classes

Tue, 09/25/2012 - 3:05pm, Skyles 005

Nishant Mehta, Georgia Institute of Technology        Organizer: Karim Lounici

I'll be presenting bounds on the uniform entropy numbers for VC classes of sets, VC classes of functions, and symmetric convex hulls of VC classes of functions. The material is in Section 2.6 of Van der Vaart and Wellner.

### Tuesday, September 4, 2012

#### Generic Chaining

Tue, 09/04/2012 - 3:05pm, Skyles 005

William Mantzel, School of Electrical and Computer Engineering, Georgia Tech        Organizer: Karim Lounici

Recap of generic chaining from last time and more discussion about it. Then, the lower Dudley bound (Theorem 2.1.1) and the Bernoulli upper bound (4.1.2) and statement of the Bernoulli conjecture (lower bound) will be covered from The Generic Chaining book.

### Friday, July 6, 2012

#### A geometric analysis of subspace clustering with outliers

Fri, 07/06/2012 - 3:05pm, Skiles 005

Mahdi Soltanolkotabi, Stanford University        Organizer: Karim Lounici

One of the most fundamental steps in data analysis and dimensionality reduction consists of approximating a given dataset by a single low-dimensional subspace, which is classically achieved via Principal Component Analysis (PCA). However, in many applications, the data often lie near a union of low-dimensional subspaces, reflecting the multiple categories or classes a set of observations may belong to. In this talk we discuss the problem of clustering a collection of unlabeled data points assumed to lie near a union of lower dimensional planes. Simply stated the task is to assign each data point to a cluster so as to recover all the hidden subspaces. As is common in computer vision or unsupervised learning applications, we do not know in advance how many subspaces there are nor do we have any information about their dimensions. We present a novel geometric analysis of an algorithm named sparse subspace clustering (SSC), which significantly broadens the range of problems where it is provably effective. For instance, we show that SSC can recover multiple subspaces, each of dimension comparable to the ambient dimension. We also show that SSC can correctly cluster data points even when the subspaces of interest intersect. Further, we develop an extension of SSC that succeeds when the data set is corrupted with possibly overwhelmingly many outliers. Underlying our analysis are clear geometric insights, which may bear on other sparse recovery problems. We will also demonstrate the effectiveness of these methods by various numerical studies.

### Tuesday, February 21, 2012

#### Minimax Rates of Estimation for Sparse PCA in High Dimensions

Tue, 02/21/2012 - 4:00pm, Skyles 006

Karim Lounici, Georgia Institute of Technology, School of Mathematics        Organizer: Karim Lounici

This presentation is based on the papers by D. Paul and I. Johnstone (2007) and V.Q. Vu and J. Lei (2012). Here is the abstract of the second paper. We study the sparse principal components analysis in the high-dimensional setting, where $p$ (the number of variables) can be much larger than $n$ (the number of observations). We prove optimal, non-aymptotics lower bounds and upper bounds on the minimax estimation error for the leading eigenvector when it belongs to an $l_q$ ball for $q\in [0,1]$. Our bound are sharp in $p$ and $n$ for all $q\in[0,1]$ over a wide class of distributions. The upper bound is obtained by analyzing the performance of $l_q$-constrained PCA. In particular, our results provide convergence rates for $l_1$-constrained PCA.

### Tuesday, February 14, 2012

#### Estimation of Low Rank Kernels on Graphs

Tue, 02/14/2012 - 4:00pm, Skyles 006

Vladimir Koltchinskii, Georgia Institute of Technology, School of Mathematics        Organizer: Karim Lounici

Let (V, E) be a graph with vertex set V and edge set E. Let (X, X', Y) V \in V × {-1,1} be a random triple, where X, X' are independent uniformly distributed vertices and Y is a label indicating whether X, X' are "similar", or not. Our goal is to estimate the regression function S_*(u, v) = \mathbb{E}(Y|X = u, X' = v), u, v \in V based on n i.i.d. copies (X_1, X'_1, Y_1), ... , (X_n, X'_n, Y_n) of (X, X', Y). We are interested in this problem in the case when S_*: V × V \mapsto [-1,1] is a symmetric low rank kernel (or it can be well approximated by low rank kernels). In addition to this, assume that S_* is "smooth" on the graph. We study estimators based on a modified least squares method with complexity penalization involving both the nuclear norm and Sobolev type norms of symmetric kernels on the graph (defined in terms of its Laplacian). We prove error bounds for such estimators that improve the existing bounds in low rank matrix recovery in the cases when the target kernel is not only low rank, but also sufficiently smooth. The talk is based in part on a joint project with Pedro Rangel.

### Tuesday, February 7, 2012

#### Estimation of Low Rank Kernels on Graphs -- Rescheduled

Tue, 02/07/2012 - 4:00pm, Skiles 006

Vladimir Koltchinskii, School of Mathematics, Georgia Tech        Organizer: Karim Lounici

Vladimir Koltchinskii's talk has been rescheduled for next Tuesday, February 14

### Tuesday, November 15, 2011

#### Randomized Approximation Schemes for Cuts and Flows in Capacitated Graphs

Tue, 11/15/2011 - 4:00pm, skyles 006

Yingyu Liang, School of Compter Science, Georgia tech        Organizer: Karim Lounici

We will talk about how to approximate an arbitrary graph by a sparse graph with respect to cuts and flows, using random sampling techniques. More specifically, we will describe a near-linear-time randomized combinatorial construction that transforms any graph on n vertices into an O(n log n)-edge graph on the same vertices whose cuts have approximately the same value as the original graph's. The new graph can be used to accelerate cut and flow algorithms, leading to approximate solution on the original graph. The construction algorithms of the sparse graph are based on a general theorem analyzing the concentration of cut values near their expectation in random graphs.

### Tuesday, November 8, 2011

#### The Price of Uncertainty in Multiagent Systems with Potentials

Tue, 11/08/2011 - 4:00pm, skyles 006

Steven Ehrlich, School of Computer Science, Georgia tech        Organizer: Karim Lounici

Multi-agent systems have been studied extensively through the lens of game theory. However, most game theoretic models make strong assumptions about agents accuracy of knowledge about their utility and the interactions of other players. We will show some progress at relaxing this assumption. In particular, we look at adversarial noise in specific potential games, and assess the effect of noise on the quality of outcomes. In some cases, very small noise can accumulate over the course of the dynamics and lead to much worse social welfare. We define the Price of Uncertainty to measure this, and we compute both upper and lower bounds on this quantity for particular games.

### Tuesday, October 4, 2011

#### architectures for sampling of correlated signals

Tue, 10/04/2011 - 4:00pm, skyles 006

Ali Ahmed, School of Electrical and Computer Engineering, Georgia Tech        Organizer: Karim Lounici

### Tuesday, September 20, 2011

#### Low-rank/Sparse matrix decomposition II

Tue, 09/20/2011 - 4:00pm, skyles 006

Karim Lounici, School of mathematics, Georgia institute of Technology        Organizer: Karim Lounici

### Tuesday, September 13, 2011

#### Low-rank/Sparse matrix decomposition

Tue, 09/13/2011 - 4:00pm, skyles 006

Karim Lounici, School of mathematics, Georgia institute of Technology        Organizer: Karim Lounici