- You are here:
- GT Home
- Home
- News & Events

Series: Combinatorics Seminar

Stability methods are often used in extremal graph theory, Ramsey theory and similar areas, where an extremal problem is to be solved and

- we have a conjecture about the structure of the conjectured extremal configurations and according to our conjecture, it has some given property \mathcal P;
- we can prove that all the almost extremal structures are near to the property \mathcal P, in some sense;
- if we knew that if a structure is near to the property \mathcal P and is extremal, then it is already the conjectured structure.

Of course, stability methods can also be used in other cases, but we restrict ourselves to the above two areas.

In my lecture I will give an introduction to the applications of the stability methods in extremal graph theory, describe cases in extremal graph theory, extremal hypergraph theory, in the Erdos-Frankl-Rold (= generalized Erdos-Kleitman-Rothschild theory) ...

In the second part of my lecture I shall describe the application of this method to the Erdos-Sos conjecture. This is part of our work with Ajtai, Komlos and Szemeredi.

Series: Combinatorics Seminar

This is joint work with Dr. Yi Zhao.

Graph tiling problems can be summarized as follows: given a graph H, what conditions do we need to find a spanning subgraph of some larger graph G that consists entirely of disjoint copies of H. The most familiar example of a graph tiling problem is finding a matching in a graph. With the Regularity Lemma and the Blow-up Lemma as our main tools, we prove a degree condition that guarantees an arbitrary bipartite graph G will be tiled by an arbitrary bipartite graph H. We also prove this degree condition is best possible up to a constant. This answers a question of Zhao and proves an asymptotic version of a result of Kuhn and Osthus for bipartite graphs.

Series: Combinatorics Seminar

There has been substantial work on approximation algorithms for clustering
data under distance-based objective functions such as k-median, k-means, and
min-sum objectives. This work is fueled in part by the hope that
approximating these objectives well will indeed yield more accurate
solutions. That is, for problems such as clustering proteins by function, or
clustering images by subject, there is some unknown correct "target"
clustering and the implicit assumption is that clusterings that are
approximately optimal in terms of these distance-based measures are also
approximately correct in terms of error with respect to the target. In this
work we show that if we make this implicit assumption explicit -- that is, if
we assume that any c-approximation to the given clustering objective Phi is
epsilon-close to the target -- then we can produce clusterings that are
O(epsilon)-close to the target, even for values c for which obtaining a
c-approximation is NP-hard. In particular, for the k-median, k-means, and
min-sum objectives, we show that we can achieve this guarantee for any
constant c > 1.
Our results show how by explicitly considering the alignment between the
objective function used and the true underlying clustering goals, one can
bypass computational barriers and perform as if these objectives were
computationally substantially easier.
This talk is based on joint work with Avrim Blum and Anupam Gupta (SODA
2009), Mark Braverman (COLT 2009), and Heiko Roeglin and Shang-Hua Teng (ALT 2009).

Series: Combinatorics Seminar

In this talk I will discuss a new technique discovered by myself
and Olof Sisask which produces many new insights in additive combinatorics,
not to mention new
proofs of classical theorems previously proved only using harmonic
analysis. Among these new proofs is one for Roth's theorem on three-term
arithmetic progressions, which gives the best bounds so
far achieved by any combinatorial method. And another is a new proof
that positive density subsets of the integers mod p contain very
long arithmetic progressions, first proved by Bourgain, and improved
upon by Ben Green and Tom Sanders. If time permits, I will discuss
how the method can be applied to the 2D corners problem.

Series: Combinatorics Seminar

It is known that, relative to any fixed vertex q of a finite graph, there
exists a unique q-reduced divisor (G-Parking function based at q) in
each linear equivalence class of divisors.
In this talk, I will give an efficient algorithm for finding such reduced
divisors. Using this, I will give an explicit and efficient bijection
between the Jacobian group and the set of spanning trees of the graph. Then
I will outline some applications of the main results, including a new
approach to the Random Spanning Tree problem, efficient computation of the
group law in the critical and sandpile group, efficient algorithm for the
chip-firing game of Baker and Norine, and the relation to the Riemann-Roch
theory on finite graphs.

Series: Combinatorics Seminar

Since the seminal work of Erdos and Renyi the phase transition of the largest components in random graphs became one of the central topics in random graph theory and discrete probability theory. Of particular interest in recent years are random graphs with constraints (e.g. degree distribution, forbidden substructures) including random planar graphs. Let G(n,M) be a uniform random graph, a graph picked uniformly at random among all graphs on vertex set [n]={1,...,n} with M edges. Let P(n,M) be a uniform random planar graph, a graph picked uniformly at random among all graphs on vertex set [n] with M edges that are embeddable in the plane. Erodos-Renyi, Bollobas, and Janson-Knuth-Luczak-Pittel amongst others studied the critical behaviour of the largest components in G(n,M) when M= n/2+o(n) with scaling window of size n^{2/3}. For example, when M=n/2+s with s=o(n) and s \gg n^{2/3}, a.a.s. (i.e. with probability tending to 1 as n approaches \infty) G(n,M) contains a unique largest component (the giant component) of size (4+o(1))s. In contract to G(n,M) one can observe two critical behaviour in P(n,M), when M=n/2+o(n) with scaling window of size n^{2/3}, and when M=n+o(n) with scaling window of size n^{3/5}. For example, when M=n/2+s with s = o(n) and s \gg n^{2/3}, a.a.s. the largest component in P(n,M) is of size (2+o(1))s, roughly half the size of the largest component in G(n,M), whereas when M=n+t with t = o(n) and t \gg n^{3/5}, a.a.s. the number of vertices outside the giant component is \Theta(n^{3/2}t^{-3/2}). (Joint work with Tomasz Luczak)

Series: Combinatorics Seminar

We consider the #P complete problem of counting the number of independent
sets in a given graph. Our interest is in understanding the effectiveness of
the popular Belief Propagation (BP) heuristic. BP is a simple and iterative
algorithm that is known to have at least one fixed point. Each fixed point
corresponds to a stationary point of the Bethe free energy (introduced by
Yedidia, Freeman and Weiss (2004) in recognition of Hans Bethe's earlier
work (1935)). The evaluation of the Bethe Free Energy at such a stationary
point (or BP fixed point) leads to the Bethe approximation to the number of
independent sets of the given graph. In general BP is not known to converge
nor is an efficient, convergent procedure for finding stationary points of
the Bethe free energy known. Further, effectiveness of Bethe approximation
is not well understood.
As the first result of this paper, we propose a BP-like algorithm that
always converges to a BP fixed point for any graph. Further, it finds an \epsilon
approximate fixed point in poly(n, 2^d, 1/\epsilon) iterations for a graph of n
nodes with max-degree d. As the next step, we study the quality of this
approximation. Using the recently developed 'loop series' approach by
Chertkov and Chernyak, we establish that for any graph of n nodes with
max-degree d and girth larger than 8d log n, the multiplicative error decays
as 1 + O(n^-\gamma) for some \gamma > 0. This provides a deterministic counting
algorithm that leads to strictly different results compared to a recent
result of Weitz (2006). Finally as a consequence of our results, we prove
that the Bethe approximation is exceedingly good for a random 3-regular
graph conditioned on the Shortest Cycle Cover Conjecture of Alon and Tarsi
(1985) being true.
(Joint work with Venkat Chandrasekaran, Michael Chertkov, David Gamarnik and
Devavrat Shah)

Series: Combinatorics Seminar

Lovasz Local Lemma (LLL) is a powerful result in probability theory that states that the probability that none of a set of bad events happens is nonzero if the probability of each event is small compared to the number of events that depend on it. It is often used in combination with the probabilistic method for non-constructive existence proofs. A prominent application of LLL is to k-CNF formulas, where LLL implies that, if every clause in the formula shares variables with at most d \le 2^k/e other clauses then such a formula has a satisfying assignment. Recently, a randomized algorithm to efficiently construct a satisfying assignment was given by Moser. Subsequently Moser and Tardos gave a randomized algorithm to construct the structures guaranteed by the LLL in a very general algorithmic framework. We will address the main problem left open by Moser and Tardos of derandomizing their algorithm efficiently when the number of other events that any bad event depends on is possibly unbounded. An interesting special case of the open problem is the k-CNF problem when k = \omega(1), that is, when k is more than a constant.

Series: Combinatorics Seminar

In this lecture, I will explain the greedy approximation algorithm on submodular function maximization due to Nemhauser, Wolsey, and Fisher. Then I will apply this algorithm to the problem of approximating an monotone submodular functions by another submodular function with succinct representation. This approximation method is based on the maximum volume ellipsoid inscribed in a centrally symmetric convex body. This is joint work with Michel Goemans, Nick Harvey, and Vahab Mirrokni.

Series: Combinatorics Seminar

In this lecture, I will review combinatorial algorithms for minimizing submodular functions. In particular, I will present a new combinatorial algorithm obtained in my recent joint work with Jim Orlin.