Seminars and Colloquia by Series

Aeons Before the Big Bang?

Series
Other Talks
Time
Tuesday, March 24, 2009 - 17:30 for 2 hours
Location
LeCraw Auditorium, Room 100
Speaker
Roger PenroseMathematical Institute, University of Oxford
There is much impressive observational evidence, mainly from the cosmic microwave background (CMB), for an enormously hot and dense early stage of the universe --- referred to as the Big Bang. Observations of the CMB are now very detailed, but this very detail presents new puzzles of various kinds, one of the most blatant being an apparent paradox in relation to the second law of thermodynamics. The hypothesis of inflationary cosmology has long been argued to explain away some of these puzzles, but it does not resolve some key issues, including that raised by the second law. In this talk, I describe a quite different proposal, which posits a succession of universe aeons prior to our own. The expansion of the universe never reverses in this scheme, but the space-time geometry is nevertheless made consistent through a novel geometrical conception. Some very recent analysis of the CMB data, obtained with the WMAP satellite, will be described, this having a profound but tantalizing bearing on these issues.

Introduction to metric and comparison geometry

Series
Other Talks
Time
Friday, February 27, 2009 - 15:00 for 2 hours
Location
Skiles 269
Speaker
Igor BelegradekSchool of Mathematics, Georgia Tech
Comparison geometry studies Riemannian manifolds with a given curvature bound. This minicourse is an introduction to volume comparison (as developed by Bishop and Gromov), which is fundamental in understanding manifolds with a lower bound on Ricci curvature. Prerequisites are very modest: we only need basics of Riemannian geometry, and fluency with fundamental groups and metric spaces. In the third (2 hour) lecture I shall prove volume and Laplacian comparison theorems.

Introduction to metric and comparison geometry

Series
Other Talks
Time
Friday, February 20, 2009 - 15:00 for 2 hours
Location
Skiles 269
Speaker
Igor BelegradekSchool of Mathematics, Georgia Tech
Comparison geometry studies Riemannian manifolds with a given curvature bound. This minicourse is an introduction to volume comparison (as developed by Bishop and Gromov), which is fundamental in understanding manifolds with a lower bound on Ricci curvature. Prerequisites are very modest: we only need basics of Riemannian geometry, and fluency with fundamental groups and metric spaces. The second (2 hour) lecture is about Gromov-Hausdorff convergence, which provides a natural framework to studying degenerations of Riemannian metrics.

Introduction to the h-principle

Series
Other Talks
Time
Friday, January 30, 2009 - 15:00 for 2 hours
Location
Skiles 269
Speaker
Mohammad GhomiSchool of Mathematics, Georgia Tech

Please Note: Please note this course runs from 3-5.

h-Principle consists of a powerful collection of tools developed by Gromov and others to solve underdetermined partial differential equations or relations which arise in differential geometry and topology. In these talks I will describe the Holonomic approximation theorem of Eliashberg-Mishachev, and discuss some of its applications including the sphere eversion theorem of Smale. Further I will discuss the method of convex integration and its application to proving the C^1 isometric embedding theorem of Nash.

Introduction to the h-principle

Series
Other Talks
Time
Friday, January 23, 2009 - 15:00 for 2 hours
Location
Skiles 269
Speaker
Mohammad GhomiSchool of Mathematics, Georgia Tech
h-Principle consists of a powerful collection of tools developed by Gromov and others to solve underdetermined partial differential equations or relations which arise in differential geometry and topology. In these talks I will describe the Holonomic approximation theorem of Eliashberg-Mishachev, and discuss some of its applications including the sphere eversion theorem of Smale. Further I will discuss the method of convex integration and its application to proving the C^1 isometric embedding theorem of Nash.

Learning with Teacher - Learning Using Hidden Information

Series
Other Talks
Time
Friday, January 16, 2009 - 14:00 for 1 hour (actually 50 minutes)
Location
Klaus 2447
Speaker
Vladimir VapnikNEC Laboratories, Columbia University and Royal Holloway University of London

Please Note: You are cordially invited to attend a reception that will follow the seminar to chat informally with faculty and students. Refreshments will be provided.

The existing machine learning paradigm considers a simple scheme: given a set of training examples find in a given collection of functions the one that in the best possible way approximates the unknown decision rule. In such a paradigm a teacher does not play an important role. In human learning, however, the role of a teacher is very important: along with examples a teacher provides students with explanations, comments, comparisons, and so on. In this talk I will introduce elements of human teaching in machine learning. I will consider an advanced learning paradigm called learning using hidden information (LUHI), where at the training stage a teacher gives some additional information x^* about training example x. This information will not be available at the test stage. I will consider the LUHI paradigm for support vector machine type of algorithms, demonstrate its superiority over the classical one and discuss general questions related to this paradigm. For details see FODAVA, Foundations of Data Analysis and Visual Analytics

Model Complexity Optimization

Series
Other Talks
Time
Friday, January 16, 2009 - 13:00 for 1 hour (actually 50 minutes)
Location
Klaus 2447
Speaker
Alexey ChervonenkisRussian Academy of Science and Royal Holloway University of London
It is shown (theoretically and empirically) that a reliable result can be gained only in the case of a certain relation between the capacity of the class of models from which we choose and the size of the training set. There are different ways to measure the capacity of a class of models. In practice the size of a training set is always finite and limited. It leads to an idea to choose a model from the most narrow class, or in other words to use the simplest model (Occam's razor). But if our class is narrow, it is possible that there is no true model within the class or a model close to the true one. It means that there will be greater residual error or larger number of errors even on the training set. So the problem of model complexity choice arises – to find a balance between errors due to limited number of training data and errors due to excessive model simplicity. I shall review different approaches to the problem.

Southeast Geometry Seminar

Series
Other Talks
Time
Friday, December 12, 2008 - 09:00 for 8 hours (full day)
Location
Skiles 243
Speaker
Various SpeakersVarious Universities
The Southeast Geometry Seminar (SGS) is a semiannual series of one day events organized by Vladimir Oliker (Emory), Mohammad Ghomi and John McCuan (Georgia Tech) and Gilbert Weinstein (UAB). See http://www.math.uab.edu/sgs for details

When Biology is Computation

Series
Other Talks
Time
Tuesday, October 21, 2008 - 11:00 for 1 hour (actually 50 minutes)
Location
Klaus Building, 1116E&W
Speaker
Leslie ValiantDivision of Engineering and Applied Sciences, Harvard University
We argue that computational models have an essential role in uncovering the principles behind a variety of biological phenomena that cannot be approached by other means. In this talk we shall focus on evolution. Living organisms function according to complex mechanisms that operate in different ways depending on conditions. Darwin's theory of evolution suggests that such mechanisms evolved through random variation guided by natural selection. However, there has existed no theory that would explain quantitatively which mechanisms can so evolve in realistic population sizes within realistic time periods, and which are too complex. Here we suggest such a theory. Evolution is treated as a form of computational learning from examples in which the course of learning depends only on the aggregate fitness of the current hypothesis on the examples, and not otherwise on individual examples. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. For example, we can show that monotone Boolean conjunctions and disjunctions are demonstrably evolvable over the uniform distribution, while Boolean parity functions are demonstrably not. We shall discuss some broader issues in evolution and intelligence that can be addressed via such an approach.

Pages