An introduction to mathematical learning theory

Series: 
SIAM Student Seminar
Friday, March 6, 2009 - 12:30
2 hours
Location: 
Skiles 269
,  
School of Mathematics, Georgia Tech
In this talk, I will briefly introduce some basics of mathematical learning theory. Two basic methods named perceptron algorithm and support vector machine will be explained for the separable classification case. Also, the subgaussian random variable and Hoeffding inequality will be mentioned in order to provide the upper bound for the deviation of the empirical risk. If time permits, the Vapnik combinatorics will be involved for shaper bounds of this deviation.