Towards Agnostically Learning Halfspaces (joint work with Adam Klivans, Yishay Mansour and Rocco Servedio)

A longstanding open problem in computational learning theory is that of learning halfspaces in the agnostic model of Kearns, Schapire and Sellie (which model can also be viewed as learning with adversarial noise). In this problem, there is an arbitrary and unknown joint distribution over n-dimensional vectors X and their 0-1 labels Y. The goal is to design an algorithm that provably learns to predict the labels of future examples with error at most (opt + eps), where opt is the error of the best halfspace predictor for the distribution, for any eps > 0, using runtime and a number of samples of this distribution that is polynomial in n and 1/eps.

We make progress on this problem by giving a simple learning algorithm that learns such functions for a very general class of distributions on n-dimensional vectors X (including any log-concave distribution) in time polynomial in the dimension n (but exponential in 1/eps). Note that our distributional assumptions are only on X, and thus the algorithm tolerates worst-case noise. The simple randomized algorithm generalizes Fourier learning and is also related to “support vector machines with a polynomial kernel.

Speaker Details

Adam Tauman Kalai is an assistant professor at the Toyota Technological Institute in Chicago. Kalai did his graduate work at CMU under the supervision of Avrim Blum. Following this, he was a postdoc at MIT.His research interests include machine learning theory, randomized algorithms, and game theory.

Date:
Speakers:
Adam Tauman Kalai
Affiliation:
Toyota Technological Institute in Chicago
    • Portrait of Jeff Running

      Jeff Running