Trading Convexity for Scalability

Joint work with Ronan Collobert, Fabian Sinz, and Jason Weston.

Convex learning algorithms, such as Support Vector Machines (SVMs), are often seen as highly desirable because they offer strong practical properties and are amenable to theoretical analysis. However, in this work we show how non-convexity can provide scalability advantages over convexity. We show how concave-convex programming can be applied to produce (1)~faster SVMs where training errors are no longer support vectors, and (2)~much faster Transductive SVMs.

Speaker Details

Léon Bottou received a Diplôme from Ecole Polytechnique, Paris in 1987, a Magistère en Mathématiques Fondamentales et Appliquées et Informatiques from Ecole Normale Supérieure, Paris in 1988, and a PhD in Computer Science from Université de Paris-Sud in 1991. He joined AT&T Bell Labs from 1991 to 1992 and AT&T Labs from 1995 to 2002. Between 1992 and 1995 he was chairman of Neuristique in Paris, a small company pionneering machine learning for data mining applications. He has been with NEC Labs America in Princeton since 2002. Léon’s primary research interest is machine learning. His contributions to this field address theory, algorithms and large scale applications. Léon’s secondary research interest is data compression and coding. His best known contribution in this field is the DjVu document compression technology (http://www.djvuzone.org.) Léon published over 60 papers and is serving on the boards of the Journal of Machine Learning Research and of Pattern Recognition Letters. He also serves on the scientific advisory board of Kxen Inc http://www.kxen.com.http://leon.bottou.org

Date:
Speakers:
Léon Bottou
Affiliation:
NEC Labs America
    • Portrait of Jeff Running

      Jeff Running

    • Portrait of Leon Bottou

      Leon Bottou