Provable Non-convex Projections for High-dimensional Learning Problems – Part1


July 22, 2015


Prateek Jain




Typical high-dimensional learning problems such as sparse regression, low-rank matrix completion, robust PCA etc can be solved using projections onto non-convex sets. However, providing theoretical guarantees for such methods is difficult due to the non-convexity in projections. In this talk, we will discuss some of our recent results that show that non-convex projections based methods can be used to solve several important problems in this area such as: a) sparse regression, b) low-rank matrix completion, c) robust PCA.

In this talk, we will give an overview of the state-of-the-art for these problems and also discuss how simple non-convex techniques can significantly outperform state-of-the-art convex relaxation based techniques and provide solid theoretical results as well. For example, for robust PCA, we provide first provable algorithm with time complexity O(n2 r) which matches the time complexity of normal SVD and is faster than the usual nuclear+L1-regularization methods that incur O(n3) time complexity. This talk is based on joint works with Ambuj Tewari, Purushottam Kar, Praneeth Netrapalli, Animashree Anandkumar, U N Niranjan, and Sujay Sanghavi.


Prateek Jain

I am a member of Machine Learning and Optimization, and Algorithms and Modeling Research Group at Microsoft Research, Bangalore, India. My research interests are in machine learning, statistical learning theory, and optimization algorithms in general. I am also interested in applications of machine learning to privacy, computer vision, text mining and natural language processing.

Earlier, I completed my PhD at the University of Texas at Austin under Prof. Inderjit S. Dhillon.