We work on algorithmic foundations of machine learning and data science. Our team has a track record of resolving long-standing open problems such as the Kadison-Singer conjecture (Conjecture Proof Leads to Pólya Prize), Primality testing (AKS primality test) and estimating of the volumes of arbitrary high-dimensional convex set (Ravindran Kannan wins Knuth Prize). Currently, we are working on mathematical foundations to improve our understanding of ML models (such as deep neural networks), and algorithms such as stochastic gradient descent and clustering, which are at the heart of machine learning. We are also working on establishing connection between lower bounds, reconstruction and machine learning. Our applied algorithmic work is inspired by pragmatic and real-world issues in big data systems. For instance, our work in Topic Modelling is inspired by the need to cluster entities with no signals for supervision (such as tail queries in advertisement systems), and our work on algorithms for Approximate Nearest Neighbour Search is inspired by the need to build and search through vector indices consisting of trillions of vectors in order to enable semantic search.
In this work, we explore theoretical properties of simple non-convex optimization methods for problems that feature prominently in several important areas such as recommendation systems, compressive sensing, computer vision etc.