Fast Algorithms via Convex Relaxation

  • Sam Wong | UC Berkeley

In Theoretical CS many of the fastest algorithms are obtained by considering the Linear Programming (LP) relaxation of the problem. In this talk, I will survey several results on the effectiveness of employing the more general *convex* relaxation. We discovered that for a host of classical problems, their convex relaxation is often a more natural starting point to design fast algorithms. Examples include submodular minimization, market equilibrium computation and approximate Caratheodory. Our journey will involve some fun interplay of convex optimization with tools from discrete optimization and economics.

Speaker Details

I am a graduate student at UC Berkeley, advised by Christos Papadimitriou. Prior to Berkeley, I obtained my Bachelor and Master’s degree from MIT under the supervision of Michel Goemans. My research includes works on algorithmic economics, convex optimization, combinatorial optimization, online algorithms and approximation algorithms. I hold a broad interest in algorithms & complexity, and their connection to related areas. I am particularly interested in interdisciplinary research that requires a good understanding of multiple areas. During my time at Berkeley I have focused on the application of convex optimization to discrete optimization and economics.

    • Portrait of Nikhil Devanur

      Nikhil Devanur

      Senior Researcher