Candidate Talk: Approximating the optimum: Efficient algorithms and their limits
- Prasad Nagaraj Raghavendra | Phd Candidate at University of Washington
Most combinatorial optimization problems of interest are NP-hard to solve exactly. To cope with this intractability, one settles for approximation algorithms with provable guarantee on the quality of approximation. Despite great success in designing approximation algorithms, underlying a vast majority of the work is the technique of linear programming, or more generally semi-definite programming. This poses the natural question: How far can we push the technique of semi-definite programming? Are there techniques that yield better approximations than semi-definite programming?
In this work, we show that the simplest semi-definite programs yield the best approximation for large classes of optimization problems ranging from constraint satisfaction problems (CSP) to graph labeling problems, ordering CSPs to certain clustering problems, under the Unique Games Conjecture. In a surprising convergence of algorithms and hardness results, the same work also leads to a generic algorithm that achieves the optimal approximation for every constraint satisfaction problem.
Speaker Details
Prasad Raghavendra obtained his bachelor’s degree at Indian Institute of Technology, Madras, and is currently a PhD candidate at the University of Washington. His research spans many areas of theoretical computer science including approximation algorithms and inapproximability results for NP-hard optimization problems, error correcting codes, metric embeddings, computational learning and complexity. He is the recipient of the Best Paper and Best Student Paper awards at STOC 08.
-
-
Jeff Running
-
Prasad Nagaraj Raghavendra
-
-
Watch Next
-
-
-
Accelerating MRI image reconstruction with Tyger
- Karen Easterbrook,
- Ilyana Rosenberg
-
-
-
-
-
-
-