About

I am a principal researcher at Microsoft Research, located in Redmond, WA. My current research interests include theory and algorithms for large-scale optimization, stochastic and online algorithms for machine learning, and parallel and distributed computing.

Projects

Foundations of Optimization

Optimization methods are the engine of machine learning algorithms. Examples abound, such as training neural networks with stochastic gradient descent, segmenting images with submodular optimization, or efficiently searching a game tree with bandit algorithms. We aim to advance the mathematical foundations of both discrete and continuous optimization and to leverage these advances to develop new algorithms with a broad set of AI applications. Some of the current directions pursued by our members include convex optimization,…

PACORA

Established: February 17, 2016

PACORA (Performance-Aware Convex Optimization for Research Allocation) is a resource allocation framework for general-purpose operating and cloud systems, which is designed to provide responsiveness guarantees to a simultaneous mix of high-throughput parallel, interactive, and real-time applications in an efficient, scalable manner in order to improve efficiency without sacrificing responsiveness or performance.

Publications

2017

2015

2014

2013

2012

2011

2010

2009

2008

2007

2006

2005

2004

2003

2000

Downloads

Proximal-Gradient Homotopy Method for Sparse Least Squares

March 2012

    Click the icon to access this download

  • Website