A Case for using Trend Filtering over Splines

  • Aaditya Ramdas | Carnegie Mellon University

This talk will be about fast optimization algorithms (specifically specialized ADMM) for a common practical problem – estimating piecewise constant/linear/quadratic fits to time series data. I will first introduce Trend Filtering, a recently proposed tool for this problem by Kim, Koh, Boyd and Gorinevsky (2009), and compare it to the popular smoothing splines and locally adaptive regression splines. Tibshirani (2014) showed that trend filtering estimates converge at the minimax optimal if the true underlying function (or its derivatives) has bounded total variation. Hence, the only roadblock to using it in practice is having robust and efficient algorithms. We take a major step in overcoming this problem, by providing a more efficient and robust solution than the current interior point methods in use. Furthermore, the proposed ADMM implementation is very simple, and importantly, it is flexible enough to extend to many interesting related problems, such as sparse trend filtering and isotonic trend filtering. Software for our method will be made freely available, written in C++, and also in R (see the tt trendfilter function in the R package tt genlasso).

Speaker Details

Aaditya Ramdas (http://www.cs.cmu.edu/~aramdas/) is a fifth year PhD student in Machine Learning and Statistics at Carnegie Mellon University, advised by Larry Wasserman and Aarti Singh, working at the intersection of computation and statistics. His thesis work is more theoretical, focusing on high-dimensional statistics and optimization theory for regression, classification and hypothesis testing. In an earlier life, he did his undergraduate in computer science at IIT Bombay and worked at a hedge fund, Tower Research.