In this talk, we present a fast and robust algorithmic framework SSNAL for solving large scale lasso problems. SSNAL is a semismooth Newton based augmented Lagrangian algorithmic framework. We show that for lasso problems, both the primal and dual iteration sequences generated by SSNAL possess a remarkably fast linear convergence rate, which can even be superlinear asymptotically. We also conduct variational analysis to analyse the second order sparsity structure of the underlying problems and proposed efficient numerical techniques to exploit the structure in our algorithm. Numerical comparison between our approach and state-of-the-art solvers on real data sets are presented to demonstrate the high efficiency and robustness of our proposed algorithm in solving difficult large scale lasso problems. For example, for a problem with over 4 million features and 16000 samples, SSNAL can solve it in 20 seconds, while the best alternative solver took 2400 seconds. This talk is based on joint work with Professor Sun Defeng and Dr. Li Xudong.
Prof Toh is from the department of mathematics and in addition to his role as IORA research director. His research mainly focuses on the design, analysis and implementation of robust and efficient algorithms for convex programming, particularly large scale linear and convex quadratic semidefinite programming and their generalizations.
He had developed the general purpose software, SDPT3, for solving medium scale SDP. SDPT3 is currently used as the computational engine in the convex optimization modeling language CVX and also worked on designing efficient preconditioned iterative methods for large scale linear systems arising from finite element discretization of soil-structure interaction problems.
Please refer to one of his publications on A Newton-CG augmented Lagrangian method for semidefinite programming -> Click here