Name of Speaker  Prof Toh Kim Chuan  
Schedule   Friday 15 January 2021 , 10am  
Link https://nus-sg.zoom.us/j/83515146165?pwd=eUpLZm5NWSs0RUpxTU5jV3JTeFQ5UT09
ID835 1514 6165
Password700968
Title   An augmented Lagrangian method with constraint generations for shape-constrained convex regression problems  
Abstract   Shape-constrained convex regression problem deals with fitting a convex function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This talk presents a unified framework for computing the least squares estimator of a multivariate shape-constrained convex regression function in $mathbb{R}^d$. We prove that the least squares estimator is computable via solving a constrained convex quadratic programming (QP) problem with $(n+1)d$ variables, $n(n-1)$ linear inequality constraints and $n$ possibly non-polyhedral inequality constraints, where $n$ is the number of data points. To efficiently solve the generally very large-scale convex QP, we design a proximal augmented Lagrangian method ({tt pALM}) whose subproblems are solved by the semismooth Newton method ({tt SSN}). To further accelerate the computation when $n$ is huge, we design a practical implementation of the constraint generation method such that each reduced problem is efficiently solved by our proposed {tt pALM}. Comprehensive numerical experiments, including those in the pricing of basket options and estimation of production functions in economics, demonstrate that our proposed {tt pALM} outperforms the state-of-the-art algorithms, and the proposed acceleration technique further shortens the computation time by a large margin.   [This talk is based on joint work with Meixia Lin and Defeng Sun]  
About the SpeakerKim–Chuan Toh is a Professor at the Department of Mathematics, National University of Singapore (NUS). He obtained his BSc degree in Mathematics from NUS and PhD degree in Applied Mathematics from Cornell University.   His current research focuses on designing efficient algorithms and software for convex programming and its applications, particularly large–scale optimization problems arising from data science/machine learning, and large–scale matrix optimization problems such as linear semidefinite programming (SDP) and convex quadratic semidefinite programming (QSDP).   He is currently an Area Editor for Mathematical Programming Computation, an Associate Editor for the SIAM Journal on Optimization, Mathematical Programming Series B, and ACM Transactions on Mathematical Software. He received the Farkas Prize awarded by the INFORMS Optimization Society in 2017 and the triennial Beale–Orchard Hays Prize awarded by the Mathematical Optimization Society in 2018. He was elected as a Fellow of the Society for Industrial and Applied Mathematics in 2018.