BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IORA - Institute of Operations Research and Analytics - ECPv6.15.11//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:IORA - Institute of Operations Research and Analytics
X-ORIGINAL-URL:https://iora.nus.edu.sg
X-WR-CALDESC:Events for IORA - Institute of Operations Research and Analytics
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Asia/Singapore
BEGIN:STANDARD
TZOFFSETFROM:+0800
TZOFFSETTO:+0800
TZNAME:+08
DTSTART:20200101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Asia/Singapore:20210115T100000
DTEND;TZID=Asia/Singapore:20210116T110000
DTSTAMP:20260417T173843
CREATED:20210112T090755Z
LAST-MODIFIED:20210303T030021Z
UID:5521-1610704800-1610794800@iora.nus.edu.sg
SUMMARY:IORA Seminar Series | 15 Jan | 10am
DESCRIPTION:Shape-constrained convex regression problem deals with fitting a convex function to the observed data\, where additional constraints are imposed\, such as component-wise monotonicity and uniform Lipschitz continuity. This talk presents a unified framework for computing the least squares estimator of a multivariate shape-constrained convex regression function in $mathbb{R}^d$. We prove that the least squares estimator is computable via solving a constrained convex quadratic programming (QP) problem with $(n+1)d$ variables\, $n(n-1)$ linear inequality constraints and $n$ possibly non-polyhedral inequality constraints\, where $n$ is the number of data points. To efficiently solve the generally very large-scale convex QP\, we design a proximal augmented Lagrangian method ({tt pALM}) whose subproblems are solved by the semismooth Newton method ({tt SSN}). To further accelerate the computation when $n$ is huge\, we design a practical implementation of the constraint generation method such that each reduced problem is efficiently solved by our proposed {tt pALM}. Comprehensive numerical experiments\, including those in the pricing of basket options and estimation of production functions in economics\, demonstrate that our proposed {tt pALM} outperforms the state-of-the-art algorithms\, and the proposed acceleration technique further shortens the computation time by a large margin.   [This talk is based on joint work with Meixia Lin and Defeng Sun]   \n  \n\n\n\nName of Speaker\n  Prof Toh Kim Chuan  \n\n\nSchedule \n  Friday 15 January 2021 \, 10am  \n\n\nLink \nhttps://nus-sg.zoom.us/j/83515146165?pwd=eUpLZm5NWSs0RUpxTU5jV3JTeFQ5UT09\n\n\nID\n835 1514 6165\n\n\nPassword\n700968\n\n\nTitle \n  An augmented Lagrangian method with constraint generations for shape-constrained convex regression problems  \n\n\nAbstract \n Shape-constrained convex regression problem deals with fitting a convex function to the observed data\, where additional constraints are imposed\, such as component-wise monotonicity and uniform Lipschitz continuity. This talk presents a unified framework for computing the least squares estimator of a multivariate shape-constrained convex regression function in $mathbb{R}^d$. We prove that the least squares estimator is computable via solving a constrained convex quadratic programming (QP) problem with $(n+1)d$ variables\, $n(n-1)$ linear inequality constraints and $n$ possibly non-polyhedral inequality constraints\, where $n$ is the number of data points. To efficiently solve the generally very large-scale convex QP\, we design a proximal augmented Lagrangian method ({tt pALM}) whose subproblems are solved by the semismooth Newton method ({tt SSN}). To further accelerate the computation when $n$ is huge\, we design a practical implementation of the constraint generation method such that each reduced problem is efficiently solved by our proposed {tt pALM}. Comprehensive numerical experiments\, including those in the pricing of basket options and estimation of production functions in economics\, demonstrate that our proposed {tt pALM} outperforms the state-of-the-art algorithms\, and the proposed acceleration technique further shortens the computation time by a large margin.   [This talk is based on joint work with Meixia Lin and Defeng Sun]  \n\n\nAbout the Speaker\nKim–Chuan Toh is a Professor at the Department of Mathematics\, National University of Singapore (NUS). He obtained his BSc degree in Mathematics from NUS and PhD degree in Applied Mathematics from Cornell University.   His current research focuses on designing efficient algorithms and software for convex programming and its applications\, particularly large–scale optimization problems arising from data science/machine learning\, and large–scale matrix optimization problems such as linear semidefinite programming (SDP) and convex quadratic semidefinite programming (QSDP).   He is currently an Area Editor for Mathematical Programming Computation\, an Associate Editor for the SIAM Journal on Optimization\, Mathematical Programming Series B\, and ACM Transactions on Mathematical Software. He received the Farkas Prize awarded by the INFORMS Optimization Society in 2017 and the triennial Beale–Orchard Hays Prize awarded by the Mathematical Optimization Society in 2018. He was elected as a Fellow of the Society for Industrial and Applied Mathematics in 2018.  \n\n\n\n 
URL:https://iora.nus.edu.sg/events/iora-seminar-series-15-jan-10am/
CATEGORIES:IORA Seminar Series
ATTACH;FMTTYPE=image/jpeg:https://iora.nus.edu.sg/wp-content/uploads/2021/01/Toh-Kim-Chuan-320x320-1.jpg
END:VEVENT
END:VCALENDAR