报告题目: An Efficient Semismooth Newton Based Algorithm for Convex Clustering
报 告 人: Toh Kim Chuan教授 (National University of Singapore)
报告时间:2018年8月12日,8:30-11:00
报告地点:数计学院2号楼229报告厅
报告摘要: Clustering is a fundamental problem in unsupervised learning. Popular methods like K-means, may suffer from instability as they are prone to get stuck at a local minimum. Recently, the sum-of-norms (SON) model (also known as clustering path), which is a convex relaxation of the hierarchical clustering model, has been proposed by Hocking et al and Lindsten et al. Although numerical algorithms like ADMM and AMA (alternating minimization algorithm) have been proposed to solve the convex clustering model, it is still very challenging to solve large-scale problems. In this talk, we present a semismooth Newton based augmented Lagrangian algorithm for large-scale convex clustering problems wherein the second-order sparsity property of the generalized Hessians associated with the underlying subproblem in each iteration are carefully exploited to achieve high computational efficiency. Extensive numerical experiments on both simulated and real data demonstrate that our algorithm is highly efficient and robust for solving large-scale problems. Moreover, the numerical results also show the superior performance and scalability of our algorithm compared to existing first-order methods. [Based on joint work with Defeng Sun and Yancheng Yuan]
报告人简介:
Dr Toh is a Professor at the Department of Mathematics, National University of Singapore (NUS). He obtained his BSc degree in Mathematics from NUS in 1990 and the PhD degree in Applied Mathematics from Cornell University in 1996 under the direction of Nick Trefethen.
Prof Toh was awarded the 2017 Farkas Prize of the INFORMS Optimization Society. He was selected as a Fellow of the Society for Industrial and Applied Mathematics in 2018. Together with Defeng Sun and Liuqin Yang, he received the Beale Orchard-Hays Prize for Excellence in Computational Mathematical Programming awarded by the Mathematical Optimization Society in 2018.
He is currently an Area Editor for Mathematical Programming Computation, an Associate Editor of Mathematical Programming Series B, an Associate Editor of SIAM Journal on Optimization, as well as a few other journals. He also served as the past secretary of the SIAM Activity Group on Optimization. He has been invited to speak at numerous conferences and workshops, including the SIAM Annual Meeting in 2010, and the International Symposium in Mathematical Programing in 2006. He was also involved in various major conference program committees.
His current research focuses on designing efficient algorithms and software for convex programming, particularly large-scale optimization problems arising from machine learning and data science, as well as large-scale matrix optimization problems such as linear semidefinite programming (SDP) and convex quadratic semidefinite programming (QSDP).