<rt id="sku2a"><center id="sku2a"></center></rt>
<acronym id="sku2a"><center id="sku2a"></center></acronym>

數學系Seminar第2065期 Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

創建時間:  2020/12/16  龔惠英   瀏覽次數:   返回

    數學系 Seminar 第 2064期

報告主題:Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

報 告 人:邊偉 教授 (哈爾濱工業大學)

報告時間:2020年12月21日(周一) 9:30

會議地點:G507

邀 請 人:徐姿

主辦部門:理學院數學系

報告摘要: We first investigate a class of constrained sparse regression problem with cardinality penalty, where the feasible set is box constraint, and the loss function is convex, not differentiable. We put forward a smoothing fast iterative hard thresholding (SFIHT) algorithm for solving such optimization problems, which combines smoothing approximations, extrapolation techniques and iterative hard thresholding methods. The extrapolation coefficients satisfy in the proposed algorithm. We establish that any accumulated point of the iterative sequence is a local minimizer of the original cardinality penalty problem. We then consider that the case where the loss function is differentiable. We propose the fast iterative hard thresholding (FIHT) algorithm to solve such problems. We prove that the iterates converges to a local minimizer with lower bound property of the problem. In particular, we show that the convergence rate of the corresponding objective function value sequence is . Finally, we perform some numerical examples to illustrate the theoretical results.



歡迎教師、學生參加!

上一條:數學系Seminar第2066期 Nonlinear optimization techniques in wireless communications

下一條:數學系Seminar第2064期 Linearized Proximal Algorithms for Convex Composite Optimization with Applications


數學系Seminar第2065期 Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

創建時間:  2020/12/16  龔惠英   瀏覽次數:   返回

    數學系 Seminar 第 2064期

報告主題:Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

報 告 人:邊偉 教授 (哈爾濱工業大學)

報告時間:2020年12月21日(周一) 9:30

會議地點:G507

邀 請 人:徐姿

主辦部門:理學院數學系

報告摘要: We first investigate a class of constrained sparse regression problem with cardinality penalty, where the feasible set is box constraint, and the loss function is convex, not differentiable. We put forward a smoothing fast iterative hard thresholding (SFIHT) algorithm for solving such optimization problems, which combines smoothing approximations, extrapolation techniques and iterative hard thresholding methods. The extrapolation coefficients satisfy in the proposed algorithm. We establish that any accumulated point of the iterative sequence is a local minimizer of the original cardinality penalty problem. We then consider that the case where the loss function is differentiable. We propose the fast iterative hard thresholding (FIHT) algorithm to solve such problems. We prove that the iterates converges to a local minimizer with lower bound property of the problem. In particular, we show that the convergence rate of the corresponding objective function value sequence is . Finally, we perform some numerical examples to illustrate the theoretical results.



歡迎教師、學生參加!

上一條:數學系Seminar第2066期 Nonlinear optimization techniques in wireless communications

下一條:數學系Seminar第2064期 Linearized Proximal Algorithms for Convex Composite Optimization with Applications

秒速时时彩