栏目导航

学术活动

Gradient Methods with Approximately Optimal Stepsizes
发布时间:2018-10-10 15:20 来源: 点击率:

报告题目:Gradient Methods with Approximately Optimal Stepsizes

报告人:刘泽显  博士  西安电子科技大学

主办单位:数学与财经学院

时间:2018年10月12日  星期五  15:00-16:00

地点:知津楼C303

报告摘要:

In this talk,  we view the Barzilai-Borwein method from a new angle, introduce a new type of stepsize called approximately optimal stepsize for gradient method.  It is remarkable that all gradient methods can be regarded as gradient methods with approximately optimal stepsizes (GM AOS). We develop an efficient gradient method with approximately optimal stepsize  for strictly convex quadratic minimization, and establish its convergence and R-linear convergence. We extend the gradient method with approximately optimal stepsize to general unconstrained optimization,  present an efficient gradient method with approximately optimal stepsizes based on conic model (GM AOS(cone)) for general unconstrained optimization, and establish its convergence. Some numerical results suggest that GM AOS(cone) outperforms some famous conjugate gradient software packages  for the problem set collection given by Andrei and the CUTEr collection. Due to the simple search direction, the simple Armijo line search used and the suprising numerical performance,  we think GM_AOS can became  a strong candidate for large scale unconstrained optimization and will be applied widely to many fields.