
Journal of Convex Analysis 17 (2010), No. 3&4, 765780 Copyright Heldermann Verlag 2010 On some CurvatureDependent Steplength for the Gradient Method Bruno Baji Dép. de Mathématiques, Université Montpellier, Place Eugène Bataillon, 34095 Montpellier 05, France baji19@free.fr Alexandre Cabot Dép. de Mathématiques, Université Montpellier, Place Eugène Bataillon, 34095 Montpellier 05, France acabot@math.univmontp2.fr [Abstractpdf] The aim of this paper is to show the interest of taking into account the notion of curvature in gradient methods. More precisely, given a Hilbert space $H$ and a strictly convex function $\phi:H\to {\mathbb{R}}$ of class ${\mathcal C}^2$, we consider the following algorithm $$x_{n+1}=x_n\lambda_n\, \nabla \phi(x_n),\quad \mbox{ with } \lambda_n = \frac{\nabla \phi(x_n)^2}{\langle\nabla^2\phi(x_n).\nabla\phi(x_n), \nabla\phi(x_n)\rangle}.\leqno (\star)$$ We obtain results of linear convergence for the above algorithm, even without strong convexity. Some variants of $(\star)$ are also considered, with different expressions of the curvaturedependent steplength $\lambda_n$. A large part of the paper is devoted to the study of an implicit version of $(\star)$, falling into the field of the proximal point iteration. All these algorithms are clearly related to the BarzilaiBorwein method and numerical illustrations at the end of the paper allow to compare these different schemes. Keywords: Unconstrained convex optimization, steepest descent, gradient method, proximal point algorithm, BarzilaiBorwein stepsize. MSC: 65K10, 90C25, 49M25 [ Fulltextpdf (179 KB)] for subscribers only. 