Journal Home Page

Cumulative Index

List of all Volumes

Complete Contents
of this Volume

Previous Article

Next Article
 


Journal of Convex Analysis 17 (2010), No. 3&4, 765--780
Copyright Heldermann Verlag 2010



On some Curvature-Dependent Steplength for the Gradient Method

Bruno Baji
Dép. de Mathématiques, Université Montpellier, Place Eugène Bataillon, 34095 Montpellier 05, France
baji19@free.fr

Alexandre Cabot
Dép. de Mathématiques, Université Montpellier, Place Eugène Bataillon, 34095 Montpellier 05, France
acabot@math.univ-montp2.fr



[Abstract-pdf]

The aim of this paper is to show the interest of taking into account the notion of curvature in gradient methods. More precisely, given a Hilbert space $H$ and a strictly convex function $\phi:H\to {\mathbb{R}}$ of class ${\mathcal C}^2$, we consider the following algorithm $$x_{n+1}=x_n-\lambda_n\, \nabla \phi(x_n),\quad \mbox{ with } \lambda_n = \frac{|\nabla \phi(x_n)|^2}{\langle\nabla^2\phi(x_n).\nabla\phi(x_n), \nabla\phi(x_n)\rangle}.\leqno (\star)$$ We obtain results of linear convergence for the above algorithm, even without strong convexity. Some variants of $(\star)$ are also considered, with different expressions of the curvature-dependent steplength $\lambda_n$. A large part of the paper is devoted to the study of an implicit version of $(\star)$, falling into the field of the proximal point iteration. All these algorithms are clearly related to the Barzilai-Borwein method and numerical illustrations at the end of the paper allow to compare these different schemes.

Keywords: Unconstrained convex optimization, steepest descent, gradient method, proximal point algorithm, Barzilai-Borwein stepsize.

MSC: 65K10, 90C25, 49M25

[ Fulltext-pdf  (179  KB)] for subscribers only.