Journal Home Page

Cumulative Index

List of all Volumes

Complete Contents
of this Volume

Next Article
 


Minimax Theory and its Applications 06 (2021), No. 1, 001--024
Copyright Heldermann Verlag 2021



Fast Inertial Proximal ADMM Algorithms for Convex Structured Optimization with Linear Constraint

Hedy Attouch
IMAG, Université Montpellier, 34095 Montpellier, France
hedy.attouch@umontpellier.fr



In a Hilbert space setting, we analyze the convergence properties of a new class of proximal ADMM algorithms with inertial features. They aim to quickly solve convex structured minimization problems with linear constraint. As a basic ingredient, we use the maximally monotone operator M which is associated with the Lagrangian formulation of the problem. We specialize to this operator the inertial proximal algorithm recently introduced by Attouch and Peypouquet [Convergence of inertial dynamics and proximal algorithms governed by maximal monotone operators, Math. Programming 174 (2019) 391--432] to resolve general monotone inclusions. This gives an inertial proximal ADMM algorithm where the extrapolation step takes into account recent advances concerning the accelerated gradient method of Nesterov. Based on an appropriate adjustment of the viscosity and proximal parameters, we analyze the fast convergence properties of the algorithm, as well as the convergence of the iterates to saddle points of the Lagrangian function. Among the perspectives, we outline a new direction of research, linked to the introduction of the Hessian damping in the algorithms.

Keywords: Convex structured optimization, linear constraint, Lagrange multipliers, maximally monotone operators, proximal ADMM, inertial methods, Nesterov accelerated method, Hessian damping.

MSC: 37N40, 46N10, 49M30, 65K05, 65K10, 90B50, 90C25.

[ Fulltext-pdf  (180  KB)] for subscribers only.