ADMM, block variants and proximality

WF-01: ADMM, block variants and proximality
Stream: Alternating Direction Method of Multipliers and its Applications
Room: Fermat
Chair(s): Jacek Gondzio

A block symmetric Gauss-Seidel decomposition theorem for convex quadratic programming and its applications in multi-block ADMM
Kim-Chuan Toh
For a multi-block convex composite quadratic programming (CCQP) with an additional nonsmooth term in the first block, we present a block symmetric Gauss-Seidel (sGS) decomposition theorem, which states that each cycle of the block sGS method is equivalent to solving the CCQP with an additional sGS proximal term constructed from the quadratic function. This theorem has played a key role in various recently developed inexact proximal ADMM for multi-block convex conic programming. We demonstrate how our sGS-based ADMM can be applied solve doubly nonnegative SDPs.

A purely numerical linear algebra view on ADMM
Stefano Cipolla, Jacek Gondzio
Because of its versatility and applicability in multiple fields, the n-block alternating direction method of multipliers (ADMM) for solving non-separable convex minimization problems, has recently attracted the attention of many researchers. Despite the fact the connections between ADMM and Gauss-Seidel are well known, to the best of our knowledge, an analysis from the purely numerical linear algebra point of view is lacking in the literature. Aim of this talk is to present a series of very recent results obtained on this topic which shed further light on basic issues as convergence

Primal-dual proximal methods with Bregman distances
Xin Jiang, Lieven Vandenberghe
We discuss Bregman distance extensions of the primal-dual three-operator (PD3O) and Condat-Vu proximal algorithms. When used with standard proximal operators these algorithms include several important methods, including ADMM, as special cases. Extensions to generalized Bregman distances are attractive if the complexity per iteration can be reduced by matching the Bregman distance to the structure in the problem. For practical implementation we introduce line search techniques in the proposed methods. The results will be illustrated with applications in semidefinite optimization.

Multi-Block ADMM – Managing Randomness and Data Privacy in Optimization Algorithm Design
Yinyu Ye
Randomization strategy has been widely used in designing optimization algorithms to achieve properties that deterministic algorithms cannot do, such as SGD, BCD, Reinforced Learning etc. In this talk, we show how randomized techniques help in greatly improving the efficiency of the multi-block alternating direction method with multipliers. On the other hand, we illustrate that too much randomness may be harmful to the algorithm convergence. Therefore, randomness needs to be carefully managed to take advantage of the strategy. In addition, we discuss how to implement ADMM while preserving data

Theme: Overlay by Kaira