Abstract
In this paper, we focus on the application o f the Peaceman-Rachford splitting method (PRSM) to a convex minimization model with linear constraints and a separable objective function. Compared to the Douglas-Rachford splitting method (DRSM), another splitting method from which the alternating direction method of multipliers originates, PRSM requires more restrictive assumptions to ensure its convergence, while it is always faster whenever it is convergent. We first illustrate that the reason for this difference is that the iterative sequence generated by DRSM is strictly contractive, while that generated by PRSM is only contractive with respect to the solution set of the model. With only the convexity assumption on the objective function of the model under consideration, the convergence of PRSM is not guaranteed. But for this case, we show that the first t iterations of PRSM still enable us to find an approximate solution with an accuracy of O(1/t). A worst-case O(1/t) convergence rate of PRSM in the ergodic sense is thus established under mild assumptions. After that, we suggest attaching an underdetermined relaxation factor with PRSM to guarantee the strict contraction of its iterative sequence and thus propose a strictly contractive PRSM. A worst-case O(1/t) convergence rate of this strictly contractive PRSM in a nonergodic sense is established. We show the numerical efficiency of the strictly contractive PRSM by some applications in statistical learning and image processing.
Original language | English (US) |
---|---|
Pages (from-to) | 1011-1040 |
Number of pages | 30 |
Journal | SIAM Journal on Optimization |
Volume | 24 |
Issue number | 3 |
DOIs | |
State | Published - 2014 |
All Science Journal Classification (ASJC) codes
- Software
- Theoretical Computer Science
- Applied Mathematics
Keywords
- Contraction
- Convergence rate
- Convex programming
- Peaceman-Rachford splitting method