Proximal or gradient steps for cocoercive operators
Proximal or gradient steps for cocoercive operators
This paper provides a theoretical and numerical comparison of classical first order splitting methods for solving smooth convex optimization problems and cocoercive equations. In a theoretical point of view, we compare convergence rates of gradient descent, forward-backward, Peaceman-Rachford, and Douglas-Rachford algorithms for minimizing the sum of two smooth convex functions …