Error bounds and quantifiable convergence of proximal methods for nonsmooth/(non)convex optimization

Prof. Russell Luke

Universität Göttingen

Thursday, February 04, 2016, 17:00

Hörsaal II, Albertstr. 23b, 79104 Freiburg

For iterative methods in nonconvex optimization, a central question is when to stop. And when the decision has been made to stop, what is the relation, if any, between the point that the algorithm delivers and the desired solution(s) to the optimization problem? Quantification of the convergence of algorithms is the key to providing error bounds for stopping crieria, and at the heart of quantifiable convergence rates lies theory of regularity, not only of the underlying functions and operators, but of the critical points of the optimization model. We survey progress over the last several years on sufficient conditions for local linear convergence of fundamental algorithms applied to nonconvex problems, and discuss challenges and prospects for further progress. The theory is local by nature and contains the convex case as an example where the local neighborhood extends to the whole space. We demonstrate the use of the tools we have developed on applications to image processing and matrix completion.

 

This talk is part of the series “Mathematisches Kolloquium"

http://wochenprogramm.mathematik.uni-freiburg.de/kolloquium.de.html