University of Birmingham > Talks@bham > Optimisation and Numerical Analysis Seminars > On the worst-case performance of the optimization method of Cauchy for smooth strongly convex functions.

On the worst-case performance of the optimization method of Cauchy for smooth strongly convex functions.

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Sergey Sergeev.

We consider the Cauchy (or steepest descent) method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give worst-case complexity bound for a noisy variant of gradient descent method. Finally, we show that these results may be applied to study the worst-case performance of Newton’s method for the minimization of self-concordant functions.

The proofs are computer-assisted, and rely on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

Joint work with F. Glineur and A.B. Taylor.

This talk is part of the Optimisation and Numerical Analysis Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.