University of Birmingham > Talks@bham > Data Science and Computational Statistics Seminar > Statistical Learning by Stochastic Gradient Descent

Statistical Learning by Stochastic Gradient Descent

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hong Duong.

Stochastic gradient descent (SGD) has become the workhorse behind many machine learning problems. Optimization and estimation errors are two contradictory factors responsible for the prediction behavior of SGD . In this talk, we report our generalization analysis of SGD by considering simultaneously the optimization and estimation errors. We remove some restrictive assumptions in the literature and significantly improve the existing generalization bounds. Our results help to understand how to stop SGD early to get a best generalization performance.

This talk is part of the Data Science and Computational Statistics Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on from the University of Cambridge.