Computational & Applied Math Seminar

Restarting accelerated gradient methods with a rough strong convexity estimate

  • 演讲者:瞿铮,香港大学

  • 时间:2017-05-08 15:00-16:00

  • 地点:科教服务中心703

About the speaker:

Dr. Zheng Qu (瞿铮) is an assistant Professor at the University of Hong Kong. She obtained her Ph.D. from École Polytechnique, France in 2013.
Her research interests lie in numerical methods for optimization and optimal control problems. Dr. Qu investigates more scalable modern optimization methods, including randomized coordinate descent methods (serial, parallel, distributed, accelerated variants), stochastic gradient methods (mini-batch, variance reduction) and primal-dual methods. She is equally interested in exploring the power of randomization in the attenuation of the curse of dimensionality for the solution of optimal control problems.


Abstract:

We propose new restarting strategies for accelerated gradient and accelerated coordinate descent methods. Our main contribution is to show that the restarted method has a geometric rate of convergence for any restarting frequency, and so it allows us to take profit of restarting even when we do not know the strong convexity coefficient. The scheme can be combined with adaptive restarting, leading to the first provable convergence for adaptive restarting schemes with accelerated gradient methods. We illustrate the properties of the algorithm on a regularized logistic regression problem and on a Lasso problem.