往期活动

Optimization methods in applications I

INTENDED LEARNING OUTCOME

At the end of the course, the students will have an understanding of modern algorithms in nonlinear optimization. They will know their applications from several fields and they will understand which method is suitable for which application.


CONTENT

The course consists of two parts. The first one provides an overview of modern optimization methods while the second one shows selected applications. The first part will be theoretical and the emphasis will be given on explaining how different optimization approaches work. Their strengths and weaknesses will be properly analyzed. The second part will use this knowledge and show how these methods can be applied to modern machine learning applications or topology optimization.


PART ONE (THEORY OF OPTIMIZATION)
1- Introduction (how to form an optimization problem, different types of optimization problems, continuity, Lipschitz, differentiability, chain rule)
Time: Monday, Sept 30, 2019, 8:30-10:00
Venue: Room 415, Block 3, Hui Yuan
2- Convex optimization (linearity, convexity, characterization, importance for minimization, subgradients, connection to eigenvalues)
Time: Tuesday, Oct 1, 2019, 8:30-10:00
Venue: Room 415, Block 3, Hui Yuan
3- Nonconvex optimization (difference from convex optimization, optimality conditions, Lagrangian duality)
Time: Wednesday, Oct 2, 2019, 8:30-10:00
Venue: Room 415, Block 3, Hui Yuan
4- Basic optimization methods (bisection method, alternating minimization, gradient descent, projected gradient method, Newton method in 1D, Newton method in higher dimension, line search)
Time: Thursday, Oct 3, 2019, 8:30-10:00
Venue: Room 518, Block 3, Hui Yuan
5- More involved optimization methods (penalization methods, interior point methods, proximal methods, ADMM, Newton method, quasi-Newton methods)
Time: Friday, Oct 4, 2019, 8:30-10:00

Venue: Room 415, Block 3, Hui Yuan


RECOMMENDED LITERATURE
- Nocedal, Jorge, and Stephen Wright. Numerical optimization. Springer Science & Business Media, 2006.
- Hastie, Trevor, et al. The elements of statistical learning: data mining, inference and prediction. Springer, 2005.
- Bends, Martin P. Topology optimization. Springer US, 2009.


BIOGRAPHY

Dr. Lukas Adam, 2015年毕业于Charles University in Prague,Czech Republic,获数学博士学位。师从国际著名优化专家J Outrata. 现任南方科技大学计算机科学系研究助理教授。在SIAM Journal on Applied Mathematics ,Mathematical Programming等应用数学重要杂志上发表多篇有影响力论文。他的研究兴趣涵盖最优化理论、方法以及在数据科学中的应用。