Optimization Theory (DATA 620020)
Spring 2023
Lecture Time and Venue:
Thursday, 18:30-21:05, H4206
Reading Materials:
- 刘浩洋, 户将, 李勇锋, 文再文. "最优化:建模、算法与理论". 高教出版社, 2020. [link]
- 林宙辰, 李欢, 方聪. "机器学习中的加速一阶优化算法". 机械工业出版社, 2021. [link]
- Jorge Nocedal, Stephen J. Wright. "Numerical optimization". Springer, 2006. [link]
- Yurii Nesterov. "Lectures on convex optimization". Springer, 2018. [link]
- Ralph Tyrell Rockafellar. "Convex Analysis". Princeton University Press, 1997. [link]
Courseware (subject to changes):
- Feb. 23: Course Overview, Optimization for Machine Learning, Topology. [pdf]
- Mar. 02: Convex Set, Convex Function, Subgradient, Subdifferential. [pdf]
- Mar. 09: Subdifferential Calculus, Lipschitz Continuity, Smoothness, Strong Convexity. [pdf]
- Mar. 16: Second-Order Characterization, Black Box Model, Gradient Decsent Method. [pdf]
- Mar. 23: Gradient Decsent Method, Polyak–Łojasiewicz Condition, Line Search Method. [pdf]
- Mar. 30: Polyak's Heavy Ball Method, Nesterov's Acceleration. [pdf]
- Apr. 06: Lower Complexity Bounds, Nonsmooth Convex Optimizaiton. [pdf]
- Apr. 13: Subgradient Method, Smoothing, Proximal Gradient Descent. [pdf]
- Apr. 20: Newton's Method, Damped Newton Method. [pdf]
- Apr. 27: Self-Concordant Functions, Global Convergence Analysis. [pdf]
- May. 04: (Limited-Memory) Classical Quasi-Newton Methods. [pdf]
- May. 11: Greedy/Randomized Quasi-Newton Methods, Block Quasi-Newton Methods. [pdf]
- May. 18: Stochastic Gradient Descent, Variance Reduction. [pdf]
- May. 25: Stochastic Variance Reduced Gradient, Catalyst Acceleration, Katyusha. [pdf]
- Jun. 01: Stochastic Recursive Gradient, Zeroth-Order Optimization. [pdf]
- Jun. 08: Zeroth-Order Optimization, Distributed Optimization. [pdf]
BACK