Content
The course is on continuous optimization problems, with focus on non-linear mathematical programming (constrained optimization).
Part 1 introduces efficient downhill algorithms in the unconstrained case:
* gradient descent, backtracking, Wolfe conditions, convergence properties
* covariant gradient, Newton, quasi-Newton methods, BFGS
Part 2 will introduce efficient algorithms for constrained optimization:
* Basics on KKT
* Log-barriers, Augmented Lagrangian, primal-dual Newton
* Differentiable Optimization
* Convex Programs, bound constraints, Phase I
Part 3 will cover extended topics that may vary each year, e.g.:
* Stochastic Gradient Descent for NN training
* No Free Lunch, Bayesian Optimization, global optimization
* Stochastic, black-box, & evolutionary algorithms
* Existing libraries, CERES, structured NLPs, solving constraint graphs