(Topics are tentative. Code is updated on t-square.)
- 8/21, 8/23. Gradient descent. Notes (draft)
- to minimize
, take step along gradient of
.
- convex functions
- to minimize
- 8/28, 8/30. Faster Gradient descent, Newton method. Notes
- 9/6,9/11. Projected (proximal) gradient descent. Notes (draft)
- take step along gradient; project to nearest feasible point.
- 9/13, 9/18, 9/25, 9/27. Solving linear systems
- 10/2,10/4,10/11. Singular Value Decomposition:
.
- 10/16, 10/18, 10/25, 10/30. Linear programming (duality, simplex)
- Duality. Notes (draft)
- Strong duality. A vector is either in the cone of a set of vectors, or hyperplane separable from it. Notes (draft)
- simplex: take greedy step to better basis. draft-part1, Notes (draft-part2)
- Linear and Convex programming (ellipsoid, cutting plane, interior point)
- Learning halfspaces: perceptron
- add violated bad example
- Matchings (maximum, weighted)
- find augmenting path; augment
- use linear program
- use cutting plane method: add violated blossom inequality; repeat
- Random walks, Brownian motion
- take random step from current point
- take infinitesimal random step
- Online optimization
- add random perturbation to objective, use offline optimization
- use gradient descent
- Stochastic gradient descent
- add random perturbation to objective (various options)