# Lectures

(Topics are tentative. Code is updated on t-square.)

• 8/21, 8/23. Gradient descent. Notes (draft)
• to minimize $f(x)$, take step along gradient of $f$.
• convex functions
• 8/28, 8/30. Faster Gradient descent, Newton method. Notes
• Take (near)-optimal step in gradient direction draft
• Newton-Raphson draft
• for root-finding: $x^{t+1} = x^t - \frac{f(x)}{f'(x)}$
• for optimization: $x^{t+1} = x^t - (\nabla^2 f(x))^{-1}\nabla f(x)$
• 9/6,9/11. Projected (proximal) gradient descent. Notes (draft)
• take step along gradient; project to nearest feasible point.
• 9/13, 9/18, 9/25, 9/27. Solving linear systems
• 10/2,10/4,10/11. Singular Value Decomposition: $A = UDV^T$.
• Regression: $x=VD^{-1}U^T b$ uniquely minimizes $\|Ax-b\|$. Notes (draft)
• Best-fit subspace Notes (draft)
• power iteration: $x:= A^TAx; x:= x/\|x\|$. Notes (draft)
• 10/16, 10/18, 10/25, 10/30. Linear programming (duality, simplex)
• Linear and Convex programming (ellipsoid, cutting plane, interior point)
• maintain minimum volume ellipsoid containing feasible half of previous ellipsoid. Notes  (draft)
• separation oracles; maintain feasible polyhedron by adding cutting plane through violated point. Notes (draft)
• add smooth convex function to objective, minimize using Newton step. Notes (draft)
• Learning halfspaces: perceptron