ECE 8823 Convex Optimization

# ECE 8823 Convex Optimization

Instructor
Justin Romberg
Email: jrom@ece.gatech.edu
Office: Centergy 5227
Phone: 404-894-3930

Description
This course covers the fundamentals of convex optimization. We will talk about mathematical fundamentals, modeling (i.e. how to set up optimization problems in different applications), and algorithms.

Outline

1. Introduction
2. Basic convex analysis
1. Convex sets
• basic properties
• separating and supporting hyperplanes
• closest point problem
• the support functional: convex sets as collections of hyperplanes
• the dual of the closest point problem, optimality conditions and applications
2. Convex functions
• defintion, examples
• first and second order conditions for convexity
• operations that preserve convexity
• the Fenchel conjugate: convex functions as collections of supporting hyperplanes
• Fenchel duality for convex programs, optimality conditions and applications
3. Unconstrained Minimization of smooth functions
• definition, descent directions
• convergence under strong convexity (exact line search)
• convergence under the gradient-Lipschitz condition (fixed step size)
2. Newton’s method
• quadratic convergence for strongly convex functions
• quadratic convergence for self-concordant functions (overview)
• log barriers for constrained problems (overview)
3. Quasi-newton methods
• low rank updates and the Woodbury formula
• DFP and BFGS
4. Theory of Constrained Optimization
1. existence and uniqueness of minimizers (unconstrained and constrained)
2. optimality conditions: unconstrained case
3. optimality conditions: constrained case (geometric)
• examples: linear constraints, positivity constraints, single inequality
4. KKT conditions: sufficiency
5. KKT conditions: necessity (Slater condition, etc)
6. Lagrange duality
7. dual examples: maxflow/mincut, SVM, waterfilling
5. Algorithms for constrained optimization
1. eliminating equality constraints
2. barrier methods (overview)
3. primal-dual interior point methods (overview)
4. dual ascent, augmented Lagrangian, and ADMM
6. Modeling
1. LPs, QPs, SOCPs, SDPs with examples
2. norm approximation, norm minimization
3. statistical estimation; maximum likelihood, MAP
4. Robust least-squares as an SDP
7. Convex Relaxation
1. MINCUT revisited
2. MAXCUT
3. relaxing nonconvex QPs, MIMO maximum likelihood, PhaseLift
4. l1 minimization for recovering sparse vectors