Mathematical Foundations of Machine Learning, Fall 2018

Mathematical Foundations of Machine Learning, Fall 2018

Instructor
Justin Romberg
jrom@ece.gatech.edu
Office: Centergy 5227
Phone: (404) 894-3930

Description
The purpose of this course is to provide first year PhD students in engineering and computing with a solid mathematical background for two of the pillars of modern data science: linear algebra and applied probability.

Download the syllabus

Go to Piazza

Outline

I. Vector space basics
a) linear vector spaces, linear independence
b) norms and inner products
c) basis expansions, examples: B­splines, cosines/Fourier, radial basis functions, etc.
d) linear approximation
e) reproducing kernel spaces

II. Linear Estimation
a) Examples: classical regression/recovering a function from point samples, imaging, etc.
b) the Singular Value Decomposition (SVD)
c) least-­squares solutions and the pseudo­inverse
d) stable inversion and regularization
e) kernels, Mercer’s theorem, RKHS, representer theorem
f) computing least-­squares solutions
i) matrix factorizations
ii) steepest descent and conjugate gradients
iii)low rank updates for online least-­squares

III. Statistical estimation and classification
a) review of core concepts in probability
b) Gaussian estimation
c) maximum likelihood estimation
d) Bayesian estimation and classification
e) empirical risk minimization
f) computing estimates: unconstrained optimization, stochastic gradient descent

IV. Modeling
a) geometric models
i) principal components analysis, low-rank approximation (Eckart-­Young theorem)
ii) sparsity, model selection
iii) structured matrix factorization (e.g. NNMF, dictionary learning)
iv) manifold models, nonlinear embeddings
b)  probabilistic models
i) Gaussian graphical models
ii) Gaussian mixture models
iii) hidden Markov models