Mathematical Foundations of Machine Learning, Fall 2019

Mathematical Foundations of Machine Learning, Fall 2019

Instructor
Justin Romberg
Office: Coda S1109
Phone: (404) 894-3930

Description
The purpose of this course is to provide first year PhD students in engineering and computing with a solid mathematical background for two of the pillars of modern data science: linear algebra and applied probability.

Download the syllabus

Go to Piazza

Outline

I. Vector space basics
a) linear vector spaces, linear independence
b) norms and inner products
c) linear approximation
d) basis expansions
e) reproducing kernel spaces

II. Regression using least-squares
a) regression as a linear inverse problem
b) the least-squares problem
c) ridge regression
d) regression in a Hilbert space, representer theorem
e) reproducing kernels, kernel regression, Mercer’s theorem

III. Solving and analyzing least-squares problems
a) the Singular Value Decomposition (SVD) and the pseudo­inverse
b) stable inversion and regularization
c) matrix factorization
d) steepest descent and conjugate gradients
e) low rank updates for online least-­squares

IV. Statistical estimation and classification
a) review of core concepts in probability
b) Gaussian estimation
c) maximum likelihood estimation
d) Bayesian estimation
e) the Bayes classifier
f) logistic regression
e) empirical risk minimization

V. Modeling
a) geometric models
i) principal components analysis, low-rank approximation (Eckart-­Young theorem)
ii) sparsity, model selection
iii) structured matrix factorization (e.g. NNMF, dictionary learning)
iv) manifold models, nonlinear embeddings
b)  probabilistic models
i) Gaussian mixture models
ii) hidden Markov models