Mathematical Foundations of Machine Learning, Fall 2017 Mathematical Foundations of Machine Learning, Fall 2017, Notes

Mathematical Foundations of Machine Learning, Fall 2017, Notes

(I will try to post notes here right before lecture.)

Notes 1, Introduction

I. Vector Spaces and Linear Representations
Notes 2, intro to bases for representing functions
Notes 3, linear vector spaces
Notes 4, norms and inner products
Notes 5, linear approximation
Notes 6, orthobases
Notes 7, parseval, truncating orthoexpansions, and JPEG
Notes 8, non-orthogonal bases

II. Linear Estimation using Least Squares
Notes 9, linear regression and discretizing linear inverse problems
Notes 10, symmetric systems of equations
Notes 11, the SVD
Notes 12, the least-squares problem
Notes 13, stable least-squares estimation
Notes 14, kernel regression
Notes 15, matrix factorization
Notes 16, iterative methods, gradient descent and conjugate gradients
Notes 17, online least-squares

III. Statistical Estimation and Classification
Notes 18, best linear unbiased estimator
(Notes 18a, probability review)
Notes 19, mmse estimation, Gaussian estimation
Notes 20, maximum likelihood estimation
Notes 21, consistency of the MLE, Cramer-Rao lower bounds
Notes 22, Bayesian estimation
Notes 23, the Bayes classifier, nearest neighbor classification
Notes 24, empirical risk minimization

IV. Modeling and Model Selection
Notes 25, PCA and non-negative matrix factorization
Notes 26, sparse regression and dictionary learning
Notes 27, Gaussian mixture models
Notes 28, hidden Markov models
          instead of notes, here is an excellent tutorial paper by Rabiner