This concise review of linear algebra summarizes some of the background needed for the course.
See also some notes on basic matrix-vector manipulations.
Notes 01, Introduction
I. Vector spaces and linear representations
Notes 02, first look at linear representations
Notes 03, linear vector spaces
Notes 04, norms and inner products
(see also the auxiliary primer on analysis)
Notes 05, linear approximation
Notes 06, orthobases
Notes 07, non-orthogonal bases
II. Regression using Least Squares
Notes 08, regression, the least-squares problem, regularization
(see also the MATLAB script regression_examples.m)
Notes 09, least squares in Hilbert space
Notes 10, reproducing kernel Hilbert spaces
Notes 11, kernel models, Mercer’s theorem
III. Solving and analyzing least squares problems
Notes 12, symmetric systems and eigenvalue decompositions
Notes 13, the singular value decomposition
Notes 14, stable least squares
(** optional) Notes 15, matrix factorization
Notes 16, gradient descent and conjugate gradients for least squares
IV. Statistical Estimation and Classification
Notes 17, review of central concepts in probability
Notes 18, Gaussian estimation
Notes 19, maximum likelihood estimation
Notes 20, consistency, efficiency, and the Cramer Rao bound
Notes 21, Bayesian estimation
Notes 22, the Bayes classifier and nearest neighbor
Notes 23, empirical risk minimization
Notes 24, logistic regression