CS 274A: Background Notes and Reading
Some of the class notes below may be updated during the quarter - if this happens it will be announced on Ed.
Class Notes
Background/Review Material on Probability
- Topics covered: random variables, conditional and joint
probabilities, Bayes rule, law of total probability, chain rule
and factorization. Frequentist and Bayesian views of probability.
- Required Reading:
- Optional Additional References and Reading
- Excellent 15 minute video on multivariate
Gaussian densities from Alex Ihler
Conditional Independence and Graphical Models
- Topics covered: Sets of random variables. Conditional independence and graphical
models. Markov models.
- Required Reading:
- Optional Additional References and Reading
- Chapter from Chris Bishop's book on graphical models (the material on graphical models starts about 20 pages into the document)
Learning from Data using Maximum Likelihood
- Topics: Concepts of models and
parameters. Definition of the likelihood function and the principle of maximum likelihood
parameter estimation. Using maximum likelihood methods to learn the parameters of Gaussian models, binomial,
multivariate and other parametric models.
- Required Reading:
- Recommended Reading:
- Optional Additional References and Reading
Bayesian Learning
- Topics: General principles of Bayesian estimation: prior densities, posterior densities, MAP,
fully Bayesian approaches. Beta/binomial and Gaussian examples. Predictive densities, model selection, model averaging.
- Required Reading:
- Recommended Reading:
- Optional Additional References and Reading
Optimization Methods for Machine Learning
- Topics: General principles of finding minima/maxima of multivariate functions, gradient and Hessian methods,
stochastic gradient methods.
- Recommended Reading:
- Optional Additional References and Reading
Regression Models
- Topics: Linear models. Systematic and stochastic components.
Parameter estimation methods for regression. Maximum likelihood and Bayesian
interpretations.
- Required Reading:
- Recommended Reading:
- Optional Additional References and Reading
Classification
- Topics: Bayes rule, classification boundaries, discriminant functions, Optimal decisions,
Bayes error rate. Likelihood-based approaches and properties of
objective functions. Logistic regression and neural network models.
- Required Reading:
- Optional Reading:
The EM Algorithm, Mixture Models, and Probabilistic Clustering
- Topics: Mixtures of Gaussians and the associated EM algorithm.
K-means clustering. Underlying theory of the EM algorithm.
- Required Reading:
- Optional Reading:
State-Space and Time-Series Models
- Topics: discrete and continuous latent-state space models. Hidden Markov models,
Kalman filters.
- Optional Reading: