CS 274A: Syllabus and Schedule, Winter 2025
Note: dates and topics may change slightly during the quarter, but the overall syllabus should remain largely the same.
- Week 1: January 6th
- Probability Review: random variables, conditional and
joint probabilities, Bayes rule, law of total probability, factorization. Sets of random variables, the
multivariate Gaussian model. Conditional independence and graphical models.
- Week 2: January 13th
- Learning from Data: Concepts of models and parameters. Definition of the
likelihood function and the principle of maximum likelihood.
- Maximum Likelihood Learning: Maximum likelihood for Gaussian models, binomial, multivariate and other parametric models.
Week 3: January 20th
- No lecture on Monday (university holiday)
- Sequence Models: Learning from sequential data. Markov models and related approaches. Connections with language models.
Week 4: January 27th
Week 5: Feb 3rd
- Bayesian Learning:
General principles of Bayesian estimation: prior densities, posterior densities, Beta-binomial examples.
- Bayesian Learning: Comparing point estimates (ML, MAP, MPE) and fully Bayesian approaches. Bayesian analysis of multinomial models and Markov models. Bayesian approaches to multi-arm bandits (in homework).
Week 6: Feb 10th
- Bayesian Learning: Bayesian analysis of Gaussian models. Predictive densities. Bayesian model selection. Approximate Bayesian inference: Laplace, variational, and Monte Carlo methods.
- Regression Learning: Linear and non-linear (e.g., neural network) models. Probabilistic perspectives on regression. Loss functions. Parameter estimation methods for regression.
Week 7: February 17th
- No lecture on Monday (university holiday)
- Midterm Exam during Wednesday's class
Week 8: February 24th
- Regression Learning: Bayesian approaches to regression. The bias-variance trade-off for squared error and regression.
Classification Learning: Likelihood-based approaches and properties of objective functions. Connections between regression and classification. Logistic regression and neural network classifiers.
Week 9: March 3rd
- Classification Learning: Decision boundaries, discriminant functions,
optimal decisions, Bayes error rate.
- Mixture Models and EM: Finite mixture models.
The EM algorithm for learning Gaussian mixtures.
Week 10: March 10th
- Mixture Models and EM: Properties of the EM algorithm. Relation of K-means clustering to
Gaussian mixture modeling. Mixtures for discrete and non-vector data.
- Additional topics: (time-permitting) latent variable models, temporal models
Finals Week:
- Final exam, Wed March 19th, 10:30am - 12:30pm