CS 274A: Syllabus and Schedule, Winter 2025
Note: dates and topics may change slightly during the quarter, but the overall syllabus should remain largely the same.
- Week 1: January 6th
- Probability Review: random variables, conditional and
joint probabilities, Bayes rule, law of total probability, factorization. Sets of random variables, the
multivariate Gaussian model. Conditional independence and graphical models.
- Week 2: January 13th
- Learning from Data: Concepts of models and parameters. Definition of the
likelihood function and the principle of maximum likelihood.
- Maximum Likelihood Learning: Maximum likelihood for Gaussian models, binomial, multivariate and other parametric models.
Week 3: January 20th
- No lecture on Monday (university holiday)
- Sequence Models: Learning from sequential data. Markov models and related approaches. Connections with language models.
Week 4: January 27th
- Bayesian Learning:
General principles of Bayesian estimation: prior densities, posterior densities, Beta-binomial examples.
- Bayesian Learning: Comparing point estimates (ML, MAP, MPE) and fully Bayesian approaches. Bayesian analysis of multinomial models and Markov models. Bayesian approaches to multi-arm bandits (in homework).
Week 5: Feb 3rd
- Bayesian Learning: Bayesian analysis of Gaussian models. Predictive densities. Bayesian model selection.
- Bayesian Learning: Predictive densities for Gaussian models. Approximate Bayesian inference: Laplace, variational, and Monte Carlo methods.
Week 6: February 10th
- Midterm Exam during Monday's class
- Regression Learning: Linear and non-linear (e.g., neural network) models. Probabilistic perspectives on regression. Loss functions. Parameter estimation methods for regression.
Week 7: February 17th
- No lecture on Monday (university holiday)
- Regression Learning: Bayesian approaches to regression. The bias-variance trade-off for squared error and regression.
Week 8: February 24th
- Classification Learning: Likelihood-based approaches and properties of objective functions. Connections between regression and classification. Logistic regression and neural network classifiers.
- Classification Learning: Decision boundaries, discriminant functions,
optimal decisions, Bayes error rate.
Week 9: March 3rd
- Mixture Models and EM: Finite mixture models.
The EM algorithm for learning Gaussian mixtures.
- Mixture Models and EM: Properties of the EM algorithm. Relation of K-means clustering to
Gaussian mixture modeling. Mixtures for discrete and non-vector data.
Week 10: March 10th
- Latent Variable Models: Bayesian learning approaches. MCMC methods.
- Temporal Models: Autoregressive models, recurrent neural networks.
Finals Week:
- Final exam, Wed March 19th, 10:30am - 12:30pm