CS 274A: Syllabus and Schedule, Winter 2026
Note: dates and topics may change slightly during the quarter, but the overall syllabus should remain largely the same.
- Week 1: January 5th
- Probability Review: random variables, conditional and
joint probabilities, Bayes rule, law of total probability, factorization. Sets of random variables, the
multivariate Gaussian model.
-
Probability Review: Conditional independence and graphical models.
- Week 2: January 12th
- Learning from Data: Concepts of models and parameters. Definition of the
likelihood function and the principle of maximum likelihood.
- Maximum Likelihood Learning: Maximum likelihood for Gaussian models, binomial, multivariate and other parametric models.
Week 3: January 19th
- No lecture on Monday (university holiday)
- Sequence Models: Learning from sequential data. Markov models and related approaches. Connections with large language models.
Week 4: January 26th
- Bayesian Learning:
General principles of Bayesian estimation: prior densities, posterior densities, Beta-binomial examples.
- Bayesian Learning: Comparing point estimates (ML, MAP, MPE) and fully Bayesian approaches. Bayesian analysis of multinomial models and Markov models. Bayesian approaches to multi-arm bandits (in homework).
Week 5: Feb 2nd
- Bayesian Learning: Bayesian analysis of Gaussian models. Predictive densities.
- Bayesian Learning: Predictive densities for Gaussian models. Bayesian model selection. Approximate Bayesian inference: Laplace, variational, and Monte Carlo methods.
<\li>
Week 6: Feb 9th
- Midterm Exam during Monday's class
- Regression Learning: Linear and non-linear (e.g., neural network) models. Probabilistic perspectives on regression. Loss functions. Parameter estimation methods for regression.
Week 7: February 16th
- No lecture on Monday (university holiday)
- Regression Learning: Bayesian approaches to regression. The bias-variance trade-off for squared error and regression.
Week 8: February 23rd
- Classification Learning: Likelihood-based approaches and properties of objective functions. Connections between regression and classification. Logistic regression and neural network classifiers.
- Classification Learning: Decision boundaries, discriminant functions,
optimal decisions, Bayes error rate.
Week 9: March 2nd
- Mixture Models and EM: Finite mixture models.
The EM algorithm for learning Gaussian mixtures.
- Mixture Models and EM: Properties of the EM algorithm. Relation of K-means clustering to
Gaussian mixture modeling. Mixtures for discrete and non-vector data.
Week 10: March 9th
- Latent Variable Models: Bayesian learning approaches. MCMC methods.
- Temporal Models: Autoregressive models, recurrent neural networks.
Finals Week: March 15th
- Final exam, Day and time TBD