Dr. Rina Dechter - University of California at Irvine ZOT!
home | publications | book | courses | research Revised on Feb. 15, 2024

CS 276 - Causal and Probabilistic Reasoning with Graphical Models
2023-24, Winter (Q2)
[ main | software | references ]


Course Reference

Course Description

One of the main challenges in building intelligent systems is the ability to perform causal inference under uncertainty. Graphical models, which include Bayesian networks and their extension into structural causal models offer a powerful and successful methodology of operationalizing causal inference in a wide spectrum of applications. Intelligent systems based on Bayesian networks are being used in a variety of real-world applications including diagnosis, sensor fusion, on-line help systems, credit assessment, bioinformatics and data mining. Causal reasoning is at the core of human reasoning and thus should play a central part in artificial intelligence and in machine learning.

The objective of this course is to provide an in-depth exposition of causal reasoning under uncertainty using structural causal models and Bayesian networks. Both theoretical underpinnings and practical considerations will be covered, with a special emphasis on exploring reasoning within the ladder of causation that include 1. Association, 2. Intervention, 3. Counterfactual.

Prerequisites

  • Familiarity with basic concepts of probability theory.
  • Knowledge of basic computer science, algorithms and programming principles.
  • Previous exposure to AI is desirable but not required.

Target Students:

This course is intended for PhD students in the area of AI, Machine Learning, and Statistics, and also appropriate for students from other disciplines with interest in causality and having relevant AI/ML/Statistics background.

Course Topics

  1. Probabilistic Graphical Models, Structural causal models,The Causal Hierarchy.
  2. Bayesian and Markov Networks: Representing independencies by graphs. d-seperation.
  3. Algorithms for probabilistic reasoning (Bucket-elimination for summation and optimization, Join-trees, The induced-width.).
  4. Sampling schemes for graphical models (MCMC and Importance Sampling).
  5. Structural Causal Models; Identification of Causal Effect; The problem of confounding.
  6. The Back-Door and Front-Door Criteria and the Do-Calculus.
  7. Linear Causal Models.
  8. Counterfactuals.
  9. Algorithms for identification using c-components. The ID algorithm.
  10. Learning Bayesian networks and Causal graphs (causal discovery).

Homework Assignments:

There will be four homework, each given roughly a week to complete.

Course Project:

Each student will also be engaged in a project based on papers from recent literature. The project will involve learning about and preseting an assigned paper/literature in class and writing a project report.

Grading Policy:

Homework (70%), Course Project (15% presentation + 15% report = 30% total)



Syllabus

s
Week Topic Lectures
Slides
Homework
Reading
Date  
Week 1

  • Probabilistic Graphical Models and Causal Bayesian Networks

  • Bayesian and Markov Networks: Representing independencies by graphs

Lec 1



Lec 2

Slides 1



Slides 2

[Darwiche] Ch. 1, 3, 4

[Russell-Norvig] Ch. 13

M 1/8


W 1/10

Week 2



  • Bayesian and Markov Networks: Representing independencies by graphs

  • The notion of d-separation






Lec 3









Slides 3







HW 1





[Darwiche] Ch. 4

M 1/15
(No Class!!!)



W 1/17

Week 3

  • Variational algorithms for probabilistic graphical models (Bucket-elimination for summation and optimization)
  • Join-trees; Belief propagation; The induced-width


Lec 4

Lec 5

Slides 4

GenIE

Slides 5

[Dechter] Ch. 4, 5.1-5.2

[Darwiche] Ch. 6

M 1/22


W 1/24
Week 4

  • Structural Causal Models; Identification of Causal Effect; The problem of confounding

Lec 6


Lec 7


Slides 6
Slides 7

HW 2

[Dechter] ch. 3.4

[Primer] ch. 1, ch 2

M 1/29


W 1/31

Week 5

  • The Back-Door Criterion

Lec 8


Lec 9


Slides 8
Slides 9


[Primer] Ch. 3

[causality] 1.3, 3.1-3,3

M 2/5


W 2/7

Week 6

  • Front-Door
  • Do-Calculous

Lec 10

Lec 11



Slides 10




HW 3


[Primer] ch. 3.4-3.6

[Causality] 3.4

M 2/12


W 2/14

Week 7

  • Algorithms for identification using c-components
  • Learning Bayesian networks from data (the EM Algorithm)






[Primer] Ch. 4

[Causality] Ch. 7

M 2/19
(No Class!!!)



W 2/21

Week 8

  • _

  • _







_

M 2/26


W 2/28

Week 9

  • _



_

M 3/4


W 3/6

Week 10

  • Student Project Presentations





M 3/11


W 3/13

Finals Week

  • Project Reports Due


F 3/22, 8-10am