- Lecture 1: Intro and Probability
- Lab 1: Bayes Theorem and Python Tech
- Lecture 2: Probability and LLN
- Lecture 3: From Monte-Carlo to frequentism
- Lab 2: Frequentism, Bootstrap, and MLE
- Lecture 4: MLE, Sampling, and Learning
- Lecture 5: Regression, AIC, Info. Theory
- Lab 3: Generating regression data, fitting it, training, and testing
- Lecture 6: Risk and Information
- Lecture 7: From Entropy to Bayes
- Lab 4: Bayesian Quantities in the Globe Model
- Lecture 8: Bayes and Sampling
- Lecture 9: Bayes and Sampling
- Lab 5: Logistic Regression and Sundry Bayesian
- Lecture 10: Sampling and Gradient Descent
- Lab 6: Sampling and PyTorch
- Lecture 11: Gradient Descent and Neural Networks
- Lecture 12: Non Linear Approximation to Classification
- [Lab 7] still to come
- Lecture 13: Classification, Mixtures, and EM
- Lecture 14: EM and Hierarchcal models
- Lab 8: EM and Hierarchicals
- Lecture 15: MCMC
- Lecture 16: MCMC and Gibbs
- Lab 9: Sampling and pymc3
- Lecture 17: Data Augmentation, Gibbs, and HMC
- Lecture 18: HMC, and Formal tests
- Lab 10: Jacobians and Tumors
- Lecture 19: NUTS, Formal tests, and Hierarchicals
- Lecture 20: Regression, GLMs, and model specification
- Lab 11: Gelman Schools Hierarchical and Prosocial Chimps GLM
- Lecture 21: From Hierarchical GLMs to Gaussian Processes
- Lecture 22: Decisions and Model Comparison
- Lecture 23: Cross-Validation, Priors, and Workflow
- Lab 12: GLM and Workflow
- Lecture 24: Variational Inference
- Lecture 25: Variational inference and Mixtures
- Lecture 26: Wrapup