MENU

You are here

Bayesian Inference I

External Lecturer: 
Guido Sanguinetti
Course Type: 
PhD Course
Academic Year: 
2021-2022
Period: 
First term
Duration: 
36 h
Description: 

Course description: Probabilistic models are an appealing way to reason about systems that exhibit intrinsic and/or observational uncertainty. An important question in such models is how observational data can be used to reduce/quantify such uncertainty, leading to improved predictions and scientific discovery. Bayesian inference provides a mathematically coherent framework to incorporate knowledge from observations into models, by providing algorithms to compute posterior distributions over unobserved model variables. In this course, we will introduce the concepts of Bayesian inference through the study of a number of probabilistic models, as well as inference algorithms. We will focus particularly on models and methods which allow for analytical computations, as these often provide insights that remain valuable also on more complex models. Topics covered include probabilistic PCA, Gaussian Processes, Sampling and Variational Inference algorithms.

Syllabus:

1. The multivariate Gaussian distribution: conditionals, marginals, and conjugate prior (and its problems)

2. Laplace method and Fisher matrix

3. Linear/ Gaussian models: probabilistic PCA and linear regression. Basis function regression.

4. Gaussian processes for regression and Bayesian Optimization.

5. Lab 1: linear regression and Gaussian Processes

6. Bayesian inference in non-conjugate models: Markov Chain Monte Carlo (MCMC), rejection and importance sampling, Metropolis-Hastings algorithm. Convergence diagnostics and rules of thumb.

7. Generalised linear models (GLMs) and inference; Gaussian processes for classification.

8. Lab 2: Bayesian GLMs.

9. Graphical models and hierarchical Bayesian models. Gibbs sampling.

10. Mixture models and topic models.

11. Variable augmentation: probit and logistic regression with auxiliary variables

12. Lab 3: Gibbs sampling for mixture models.

13. Variational inference: prelude, the EM algorithm

14. Mean-field variational inference

15. Variational inference for general models: black-box variational inference and variational autoencoders, Stein variational inference.

16. Lab 4: Variational mean field for mixture models.

Research Group: 
Next Lectures: 

Sign in