This module focuses principally on Bayesian and computational statistics. The module introduces basic Bayesian statistical modelling and methods, such as Bayes' Theorem, posterior and prior distributions and Markov chain Monte Carlo methods. Other Monte Carlo simulation methods, such as rejection sampling, importance sampling, coupling from the past will also be covered in the module.

On completion of the course students should be able to (learning outcomes):


  • Understand Bayes' theorem and Bayesian statistical modelling

  • Understand the difference between certain Bayesian inferences and corresponding frequentist ones.

  • Understand Markov chain Monte Carlo simulation

  • Understand rejection sampling, importance sampling and the slice sampler

  • Understand the convergence diagnostic for MCMC.

  • Develop a Monte Carlo simulation algorithm for simple probability distributions




Syllabus

1. Bayesian statistical methods:
likelihood function, prior distribution, posterior distribution, predictive distribution, exchangeability, de Finetti theorem

2. Random variable generation and Monte Carlo integration,


  • Classical Monte Carlo Integration

  • transformation methods,

  • importance sampling



3. Other methods for random variable generation:

  • rejection sampling,

  • ratio of uniform methods


4. Adaptive rejection sampling

  • envelope function,

  • log-concave densities.


5. Simulation from posterior distribution via Markov chain Monte Carlo:

  • Markov chains, stationary distribution,

  • transition probability,

  • general balance, detail balance.

  • the MCMC principle


6. Metropolis-Hastings algorithm,

  • Convergence of Metropolis-Hastings algorithm

  • Independent Metropolis-Hastings algorithm,

  • Random walks


7. Gibbs sampler

  • Hammersley-Clifford Theorem

  • Mixture of distributions


8. Slice sampler

9. Diagnostic of MCMC convergence