Detailed information about the course

[ Back ]
Title

Fundamentals of Bayesian Inference

Dates

20 Mai 2021

Responsable de l'activité

Bastien Chopard

Organizer(s)

Prof. Bastien Chopard, UNIGE

Speakers

Dr Dutta Ritabrata, Assistant Professor of Statistics, University of Warwick, UK

Description

Part I: Bayesian inference and Monte Carlo methods In this talk we will introduce the fundamentals of Bayesian inference, which is the corner-stone of statistics and machine-learning today. Bayesian inference uses Bayes' theorem to update the initial probability for a hypothesis (prior distribution) as more evidence or information becomes available through data as posterior distribution of the hypothesis. In contrast to the classical frequentist statistical methods, Bayesian inference provides a probability distribution on the parameters incorporating uncertainty of the estimation, rather than just a point estimate. This part of the talk will only consider the cases when the likelihood function of the hypothesis or parameters given the dataset is known and can be computed. We will first give an overview of inferential aspects in Bayesian analysis, with illustrations of the estimation of the mean and variance of a normal distribution; estimation of parameters in a linear regression model; estimation of a logistic regression model. For these inference schemes, one of the most important part is to be able to sample from the posterior distributions. The main aspects of sampling from the posterior distribution will be first illustrated through the conjugate priors, when the posterior distribution has an analytical form and easy to sample from. Next we will handle the cases for which the posterior is known only up to a normalizing constant and for sampling from the posterior distribution we will use Markov chain Monte Carlo (MCMC) sampling techniques, which are crucial to perform Bayesian inference. We will discuss different MCMC methods, Metropolis-Hasting algorithm, Gibbs Sampling and Hamiltonian Monte Carlo(NUTS). Part II: Likelihood-free Bayesian inference In the second part of the talk, we will introduce methods for Bayesian inference, when the likelihood function can not be evaluated for the model due to unavailability of the analytical form of the likelihood function or computational complexity eg. when the likelihood function is a 40-dimensional integral. We will first introduce approximate Bayesian computation (ABC) and different algorithms used for ABC, eg. Rejection ABC, MCMC ABC, sequential Monte Carlo ABC methods and simulated-annealing ABC (SABC), all of which will be illustrated through application for inference of closure parameters of a numerical weather prediction (NWP) model, Lorenz95. Next we will discuss how to choose discrepancy measure for ABC, using linear regression, classification or neural networks. We will also illustrate other methods of likelihood-free inference, where an approximation of likelihood function (eg. synthetic likelihood, approximation using Fourier transform and probabilistic classification by estimating the ratio of likelihoods) is computed and used in the MCMC methods, introduced in Part I, as in pseudo-marginal likelihood approaches. Finally, we will illustrate how can these likelihood-free inference schemes be used on high-performance computing frameworks (MPI, SPARK on a cluster, Amazon EWS or a supercomputer) through a Python package ABCpy.

Program

9h-12h:

 

 

 

  • "Fundamentals of Bayesian inference" (1hr lecture), 
  • "Monte Carlo Methods for Bayesian inference" (1 hr lecture), 
  • Tutorial of MCMC" (1 hr tutorial using Python)

 

 

 

14h-17h

  • "Likelihood-free Bayesian inference (LFBI)" (1 hr lecture), 
  • "Neural networks for LFBI" (1 hr lecture), 
  • "Tutorial of LFBI" (1 hr tutorial using Python)
Location

Online

Information
Places

30

Deadline for registration 19.05.2021
short-url short URL

short-url URL onepage