Pseudo-Marginal MCMC

Posted on Updated on

Markov Chain Monte Carlo (MCMC) techniques are a method Bayesian statistics which are used when the prior distribution is not conjugate to the likelihood. Where this is the case it is often not possible to evaluate the posterior density analytically, and so numerical methods must be employed. MCMC methods provide an algorithm which simulate realisations from the posterior distribution. The most well known MCMC techniques are the Gibbs sampler, and Metropolis-Hastings sampling.

The Gibbs sampler requires you to initialise the value of each parameter, \bf{\theta^{(0)}} = (\theta_1^{(0)},...\theta_p^{(0)}). It then simulates each parameter from its conditional distribution, based on past simulations, \theta_1^{(j)} \sim (\theta_1|\theta_2^{(j-1)},...\theta_p^{(j-1)},\bf{x}), ... , \theta_p^{(j)} \sim (\theta_p|\theta_1^{(j)},...\theta_{p-1}^{(j)},\bf{x}). After a burn-in period, the Gibbs sampler will converge to the posterior distribution. The Gibbs sampler can only be used if the conditional distributions are standard distributions and so easy to simulate from.

When the conditional distributions are not standard, Metropolis-Hastings can be used. The Metropolis-Hastings algorithm also requires the chain \bf{\theta^{(0)}} to be initialised. It then generates a proposed value \bf{\theta^*} for each parameter from the proposal distribution q(\theta^*|\theta^{(j-1)}), and calculates an acceptance probability \alpha(\theta^{(j-1)},\theta^*) of the proposal, given by \alpha(\theta,\theta^*) = min \{1, \frac{\pi(\theta^*|\bf{x})q(\theta|\theta^*)}{\pi(\theta^|\bf{x})q(\theta^*|\theta)}. It then either accepts the proposed value, with probability \alpha(\theta^{(j-1)},\theta^*) or sets \theta^{(j)}=\theta^{(j-1)}.

While Metropolis-Hastings does not require you to know the value of the normalising constant of the posterior, it does require that the posterior is able to be evaluated. When this cannot be done, but an approximation can be produced, pseudo-marginal Metropolis-Hastings can be used. The posterior distribution is \pi(\theta|\bf{x}) \propto \pi(\theta)f(\theta|\bf{x}). When the likelihood, f(\theta|\bf{x}), is not known, the posterior cannot be calculated. However if f(\theta|\bf{x}) can be estimated to give \hat{p}(x|\theta,v)  which is unbiased and non-negative, then pseudo-marginal MCMC can simulate approximations which have the correct stationary distribution.

The algorithm uses the current value \theta and an estimate \hat{\pi}(\theta,u). For each timestep,

  1. Propose \theta^* from q(\theta^*|\theta)
  2. Sample U^* from \tilde{q}(u^*,\theta^*) to construct \hat{\pi}(\theta^*,u^*).
  3. Use Metropolis-Hastings with acceptance probability \alpha(\theta,\theta^*) = min(1, \frac{\hat{\pi}(\theta^*,u^*)q(\theta,\theta^*)}{\hat{\pi}(\theta,u)q(\theta^*|\theta)}).

For further information and references, see


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s