The Metropolis–Hastings algorithm is a member of the MCMC family; as such, it is designed to generate samples $\textbf{x}$ (eventually) according to target probabilities $\pi(\textbf{x})$. (Typically we are interested in sampling from $\pi(\textbf{x})P(\textbf{x}\textbf{e})$.) Like simulated annealing, Metropolis–Hastings operates in two stages. First, it samples a new state $\textbf{x’}$ from a proposal distribution $q(\textbf{x’}\textbf{x})$, given the current state $\textbf{x}$. Then, it probabilistically accepts or rejects $\textbf{x’}$ according to the acceptance probability \(\alpha(\textbf{x'}\textbf{x}) = \min\ \left(1,\frac{\pi(\textbf{x'})q(\textbf{x}\textbf{x'})}{\pi(\textbf{x})q(\textbf{x'}\textbf{x})} \right)\ .\) If the proposal is rejected, the state remains at $\textbf{x}$.
-
Consider an ordinary Gibbs sampling step for a specific variable $X_i$. Show that this step, considered as a proposal, is guaranteed to be accepted by Metropolis–Hastings. (Hence, Gibbs sampling is a special case of Metropolis–Hastings.)
-
Show that the two-step process above, viewed as a transition probability distribution, is in detailed balance with $\pi$.