Structured Markov chain Monte Carlo

Daniel J. Sargent, James S. Hodges, Bradley P. Carlin

Research output: Contribution to journalArticle

29 Citations (Scopus)

Abstract

This article introduces a general method for Bayesian computing in richly parameterized models, structured Markov chain Monte Carlo (SMCMC), that is based on a blocked hybrid of the Gibbs sampling and Metropolis-Hastings algorithms. SMCMC speeds algorithm convergence by using the structure that is present in the problem to suggest an appropriate Metropolis-Hastings candidate distribution. Although the approach is easiest to describe for hierarchical normal linear models, we show that its extension to both nonnormal and nonlinear cases is straightforward. After describing the method in detail we compare its performance (in terms of run time and autocorrelation in the samples) to other existing methods, including the single-site updating Gibbs sampler available in the popular BUGS software package. Our results suggest significant improvements in convergence for many problems using SMCMC, as well as broad applicability of the method, including previously intractable hierarchical nonlinear model settings.

Original languageEnglish (US)
Pages (from-to)217-234
Number of pages18
JournalJournal of Computational and Graphical Statistics
Volume9
Issue number2
StatePublished - 2000

Fingerprint

Markov Chain Monte Carlo
Metropolis-Hastings
Metropolis-Hastings Algorithm
Gibbs Sampler
Gibbs Sampling
Hierarchical Model
Autocorrelation
Software Package
Updating
Nonlinear Model
Linear Model
Markov chain Monte Carlo
Computing
Model

Keywords

  • Blocking
  • Convergence acceleration
  • Gibbs sampling
  • Hierarchical model
  • Metropolis-Hastings algorithm

ASJC Scopus subject areas

  • Discrete Mathematics and Combinatorics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

Sargent, D. J., Hodges, J. S., & Carlin, B. P. (2000). Structured Markov chain Monte Carlo. Journal of Computational and Graphical Statistics, 9(2), 217-234.

Structured Markov chain Monte Carlo. / Sargent, Daniel J.; Hodges, James S.; Carlin, Bradley P.

In: Journal of Computational and Graphical Statistics, Vol. 9, No. 2, 2000, p. 217-234.

Research output: Contribution to journalArticle

Sargent, DJ, Hodges, JS & Carlin, BP 2000, 'Structured Markov chain Monte Carlo', Journal of Computational and Graphical Statistics, vol. 9, no. 2, pp. 217-234.
Sargent DJ, Hodges JS, Carlin BP. Structured Markov chain Monte Carlo. Journal of Computational and Graphical Statistics. 2000;9(2):217-234.
Sargent, Daniel J. ; Hodges, James S. ; Carlin, Bradley P. / Structured Markov chain Monte Carlo. In: Journal of Computational and Graphical Statistics. 2000 ; Vol. 9, No. 2. pp. 217-234.
@article{e65ba60e299543ffbb66d8031d68e9b7,
title = "Structured Markov chain Monte Carlo",
abstract = "This article introduces a general method for Bayesian computing in richly parameterized models, structured Markov chain Monte Carlo (SMCMC), that is based on a blocked hybrid of the Gibbs sampling and Metropolis-Hastings algorithms. SMCMC speeds algorithm convergence by using the structure that is present in the problem to suggest an appropriate Metropolis-Hastings candidate distribution. Although the approach is easiest to describe for hierarchical normal linear models, we show that its extension to both nonnormal and nonlinear cases is straightforward. After describing the method in detail we compare its performance (in terms of run time and autocorrelation in the samples) to other existing methods, including the single-site updating Gibbs sampler available in the popular BUGS software package. Our results suggest significant improvements in convergence for many problems using SMCMC, as well as broad applicability of the method, including previously intractable hierarchical nonlinear model settings.",
keywords = "Blocking, Convergence acceleration, Gibbs sampling, Hierarchical model, Metropolis-Hastings algorithm",
author = "Sargent, {Daniel J.} and Hodges, {James S.} and Carlin, {Bradley P.}",
year = "2000",
language = "English (US)",
volume = "9",
pages = "217--234",
journal = "Journal of Computational and Graphical Statistics",
issn = "1061-8600",
publisher = "American Statistical Association",
number = "2",

}

TY - JOUR

T1 - Structured Markov chain Monte Carlo

AU - Sargent, Daniel J.

AU - Hodges, James S.

AU - Carlin, Bradley P.

PY - 2000

Y1 - 2000

N2 - This article introduces a general method for Bayesian computing in richly parameterized models, structured Markov chain Monte Carlo (SMCMC), that is based on a blocked hybrid of the Gibbs sampling and Metropolis-Hastings algorithms. SMCMC speeds algorithm convergence by using the structure that is present in the problem to suggest an appropriate Metropolis-Hastings candidate distribution. Although the approach is easiest to describe for hierarchical normal linear models, we show that its extension to both nonnormal and nonlinear cases is straightforward. After describing the method in detail we compare its performance (in terms of run time and autocorrelation in the samples) to other existing methods, including the single-site updating Gibbs sampler available in the popular BUGS software package. Our results suggest significant improvements in convergence for many problems using SMCMC, as well as broad applicability of the method, including previously intractable hierarchical nonlinear model settings.

AB - This article introduces a general method for Bayesian computing in richly parameterized models, structured Markov chain Monte Carlo (SMCMC), that is based on a blocked hybrid of the Gibbs sampling and Metropolis-Hastings algorithms. SMCMC speeds algorithm convergence by using the structure that is present in the problem to suggest an appropriate Metropolis-Hastings candidate distribution. Although the approach is easiest to describe for hierarchical normal linear models, we show that its extension to both nonnormal and nonlinear cases is straightforward. After describing the method in detail we compare its performance (in terms of run time and autocorrelation in the samples) to other existing methods, including the single-site updating Gibbs sampler available in the popular BUGS software package. Our results suggest significant improvements in convergence for many problems using SMCMC, as well as broad applicability of the method, including previously intractable hierarchical nonlinear model settings.

KW - Blocking

KW - Convergence acceleration

KW - Gibbs sampling

KW - Hierarchical model

KW - Metropolis-Hastings algorithm

UR - http://www.scopus.com/inward/record.url?scp=23044518019&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=23044518019&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:23044518019

VL - 9

SP - 217

EP - 234

JO - Journal of Computational and Graphical Statistics

JF - Journal of Computational and Graphical Statistics

SN - 1061-8600

IS - 2

ER -