Markov chain Monte Carlo importance samplers for Bayesian models with intractable likelihoods
Markov chain Monte Carlo (MCMC) is an approach to parameter inference in Bayesian models that is based on computing ergodic averages formed from a Markov chain targeting the Bayesian posterior probability. We consider the efficient use of an approximation within the Markov chain, with subsequent importance sampling (IS) correction of the Markov chain inexact output, leading to asymptotically exact inference. We detail convergence and central limit theorems for the resulting MCMCIS estimators. We also consider the case where the approximate Markov chain is pseudomarginal, requiring unbiased estimators for its approximate marginal target. Convergence results with asymptotic variance formulae are shown for this case, and for the case where the IS weights based on unbiased estimators are only calculated for distinct output samples of the socalled ‘jump’ chain, which, with a suitable reweighting, allows for improved efficiency. As the IS type weights may assume negative values, extended classes of unbiased estimators may be used for the IS type correction, such as those obtained from randomised multilevel Monte Carlo. Using Euler approximations and coupling of particle filters, we apply the resulting estimator using randomised weights to the problem of parameter inference for partially observed Itô diffusions. Convergence of the estimator is verified to hold under regularity assumptions which do not require that the diffusion can be simulated exactly. In the context of approximate Bayesian computation (ABC), we suggest an adaptive MCMC approach to deal with the selection of a suitably large tolerance, with IS correction possible to finer tolerance, and with provided approximate confidence intervals. A prominent question is the efficiency of MCMCIS compared to standard direct MCMC, such as pseudomarginal, delayed acceptance, and ABCMCMC. We provide a comparison criterion which generalises the covariance ordering to the IS setting. We give an asymptotic variance bound relating MCMCIS with the latter chains, as long as the ratio of the true likelihood to the approximate likelihood is bounded. We also perform various experiments in the state space model and ABC context, which confirm the validity and competitiveness of the suggested MCMCIS estimators in practice.
...
ISBN
9789513977382Contains publications
 Artikkeli I: Vihola, Matti; Helske, Jouni; Franks, Jordan (2020). Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo. Scandinavian Journal of Statistics, Early online. DOI: 10.1111/sjos.12492
 Artikkeli II: Franks, Jordan; Vihola, Matti (2020). Importance sampling correction versus standard averages of reversible MCMCs in terms of the asymptotic variance. Stochastic Processes and Their Applications, 130 (10), 61576183. DOI: 10.1016/j.spa.2020.05.006
 Artikkeli III: Franks, J.; Jasra, A.; Law, K. J. H. and Vihola, M. (2018). Unbiased inference for discretely observed hidden Markov model diffusions. Preprint. arXiv:1807.10259v4
 Artikkeli IV: Vihola, Matti; Franks, Jordan (2020). On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and postcorrection. Biometrika, 107 (2), 381395. DOI: 10.1093/biomet/asz078
Metadata
Show full item recordCollections
 Väitöskirjat [2897]
Related items
Showing items with similar title or keywords.

Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo
Vihola, Matti; Helske, Jouni; Franks, Jordan (WileyBlackwell, 2020)We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the ... 
Importance sampling correction versus standard averages of reversible MCMCs in terms of the asymptotic variance
Franks, Jordan; Vihola, Matti (Elsevier, 2020)We establish an ordering criterion for the asymptotic variances of two consistent Markov chain Monte Carlo (MCMC) estimators: an importance sampling (IS) estimator, based on an approximate reversible chain and subsequent ... 
On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and postcorrection
Vihola, Matti; Franks, Jordan (Oxford University Press, 2020)Approximate Bayesian computation enables inference for complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation ... 
Conditional particle filters with diffuse initial distributions
Karppinen, Santeri; Vihola, Matti (Springer, 2021)Conditional particle filters (CPFs) are powerful smoothing algorithms for general nonlinear/nonGaussian hidden Markov models. However, CPFs can be inefficient or difficult to apply with diffuse initial distributions, which ... 
Theoretical and methodological aspects of MCMC computations with noisy likelihoods
Andrieu, Christophe; Lee, Anthony; Vihola, Matti (Chapman and Hall/CRC, 2018)Approximate Bayesian computation (ABC) [11, 42] is a popular method for Bayesian inference involving an intractable, or expensive to evaluate, likelihood function but where simulation from the model is easy. The method ...