Explore chapters and articles related to this topic
Bayesian Statistical Methods
Published in Simon Washington, Matthew Karlaftis, Fred Mannering, Panagiotis Anastasopoulos, Statistical and Econometric Methods for Transportation Data Analysis, 2020
Simon Washington, Matthew Karlaftis, Fred Mannering, Panagiotis Anastasopoulos
MCMC is a simulation process that enables repeated sampling from known distributions in finite state-space that results in a Markov chain, or series of numbers. An example of a Markov chain is the random sample of normal variates that is obtained from a random number generator. A Markov chain operates in discrete time intervals to produce a sequence of evolving random variables, with the probability of transition (evolution) dependent upon its current state. Chains are generated from a transition kernel (which is a conditional probability density function). The resulting chains have desirable properties. Specifically, a stationary probability distribution exists as a result of construction of the Markov chains, and convergence to the limiting or stationary distribution occurs with almost certainty (Robert and Casella, 1999) under the right conditions.
Determinants of health complaints of Bodetabek commuter workers using Bayesian multilevel logistic regression
Published in Yuli Rahmawati, Peter Charles Taylor, Empowering Science and Mathematics for Global Competitiveness, 2019
When the posterior distribution was difficult to derive mathematically, it was approximated using Markov Chain Monte Carlo (MCMC) (Hox, 2010). MCMC is a simulation technique that can generate random samples from a complex posterior distribution. Through a large number of simulated random samples, it will be possible to calculate the posterior mean, standard deviation, density plot, and quintiles of this distribution (Browne, 2017). In the Bayesian MCMC approach, to test the model fit (goodness of fit), we can compare the Deviance Information Criterion (DIC) from each model. () DIC=D¯+pD
Mixture of Experts Models
Published in Sylvia Frühwirth-Schnatter, Gilles Celeux, Christian P. Robert, Handbook of Mixture Analysis, 2019
Isobel Claire Gormley, Sylvia Frühwirth-Schnatter
Sampling the component weight parameters in step 4 through an MH algorithm brings issues such as choosing suitable proposal distributions q(γg*|γg,γ−g) and tuning parameters, which may make fitting ME models troublesome. Gormley & Murphy (2010b) detail an approach to deriving proposal distributions with attractive properties, within the context of an ME model for network data. Villani et al. (2009, 2012) introduce a highly efficient MCMC scheme based on a Metropolis-within-Gibbs sampler that exploits a few Newton iterations to construct suitable proposal distributions.
Bayesian updating for predictions of delayed strains of large concrete structures: influence of prior distribution
Published in European Journal of Environmental and Civil Engineering, 2023
D. Rossat, J. Baroth, M. Briffaut, F. Dufour, A. Monteil, B. Masson, S. Michel-Ponnelle
Markov Chain Monte Carlo (MCMC) sampling techniques constitute the most widespreadly used class of Bayesian computational approaches (Robert & Casella, 2004). The main idea of such techniques consists in generating Markov chains which asymptotically behave as the posterior distribution. This allows to draw samples of the posterior distribution, and then estimates posterior QoI from such samples, as well as posterior PDF using kernel smoothing techniques (Parzen, 1962). The Metropolis-Hastings algorithm (Hastings, 1970; Metropolis et al., 1953) is the cornerstone of MCMC algorithms. Its particular interest lies in the fact that its use does not require the evaluation of the model evidence (11). Nevertheless, this algorithm suffers from major issues: first, it requires to tune a set of parameters which plays a crucial role in the sampling of the posterior distribution, so that sampling results are strongly dependent on tuning quality. Furthermore, the generated samples present potentially important auto-correlation, which leads to increased variability of estimates of posterior QoI. Consequently, many algorithms have been proposed in order to alleviate these problems, such as adaptive algorithms (Andrieu & Thoms, 2008), gradient-based algorithms such as Hamiltonian Monte Carlo (HMC) (Neal, 2011), or affine-invariant ensemble samplers (AIES) (Goodman & Weare, 2010).
Accelerated Bayesian inference-based history matching of petroleum reservoirs using polynomial chaos expansions
Published in Inverse Problems in Science and Engineering, 2021
Sufia Khatoon, Jyoti Phirani, Supreet Singh Bahga
The inverse problem of history matching deals with the calculation of the probability density of model parameters given observed data . To this end, we employ the Bayes' rule We use the Metropolis-Hastings algorithm [37] to sample the posterior density . The Metropolis-Hastings algorithm is a Markov chain Monte Carlo (MCMC) method based on the principle that the sampling is done from a Markov chain in such a way that the sampled distribution converges to the target distribution at equilibrium state. In the MCMC method, the sampling is initiated with an initial value of . Then, the next value is sampled using a proper proposal distribution, which is a conditional distribution on the current state. Thereafter, the ratio of the probability of the new proposed sample to the probability of the current state is calculated as If the new sample proposed is more likely than the existing sample , the sample is accepted, else the sample is accepted with probability α. This procedure is repeated until a stationary state is achieved, and the Markov Chain approaches the target distribution after a sufficient burn-in period.
Validation and Uncertainty Quantification for Wall Boiling Closure Relations in Multiphase-CFD Solver
Published in Nuclear Science and Engineering, 2019
Directly inferring the posterior distribution of parameters through the Bayes formula for a complex model with multiple parameters is extremely difficult. The MCMC method is an alternative for Bayesian inference and has demonstrated its applicability in thermal-hydraulic problems.34 The general idea of MCMC is to construct Markov chains that converge to the posterior parameter distributions. For a given parameter, it is proved that the stationary distribution of the Markov chains is the posterior density. There are multiple algorithms for MCMC sampling; in this work, the Delayed Rejection Adaptive Metropolis (DRAM) algorithm35 is chosen for MCMC. There are two features of DRAM. One is delayed rejection, which means if a candidate is rejected in the sampling process, an alternate candidate is constructed to induce greater mixing. The other is adaption, which means the covariance matrix of the parameters is continuously updated using the accepted candidates. MCMC will construct a stationary distribution of a Markov chain that equals the posterior distribution of the parameter.