Explore chapters and articles related to this topic
Bayesian Learning Approach
Published in Mark Chang, Artificial Intelligence for Drug Development, Precision Medicine, and Healthcare, 2020
The term Bayesian refers to the 18th-century theologian and mathematician Thomas Bayes. Bayes conceived of and applied broadly a method of estimating an unknown probability on the basis of other, related known probabilities. Human beings ordinarily acquire knowledge through a sequence of learning events and experiences. We hold perceptions and understandings of certain things based on our prior experiences or prior knowledge. When new facts are observed, we update our perception or knowledge accordingly. No matter whether the newly observed facts are multiple or solitary, it is this progressive, incremental learning mechanism that is the central idea of the Bayesian approach. Bayes’ rule (or theorem) therefore enunciates important and fundamental relationships among prior knowledge, new evidence, and updated knowledge (posterior probability). It simply reflects the ordinary human learning mechanism, and is part of everyone’s personal and professional life (Chang and Boral, 2008).
Risk analysis of ships & offshore wind turbines collision: Risk evaluation and case study
Published in C. Guedes Soares, T.A. Santos, Progress in Maritime Technology and Engineering, 2018
Qing Yu, Xuri Xin, Kezhong Liu, Jinfen Zhang
In the BBN modeling, the interaction edges between each node are important as well as the selection of the relevant parameters (Zhang et al., 2016). The result of the BBN conduct after many iterative processes, which indicates the potential relation from the evidence nodes to the result nodes (Mazaheri et al., 2016).The data from the accident reports or the previous research, some parameters could be previously numbered to get the prior probability. This quantified data represents that according to the historical data or record or experts’ beliefs, the condition probability distribution for a parameter before evidence is proved. However, during the investigation, some relevant evidences or information of an unknown event could be collected from some cases. These evidences are conditional and variable. After taking into account these evidences, the probability of an unknown event is the posterior probability. In this paper, all the nodes are selected from many parameters, this procedure has three stages:
Preliminaries
Published in Stephen Marsland, Machine Learning, 2014
The reason why Bayes’ rule is so important is that it lets us obtain the posterior probability—which is what we actually want—by calculating things that are much easier to compute. We can estimate the prior probabilities by looking at how often each class appears in our training set, and we can get the class-conditional probabilities from the histogram of the values of the feature for the training set. We can use the posterior probability (Figure 2.12) to assign each new observation to one of the classes by picking the class Ci where: P(Ci|x)>P(Cj|x)∀i≠j,
ABAFT: an adaptive weight-based fusion technique for travel time estimation using multi-source data with different confidence and spatial coverage
Published in Journal of Intelligent Transportation Systems, 2023
Sara Respati, Edward Chung, Zuduo Zheng, Ashish Bhaskar
From the probability theory, Choi and Chung (2002) proposed an estimation model using the Bayesian polling technique to fuze travel time data from loop detectors and GPS. Westerman et al. (1996) developed a model using a Bayesian estimator to estimate link mean speed by combining data from probe vehicles and loop detectors in the California Partners for Advanced Transit and Highways (PATH) project. Also, Liu et al. (2016) proposed an improvement of the Bayesian method, an iterative Bayesian, to improve the fusion of loop detectors and GPS data. Additionally, Mil and Piantanakulchai (2018) developed a Bayesian-based model that incorporates two factors: the different traffic conditions classified by a Gaussian Mixture (GM) model and the bias in the individual sensor estimation by introducing a non-zero mean Gaussian distribution. The non-zero mean distribution represents noises with the bias of the sensor’s estimation. Bayesian inference, a probability-based method, relies on the probability/density function to express data uncertainty. The method requires prior knowledge about an event to calculate the posterior probability of the hypothesis. It updates beliefs in response to new evidence that would cause change (Box, 1992).
Bayesian updating methodology for probabilistic model of bridge traffic loads using in-service data of traffic environment
Published in Structure and Infrastructure Engineering, 2022
Thus, this study aims to develop a Bayesian inference methodology to directly update the parameters of the probabilistic model of bridge traffic loads based on new observations regarding the traffic environment. Bayesian inference, a method of statistical inference to quantify the uncertainty of parameters, obtains the posterior probability distribution by combining the existing information represented by a prior model with newly measured data (Ang & Tang, 2007; Box & Tiao, 2011). Bayesian based methods have been widely used to solve a variety of problems in structural engineering (Beck, 2010; Green, Cross, & Worden, 2015; Huang, Beck, & Li, 2017; Leahy et al., 2015; Lee & Song, 2016; Yu & Cai, 2019; Yu et al., 2019). This method enables us to continuously update traffic load effects on bridges by estimating the parameters of the posterior distribution affected by the changes in the surroundings based on WIM data. Besides, it is often impossible to obtain sufficient WIM data in estimating traffic loads on bridges because of challenges in continuous WIM data acquisition. In such cases, by using the indirect information about the traffic around the bridge and the generic probabilistic model of bridge traffic loads as a prior model, Bayesian updating can estimate the traffic load effects on that particular bridge more reasonably and accurately.
An adaptive defense mechanism to prevent advanced persistent threats
Published in Connection Science, 2021
Yi-xi Xie, Li-xin Ji, Ling-shu Li, Zehua Guo, Thar Baker
Bayes theorem, the basis of the inference algorithm, expresses the relationships between the prior probability and the posterior probability. It indicates the relationships between observable status and unobservable status. The risk reasoning process on DBAG combines the vulnerability exploitation model and defenders’ partially observable detected results. In the course of the inference, uncertain states could be represented by conditional probability tables. The joint probability of the attack through multiple time slots is formulated under the dynamic Bayesian network as follows: in which represents the state of node in time slice . represents the parent nodes of .