Explore chapters and articles related to this topic
Random Variables
Published in Janos J. Gertler, Fault Detection and Diagnosis in Engineering Systems, 2017
Intuition suggests to seek the parameter estimates as the parameter values with which the distribution assigns the highest local probability to the actual observations. This concept is known as the maximum likelihood principle of parameter estimation. The parameter estimates are thus obtained by maximizing the likelihood function under the given observations, that is, () θ^=arg[maxθLx(ξ,θ)]
What Is Data Analytics?
Published in Rakesh M. Verma, David J. Marchette, Cybersecurity Analytics, 2019
Rakesh M. Verma, David J. Marchette
The problem of choosing m is a bit trickier. The idea is as follows. Given a probability density function (such as the mixture model of Equation (2.5)), the likelihood is defined as the product of the density computed at each observation. The maximum likelihoodmaximum likelihood principle is that the parameters that maximize the likelihood are the “best”. So, naively we could compute the mixture model for m = 1, compute this likelihood, then do this for m = 2, m = 3, and so on, and select the value of m for which the likelihood is maximal.
Select Methods for Estimating Probabilistic Models
Published in Craig Friedman, Sven Sandow, Utility-Based Learning from Data, 2016
Maximum-likelihood inference is the logical consequence of using the likelihood as a model performance measure. In Sections 6.2, 8.1, and 8.2, we have discussed in some detail why the likelihood is a reasonable model performance measure; the reasons are the following. The likelihood is, by definition, the probability of the data under the model measure (see Definition 6.1).The likelihood principle is equivalent to the conjunction of the conditionality principle and the sufficiency principle (see Birnbaum (1962), or Section 6.2.2).The likelihood ratio provides a decision criterion for model selection that is optimal in the sense of the Neyman-Pearson lemma (see Neyman and Pearson (1933), or Section 6.2.3).The likelihood principle is consistent with Bayesian logic (see Jaynes (2003), or Section 6.2.2).An investor who bets in a horse race so as to optimize his wealth growth rate measures relative model performance by means of the likelihood ratio (see Cover and Thomas (1991), or Section 6.2.4).An expected utility maximizing investor with a utility function of the form U (W) = α log (W − γB) + β who bets in a horse race measures relative model performance by means of the likelihood ratio (Theorem 8.2).
Maximum likelihood-based recursive least-squares estimation for multivariable systems using the data filtering technique
Published in International Journal of Systems Science, 2019
Huafeng Xia, Yongqing Yang, Feng Ding, Ahmed Alsaedi, Tasawar Hayat
For stochastic systems with coloured noises, Ding, Wang, Dai, Li, and Chen (2017) presented a data filtering-based recursive least-squares algorithm for a class of output nonlinear systems with an autoregressive (AR) noise. Zhang, Xu, Ding, and Hayat (2018) proposed a bilinear state observer-based multi-innovation extended stochastic gradient method for bilinear systems with a moving average (MA) noise (Zhang, Ding, Xu, & Yang, 2018). This paper considers the maximum likelihood-based recursive least-squares method for multivariable systems with an ARMA noise. The objective of maximum likelihood principle is to construct a likelihood function with respect to the observed data and the unmeasurable parameters, and to obtain the parameter estimation by maximising the likelihood function (Gu, Liu, Chou, & Ji, 2019; Gu, Liu, Li, Chou, & Ji, 2019; Soderstrom & Soverini, 2017; Young, 2015). Recently, there is an extensive research on maximum likelihood methods (Schuler & Rose, 2017; Tolic, Milicevic, Suvak, & Biondic, 2018). For example, Wang, Zhang, and Yuan (2017) studied the parameter estimation problems for a dual-rate sampled Hammerstein controlled ARMA system and presented a maximum likelihood estimation algorithm by using the polynomial transformation technology. Chen, Ding, Alsaedi, and Hayat (2017) developed a maximum likelihood-based multi-innovation extended gradient estimation algorithm for controlled autoregressive ARMA systems by using the data filtering technique.
A reliable decision support system for fresh food supply chain management
Published in International Journal of Production Research, 2018
Gabriella Dellino, Teresa Laudadio, Renato Mari, Nicola Mastronardi, Carlo Meloni
In the grid search approach for ARIMA and ARIMAX, we consider all possible combinations of parameters p, d, q, P, D and Q ranging in predefined intervals, while seasonality s is set based on results from the pre-processing phase. For each tuple , the maximum likelihood principle is adopted for model parameters’ estimation. Then the analysis of the forecasting model is conducted by means of two kinds of statistical indicators, in-sample and out-of-sample, that are used to determine the best model. Such indicators have been described in Section 2.3.1. For TF, the grid search is implemented as follows: parameters v and r range in predefined intervals, while parameter b is estimated according to the maximum likelihood principle. In order to choose the best value of unknown parameters, for each pair (v, r), the simulated time series is defined as
Multi-damage accelerated life test based on the Birnbaum-Saunders reliability evaluation model
Published in Journal of Asian Architecture and Building Engineering, 2023
Chenggong Lu, Zhiqiang Wei, Hongxia Qiao, Theogene Hakuzweyezu, Guobin Qiao, Kan Li, Bingrong Zhu
The maximum likelihood estimation method is the most classic parameter estimation method in statistics. Based on the maximum likelihood principle, this method uses the density function of the distribution function and the sample’s information to maximize the probability of the sample appearing in the field of observations. In recent years, based on computer science, it has been developed into a helpful parameter estimation method. In this paper, through partial derivation and simplification of the likelihood function, the maximum likelihood parameter estimation equations are obtained as shown in equation 11: