Explore chapters and articles related to this topic
Fundamental Principles of Parametric Estimation
Published in Anastasia Veloni, Nikolaos I. Miridakis, Erysso Boukouvala, Digital and Statistical Signal Processing, 2018
Anastasia Veloni, Nikolaos I. Miridakis, Erysso Boukouvala
We shall begin with the basic mathematical formulation of estimation, and then, specifying for the case of scalar parameters, we will study two different cases: the random parameters and the non-random parameters. For random parameters, one can estimate the mean accuracy of the estimator easier and define procedures for deriving optimal estimators, called Bayes estimators, providing the highest possible accuracy. More specifically, three different optimization criteria are defined: the mean square error (MSE), mean absolute error (MAE) and mean uniform error, also called high error probability (Pe). Afterwards, the deterministic scalar parameters are studied focusing on how unbiased the estimator is and on variance, which measures the estimation accuracy. This leads to the concept of Fisher information and Cramer-Rao lower bound to the variance of unbiased estimators. Finally, we generalize this analysis to the case of multiple (vector) parameters.
F
Published in Philip A. Laplante, Comprehensive Dictionary of Electrical Engineering, 2018
first order system the system that can be described by a linear first-order difference equation. The output of the first-order system y(n) is equal to a linear combination of the past output value y(n - 1) and the input value x(n), i.e., y(n) = x(n) + y(n - 1) first-fit memory allocation a memory allocation algorithm used for variable-size units (e.g., segments). The "hole" selected is the first one that will fit the unit to be loaded. This hole is then broken up into two pieces: one for the process and one for the unused memory, except in the unlikely case of an exact fit, there is no unused memory. first-in-first-out (FIFO) a queuing discipline whereby the entries in a queue are removed in the same order as that in which they joined the queue. first-in-last-out (FILO) a queuing rule whereby the first entries are removed in the opposite order as that in which they joined the queue. This is typical of Stack structures and equivalent to lastin-first-out (LIFO). first-swing stability criterion to determine transient stability by use of the swing equation. The rotor angle immediately following a severe disturbance usually increases. The criterion states that if the rotor angle swings back and decreases a short time after the disturbance, then the system is first-swing stable. Fisher information a quantitative measurement of the ability to estimate a specific set of parameters. The Fisher information J () is
F
Published in Phillip A. Laplante, Dictionary of Computer Science, Engineering, and Technology, 2017
Fisher information a quantitative measurement of the ability to estimate a specific set of parameters. The Fisher information J(θ) is defined by J(θ)=Eθ(∂lnfθ(y)∂θ)2=−Eθ(∂2lnfθ(y)∂2θ)
A novel kinetic model for a cocoa waste fermentation to ethanol reaction and its experimental validation
Published in Preparative Biochemistry & Biotechnology, 2023
Eduardo Alvarado-Santos, Ricardo Aguilar-López, M. Isabel Neria-González, Teresa Romero-Cortés, Víctor José Robles-Olvera, Pablo A. López-Pérez
The term “information matrix” is used to indicate that a larger FIM, in the sense of the matrix (positive semi-definite), is associated with a smaller covariance matrix (that is, more information), whereas a smaller FIM is associated with a larger covariance matrix (that is, less information). This matrix relates the amount of information contained in the experimental data. We call it “information” because Fisher's information measures how much this parameter tells us about the data. In this sense, Fisher information is the amount of information that goes from the data to the parameters. The FIM can be calculated by linearizing the output signals of the studied system around the optimal values of the parameter [64] The linearization of the outputs for each parameter can be expressed as in (13). Mathematically, the sensitivity coefficients are the first-order derivatives of the model results with respect to the model parameters, for more information, it is recommended to review.[64]
Optimizing Thermoacoustic Characterization Experiments for Identifiability Improves Both Parameter Estimation Accuracy and Closed-Loop Controller Robustness Guarantees
Published in Combustion Science and Technology, 2022
Xiaoling Chen, Jacqueline O’Connor, Hosam Fathy
Fisher information analysis provides a minimum co-variance matrix bound for the estimated parameters via the Cramér-Rao inequality (Forman et al. 2012; Mendoza et al. 2016; Pronzato 2008). In this work, based on the nominal values of the estimated parameters, we can apply Fisher information analysis to assess the local identifiability of a model’s parameters around the nominal values. Previous work by the authors shows, in simulation, the potential benefits of optimizing a Rijke tube experiment for Fisher identifiability (Chen et al. 2019). The current work provides the experimental validation of the applicability of Fisher information analysis to combustion stability experiments. The work also shows that optimizing a Rijke tube experiment for Fisher identifiability furnishes tighter parameter estimates (i.e., smaller estimation uncertainties) compared to a benchmark experiment where the Rijke tube is excited using a broadband input signal. This broadband excitation is similar to traditional flame transfer function measurement methods, where flames are subjected to large ranges of individual frequencies and their response measured (Freitag et al. 2006; Kim et al. 2010; Palies et al. 2010); these methods are time consuming and information-heavy, making them cumbersome to use in control settings.
An Information Geometry Approach to Robustness Analysis for the Uncertainty Quantification of Computer Codes
Published in Technometrics, 2022
Clement Gauchy, Jerome Stenger, Roman Sueur, Bertrand Iooss
Consider the family of parametric densities . We recall that every input variable represents a physical parameter with known domain of validity, and therefore, for all θ in Θ, the support of is assumed to be a compact set of . The metric associated with the coordinate function θ, called the Fisher (or Fisher-Rao) metric, is defined as follows:where is the Fisher information matrix evaluated at θ0 for this statistical model. The Fisher information, well-known in the fields of optimal design, Bayesian statistics, and machine learning, is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of the distribution of X. The Fisher information matrix defines the following local inner product in for and