Explore chapters and articles related to this topic
Data generation, collection, analysis, and preprocessing
Published in Madhusree Kundu, Palash Kumar Kundu, Seshu Kumar Damarla, Chemometric Monitoring: Product Quality Assessment, Process Fault Detection, and Applications, 2017
Madhusree Kundu, Palash Kumar Kundu, Seshu Kumar Damarla
Regression was introduced by Karl Pearson, who adapted a related idea presented by Francis Galton in the 1880s. Unlike correlation, regression analysis is a statistical process for estimating the causal relationships among variables. A regression analysis involves independent, dependent variables and parameters. In regression, the target is to derive a functionality relating the dependent variable as a function of the independent variables called the regression function. Regression is broadly classified under two categories: parametric and nonparametric. Linear regression and ordinary linear least squares are parametric regression, where the regression function is defined by the minimum number of parameters estimated using the data of interest. The earliest form of regression was the method of least squares, which was published by Legendre in 1805 and by Gauss in 1809. Nonparametric regression refers to the techniques where the structure of the regression function is not predetermined, but instead falls under a set of functions decided by the data. The sample size required is considerably larger than the parametric regession. This book utilizes regression in the following ways: for data smoothing (local regression smoothing using the MATLAB® Curve Fitting Toolbox) and for classification using multivariate linear regression model partial least squares (PLS) (discussed in Chapter 2).
Introduction
Published in Bradley Efron, R.J. Tibshirani, An Introduction to the Bootstrap, 1994
Bradley Efron, R.J. Tibshirani
This book describes the bootstrap and other methods for assessing statistical accuracy. The bootstrap does not work in isolation but rather is applied to a wide variety of statistical procedures. Part of the objective of this book is expose the reader to many exciting and useful statistical techniques through real-data examples. Some of the techniques described include nonparametric regression, density estimation, classification trees, and least median of squares regression.
Jump regression, image processing, and quality control
Published in Quality Engineering, 2018
Nonparametric regression analysis provides statistical tools for estimating regression curves or surfaces from noisy data. Conventional nonparametric regression procedures, such as the local kernel smoothers and splines, are appropriate for estimating continuous regression functions only (Fan and Gijbels 1996; Wahba 1990). In practice, however, the underlying regression function could have jumps or other singularities. For instance, the small diamonds in Figure 2 denote the sea-level pressures observed by a Bombay weather station in India during 1921–1992. We can notice that there is a jump around the year 1960, which was confirmed in Qiu and Yandell (1998). If a conventional local kernel smoother is used for estimating the underlying regression function, then the estimated function looks like the dashed curve in the plot. Obviously, the jump, which could be an important data structure that reveals certain interesting atmospheric phenomena, is blurred by the smoother. JRA is specifically for handling cases with jumps or other singularities in the underlying regression function (Qiu 2005). The regression function estimated by a JRA method is shown by the solid curve in Figure 2. It can be seen that the jump structure is preserved well by this method.
Lifetime sensitivity analysis of FPSO operating parameters on energy consumption and overall oil production in a pre-salt oil field
Published in Chemical Engineering Communications, 2020
Ali Allahyarzadeh-Bidgoli, Daniel Jonas Dezan, Leandro Oliveira Salviano, Silvio de Oliveira Junior, Jurandir Itizo Yanagihara
The analysis of variance is performed by using the Smoothing Spline ANOVA (SS-ANOVA) model, which is a nonparametric regression method that can be used for both univariate and multivariate statistical modeling. This method is a statistical modeling algorithm based on function decomposition similar to the classical analysis of variance (ANOVA) decomposition and it is associated with notions of the main and interaction effects. For this reason, the interpretability of the results from SS-ANOVA is an additional benefit over standard parametric models. In addition, nonparametric regression, when compared with parametric assumptions, normally provides a better goodness-of-fit (Hidalgo et al., 2018).
Effects of a priori parameter selection in minimum relative entropy method on inverse electrocardiography problem
Published in Inverse Problems in Science and Engineering, 2018
Onder Nazim Onak, Yesim Serinagaoglu Dogrusoz, Gerhard Wilhelm Weber
In recent years, methods such as Multivariate Adaptive Regression Splines (MARS), Conic MARS (CMARS) and its robust version RCMARS [56,57] have been successfully applied to many areas of science and technology such as satellite data reconstruction [58,59], image processing in meteorology [60] and ground motion prediction [61]. These methods are examples of nonparametric regression modelling techniques that make no assumption about the underlying functional relationship between the dependent and independent variables. We plan to use (C)MARS techniques to solve inverse ECG problem to overcome the sensitivity of the MRE method to the model parameters.