Explore chapters and articles related to this topic
Supervised Models
Published in Chandrasekar Vuppalapati, Democratization of Artificial Intelligence for the Future of Humanity, 2021
In statistics, linear regression is an approach for modeling the relationship between a scalar dependent variable y and one or more explanatory variables denoted X. The case of one explanatory variable is called simple linear regression. When the outcome, or class, is numeric, and all the attributes are numeric, linear regression is a natural technique to consider. This is a staple method in statistics. The idea is to express the class as a linear combination of the attributes, with predetermined weights: x=w0+w1a1+w2a2+⋯+wkak where x is the class; al, a2, …, ak are the attribute values; and w0, w1, …, wk are weights.
Edge Analytics
Published in Chandrasekar Vuppalapati, Building Enterprise IoT Applications, 2019
In statistics, linear regression is an approach for modeling the relationship between a scalar dependent variable y and one or more explanatory variables denoted X. The case of one explanatory variable is called simple linear regression. When the outcome, or class, is numeric, and all the attributes are numeric, linear regression is a natural technique to consider. This is a staple method in statistics. The idea is to express the class as a linear combination of the attributes, with predetermined weights: x=w0+w1a1+w2a2+…+wkak
Learning deterministic models
Published in Richard E. Neapolitan, Xia Jiang, Artificial Intelligence, 2018
Richard E. Neapolitan, Xia Jiang
Multiple linear regression is just like simple linear regression except that there is more than one independent variable. That is, we have m independent variables X1,X2,...,Xm $ X_{1} ,\,X_{2} , \, . \, . \, . \, ,\,X_{m} $ and a dependent variable Y such that y=b0+b1x1+b2x2+…+bmxm+εx1,x2,...,xm, $$ y = b_{0} + b_{1} x_{1} + b_{2} x_{2} + \ldots + b_{m} x_{m} + \varepsilon _{{x_{1} ,x_{2} ,...~,~x_{m} }} ,~ $$
Spatio-temporal modelling of the influence of climatic variables and seasonal variation on PM10 in Malaysia using multivariate regression (MVR) and GIS
Published in Geomatics, Natural Hazards and Risk, 2021
Abdulwaheed Tella, Abdul-Lateef Balogun, Ibrahima Faye
Although simple linear regression has been implemented in air pollution studies, it has a low predictive accuracy for particulate matters (Wang and Ogawa 2015). In contrast, multiple linear regression has a higher probability of getting a better model fit than simple linear regression due to the inclusion of more than one explanatory variable that determines the prediction of the dependent variable (Gupta 2019). Thus, the capability and strength of the explanatory variables in predicting the dependent variable using MVR will be determined. Moreover, with the application of MVR, the significance of the independent variables (e.g. climatic factors) in predicting the dependent variable (PM10) will be determined. Understanding the multicollinearity between the independent variables is fundamental to determine the statistical significance of the independent variables (Allen 1997; Yoo et al. 2014; Daoud 2017) in predicting the dependent variable. Thus, pairwise correlation analysis was performed to determine the problem of multicollinearity in the independent variables. The following statistical indices (Table 1) were used to evaluate the model’s performance.
Reliable corridor level travel time estimation using probe vehicle data
Published in Transportation Letters, 2020
Rahul Sakhare, Lelitha Vanajakshi
Linear regression is a linear approach to explain the relationship between a dependent variable and one or more explanatory variables. The dependent variable is a possible outcome variable that is determined by other independent variables. If only one independent variable is used to model the outcome, then it is termed as a simple linear regression whereas use of two or more than two independent variables to model the outcome is termed as multivariate linear regression. Residuals are the difference between the data points and regressed line and linear regression models often try to minimize the sum of squares of residuals. It is also known as the least square approach. This is one of the most common approach used for linear regression models though there are other available approaches like minimizing the lack of fit, minimizing penalized version of the least squares cost function. For the present study, least square approach was used for the purpose of fitting the linear regression model using MATLAB. The following section discusses the functional form of the model and different explanatory variables that were used to build the model.
Regression: multiple linear
Published in International Journal of Injury Control and Safety Promotion, 2018
The simple linear model is described by the mathematical straight line , where is called the intercept and is called the slope. The focus in such models is on the slope since, if equal to zero, there is no relationship between X and Y. Geometrically, any given straight line is determined by two points (x1,y1) and (x2,y2) that lie on the two-dimensional X–Y plane, so that the simple linear regression slope coefficient can be written as , providing us the interpretation of the slope as the change in Y relative to a change in X.