Explore chapters and articles related to this topic
Regression
Published in Benny Raphael, Construction and Building Automation, 2023
Autoregression is frequently used in predictions with time series. Sensors and other sources produce data at regular intervals. Frequently, the output at the next time step is correlated with the data in previous time steps. This typically happens when the value of the output variable is determined by differential equations involving time. Such differential equations can be converted to finite difference forms in which the output at time t can be written in terms of the values of variables at previous time steps (t − 1), (t − 2), etc. Thus, the data from previous time steps can be used to predict the value at the next time step. This can easily be converted into a regression problem. The regression equation is as follows: y(t)=w1y(t−1)+w2y(t−2)+…+bThe coefficients w1, w2 etc. are determined through regression. All that needs to be done is to arrange data in a form such that the columns representing input variables contain the data at previous time steps.
Time Series Forecasting
Published in Harry G. Perros, An Introduction to IoT Analytics, 2021
Let Xt be a random variable indicating the value of a time series at time t. The autoregressive model predicts the next value Xt for time t using the expression:Xt=δ+a1Xt−1+a2Xt−2+…+apXt−p+εtwhere Xt−1, Xt−2, …, Xt−p are the random variables indicating the previous p observed values of the time series, δ is a constant, ai, i = 1, 2, …, p, are weights, and εt is normally distributed white noise with 0 mean and variance σ2. The autoregressive model is referred to as AR(p), where p is the order of the model. We note that the autoregressive model is a linear regression of the next value of the time series against one or more prior values of the time series.
A Single Time Series Model
Published in Yu Ding, Data Science for Wind Energy, 2019
The partial autocorrelation function is useful in identifying the model order of an autoregressive process. If the original process is autoregressive of order k, then for p > k, we should have ϕpp = 0. This can again be done in a partial autocorrelation function plot by inspecting, up to which order, PACF becomes zero or near zero. By setting type=c(‘partial’) in one of its arguments, the acf function computes PACF values and draws a PACF plot. Alternatively, the pacf function in the tseries package can do the same. The dashed line on a PACF plot bears the same value as the same line on an ACF plot.
Expressway rear-end crash risk evolution mechanism analysis under different traffic states
Published in Transportmetrica B: Transport Dynamics, 2023
Ling Wang, Lingjie Zou, Mohamed Abdel-Aty, Wanjing Ma
Univariate time series analysis can be extended to multivariate time series by building vector autoregressive models. The vector autoregressive (VAR) model is usually used to describe the relationship between the changes in multivariate time series. If there are m-dimensional series, an m-dimensional p-order vector autoregressive model can be established, and the formula is as follows: where is the crash risk value of the target segments at t time, and are the crash risk value of the two upstream segments and the two downstream segments and the target segment at the p time before the t time, i.e. is a 5-dimensional vector, and is the coefficient matrix and is a random perturbation term.
Optimisation of pavement maintenance and rehabilitation activities, timing and work zones for short survey sections and multiple distress types
Published in International Journal of Pavement Engineering, 2020
Valentin Donev, Markus Hoffmann
So far the spatial arrangement of the road sections has not been taken into account. The assumption of independent condition developments on adjacent sections is unrealistic, as they often exhibit quite similar condition. This study employs a times-series model to account for a positive correlation between adjacent road sections (spatial, serial or autocorrelation). The focus, however, is not on a sequence of observations in time, as it is usual for time series, but on series of ordered measurements at intervals of equal length (survey sections). Thus, a second-order autoregressive process was fitted to real-world data by using conditional maximum likelihood estimation. The basic formulas are shown in Figure 3(d) and for more details on time series, reference to the literature is made (e.g. Hamilton 1994, Montgomery et al. 2015). Condition data of regional roads from two Austrian states was used for this analysis with models being fitted to individual roads. While the mean condition may vary between roads, the estimated autoregressive parameters are relatively consistent with the average values shown in Figure 3(d). The order (i.e. number of lags) of the autoregressive models is determined based on the partial autocorrelogram (Figure 3(c)). For the vast majority of sections, the correlation is significant over a length of one to four sections (50 m length) with two lags being the most common. Furthermore, the results showed that the number of lags and the degree of the correlation depend on the distress type and length of the survey sections (25 m and 50 m were examined). Figure 3(a,b) illustrates the difference between ordered and random sequences of service lives for rutting.
The application of deep generative models in urban form generation based on topology: a review
Published in Architectural Science Review, 2023
Bo Lin, Wassim Jabi, Padraig Corcoran, Simon Lannon
Autoregressive models use a linear combination of past values of valuables to forecast the target variables, and they are very flexible in dealing with different kinds of time series (Kingma and Welling 2014). In terms of images, autoregressive models handle images pixel by pixel rather than whole images (Hyndman 2018). Masked Autoencoder for Distribution Estimation (MADE), an autoregressive model modified by autoencoder network, uses the autoregressive property to forecast the distribution from a set of samples (Turhan and Bilge 2018). PixelCNN Decoder, an autoregressive model based on Convolutional Neural Network (CNN), can generate images conditionally (Uria et al. 2016). PixelRNN uses the dependency between pixels closer together to generate images sequentially based on Long Short-Term Memory (LSTM) (Oord et al. 2016). Recurrent Neural Networks (RNN) are a class of neural networks modelling the information in sequential order, widely used in time series and natural language (Guo and Zhao 2023). However, RNN only performs well in short-term dependency and has not been proven useful in long-term dependency. LSTM, a special type of RNN, can seamlessly store and repeatedly utilize long-term information (Oussidi and Elhassouny 2018; Tensorflow n.d.). PixelVAE is a VAE model with an autoregressive model based on pixelCNN for natural image modelling (Oussidi and Elhassouny 2018). Variational Lossy Autoencoder learns the global representation for 2D images by combining VAE with neural autoregressive models, such as RNN, MADE, PixelCNN, and PixelRNN (Gulrajani et al. 2017). Graphgen, GraphRNN, and DeepGMG utilize autoregressive models to generate graphs (Goyal, Jain, and Ranu 2020; Li et al. 2018; You et al. 2018b). Generative Adversarial Networks (GAN) based models