Explore chapters and articles related to this topic
Multivariate Analysis and Techniques
Published in N.C. Basantia, Leo M.L. Nollet, Mohammed Kamruzzaman, Hyperspectral Imaging Analysis and Applications for Food Quality, 2018
When the spectral data and target attributes are not linearly related as a result of physical sample properties or instrumental effects, non-linear methods such as ANN and SVM regression are very suitable for analysis. The most widely used ANN is the multilayer feed forward neural network where the neurons are arranged in three layers: input layer, hidden layer and output layer. The spectral value at every wavelength is fed to the input layer, while the output layer delivers the prediction of the attribute. Feed forward neural network usually has one or more hidden layers, which enable the network to deal with nonlinear and complex correlation. In SVM regression, the input is first mapped in high-dimensional feature space using nonlinear mapping, and then a linear regression is constructed in this feature space. The solution of SVM will become more complex and the speed will decrease if the sample size is large. To solve these problems, an optimized version of SVM called the least squares support vector machine (LS-SVM) can be used (Suykens & Vandewalle, 1999).
Theoretical basis – TEI@I methodology
Published in Yafei Zheng, Kin Keung Lai, Shouyang Wang, Forecasting Air Travel Demand, 2018
Yafei Zheng, Kin Keung Lai, Shouyang Wang
First proposed by Vapnik (1995), support vector machines (SVM) are a successful realization of the statistical learning theory (SLT). It is a computational intelligence model based on the SLT’s Vapnik–Chervonenkis theory and the principle of structural risk minimization. A SVM model is seeking a balance between the model’s training precision and forecasting accuracy with limited sample information at hand, in order to obtain a better generalization ability. A SVM model can also be called a support vector regression (SVR) model if it is trained for a regression analysis, or a support vector classification (SVC) model for a classification analysis. The basic idea of a SVM is mapping a nonlinear problem in the low-dimensional feature space to a linear problem of a high-dimensional space, to simplify the problem. The SVM models have been proved to possess excellent capabilities in forecasting even for small samples, whose training is however a time consuming process when facing high-dimensional data. To deal with this problem, Suykens and Vandewalle (1999) proposed a new version of the SVM algorithms, named least squares support vector machine (LSSVM) models. According to the modeling purposes, LSSVM models can also be categorized into two main groups, i.e., least squares support vector regression (LSSVR) and least squares support vector classification (LSSVC) for the regression and classification purposes, respectively. Specifically, the LSSVR models are usually applied in forecasting literature. In this section, we will give a brief description for the SVR models and LSSVR models.
An analysis of the influence of affordable housing system on price
Published in Dawei Zheng, Industrial, Mechanical and Manufacturing Science, 2015
Least Squares Support Vector Machine (LS-SVM) is an extension of support vector machine, it converts the quadratic programming problem to solve linear equations. It has faster solving velocity, is widely used in the regression analysis, pattern recognition and many other areas (An et al. 2011a, Zhou et al. 2011 ). The regression principle of LS-SVM is as follows:
Modeling vaporization enthalpy of pure hydrocarbons and petroleum fractions using LSSVM approach
Published in Energy Sources, Part A: Recovery, Utilization, and Environmental Effects, 2020
Hani Ahmadi, Haleh Ahmadi, Alireza Baghban
Least squares support vector machine (LSSVM) is a computer-based technique which is the least square version of SVM. Application of LSSVM for data analyzing and patterns recognizing makes it more popular in practical problems. This technique can be utilized for classification and regression analysis with good satisfactory outcomes. The LSSVM approach solves a group of linear equations as an alternative to complex quadratic programming (QP) which is used in classical SVM. More details were in the literature regarding its theory (Cortes and Vapnik 1995; Smola and Vapnik 1997; Suykens and Vandewalle 1999; Wang 2005). The assumed input parameters in the suggested LSSVM model are the boiling temperature (Tb), specific gravity (SG), and molecular weight (Mw). Moreover, Matlab software version 2016 was used in order to code the LSSVM approach. Two critical parameters in the LSSVM method are the regularization parameters (γ) and the kernel parameter (σ2) that must be determined optimally using optimization techniques. The applied kernel function in the LSSVM model is the radial basis function (RBF) because of the very amazing applicability of this function. This study uses a particle swarm optimization (PSO) algorithm as an interesting evolutionary optimization algorithm to find the best value of these tuning parameters. A typical construction of the suggested LSSVM model can be obvious in Figure 1.
Automatic Generation Control (AGC) of Wind Power System: An Least Squares-Support Vector Machine (LS-SVM) Radial Basis Function (RBF) Kernel Approach
Published in Electric Power Components and Systems, 2018
Gulshan Sharma, Ibraheem Nasiruddin, K. R. Niazi, R. C. Bansal
In recent years the applications of artificial neural networks (ANN) have been widely explored to solve the problems of real power system operation and control more efficiently [11–17]. ANN overcomes the drawbacks of insufficient data availability in real time and improves the dynamic performance of the power system over a wide range of operating conditions. The success of ANN-based AGC depends on the structure chosen, number of hidden layers, size of the neurons per layer and most important is to select the learning algorithm besides reliable training data which greatly affect the performance of ANN. All these problems of ANN are alleviated to a great extent through advanced machine learning technique, i.e., Support Vector Machines (SVMs). The SVM is another class of machine learning techniques which has excellent function approximation and classification capabilities [18]. SVM has better generalization capability than the conventional neural networks (NNs). Basically this network uses a kernel to map the data in the input space to a high dimensional feature space in which the problem becomes linearly separable. There are many classes of kernels available such as the linear kernel, Radial Basis Function (RBF) and polynomial kernels. The least squares support vector machine (LS-SVM) is advancement over the traditional SVMs and has high precision and fast convergence as compared to traditional SVMs [19]. In LS-SVM, the objective function of the optimization problem includes an additional sum squared error term which reduces the computation time of the convex optimization problem and hence improves the performance of the LS-SVM network.
Optical property inversion of biological materials using Fourier series expansion and LS-SVM for hyperspectral imaging
Published in Inverse Problems in Science and Engineering, 2018
Wei Wang, Min Huang, Qibing Zhu, Jianwei Qin
Recently, machine learning methods have been studied to establish a non-linear inversion regression model through training between the spatially resolved diffuse reflectance profile and the set of optical absorption and reduced scattering coefficients, such as neural network (NN) and support vector machine (SVM) [17,18]. Such a model is then used to quickly get the corresponding optical properties from measured spatially resolved diffuse reflectance profile. NN model prediction accuracy depends on a large number of sample data and model parameters, and the result of prediction is not stable for small sample system. Least squares support vector machine (LS-SVM) is a regression and classification method that evolved from the SVM by constructing the loss function for changing the quadratic programming to solving linear equation. LS-SVM algorithm can reduce the complexity of the calculation to improve the speed and better describe the complex non-linear relationship between the spectral features and optical properties, and thus obtain good results in the regression modelling [19,20]. Barman et al. [21] utilized diffuse reflectance data, coupled with LS-SVM, to develop the prediction model of the optical properties. Their research results demonstrated that determination of the optical properties using LS-SVM algorithm is feasible. However, these researches directly utilized the diffuse reflectance profile as the input of the LS-SVM model, which may result in large input dimension and has a higher risk of over-fitting. Although diffuse reflectance profile at the surface of the material contains abundant information related to the absorption and reduced scattering coefficients, how to extract useful feature information to improve the prediction accuracy of optical parameters is still a problem that is worth to study. On the other hand, extraction of meaningful information can reduce input dimension, simplify model structure and enhance model robustness.