Explore chapters and articles related to this topic
Prediction of Compressive Strength of Self-Compacting Concrete Containing Silica's Using Soft Computing Techniques
Published in Sakshi Gupta, Parveen Sihag, Mohindra Singh Thakur, Utku Kose, Applications of Computational Intelligence in Concrete Technology, 2022
Pranjal Kumar Pandey, Paratibha Aggarwal, Yogesh Aggarwal, Sejal Aggarwal
Model development in SVM is based on trial-and-error process. For the optimum development of the model large numbers of trials were conducted. In this study model development is based on polynomial kernel function. It was observed that poly kernel function performed better than radial basis function kernel. The precision of the developed models is examined with help of agreement plots between predicted and actual strength as shown in Figure 10.4. The results of performance evaluation parameters shows that bagged SVM-based model with R2 = 0.9204, MAE = 4.6161 and RMSE = 6.2214 achieved better results than SVM model with R2 = 0.9181, MAE = 4.71274 and RMSE = 6.2917.
Performance Evaluation of Best Feature Subsets for Crop Yield Prediction Using Machine Learning Algorithms
Published in Applied Artificial Intelligence, 2019
SVR is commonly used in crop yield prediction (Gu et al. 2016; Ying-Xue, Huan, and Li-Jiao 2017). One of the advantages of this method is that mathematical analysis is relatively easier because nonlinear problems related to the input space are expressed by being matched with linear problems of high-dimension feature space (Hearst et al. 1998). In SVR, radial basis function kernel is commonly set to achieve better predictive performance (Nanda et al. 2018; Zhang and Huihua 2013). The tuning parameters of cost (C) and the kernel width (γ) are need to be set for RBF to obtain an accurate prediction. The C and γ value may vary in each and every feature subset. In order to obtain the optimal value of γ and C for each best feature subsets, SVR is tuned. The C value set the range between 10 and 100 and γ value set the range between 0 and 3, each step is increased by 0.1 to obtain the optimal value. This procedure is applied to all distinct feature subsets. The γ and C values for different feature subsets are tabulated in Table 6.
Hierarchical age estimation mechanism with adaBoost-based deep instance weighted fusion
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2021
Yongming Li, Fan Li, Yuanlin Zheng, Pin Wang, Mingfeng Jiang, Xinke Li
The effect of different kernel functions is also tested. SVR is used as the age estimation model and linear kernel and RBF kernel are compared here. To ensure a fair comparison, other relevant parameters of SVR are set to default values. The results are shown in Table 2. ‘Linear’ indicates a linear kernel function; ‘RBF’ indicates a radial basis function kernel function.
A feature selection using improved dragonfly algorithm with support vector machine for breast cancer prediction
Published in Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 2023
S. Roselin Mary, R. Murali Prasad, R. Suguna
Support vector machine classification makes use of a radial basis function kernel or Gaussian distribution. Since the external summations are random with respect to the test vector, the classification for a test vector can make use of the outcomes of disconnected external summations.