Explore chapters and articles related to this topic
Basic Approaches of Artificial Intelligence and Machine Learning in Thermal Image Processing
Published in U. Snekhalatha, K. Palani Thanaraj, Kurt Ammer, Artificial Intelligence-Based Infrared Thermal Image Processing and Its Applications, 2023
U. Snekhalatha, K. Palani Thanaraj, Kurt Ammer
Formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani, in the LogitBoost algorithm, as the name suggests, the aim is to attain the highest accuracy by boosting the classifier. LogitBoost is a classification boosting technique. LogitBoost and AdaBoost are comparable in that they both do an additive logistic regression. LogitBoost reduces the logistic loss whereas AdaBoost minimizes the exponential loss. It constructs a logistic model by iterative refinement, gradually adding more variables as new linear models are introduced. The concept is to automatically build a tree structure by recursively dividing the iterative fitting process into decision tree branches matching subsets of the data (Friedman et al., 2000).
Correlation-based Oversampling aided Cost Sensitive Ensemble learning technique for Treatment of Class Imbalance
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2022
Debashree Devi, Saroj K. Biswas, Biswajit Purkayastha
From (Figure 8) and (Table 18), it is observed that LogitBoost has performed significantly well than the other boosting variant methods for Blood transfusion, Diabetes, Heart, subcl5 and yeast3 datasets. The proposed CorrOV-CSEn has scored slightly less AUC-ROC values than LogitBoost in case of all the above mentioned 5 datasets (Table 18). The minimisation of logistic loss in LogitBoost makes it less sensitive to the outliers while AdaBoost in the process of minimising exponential loss become highly sensitive to the outliers and noisy data. Hence, higher AUC-ROC scores are obtained by LogitBoost for yeast3 and subcl5 datasets. LogitBoost takes into account the individual characteristics of the data-points and thus has made the imbalanced class distribution a trivial aspect during training. This leads LogitBoost to achieve higher AUC-ROC scores than the rest boosting variant methods. However, in CorrOV-CSEn, the role of features and misclassification statistics of the instances are incorporated during training and thus, has performed significantly better than the other boosting variants.