Explore chapters and articles related to this topic
Swarm Intelligence in Cybersecurity
Published in Andrew Schumann, Swarm Intelligence, 2020
Cong Truong Thanh, Quoc Bao Diep, Ivan Zelinka
Contemporary with ACO, PSO was also used and achieved satisfactory results when utilized in IDS. In [1], Aburomman et al. applied ensemble techniques to the intrusion detection problem. As base classifiers for these ensemble methods, they used support vector machine (SVM) and k-nearest-neighbor (kNN), on the KDD’99 dataset. After that, three new ensembles were created by using three techniques: PSO, another based on a variant of the PSO, where parameters were optimized by the local unimodal sampling (LUS), and lastly the Weighted Majority Algorithm (WMA). Their experiment showed that the ensemble technique was able to improve the results obtained from the base classifiers alone. Furthermore, the ensembles generated by PSO techniques archived better results than the traditional WMA.
Data Stream Mining for Big Data
Published in Himansu Das, Jitendra Kumar Rout, Suresh Chandra Moharana, Nilanjan Dey, Applied Intelligent Decision Making in Machine Learning, 2020
In the adaptation method, the learner is adapted to track changes in the data evolution either in an informed way or blindly. In the former case, the learner updates itself only when a change has indeed occurred, whereas in the latter case, the learner updates itself no matter whether a change has occurred or not. In decision model management, the technique characterizes the number of decision models kept in memory. It assumes that the data comes from several distributions and tries to learn a new model whenever a change is detected. The dynamic weighted majority algorithm (Kolter and Maloof, 2007) is an example of decision model management.
Mobile Phone based ensemble classification of Deep Learned Feature for Medical Image Analysis
Published in IETE Technical Review, 2020
Tamarafinide V. Dittimi, Ching Y. Suen
Stacking involves the use of ensemble learners to combine the results of multiple base classifiers on a single dataset. It trains a linear regression classifier to combine the outputs of the base models. Furthermore, the individual classifiers were trained using the whole training dataset; before linear regression is applied to fit using the accuracy and error of the base techniques in the ensemble system. Weighted Majority Algorithm (WMA) is used to construct a compound algorithm from a pool of prediction algorithms. This technique is a binary decision problem and constructs a compound algorithm by assigning a positive weight to each base classifier and computes the weighted votes of all the base models in the pool and finally assigns while assigning the sample to the prediction with the highest vote. It assumes no prior knowledge of the detection rate of the base algorithms in the pool but instead believes that one or more of the base classifiers will perform well [27]. Lastly, Bagging uses different subsets to train the base classifier separately, it generates multiple bootstrapped training sets by repeatedly (n times) selecting one of the n samples at random and calls the base model learning algorithm with each of them to yield a set of base models; where each sample has an equal probability of being selected. Voting is employed to obtain the classification results and provides the ensemble with a balance between variability and similarity. Although, some training examples may not be selected at all and others may be chosen multiple times.