Explore chapters and articles related to this topic
Rule-Based Classifiers
Published in Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier, Machine Learning, 2016
Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier
Ripper (repeated incremental pruning to produce error reduction) is one of the popular sequential covering algorithms. It is based on the technique called reduced error pruning (REP) which is also used in decision tree algorithms. Rule-based algorithms use REP by splitting their training data into a growing set and a pruning set [2]. The initial ruleset is formed using the growing set and the ruleset is simplified by using the pruning operation that will yield the greatest reduction of error on the pruning set. The pruning operation can be deletion of any single condition or any single rule. The pruning stops when further application of the pruning operator results in an increased error on the pruning set. In order to explain how the Ripper works, we will use JRip, an implementation of Ripper in Weka.
Machine Learning
Published in Pedro Larrañaga, David Atienza, Javier Diaz-Rozo, Alberto Ogbechie, Carlos Puerto-Santana, Concha Bielza, Industrial Applications of Machine Learning, 2019
Pedro Larrañaga, David Atienza, Javier Diaz-Rozo, Alberto Ogbechie, Carlos Puerto-Santana, Concha Bielza
Repeated incremental pruning to produce error reduction (RIPPERk $ _k $ ) (Cohen, 1995) is one of the most popular rule induction models. RIPPERk $ _k $ is based on the incremental reduced error pruning (IREP) algorithm introduced by Fürnkranz and Widmer (1994).
Signature Generation Algorithms for Polymorphic Worms
Published in Mohssen Mohammed, Al-Sakib Khan Pathan, Automatic Defense Against Zero-day Polymorphic Worms in Communication Networks, 2016
Mohssen Mohammed, Al-Sakib Khan Pathan
RIPPER is a well-known rule-based algorithm [29]. RIPPER is elaborated as repeated incremental pruning to produce error reduction. This algorithm was designed by Cohen in 1995. RIPPER is especially more efficient on large noisy datasets. There are two kinds of loops in the RIPPER algorithm: outer and inner. The outer loop adds one rule at a time to the rule base, and the inner loop adds one condition at a time to the current rule. The information gain measure is maximized by adding the conditions to the rule. This process is continued until it covers no negative example.
Data analytics in quality 4.0: literature review and future research directions
Published in International Journal of Computer Integrated Manufacturing, 2023
Alexandros Bousdekis, Katerina Lepenioti, Dimitris Apostolou, Gregoris Mentzas
Kim et al. (2012) compared seven novelty detection methods and three different dimensionality reduction methods for detecting faulty wafers in semiconductor manufacturing. Sun, Yang, and Wang (2017) proposed a method based on the particle swarm optimization and the kernel extreme learning machine in resistance spot welding to target the accurate and fast joint quality identification. Teti (2015) applied multi-sensor signal processing for the extraction and selection of signal features for pattern recognition. Lieber et al. (2013) implemented data pre-processing and feature extraction and combined supervised and unsupervised learning methods to identify operational patterns, and quality-related features. Oliff and Liu (2017) proposed a methodology incorporating the rule-based learning algorithms C4.5 and RIPPER (Repeated Incremental Pruning to Produce Error Reduction).
Developing a knowledge-based system for diagnosis and treatment recommendation of neonatal diseases
Published in Cogent Engineering, 2023
Desalegn Wendimu, Kindie Biredagn
JRip Rule classifiers: JRip rule classifiers presented a Repeated Incremental Pruning to Produce Error Reduction (RIPPER). It is an inference and rule-based learner that can be applied to predict elements with propositional rules. JRip is a rapid classification algorithm for learning “IF-THEN”, and it has the advantage of being a high-level and symbolic knowledge representation that contributes to the discoverability of knowledge (Lehr et al., 2011).