Explore chapters and articles related to this topic
Machine learning classifier for fault classification in photovoltaic system
Published in Rajesh Singh, Anita Gehlot, Intelligent Circuits and Systems, 2021
V.S. Bharath Kurukuru, Mohammed Ali Khan, Ahteshamul Haque, Arun Kumar Tripathi
Machine learning (Hale 1981) is a procedure of data analysis that programs systematic model structure using techniques that iteratively learn from data. These techniques include the decision trees, k-nearest neighbours, non-deterministic classification techniques such as the support vector machine (SVM) and naïve Bayes (NB) classifier, which relies on probability theory. In general, these learning methods can be categorized into two categories: the eager learning method and the lazy learning method. In the eager learning method, the classification model is constructed using labelled training data: the support vector machine is commonly described as an eager learner. On the other hand, in the lazy learning method, the process of model development is delayed and the classification process is performed by observing all the relatively similar training and testing instances. Therefore, K-nearest neighbour is commonly described as a lazy learner. In this paper, SVM classifiers (Chapelle 2007), which are an eager learner, are used to automate the fault classification process due to its training time and pre-calculated algorithm.
Signature Generation Algorithms for Polymorphic Worms
Published in Mohssen Mohammed, Al-Sakib Khan Pathan, Automatic Defense Against Zero-day Polymorphic Worms in Communication Networks, 2016
Mohssen Mohammed, Al-Sakib Khan Pathan
Instance-based learning is considered another category under the header of statistical methods. Instance-based learning algorithms are lazy-learning algorithms as they delay the induction or generalization process until classification is performed. Lazy-learning algorithms require less computation time during the training phase than eager-learning algorithms (such as decision trees, neural and Bayes nets) but more computation time during the classification process. One of the most straightforward instance-based learning algorithms is the nearest neighbor algorithm.
A Study and Analysis of an Emotion Classification and State-Transition System in Brain Computer Interfacing
Published in Qurban A. Memon, Shakeel Ahmed Khoja, Data Science, 2019
Subhadip Pal, Shailesh Shaw, Tarun Saurabh, Yashwant Kumar, Sanjay Chakraborty
KNN is also known as lazy learning classifier. Decision tree and rule-based classifiers are designed to learn a model that maps the input attributes to the class labels as soon as the training data becomes available, and thus, they are known as eager learning classifiers. Unlike eager learning classifier, KNN does not construct a classification model from data, and it performs classification by matching the test instance with K training examples and decides its class based on the similarity to KNNs. Briefly, the approach is stated, as mentioned below:
A data-driven predictive system using Case-Based Reasoning for the configuration of device-assisted back pain therapy
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2021
Juan A. Recio-García, Belén Díaz-Agudo, Alireza Kazemi, Jose Luis Jorro
CBR is particularly applicable to problems where earlier cases are available, even when the domain is not understood well enough for a deep domain model. CBR is a lazy learning method. This term refers to any machine learning process that defers the majority of computation to consultation time. Lazy learning stands in contrast to the traditional eager learning in which the majority of computation occurs at training time before receiving queries. The main advantage in employing a lazy learning method, such as case-based reasoning, is that the target function will be approximated locally for each query to the system and deal successfully with dynamic changes in the problem domain, i.e., CBR is reactive to the dynamical incorporation of new cases during the system behaviour. The lazy learning approach used by CBR is well suited for this domain and better than other traditional machine learning algorithms that require a special training phase whenever information is extracted (knowledge generalisation), which makes on-line adaptation difficult. In fact, learning in dynamic environments is not possible for eager learning methods. Another advantage of CBR over eager machine learning methods is that, in general, machine learning techniques are data oriented: they model the relationships contained in the training data set. That means that they cannot work on the so called cold start situation where not representative training data is available or where we don’t have a representative selection from part of the problem domain. The resulting model may be biased. Bias is a source of error in your model that causes it to over-generalise and underfit your data.