Explore chapters and articles related to this topic
Machine Learning for Solving a Plethora of Internet of Things Problems
Published in Kamal Kumar Sharma, Akhil Gupta, Bandana Sharma, Suman Lata Tripathi, Intelligent Communication and Automation Systems, 2021
Sparsh Sharma, Abrar Ahmed, Mohd Naseem, Surbhi Sharma
Naïve Bayes (NB) is a classification algorithm based on Bayes’ theorem [2]. The Bayesian network introduced by Pearl in 1988 is a high-level representation of probability distribution over a set of variables [34]. Naïve Bayes is a probabilistic classification algorithm as it uses probability to make predictions for the purpose of classification [35]. It doesn’t require a large amount of data for training to estimating the necessary parameters for classification [36]. The presence of a particular feature in a class is not related to the presence of another feature; this is the assumption of a simple base classifier [37]. It is simple and fast to predict class datasets as well as multiclass prediction. It handles real and discrete data. [35]. Naïve Bayes is a faster, easier to implement and very effective algorithm in ML. NB is usually beneficial for high-dimensional data, where the probability of every feature is independently estimated [34]. It is one of the best and most important algorithms in data mining, as described by Wu et al. [38].
Intelligent System for Real-time Stability Prediction
Published in Yan Xu, Yuchen Zhang, Zhao Yang Dong, Rui Zhang, Intelligent Systems for Stability Assessment and Control of Smart Power Grids, 2020
Yan Xu, Yuchen Zhang, Zhao Yang Dong, Rui Zhang
For a classification task, a successful probabilistic classification model should be able to numerically describe the uncertainties existing in the problem, and estimate the probability of each class. For voltage instability detection, the result from a probabilistic classification model is the probability of the post-disturbance voltage being stable or unstable. To achieve this, the probabilistic classification model is constructed, based on the framework in (C. Wan, et al., 2014). Its working principle is shown in Fig. 6.21.
A Survey on Data Level Techniques – A Customer Churn Prediction Case Study
Published in Durgesh Kumar Mishra, Nilanjan Dey, Bharat Singh Deora, Amit Joshi, ICT for Competitive Strategies, 2020
Gillala Rekha, Shaveta Malik, Amit Kumar Tyagi, V Krishna Reddy
Naive Bayes: A Bayes classifier is a probabilistic classification algorithm. It is based on Bayes’ theorem. It is an independent feature model, with prior and posteriori probability estimates. A Naive Bayes (NB) classifier assumes that the incidence (or nonexistence) of a particular attribute of a class (i.e., customer churn) is not related to the incidence (or nonexistence) of any other attribute. The NB classifier achieved good results on the churn prediction problem for the wireless telecommunications industry.
A study on the evaluation of tokenizer performance in natural language processing
Published in Applied Artificial Intelligence, 2023
NB is a probabilistic classification algorithm based on Bayes’ Theorem. Given a class, this algorithm assumes that all properties are independent of each other and is generally used widely to reduce parameters. The NB consists of Bayesian rules, conditional independence assumptions, and classification rules for input data. Compared to other complex graphic models, NB has the advantage that the number of data required to estimate parameters required for classification is small. kNN is an algorithm that calculates the distance between a training sample and a test sample in a data set and classifies it based on adjacent elements. It has the advantage of straightforward interpretation and short calculation time. SVM is mainly used for classification problems as a supervised learning model for pattern recognition and data analysis (Kim et al. 2018a). This algorithm aims to find a hyperplane that can maximize the margins in the feature space where the data is mapped.
Determining the most accurate machine learning algorithms for medical diagnosis using the monk’ problems database and statistical measurements
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2023
The NB classifier is a simple probabilistic classification method based on Bayes’ theorem. In Bayes’ theorem, in cases where two random events occur consecutively and are independent of each other, it can be represented by the expression probability of occurrence of the second event if one of these two events ( ve ) occurs. Thanks to the change feature, the multiplication rule can be written with two different expressions as in Equation 3;