Explore chapters and articles related to this topic
Bayes Theorem, Causality and Building Blocks for Bayesian Networks
Published in G. Unnikrishnan, Oil and Gas Processing Equipment, 2020
The conditional entropy H (X | Y) is a measure of the uncertainty of X given an observation on Y, while the mutual information I (X, Y) is a measure of the information shared by X and Y (Netica, 2017). If X is the variable of interest, then I (X, Y) is a measure of the value of observing Y. The mutual information is computed as I(X, Y)=H(X)−H(X |Y)=H(Y)−H(Y|X)=∑YP(Y)∑XP(X |Y)logP(X, Y)P(X)P(Y)
Feature Selection
Published in Jan Žižka, František Dařena, Arnošt Svoboda, Text Mining with Machine Learning, 2019
Jan Žižka, František Dařena, Arnošt Svoboda
The joint entropy can also be related to the conditional entropy. The conditional entropy measures how much information is contained in one variable when the other is known. When we know the value of variable X, the conditional entropy H(Y |X = x) can be calculated as follows: H(Y|X=x)=−∑y∈Yp(y|x)log(y|x)
Shannon entropy and the basics of information theory
Published in Jürgen Bierbrauer, Introduction to Coding Theory, 2016
8.13 Definition. Theconditional entropyH(X|Y) is defined byH(X|Y)=∑jqj·H(X|Y=j)=−∑i,jp(i,j)·log(p(i|j)).
Deep CNN and geometric features-based gastrointestinal tract diseases detection and classification from wireless capsule endoscopy images
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2021
Muhammad Sharif, Muhammad Attique Khan, Muhammad Rashid, Mussarat Yasmin, Farhat Afza, Urcun John Tanik
The above formulation selects features into distinct parameters as and , where shows distinct fused features merged by a parallel approach and denotes similar features of both vectors. Thereafter, ED is performed on and redundant features are removed to select only one feature of the same value. Later on, both distinct feature vectors are combined into one matrix by serial approach. After that, geometric features obtained in 'Lesion detection' section are simply merged in fused deep CNN vector. A conditional entropy-based approach is applied on fused vector and best features are selected. In the literature, conditional entropy is mostly used to control the uncertainty and randomness in CV and machine learning areas and not for features selection. Instead, Shannon entropy is used for features selection (Khan, Gani, Wahab, & Singh, 2018).
A feature selection method with feature ranking using genetic programming
Published in Connection Science, 2022
Guopeng Liu, Jianbin Ma, Tongle Hu, Xiaoying Gao
Lin et al. (2008) constructed a classifier using layered genetic programming, which had the characteristic of feature selection and feature extraction. Neshatian and Zhang (2012) combined information entropy and conditional entropy to evaluate the correlation between feature sets and classes for feature selection.