Explore chapters and articles related to this topic
Literature review and proposed framework
Published in Juan Carlos Chacon-Hurtado, Optimisation of Dynamic Heterogeneous Rainfall Sensor Networks in the Context of Citizen Observatories, 2019
For multiple random variables, the information contained in the set is not necessarily independent. Joint entropy (JH) measures the amount of non-redundant information which is captured by the sensor network (set). Therefore, the measurement of the joint entropy can provide information about the diversity of measurements that the network is able to capture. The joint entropy (JH) can be calculated for a set of n discrete variables as (Equation 2.16). JH(X1,…,Xn)=−∑x1I…∑xnIp(x1,…,xn)logp(x1,…,xn)
Entropy based multi-criteria evaluation for rainfall monitoring networks under seasonal effect
Published in Chongfu Huang, Zoe Nivolianitou, Risk Analysis Based on Data and Crisis Response Beyond Knowledge, 2019
Heshu Li, Dong Wang*, Yuankun Wang
The unit of entropy is bit. To measure the total uncertainty or information contained in two or more variables, joint entropy is defined. For the bivariate case, it is formulated as: HX,Y=−∑i=1m∑j=1npxi,yjlog2pxi,yj
Evaluating Image Fusion Performance
Published in Hassen Fourati, Krzysztof Iniewski, Multisensor Data Fusion, 2016
where p(u, v) is the joint probability distribution function of U and V, and p(u) and p(v) are the marginal probability distribution functions of U and V respectively. In fact, MI quantifies the distance between the joint distribution of U and V, that is, p(u, v), and the joint distribution when U and V are independent, that is, p(u) p(v). Mutual information can be equivalently expressed with joint entropy {H(U, V)} and marginal entropy {H(U), H(V)} of the two variables U and V as () MI(U,V)=H(U)+H(V)−H(V,U)
Coal free-swelling index modeling by an ordinal-based soft computing approach
Published in International Journal of Coal Preparation and Utilization, 2023
M. Pirizadeh, M. Manthouri, S. Chehreh Chelgani
Nevertheless, some newer and more capable methods have received a lot of attention from researchers in recent years, which do not suffer from the enumerated shortcomings. In this regard, one of the most efficient methods for feature selection is the Mutual information (MI) method, which firstly can measure nonlinear relationships between parameters as well as linear relationships and secondly can work well with categorical variables. Considering two random variables X and Y, the entropy of X denoted by H(X) computes its uncertainty. Also, H(X, Y) is the joint entropy of X and Y. The Conditional entropy computed by H(Y|X) = H(X, Y) – H(X) shows the uncertainty of Y given that the variable X is observed. MI between two random variables is a nonlinear function to measure the amount of information possessed about a variable when the other variable is observed. MI is calculated by I(X, Y) = H(Y) – H(Y|X) and is the reduction in the uncertainty of variable Y given the observation of variable X (Khodayar, Wang, and Manthouri 2018). As shown in Fig. 2, the MI value is non-negative, which in the original version is zero if the two random variables are independent, and greater values indicate more dependence. However, the MI values have scaled between zero (no MI) and one (full relationship) in the normalized version.
Are temperature time series measured at hydrometric stations representative of the river’s thermal regime?
Published in Canadian Water Resources Journal / Revue canadienne des ressources hydriques, 2023
Habiba Ferchichi, André St-Hilaire
The joint entropy is considered as a measure of the uncertainty (heterogeneity) in two random variables taken together. Two stations with very similar temperature distributions have low joint entropy. It can also be described by the following equation: