Explore chapters and articles related to this topic
Texture Feature Extraction
Published in R. Suganya, S. Rajaram, A. Sheik Abdullah, Big Data in Medical Image Processing, 2018
R. Suganya, S. Rajaram, A. Sheik Abdullah
The disorderliness of an image is given by entropy. Entropy is a measure of information content. It measures the randomness of intensity distribution when all the elements of the matrix are maximally random; entropy has its highest value. So a homogeneous image has lower entropy than an inhomogenous image. In fact, when energy gets higher, entropy should get lower. Such a matrix corresponds to an image in which there are no preferred gray level pairs for the distance vector d. () −∑i=0G−1∑j=0G−1P(i,j)xlog(P(i,j))
The Media Industries: Segments, Structures, and Similarities
Published in Joan Van Tassel, Lisa Poe-Howfield, Managing Electronic Media, 2012
Joan Van Tassel, Lisa Poe-Howfield
For managers, the distinction is important, because each type of material comes with its own set of standards. Entertainment content must attract viewers and users; information content must inform them without alienating them. On the whole, the creation and distribution of entertainment products do not entail liability, although there may be criminal penalties for engagement with porno-graphic, treasonous, or other societally prohibited material.
Understanding Data Sources
Published in Praveen Kumar, Jay Alameda, Peter Bajcsy, Mike Folk, Momcilo Markus, Hydroinformatics: Data Integrative Approaches in Computation, Analysis, and Modeling, 2005
From a data analysis viewpoint, we are looking for maximum information content and a minimum information loss in a data acquisition system. Information content represents primarily (1) spatial locations of measurements, (2) time of data acquisitions, and (3) calibrated values. The word calibrated refers to the value in physical units, for example, [—20°C, 50°C] rather than to the raw value coming out of an analog to digital converter (ADC). Spatial and temporal information about each measurement can be obtained by adding an appropriate sensor. For example, spatial locations of MICAs can be measured by attaching Global Position System (GPS) sensors in outdoor environments or by measuring a time delay between acoustic and electro-magnetic waves (acoustic time-of-flight ranging) [46-48]. In order to save (1) the cost of a time sensor and (2) the energy consumed by time data acquisition, a relative time can be derived from a processor’s counter since there is no internal clock on a MICA board. Although one eliminates the need for a time sensor, the problem of synchronization of data acquisitions arises. Regardless of a chosen synchronization technique [49,50], for instance, by broadcasting a RESET signal to all sensors and adding the offset due to a distance [51,52], it is important to incorporate the uncertainty of spatial and temporal measurements into the data analysis. One should also be aware that measuring and transmitting spatial and temporal information would consume not only bandwidth but also energy. Lastly, a conversion of raw values to calibrated values in engineering units is a critical step to provide information content. There are sensors that (1) are calibrated, (2) come with calibration equations, or (3) have to be calibrated with a calibration gauge. For instance, the raw values from the thermistor on MTS101CA [53] can be converted to degrees Kelvin with accuracy of ±0.2 K by using the following approximation over the 0°Cto50°C temperature range 1/T(K)=a+b*Ln(Rthr)+c*[Ln(Rthr)]3,
Design creativity and the semantic analysis of conversations in the design studio
Published in International Journal of Design Creativity and Innovation, 2021
Hernan Casakin, Georgi V. Georgiev
Information content is considered as a fundamental phenomenon of human language and thinking. It is defined as the amount of information transmitted by a specific unit of language in a certain context (Georgiev & Georgiev, 2018). For example, in this study nouns are a unit of language. Information Content measures the degree of informativeness of a unit. Thus, units with higher Information Content have a lower probability of occurrence in more general contexts (Meymandpour & Davis, 2016). In design, sharp drops in Information Content are found to be effective in quantifying design fixation while generating new ideas (Gero, 2011). Measuring entropy, i.e., lack of or gradual decline into disorder, based on Information Content in linkography, has been useful in detecting high and low scores in creativity during design sessions (Kan & Gero, 2017). Moreover, dissimilar levels of Information Content demonstrate different degrees of usefulness of solutions for designers in context of function-based models of design (Sen et al., 2010).
Informational analysis of the Canadian National Hydrometric program monitoring network
Published in Canadian Water Resources Journal / Revue canadienne des ressources hydriques, 2023
James M. Leach, Jongho Keum, Jeffrey Karn, Megan Garner, Paulin Coulibaly
Where is a cost parameter (), is the -insensitivity function that adds a penalty when the residual is greater than (), and for and for are regression coefficients. is a kernel function (a radial basis function was used), is a row vector sampled from the matrix is row of the matrix n is the number of time steps of the hydrometric data being used, and p is the number of hydrometric stations in the network being evaluated not including station d. Stations which have unique data relative to other stations in the network will be more difficult to reproduce using these regression methods; these stations will be further identified through the information theory analysis. The information theory analysis can be used to measure, or quantify, the information content of the network as entropy.