Explore chapters and articles related to this topic
D
Published in Philip A. Laplante, Comprehensive Dictionary of Electrical Engineering, 2018
differential amplifier arithmetic average of the two signals. That is, (v 1 + v 2 )/2 differential amplifier an amplifier intended to respond only to the difference between its input voltages, while rejecting any signal common to both inputs. The differential amplifier is designed such that the difference between the two inputs is amplified (high differential gain), while the signals appearing at either individual input (referenced to ground potential) sees a very low gain (low commonmode gain, usually loss). The differential amplifier is usually used as the first component at the receiving end of a communications link using twisted pair cable (either shielded or unshielded) as the transmission medium. This provides a method to reject any common-mode noise induced onto the twisted pair transmission line, including commonmode noise falling within the useful bandwidth of the communications link. The figure of merit for the differential amplifier is its common mode rejection ratio (CMRR), computed by dividing the differential-mode gain by the common-mode gain. differential coding a coding scheme that codes the differences between samples. See predictive coding. differential entropy the entropy of a continuous random variable. For a random variable X , with probability density function f (x) on the support set S, the differential entropy h(X ) is defined as h(X ) = - f (x) log f (x) two input signals. The differential gain may be expressed in percentage form by multiplying the above amplification factor by 100, or in decibels by multiplying the common logarithm of the differential gain by 20. differential inclusion a multivalued differential equation, x F(t, x) , where F(t, x) is a nonempty set of velocity vectors at x Rn for each time t on some time interval. The set F(t, x) can be viewed as the set of all possible "velocities" x(t) of a dynamical system modeled by the multivalued, or multifunction, differential equation. A solution x(t) is an absolutely continuous function on some time interval whose velocity vector x lies in the set F(t, x) for almost all t. See also Filippov method. differential kinematics equation v = J (q)q can be interpreted as the differential kinematics mapping relating the n components of the joint velocity vector to the r m components of the velocity vector v of concern for the specific task. Here n denotes number of degrees of mobility of the structure, m is the number of operational space variables, and r is the number of operational space variables necessary to specify a given task. See also geometric Jacobian. differential length vector the vector sum of the differential length changes in each of the three coordinate directions along a given curve. differential mode gain for a differential amplifier, the ratio of the output signal amplitude to the amplitude of the difference signal between the amplifier input terminals. differential pair a two-transistor BJT (FET) amplifier in which a differential input signal is applied to the base (gate) terminals of the two transistors, the output is taken differentially from the collector (drain) terminals, and the emitter (source)
The growth and size of orogenic gold systems: probability and dynamical behaviour
Published in Australian Journal of Earth Sciences, 2023
Each probability distribution is characterised by several parameters as indicated in Figure 1 and Table 1. A convenient measure of a distribution is its differential entropy, which is a function of the probability distribution parameters (Table 1) and is an indication of the degree of order in the system. Another way of thinking of the entropy is that it is a measure of uncertainty in the system. Systems with high entropy have low order and high uncertainty. Systems with low entropy have high order and low uncertainty. The differential entropy is different from the classical Shannon entropy and can be negative. In Figure 1, distributions that plot to the left of each diagram (and resemble power law distributions) have low entropies, whereas those that plot to the right (and resemble normal distributions) have high entropy.
Design of combustion experiments using differential entropy
Published in Combustion Theory and Modelling, 2022
Éva Valkó, Máté Papp, Márton Kovács, Tamás Varga, István Gy. Zsély, Tibor Nagy, Tamás Turányi
Sheen and Manion [21] developed an algorithm to define a set of measurements that minimise the uncertainty of a given model output. They called this method ‘Experimental Design through Differential Information’ (EDDI). The method uses differential entropy as a tool to measure the predictive potential of experimental datasets. Like its discrete counterpart, differential entropy is a measure of average surprisal of a random variable, to continuous probability distributions. Sheen and Manion summarised the goal of experimental design as finding a set of measurements that minimise the uncertainty of some simulated values of the model. This is corresponding to finding a set of measurements that minimise the Shannon information: where is the expected value functional and is the probability density function for the given discrete random variable , which can be the simulation result of the optimised model, the value of an estimated kinetic parameter, etc. In the case of an arbitrary multivariate normal distribution with a given covariance matrix, it can be shown that information entropy is a linear function of the logarithm of the determinant of the corresponding covariance matrix. If the covariance matrix is diagonal (i.e. also in the single-variable case), the amount of information will be a linear function of the logarithm of the product of variances of the components of variable .
Kernel Estimation of Mathai-Haubold Entropy and Residual Mathai-Haubold Entropy Functions under α-Mixing Dependence Condition
Published in American Journal of Mathematical and Management Sciences, 2022
Shannon (1948), made his breakthrough in statistics through proposing a measure of uncertainty associated with a discrete random variable, as a generalization of Boltzman-Gibbs entropy of classical statistical mechanics. Later this measure was known in the literature as Shannon information measure or Shannon entropy. A direct extension of Shannon entropy in the discrete case to the continuous case is known in the literature as differential entropy. Let X be a non-negative random variable admitting an absolutely continuous distribution function F(x) with probability density function (pdf) f(x). Then the Shannon entropy associated with X is defined as