Explore chapters and articles related to this topic
Coding and Modulation for Free-Space Optical Communications
Published in Hamid Hemmati, Near-Earth Laser Communications, 2018
Let h(p) be the binary entropy function, h(p) = p log2(1/p) + (1 − p) log2(1/(1 − p)). COOK(M, ns, nb) has a horizontal asymptote at h(1/M), the limit imposed by restricting the input to duty cycle 1/M. Let COOK(ns,nb)=maxMCOOK(M,ns,nb),
Shannon entropy and the basics of information theory
Published in Jürgen Bierbrauer, Introduction to Coding Theory, 2016
8.3 Definition. Thebinary entropy functionis defined byh(x) = −xlog(x) − (1 − x)log(1 − x) (for 0 ≤ x ≤ 1).
Dynamical intricacy and average sample complexity
Published in Dynamical Systems, 2018
Karl Petersen, Benjamin Wilson
Note that if it intersects at most ⌊(1 − ϵ)(2n/k)⌋ of the ⌈2n/k⌉ sets Ki. To create any subset we choose ⌈(2n/k)ϵ⌉ intervals Ki for S to not intersect and then pick a subset (could be empty) from the rest of the ⌊2n/k(1 − ϵ)⌋ intervals Ki to intersect S. The same subset can be produced this way many times. Thus, According to Stirling's approximation, there is a constant c such that for all m. This implies We will show by showing that for each ϵ > 0 we can find a k such that 2ϵϵ(2/k)ϵ(1 − ϵ)(2/k)(1 − ϵ) > 1. Denote the binary entropy function by To show 2ϵϵ(2/k)ϵ(1 − ϵ)(2/k)(1 − ϵ) > 1, we take the logarithm of both sides of the inequality and show This would follow from By basic calculus H(ϵ) ≤ log (2); thus, if k > 2/ϵ, Equation (3.10) is satisfied and therefore Equation (3.9) is satisfied.
A note on circle maps driven by strongly expanding endomorphisms on
Published in Dynamical Systems, 2018
We now fix n and let k = [δn]. Since we have it follows from the above estimate that In the last inequality, we used the fact that where H(p) is the binary entropy function.