Explore chapters and articles related to this topic
Entropy-Enthalpy Compensation and Exploratory Factor Analysis of Correlations: Are There Common Points?
Published in Evgeni Starikov, Bengt Nordén, Shigenori Tanaka, Entropy-Enthalpy Compensation, 2020
In deriving the classical-mechanical entropy definition, the authors adapt Schrodinger’s approach to introduce the entropy definition for quantum mechanics. The resulting entropy formula is valid for all ensembles, being in complete agreement with the Gibbs entropy. Remarkably, such an agreement could be achieved in the continuum limit, for discretized equations like Eq. 1,46 are not delivering consistent definitions of entropy, because then its value depends on the "cell system” [46], The cell system is just the way of taking the system’s total phase volume into cells of arbitrary dimensions. Thus, if we deal with a system of N particles, then the cell number i should physically correspond to the state of the i-th particle (meaning mathematically the communion of the particles’ three Cartesian coordinates and three Cartesian impulses). The authors present a careful mathematical analysis of such a picture and then draw their above-cited conclusion. To arrive at some general conclusion, we must anyway go over to the continuum representation of Eq. 1.46.
Information Entropy
Published in Mihai V. Putz, New Frontiers in Nanochemistry, 2020
Francisco Torrens, Gloria Castellano
Matter exploration in small regions and at short time intervals revealed the existence of reduced-size systems and new processes that happen at those scales (Rubí, 2012). Microsystems are not simply macrosystems miniatures, but they have their own structures and functions. Molecular motors, nanomotors, and active particles are autonomous microsystems that use energy-conversion mechanisms to carry out their work under the environmental influence and in out-of-equilibrium situations. Statistical physics offers a method able to describe the processes that such systems perform: complexity and nonadditive entropy Sq. A nonadditive entropy measure Sq was applied to certain complex natural, artificial and social systems (Tsallis & Plastino, 2012). The probability distributions that maximize Sq under restrictions showed ubiquity and robustness. The mechanisms that explain distributions origin imply a generalization (nonextensive statistical mechanics) of Boltzmann–Gibbs entropy theory.
Nanothermodynamics: Fundamentals and Applications
Published in Klaus D. Sattler, 21st Century Nanoscience – A Handbook, 2020
Vladimir García-Morales, Javier Cervera, José A. Manzanares
Tsallis entropy includes the Boltzmann entropy and the Gibbs entropy equations as particular cases when q → 1 as it can be transformed to Sq=k∑jpj(1∕pj)1-q-11-q=k∑jpjlnq1pj,
The paradigm of complex probability and Ludwig Boltzmann's entropy
Published in Systems Science & Control Engineering, 2018
Moreover, a direct connection can be made between the two. If the probabilities in question are the thermodynamic probabilities : the (reduced) Gibbs entropy can then be seen as simply the amount of Shannon information needed to define the detailed microscopic state of the system, given its macroscopic description. Or, in the words of Gilbert Newton Lewis writing about chemical entropy in 1930, ‘Gain in entropy always means loss of information, and nothing more.’ To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate.