Sample records for classic maximum entropy

  1. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    PubMed

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  2. Classic Maximum Entropy Recovery of the Average Joint Distribution of Apparent FRET Efficiency and Fluorescence Photons for Single-molecule Burst Measurements

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2012-01-01

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694

  3. Reconstruction of calmodulin single-molecule FRET states, dye interactions, and CaMKII peptide binding by MultiNest and classic maximum entropy

    NASA Astrophysics Data System (ADS)

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2013-08-01

    We analyzed single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.

  4. Reconstruction of Calmodulin Single-Molecule FRET States, Dye-Interactions, and CaMKII Peptide Binding by MultiNest and Classic Maximum Entropy

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2013-01-01

    We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data. PMID:24223465

  5. Reconstruction of Calmodulin Single-Molecule FRET States, Dye-Interactions, and CaMKII Peptide Binding by MultiNest and Classic Maximum Entropy.

    PubMed

    Devore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2013-08-30

    We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca 2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.

  6. Maximum Entropy Calculations on a Discrete Probability Space

    DTIC Science & Technology

    1986-01-01

    constraints acting besides normalization. Statement 3: " The aim of this paper is to show that the die experiment just spoken of has solutions by classical ...analysis. Statement 4: We snall solve this problem in a purely classical way, without the need for recourse to any exotic estimator, such as ME." Note... The I’iximoun Entropy Principle lin i rejirk.ible -series ofT papers beginning in 1957, E. T. J.ayiieti (1957) be~gan a revuluuion in inductive

  7. Elements of the cognitive universe

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2017-06-01

    "The least biased inference, taking available information into account, is the one with maximum entropy". So we are taught by Jaynes. The many followers from a broad spectrum of the natural and social sciences point to the wisdom of this principle, the maximum entropy principle, MaxEnt. But "entropy" need not be tied only to classical entropy and thus to probabilistic thinking. In fact, the arguments found in Jaynes' writings and elsewhere can, as we shall attempt to demonstrate, profitably be revisited, elaborated and transformed to apply in a much more general abstract setting. The approach is based on game theoretical thinking. Philosophical considerations dealing with notions of cognition - basically truth and belief - lie behind. Quantitative elements are introduced via a concept of description effort. An interpretation of Tsallis Entropy is indicated.

  8. Towards operational interpretations of generalized entropies

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2010-12-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  9. Convex foundations for generalized MaxEnt models

    NASA Astrophysics Data System (ADS)

    Frongillo, Rafael; Reid, Mark D.

    2014-12-01

    We present an approach to maximum entropy models that highlights the convex geometry and duality of generalized exponential families (GEFs) and their connection to Bregman divergences. Using our framework, we are able to resolve a puzzling aspect of the bijection of Banerjee and coauthors between classical exponential families and what they call regular Bregman divergences. Their regularity condition rules out all but Bregman divergences generated from log-convex generators. We recover their bijection and show that a much broader class of divergences correspond to GEFs via two key observations: 1) Like classical exponential families, GEFs have a "cumulant" C whose subdifferential contains the mean: Eo˜pθ[φ(o)]∈∂C(θ) ; 2) Generalized relative entropy is a C-Bregman divergence between parameters: DF(pθ,pθ')= D C(θ,θ') , where DF becomes the KL divergence for F = -H. We also show that every incomplete market with cost function C can be expressed as a complete market, where the prices are constrained to be a GEF with cumulant C. This provides an entirely new interpretation of prediction markets, relating their design back to the principle of maximum entropy.

  10. Nonequilibrium Entropy in a Shock

    DOE PAGES

    Margolin, Len G.

    2017-07-19

    In a classic paper, Morduchow and Libby use an analytic solution for the profile of a Navier–Stokes shock to show that the equilibrium thermodynamic entropy has a maximum inside the shock. There is no general nonequilibrium thermodynamic formulation of entropy; the extension of equilibrium theory to nonequililbrium processes is usually made through the assumption of local thermodynamic equilibrium (LTE). However, gas kinetic theory provides a perfectly general formulation of a nonequilibrium entropy in terms of the probability distribution function (PDF) solutions of the Boltzmann equation. In this paper I will evaluate the Boltzmann entropy for the PDF that underlies themore » Navier–Stokes equations and also for the PDF of the Mott–Smith shock solution. I will show that both monotonically increase in the shock. As a result, I will propose a new nonequilibrium thermodynamic entropy and show that it is also monotone and closely approximates the Boltzmann entropy.« less

  11. Nonequilibrium Entropy in a Shock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolin, Len G.

    In a classic paper, Morduchow and Libby use an analytic solution for the profile of a Navier–Stokes shock to show that the equilibrium thermodynamic entropy has a maximum inside the shock. There is no general nonequilibrium thermodynamic formulation of entropy; the extension of equilibrium theory to nonequililbrium processes is usually made through the assumption of local thermodynamic equilibrium (LTE). However, gas kinetic theory provides a perfectly general formulation of a nonequilibrium entropy in terms of the probability distribution function (PDF) solutions of the Boltzmann equation. In this paper I will evaluate the Boltzmann entropy for the PDF that underlies themore » Navier–Stokes equations and also for the PDF of the Mott–Smith shock solution. I will show that both monotonically increase in the shock. As a result, I will propose a new nonequilibrium thermodynamic entropy and show that it is also monotone and closely approximates the Boltzmann entropy.« less

  12. Applications of quantum entropy to statistics

    NASA Astrophysics Data System (ADS)

    Silver, R. N.; Martz, H. F.

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.

  13. Applications of the principle of maximum entropy: from physics to ecology.

    PubMed

    Banavar, Jayanth R; Maritan, Amos; Volkov, Igor

    2010-02-17

    There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.

  14. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  15. Generalized Maximum Entropy

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Stutz, John

    2005-01-01

    A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].

  16. Third law of thermodynamics in the presence of a heat flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camacho, J.

    1995-01-01

    Following a maximum entropy formalism, we study a one-dimensional crystal under a heat flux. We obtain the phonon distribution function and evaluate the nonequilibrium temperature, the specific heat, and the entropy as functions of the internal energy and the heat flux, in both the quantum and the classical limits. Some analogies between the behavior of equilibrium systems at low absolute temperature and nonequilibrium steady states under high values of the heat flux are shown, which point to a possible generalization of the third law in nonequilibrium situations.

  17. Interuniversal entanglement in a cyclic multiverse

    NASA Astrophysics Data System (ADS)

    Robles-Pérez, Salvador; Balcerzak, Adam; Dąbrowski, Mariusz P.; Krämer, Manuel

    2017-04-01

    We study scenarios of parallel cyclic multiverses which allow for a different evolution of the physical constants, while having the same geometry. These universes are classically disconnected, but quantum-mechanically entangled. Applying the thermodynamics of entanglement, we calculate the temperature and the entropy of entanglement. It emerges that the entropy of entanglement is large at big bang and big crunch singularities of the parallel universes as well as at the maxima of the expansion of these universes. The latter seems to confirm earlier studies that quantum effects are strong at turning points of the evolution of the universe performed in the context of the timeless nature of the Wheeler-DeWitt equation and decoherence. On the other hand, the entropy of entanglement at big rip singularities is going to zero despite its presumably quantum nature. This may be an effect of total dissociation of the universe structures into infinitely separated patches violating the null energy condition. However, the temperature of entanglement is large/infinite at every classically singular point and at maximum expansion and seems to be a better measure of quantumness.

  18. Thermodynamics of an ideal generalized gas: II. Means of order alpha.

    PubMed

    Lavenda, B H

    2005-11-01

    The property that power means are monotonically increasing functions of their order is shown to be the basis of the second laws not only for processes involving heat conduction, but also for processes involving deformations. This generalizes earlier work involving only pure heat conduction and underlines the incomparability of the internal energy and adiabatic potentials when expressed as powers of the adiabatic variable. In an L-potential equilibration, the final state will be one of maximum entropy, whereas in an entropy equilibration, the final state will be one of minimum L. Unlike classical equilibrium thermodynamic phase space, which lacks an intrinsic metric structure insofar as distances and other geometrical concepts do not have an intrinsic thermodynamic significance in such spaces, a metric space can be constructed for the power means: the distance between means of different order is related to the Carnot efficiency. In the ideal classical gas limit, the average change in the entropy is shown to be proportional to the difference between the Shannon and Rényi entropies for nonextensive systems that are multifractal in nature. The L potential, like the internal energy, is a Schur convex function of the empirical temperature, which satisfies Jensen's inequality, and serves as a measure of the tendency to uniformity in processes involving pure thermal conduction.

  19. Quantitative LC-MS of polymers: determining accurate molecular weight distributions by combined size exclusion chromatography and electrospray mass spectrometry with maximum entropy data processing.

    PubMed

    Gruendling, Till; Guilhaus, Michael; Barner-Kowollik, Christopher

    2008-09-15

    We report on the successful application of size exclusion chromatography (SEC) combined with electrospray ionization mass spectrometry (ESI-MS) and refractive index (RI) detection for the determination of accurate molecular weight distributions of synthetic polymers, corrected for chromatographic band broadening. The presented method makes use of the ability of ESI-MS to accurately depict the peak profiles and retention volumes of individual oligomers eluting from the SEC column, whereas quantitative information on the absolute concentration of oligomers is obtained from the RI-detector only. A sophisticated computational algorithm based on the maximum entropy principle is used to process the data gained by both detectors, yielding an accurate molecular weight distribution, corrected for chromatographic band broadening. Poly(methyl methacrylate) standards with molecular weights up to 10 kDa serve as model compounds. Molecular weight distributions (MWDs) obtained by the maximum entropy procedure are compared to MWDs, which were calculated by a conventional calibration of the SEC-retention time axis with peak retention data obtained from the mass spectrometer. Comparison showed that for the employed chromatographic system, distributions below 7 kDa were only weakly influenced by chromatographic band broadening. However, the maximum entropy algorithm could successfully correct the MWD of a 10 kDa standard for band broadening effects. Molecular weight averages were between 5 and 14% lower than the manufacturer stated data obtained by classical means of calibration. The presented method demonstrates a consistent approach for analyzing data obtained by coupling mass spectrometric detectors and concentration sensitive detectors to polymer liquid chromatography.

  20. Finding the quantum thermoelectric with maximal efficiency and minimal entropy production at given power output

    NASA Astrophysics Data System (ADS)

    Whitney, Robert S.

    2015-03-01

    We investigate the nonlinear scattering theory for quantum systems with strong Seebeck and Peltier effects, and consider their use as heat engines and refrigerators with finite power outputs. This paper gives detailed derivations of the results summarized in a previous paper [R. S. Whitney, Phys. Rev. Lett. 112, 130601 (2014), 10.1103/PhysRevLett.112.130601]. It shows how to use the scattering theory to find (i) the quantum thermoelectric with maximum possible power output, and (ii) the quantum thermoelectric with maximum efficiency at given power output. The latter corresponds to a minimal entropy production at that power output. These quantities are of quantum origin since they depend on system size over electronic wavelength, and so have no analog in classical thermodynamics. The maximal efficiency coincides with Carnot efficiency at zero power output, but decreases with increasing power output. This gives a fundamental lower bound on entropy production, which means that reversibility (in the thermodynamic sense) is impossible for finite power output. The suppression of efficiency by (nonlinear) phonon and photon effects is addressed in detail; when these effects are strong, maximum efficiency coincides with maximum power. Finally, we show in particular limits (typically without magnetic fields) that relaxation within the quantum system does not allow the system to exceed the bounds derived for relaxation-free systems, however, a general proof of this remains elusive.

  1. Maximum Relative Entropy of Coherence: An Operational Coherence Measure.

    PubMed

    Bu, Kaifeng; Singh, Uttam; Fei, Shao-Ming; Pati, Arun Kumar; Wu, Junde

    2017-10-13

    The operational characterization of quantum coherence is the cornerstone in the development of the resource theory of coherence. We introduce a new coherence quantifier based on maximum relative entropy. We prove that the maximum relative entropy of coherence is directly related to the maximum overlap with maximally coherent states under a particular class of operations, which provides an operational interpretation of the maximum relative entropy of coherence. Moreover, we show that, for any coherent state, there are examples of subchannel discrimination problems such that this coherent state allows for a higher probability of successfully discriminating subchannels than that of all incoherent states. This advantage of coherent states in subchannel discrimination can be exactly characterized by the maximum relative entropy of coherence. By introducing a suitable smooth maximum relative entropy of coherence, we prove that the smooth maximum relative entropy of coherence provides a lower bound of one-shot coherence cost, and the maximum relative entropy of coherence is equivalent to the relative entropy of coherence in the asymptotic limit. Similar to the maximum relative entropy of coherence, the minimum relative entropy of coherence has also been investigated. We show that the minimum relative entropy of coherence provides an upper bound of one-shot coherence distillation, and in the asymptotic limit the minimum relative entropy of coherence is equivalent to the relative entropy of coherence.

  2. Markov Chain Analysis of Musical Dice Games

    NASA Astrophysics Data System (ADS)

    Volchenkov, D.; Dawin, J. R.

    2012-07-01

    A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.

  3. Musical Markov Chains

    NASA Astrophysics Data System (ADS)

    Volchenkov, Dima; Dawin, Jean René

    A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.

  4. Ciliates learn to diagnose and correct classical error syndromes in mating strategies

    PubMed Central

    Clark, Kevin B.

    2013-01-01

    Preconjugal ciliates learn classical repetition error-correction codes to safeguard mating messages and replies from corruption by “rivals” and local ambient noise. Because individual cells behave as memory channels with Szilárd engine attributes, these coding schemes also might be used to limit, diagnose, and correct mating-signal errors due to noisy intracellular information processing. The present study, therefore, assessed whether heterotrich ciliates effect fault-tolerant signal planning and execution by modifying engine performance, and consequently entropy content of codes, during mock cell–cell communication. Socially meaningful serial vibrations emitted from an ambiguous artificial source initiated ciliate behavioral signaling performances known to advertise mating fitness with varying courtship strategies. Microbes, employing calcium-dependent Hebbian-like decision making, learned to diagnose then correct error syndromes by recursively matching Boltzmann entropies between signal planning and execution stages via “power” or “refrigeration” cycles. All eight serial contraction and reversal strategies incurred errors in entropy magnitude by the execution stage of processing. Absolute errors, however, subtended expected threshold values for single bit-flip errors in three-bit replies, indicating coding schemes protected information content throughout signal production. Ciliate preparedness for vibrations selectively and significantly affected the magnitude and valence of Szilárd engine performance during modal and non-modal strategy corrective cycles. But entropy fidelity for all replies mainly improved across learning trials as refinements in engine efficiency. Fidelity neared maximum levels for only modal signals coded in resilient three-bit repetition error-correction sequences. Together, these findings demonstrate microbes can elevate survival/reproductive success by learning to implement classical fault-tolerant information processing in social contexts. PMID:23966987

  5. Quantum chaos: An entropy approach

    NASA Astrophysics Data System (ADS)

    Sl/omczyński, Wojciech; Życzkowski, Karol

    1994-11-01

    A new definition of the entropy of a given dynamical system and of an instrument describing the measurement process is proposed within the operational approach to quantum mechanics. It generalizes other definitions of entropy, in both the classical and quantum cases. The Kolmogorov-Sinai (KS) entropy is obtained for a classical system and the sharp measurement instrument. For a quantum system and a coherent states instrument, a new quantity, coherent states entropy, is defined. It may be used to measure chaos in quantum mechanics. The following correspondence principle is proved: the upper limit of the coherent states entropy of a quantum map as ℏ→0 is less than or equal to the KS-entropy of the corresponding classical map. ``Chaos umpire sits, And by decision more imbroils the fray By which he reigns: next him high arbiter Chance governs all.'' John Milton, Paradise Lost, Book II

  6. Diffusive mixing and Tsallis entropy

    DOE PAGES

    O'Malley, Daniel; Vesselinov, Velimir V.; Cushman, John H.

    2015-04-29

    Brownian motion, the classical diffusive process, maximizes the Boltzmann-Gibbs entropy. The Tsallis q-entropy, which is non-additive, was developed as an alternative to the classical entropy for systems which are non-ergodic. A generalization of Brownian motion is provided that maximizes the Tsallis entropy rather than the Boltzmann-Gibbs entropy. This process is driven by a Brownian measure with a random diffusion coefficient. In addition, the distribution of this coefficient is derived as a function of q for 1 < q < 3. Applications to transport in porous media are considered.

  7. Linearized semiclassical initial value time correlation functions with maximum entropy analytic continuation.

    PubMed

    Liu, Jian; Miller, William H

    2008-09-28

    The maximum entropy analytic continuation (MEAC) method is used to extend the range of accuracy of the linearized semiclassical initial value representation (LSC-IVR)/classical Wigner approximation for real time correlation functions. LSC-IVR provides a very effective "prior" for the MEAC procedure since it is very good for short times, exact for all time and temperature for harmonic potentials (even for correlation functions of nonlinear operators), and becomes exact in the classical high temperature limit. This combined MEAC+LSC/IVR approach is applied here to two highly nonlinear dynamical systems, a pure quartic potential in one dimensional and liquid para-hydrogen at two thermal state points (25 and 14 K under nearly zero external pressure). The former example shows the MEAC procedure to be a very significant enhancement of the LSC-IVR for correlation functions of both linear and nonlinear operators, and especially at low temperature where semiclassical approximations are least accurate. For liquid para-hydrogen, the LSC-IVR is seen already to be excellent at T=25 K, but the MEAC procedure produces a significant correction at the lower temperature (T=14 K). Comparisons are also made as to how the MEAC procedure is able to provide corrections for other trajectory-based dynamical approximations when used as priors.

  8. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    NASA Astrophysics Data System (ADS)

    Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel

    2017-07-01

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) [13], we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.

  9. Entropy of spatial network ensembles

    NASA Astrophysics Data System (ADS)

    Coon, Justin P.; Dettmann, Carl P.; Georgiou, Orestis

    2018-04-01

    We analyze complexity in spatial network ensembles through the lens of graph entropy. Mathematically, we model a spatial network as a soft random geometric graph, i.e., a graph with two sources of randomness, namely nodes located randomly in space and links formed independently between pairs of nodes with probability given by a specified function (the "pair connection function") of their mutual distance. We consider the general case where randomness arises in node positions as well as pairwise connections (i.e., for a given pair distance, the corresponding edge state is a random variable). Classical random geometric graph and exponential graph models can be recovered in certain limits. We derive a simple bound for the entropy of a spatial network ensemble and calculate the conditional entropy of an ensemble given the node location distribution for hard and soft (probabilistic) pair connection functions. Under this formalism, we derive the connection function that yields maximum entropy under general constraints. Finally, we apply our analytical framework to study two practical examples: ad hoc wireless networks and the US flight network. Through the study of these examples, we illustrate that both exhibit properties that are indicative of nearly maximally entropic ensembles.

  10. Nonequilibrium Thermodynamics in Biological Systems

    NASA Astrophysics Data System (ADS)

    Aoki, I.

    2005-12-01

    1. Respiration Oxygen-uptake by respiration in organisms decomposes macromolecules such as carbohydrate, protein and lipid and liberates chemical energy of high quality, which is then used to chemical reactions and motions of matter in organisms to support lively order in structure and function in organisms. Finally, this chemical energy becomes heat energy of low quality and is discarded to the outside (dissipation function). Accompanying this heat energy, entropy production which inevitably occurs by irreversibility also is discarded to the outside. Dissipation function and entropy production are estimated from data of respiration. 2. Human body From the observed data of respiration (oxygen absorption), the entropy production in human body can be estimated. Entropy production from 0 to 75 years old human has been obtained, and extrapolated to fertilized egg (beginning of human life) and to 120 years old (maximum period of human life). Entropy production show characteristic behavior in human life span : early rapid increase in short growing phase and later slow decrease in long aging phase. It is proposed that this tendency is ubiquitous and constitutes a Principle of Organization in complex biotic systems. 3. Ecological communities From the data of respiration of eighteen aquatic communities, specific (i.e. per biomass) entropy productions are obtained. They show two phase character with respect to trophic diversity : early increase and later decrease with the increase of trophic diversity. The trophic diversity in these aquatic ecosystems is shown to be positively correlated with the degree of eutrophication, and the degree of eutrophication is an "arrow of time" in the hierarchy of aquatic ecosystems. Hence specific entropy production has the two phase: early increase and later decrease with time. 4. Entropy principle for living systems The Second Law of Thermodynamics has been expressed as follows. 1) In isolated systems, entropy increases with time and approaches to a maximum value. This is well-known classical Clausius principle. 2) In open systems near equilibrium entropy production always decreases with time approaching a minimum stationary level. This is the minimum entropy production principle by Prigogine. These two principle are established ones. However, living systems are not isolated and not near to equilibrium. Hence, these two principles can not be applied to living systems. What is entropy principle for living systems? Answer: Entropy production in living systems consists of multi-stages with time: early increasing, later decreasing and/or intermediate stages. This tendency is supported by various living systems.

  11. Classical many-particle systems with unique disordered ground states

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Stillinger, F. H.; Torquato, S.

    2017-10-01

    Classical ground states (global energy-minimizing configurations) of many-particle systems are typically unique crystalline structures, implying zero enumeration entropy of distinct patterns (aside from trivial symmetry operations). By contrast, the few previously known disordered classical ground states of many-particle systems are all high-entropy (highly degenerate) states. Here we show computationally that our recently proposed "perfect-glass" many-particle model [Sci. Rep. 6, 36963 (2016), 10.1038/srep36963] possesses disordered classical ground states with a zero entropy: a highly counterintuitive situation . For all of the system sizes, parameters, and space dimensions that we have numerically investigated, the disordered ground states are unique such that they can always be superposed onto each other or their mirror image. At low energies, the density of states obtained from simulations matches those calculated from the harmonic approximation near a single ground state, further confirming ground-state uniqueness. Our discovery provides singular examples in which entropy and disorder are at odds with one another. The zero-entropy ground states provide a unique perspective on the celebrated Kauzmann-entropy crisis in which the extrapolated entropy of a supercooled liquid drops below that of the crystal. We expect that our disordered unique patterns to be of value in fields beyond glass physics, including applications in cryptography as pseudorandom functions with tunable computational complexity.

  12. Maximum Tsallis entropy with generalized Gini and Gini mean difference indices constraints

    NASA Astrophysics Data System (ADS)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2017-04-01

    Using the maximum entropy principle with Tsallis entropy, some distribution families for modeling income distribution are obtained. By considering income inequality measures, maximum Tsallis entropy distributions under the constraint on generalized Gini and Gini mean difference indices are derived. It is shown that the Tsallis entropy maximizers with the considered constraints belong to generalized Pareto family.

  13. Random versus maximum entropy models of neural population activity

    NASA Astrophysics Data System (ADS)

    Ferrari, Ulisse; Obuchi, Tomoyuki; Mora, Thierry

    2017-04-01

    The principle of maximum entropy provides a useful method for inferring statistical mechanics models from observations in correlated systems, and is widely used in a variety of fields where accurate data are available. While the assumptions underlying maximum entropy are intuitive and appealing, its adequacy for describing complex empirical data has been little studied in comparison to alternative approaches. Here, data from the collective spiking activity of retinal neurons is reanalyzed. The accuracy of the maximum entropy distribution constrained by mean firing rates and pairwise correlations is compared to a random ensemble of distributions constrained by the same observables. For most of the tested networks, maximum entropy approximates the true distribution better than the typical or mean distribution from that ensemble. This advantage improves with population size, with groups as small as eight being almost always better described by maximum entropy. Failure of maximum entropy to outperform random models is found to be associated with strong correlations in the population.

  14. Sharpening the second law of thermodynamics with the quantum Bayes theorem.

    PubMed

    Gharibyan, Hrant; Tegmark, Max

    2014-09-01

    We prove a generalization of the classic Groenewold-Lindblad entropy inequality, combining decoherence and the quantum Bayes theorem into a simple unified picture where decoherence increases entropy while observation decreases it. This provides a rigorous quantum-mechanical version of the second law of thermodynamics, governing how the entropy of a system (the entropy of its density matrix, partial-traced over the environment and conditioned on what is known) evolves under general decoherence and observation. The powerful tool of spectral majorization enables both simple alternative proofs of the classic Lindblad and Holevo inequalities without using strong subadditivity, and also novel inequalities for decoherence and observation that hold not only for von Neumann entropy, but also for arbitrary concave entropies.

  15. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  16. Approximation of the ruin probability using the scaled Laplace transform inversion

    PubMed Central

    Mnatsakanov, Robert M.; Sarkisian, Khachatur; Hakobyan, Artak

    2015-01-01

    The problem of recovering the ruin probability in the classical risk model based on the scaled Laplace transform inversion is studied. It is shown how to overcome the problem of evaluating the ruin probability at large values of an initial surplus process. Comparisons of proposed approximations with the ones based on the Laplace transform inversions using a fixed Talbot algorithm as well as on the ones using the Trefethen–Weideman–Schmelzer and maximum entropy methods are presented via a simulation study. PMID:26752796

  17. Topological order, entanglement, and quantum memory at finite temperature

    NASA Astrophysics Data System (ADS)

    Mazáč, Dalimil; Hamma, Alioscia

    2012-09-01

    We compute the topological entropy of the toric code models in arbitrary dimension at finite temperature. We find that the critical temperatures for the existence of full quantum (classical) topological entropy correspond to the confinement-deconfinement transitions in the corresponding Z2 gauge theories. This implies that the thermal stability of topological entropy corresponds to the stability of quantum (classical) memory. The implications for the understanding of ergodicity breaking in topological phases are discussed.

  18. Action and entanglement in gravity and field theory.

    PubMed

    Neiman, Yasha

    2013-12-27

    In nongravitational quantum field theory, the entanglement entropy across a surface depends on the short-distance regularization. Quantum gravity should not require such regularization, and it has been conjectured that the entanglement entropy there is always given by the black hole entropy formula evaluated on the entangling surface. We show that these statements have precise classical counterparts at the level of the action. Specifically, we point out that the action can have a nonadditive imaginary part. In gravity, the latter is fixed by the black hole entropy formula, while in nongravitating theories it is arbitrary. From these classical facts, the entanglement entropy conjecture follows by heuristically applying the relation between actions and wave functions.

  19. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines.

    PubMed

    Haseli, Y

    2016-05-01

    The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  20. In Vivo potassium-39 NMR spectra by the burg maximum-entropy method

    NASA Astrophysics Data System (ADS)

    Uchiyama, Takanori; Minamitani, Haruyuki

    The Burg maximum-entropy method was applied to estimate 39K NMR spectra of mung bean root tips. The maximum-entropy spectra have as good a linearity between peak areas and potassium concentrations as those obtained by fast Fourier transform and give a better estimation of intracellular potassium concentrations. Therefore potassium uptake and loss processes of mung bean root tips are shown to be more clearly traced by the maximum-entropy method.

  1. Device-Independent Tests of Entropy

    NASA Astrophysics Data System (ADS)

    Chaves, Rafael; Brask, Jonatan Bohr; Brunner, Nicolas

    2015-09-01

    We show that the entropy of a message can be tested in a device-independent way. Specifically, we consider a prepare-and-measure scenario with classical or quantum communication, and develop two different methods for placing lower bounds on the communication entropy, given observable data. The first method is based on the framework of causal inference networks. The second technique, based on convex optimization, shows that quantum communication provides an advantage over classical communication, in the sense of requiring a lower entropy to reproduce given data. These ideas may serve as a basis for novel applications in device-independent quantum information processing.

  2. Maximum work extraction and implementation costs for nonequilibrium Maxwell's demons.

    PubMed

    Sandberg, Henrik; Delvenne, Jean-Charles; Newton, Nigel J; Mitter, Sanjoy K

    2014-10-01

    We determine the maximum amount of work extractable in finite time by a demon performing continuous measurements on a quadratic Hamiltonian system subjected to thermal fluctuations, in terms of the information extracted from the system. The maximum work demon is found to apply a high-gain continuous feedback involving a Kalman-Bucy estimate of the system state and operates in nonequilibrium. A simple and concrete electrical implementation of the feedback protocol is proposed, which allows for analytic expressions of the flows of energy, entropy, and information inside the demon. This let us show that any implementation of the demon must necessarily include an external power source, which we prove both from classical thermodynamics arguments and from a version of Landauer's memory erasure argument extended to nonequilibrium linear systems.

  3. Affine Isoperimetry and Information Theoretic Inequalities

    ERIC Educational Resources Information Center

    Lv, Songjun

    2012-01-01

    There are essential connections between the isoperimetric theory and information theoretic inequalities. In general, the Brunn-Minkowski inequality and the entropy power inequality, as well as the classical isoperimetric inequality and the classical entropy-moment inequality, turn out to be equivalent in some certain sense, respectively. Based on…

  4. Unification of field theory and maximum entropy methods for learning probability densities

    NASA Astrophysics Data System (ADS)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  5. Unification of field theory and maximum entropy methods for learning probability densities.

    PubMed

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  6. Ab initio-informed maximum entropy modeling of rovibrational relaxation and state-specific dissociation with application to the O2 + O system

    NASA Astrophysics Data System (ADS)

    Kulakhmetov, Marat; Gallis, Michael; Alexeenko, Alina

    2016-05-01

    Quasi-classical trajectory (QCT) calculations are used to study state-specific ro-vibrational energy exchange and dissociation in the O2 + O system. Atom-diatom collisions with energy between 0.1 and 20 eV are calculated with a double many body expansion potential energy surface by Varandas and Pais [Mol. Phys. 65, 843 (1988)]. Inelastic collisions favor mono-quantum vibrational transitions at translational energies above 1.3 eV although multi-quantum transitions are also important. Post-collision vibrational favoring decreases first exponentially and then linearly as Δv increases. Vibrationally elastic collisions (Δv = 0) favor small ΔJ transitions while vibrationally inelastic collisions have equilibrium post-collision rotational distributions. Dissociation exhibits both vibrational and rotational favoring. New vibrational-translational (VT), vibrational-rotational-translational (VRT) energy exchange, and dissociation models are developed based on QCT observations and maximum entropy considerations. Full set of parameters for state-to-state modeling of oxygen is presented. The VT energy exchange model describes 22 000 state-to-state vibrational cross sections using 11 parameters and reproduces vibrational relaxation rates within 30% in the 2500-20 000 K temperature range. The VRT model captures 80 × 106 state-to-state ro-vibrational cross sections using 19 parameters and reproduces vibrational relaxation rates within 60% in the 5000-15 000 K temperature range. The developed dissociation model reproduces state-specific and equilibrium dissociation rates within 25% using just 48 parameters. The maximum entropy framework makes it feasible to upscale ab initio simulation to full nonequilibrium flow calculations.

  7. Proposed principles of maximum local entropy production.

    PubMed

    Ross, John; Corlan, Alexandru D; Müller, Stefan C

    2012-07-12

    Articles have appeared that rely on the application of some form of "maximum local entropy production principle" (MEPP). This is usually an optimization principle that is supposed to compensate for the lack of structural information and measurements about complex systems, even systems as complex and as little characterized as the whole biosphere or the atmosphere of the Earth or even of less known bodies in the solar system. We select a number of claims from a few well-known papers that advocate this principle and we show that they are in error with the help of simple examples of well-known chemical and physical systems. These erroneous interpretations can be attributed to ignoring well-established and verified theoretical results such as (1) entropy does not necessarily increase in nonisolated systems, such as "local" subsystems; (2) macroscopic systems, as described by classical physics, are in general intrinsically deterministic-there are no "choices" in their evolution to be selected by using supplementary principles; (3) macroscopic deterministic systems are predictable to the extent to which their state and structure is sufficiently well-known; usually they are not sufficiently known, and probabilistic methods need to be employed for their prediction; and (4) there is no causal relationship between the thermodynamic constraints and the kinetics of reaction systems. In conclusion, any predictions based on MEPP-like principles should not be considered scientifically founded.

  8. Entropy and equilibrium via games of complexity

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zachos, C. K.; High Energy Physics

    Following ref [1], a classical upper bound for quantum entropy is identified and illustrated, 0 {le} S{sub q} {le} ln (e{sigma}{sup 2}/2{h_bar}), involving the variance {sigma}{sup 2} in phase space of the classical limit distribution of a given system. A fortiori, this further bounds the corresponding information-theoretical generalizations of the quantum entropy proposed by Renyi.

  10. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  11. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    NASA Astrophysics Data System (ADS)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  12. Principles of time evolution in classical physics

    NASA Astrophysics Data System (ADS)

    Güémez, J.; Fiolhais, M.

    2018-07-01

    We address principles of time evolution in classical mechanical/thermodynamical systems in translational and rotational motion, in three cases: when there is conservation of mechanical energy, when there is energy dissipation and when there is mechanical energy production. In the first case, the time derivative of the Hamiltonian vanishes. In the second one, when dissipative forces are present, the time evolution is governed by the minimum potential energy principle, or, equivalently, maximum increase of the entropy of the universe. Finally, in the third situation, when internal sources of work are available to the system, it evolves in time according to the principle of minimum Gibbs function. We apply the Lagrangian formulation to the systems, dealing with the non-conservative forces using restriction functions such as the Rayleigh dissipative function.

  13. Classical and quantum entropy of parton distributions

    NASA Astrophysics Data System (ADS)

    Hagiwara, Yoshikazu; Hatta, Yoshitaka; Xiao, Bo-Wen; Yuan, Feng

    2018-05-01

    We introduce the semiclassical Wehrl entropy for the nucleon as a measure of complexity of the multiparton configuration in phase space. This gives a new perspective on the nucleon tomography. We evaluate the entropy in the small-x region and compare with the quantum von Neumann entropy. We also argue that the growth of entropy at small x is eventually slowed down due to the Pomeron loop effect.

  14. Estimation of the magnetic entropy change by means of Landau theory and phenomenological model in La0.6Ca0.2 Sr0.2MnO3/Sb2O3 ceramic composites

    NASA Astrophysics Data System (ADS)

    Nasri, M.; Dhahri, E.; Hlil, E. K.

    2018-06-01

    In this paper, magnetocaloric properties of La0.6Ca0.2Sr0.2MnO3/Sb2O3 oxides have been investigated. The composite samples were prepared using the conventional solid-state reaction method. The second-order phase transition can be testified with the positive slope in Arrott plots. An excellent agreement has been found between the -ΔSM values estimated by Landau theory and those obtained using the classical Maxwell relation. The field dependence of the magnetic entropy change analysis shows a power law dependence,|ΔSM|≈Hn , with n(TC) = 0.65. Moreover, the scaling analysis of magnetic entropy change exhibits that ΔSM(T) curves collapse into a single universal curve, indicating that the observed paramagnetic to ferromagnetic phase transition is an authentic second-order phase transition. The maximum value of magnetic entropy change of composites is found to decrease slightly with the further increasing of Sb2O3 concentration. A phenomenological model was used to predict magnetocaloric properties of La0.6Ca0.2Sr0.2MnO3/Sb2O3 composites. The theoretical calculations are compared with the available experimental data.

  15. Beyond the classical theory of heat conduction: a perspective view of future from entropy

    PubMed Central

    Lai, Xiang; Zhu, Pingan

    2016-01-01

    Energy is conserved by the first law of thermodynamics; its quality degrades constantly due to entropy generation, by the second law of thermodynamics. It is thus important to examine the entropy generation regarding the way to reduce its magnitude and the limit of entropy generation as time tends to infinity regarding whether it is bounded or not. This work initiates such an analysis with one-dimensional heat conduction. The work not only offers some fundamental insights of universe and its future, but also builds up the relation between the second law of thermodynamics and mathematical inequalities via developing the latter of either new or classical nature. A concise review of entropy is also included for the interest of performing the analysis in this work and the similar analysis for other processes in the future. PMID:27843400

  16. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  17. Maximum Entropy Methods as the Bridge Between Microscopic and Macroscopic Theory

    NASA Astrophysics Data System (ADS)

    Taylor, Jamie M.

    2016-09-01

    This paper is concerned with an investigation into a function of macroscopic variables known as the singular potential, building on previous work by Ball and Majumdar. The singular potential is a function of the admissible statistical averages of probability distributions on a state space, defined so that it corresponds to the maximum possible entropy given known observed statistical averages, although non-classical entropy-like objective functions will also be considered. First the set of admissible moments must be established, and under the conditions presented in this work the set is open, bounded and convex allowing a description in terms of supporting hyperplanes, which provides estimates on the development of singularities for related probability distributions. Under appropriate conditions it is shown that the singular potential is strictly convex, as differentiable as the microscopic entropy, and blows up uniformly as the macroscopic variable tends to the boundary of the set of admissible moments. Applications of the singular potential are then discussed, and particular consideration will be given to certain free-energy functionals typical in mean-field theory, demonstrating an equivalence between certain microscopic and macroscopic free-energy functionals. This allows statements about L^1-local minimisers of Onsager's free energy to be obtained which cannot be given by two-sided variations, and overcomes the need to ensure local minimisers are bounded away from zero and +∞ before taking L^∞ variations. The analysis also permits the definition of a dual order parameter for which Onsager's free energy allows an explicit representation. Also, the difficulties in approximating the singular potential by everywhere defined functions, in particular by polynomial functions, are addressed, with examples demonstrating the failure of the Taylor approximation to preserve relevant shape properties of the singular potential.

  18. Ab initio-informed maximum entropy modeling of rovibrational relaxation and state-specific dissociation with application to the O{sub 2} + O system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulakhmetov, Marat, E-mail: mkulakhm@purdue.edu; Alexeenko, Alina, E-mail: alexeenk@purdue.edu; Gallis, Michael, E-mail: magalli@sandia.gov

    Quasi-classical trajectory (QCT) calculations are used to study state-specific ro-vibrational energy exchange and dissociation in the O{sub 2} + O system. Atom-diatom collisions with energy between 0.1 and 20 eV are calculated with a double many body expansion potential energy surface by Varandas and Pais [Mol. Phys. 65, 843 (1988)]. Inelastic collisions favor mono-quantum vibrational transitions at translational energies above 1.3 eV although multi-quantum transitions are also important. Post-collision vibrational favoring decreases first exponentially and then linearly as Δv increases. Vibrationally elastic collisions (Δv = 0) favor small ΔJ transitions while vibrationally inelastic collisions have equilibrium post-collision rotational distributions. Dissociationmore » exhibits both vibrational and rotational favoring. New vibrational-translational (VT), vibrational-rotational-translational (VRT) energy exchange, and dissociation models are developed based on QCT observations and maximum entropy considerations. Full set of parameters for state-to-state modeling of oxygen is presented. The VT energy exchange model describes 22 000 state-to-state vibrational cross sections using 11 parameters and reproduces vibrational relaxation rates within 30% in the 2500–20 000 K temperature range. The VRT model captures 80 × 10{sup 6} state-to-state ro-vibrational cross sections using 19 parameters and reproduces vibrational relaxation rates within 60% in the 5000–15 000 K temperature range. The developed dissociation model reproduces state-specific and equilibrium dissociation rates within 25% using just 48 parameters. The maximum entropy framework makes it feasible to upscale ab initio simulation to full nonequilibrium flow calculations.« less

  19. Deterministic physical systems under uncertain initial conditions: the case of maximum entropy applied to projectile motion

    NASA Astrophysics Data System (ADS)

    Montecinos, Alejandra; Davis, Sergio; Peralta, Joaquín

    2018-07-01

    The kinematics and dynamics of deterministic physical systems have been a foundation of our understanding of the world since Galileo and Newton. For real systems, however, uncertainty is largely present via external forces such as friction or lack of precise knowledge about the initial conditions of the system. In this work we focus on the latter case and describe the use of inference methodologies in solving the statistical properties of classical systems subject to uncertain initial conditions. In particular we describe the application of the formalism of maximum entropy (MaxEnt) inference to the problem of projectile motion, given information about the average horizontal range over many realizations. By using MaxEnt we can invert the problem and use the provided information on the average range to reduce the original uncertainty in the initial conditions. Also, additional insight into the initial condition's probabilities, and the projectile path distribution itself, can be achieved based on the value of the average horizontal range. The wide applicability of this procedure, as well as its ease of use, reveals a useful tool with which to revisit a large number of physics problems, from classrooms to frontier research.

  20. Maximum-entropy probability distributions under Lp-norm constraints

    NASA Technical Reports Server (NTRS)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  1. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaerer, Roman Pascal, E-mail: schaerer@mathcces.rwth-aachen.de; Bansal, Pratyuksh; Torrilhon, Manuel

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropymore » distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.« less

  2. The maximum entropy production principle: two basic questions.

    PubMed

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  3. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    NASA Astrophysics Data System (ADS)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  4. Application of an improved minimum entropy deconvolution method for railway rolling element bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Cheng, Yao; Zhou, Ning; Zhang, Weihua; Wang, Zhiwei

    2018-07-01

    Minimum entropy deconvolution is a widely-used tool in machinery fault diagnosis, because it enhances the impulse component of the signal. The filter coefficients that greatly influence the performance of the minimum entropy deconvolution are calculated by an iterative procedure. This paper proposes an improved deconvolution method for the fault detection of rolling element bearings. The proposed method solves the filter coefficients by the standard particle swarm optimization algorithm, assisted by a generalized spherical coordinate transformation. When optimizing the filters performance for enhancing the impulses in fault diagnosis (namely, faulty rolling element bearings), the proposed method outperformed the classical minimum entropy deconvolution method. The proposed method was validated in simulation and experimental signals from railway bearings. In both simulation and experimental studies, the proposed method delivered better deconvolution performance than the classical minimum entropy deconvolution method, especially in the case of low signal-to-noise ratio.

  5. Nonadditive entropy maximization is inconsistent with Bayesian updating

    NASA Astrophysics Data System (ADS)

    Pressé, Steve

    2014-11-01

    The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  6. Nonadditive entropy maximization is inconsistent with Bayesian updating.

    PubMed

    Pressé, Steve

    2014-11-01

    The maximum entropy method-used to infer probabilistic models from data-is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  7. DEM interpolation weight calculation modulus based on maximum entropy

    NASA Astrophysics Data System (ADS)

    Chen, Tian-wei; Yang, Xia

    2015-12-01

    There is negative-weight in traditional interpolation of gridding DEM, in the article, the principle of Maximum Entropy is utilized to analyze the model system which depends on modulus of space weight. Negative-weight problem of the DEM interpolation is researched via building Maximum Entropy model, and adding nonnegative, first and second order's Moment constraints, the negative-weight problem is solved. The correctness and accuracy of the method was validated with genetic algorithm in matlab program. The method is compared with the method of Yang Chizhong interpolation and quadratic program. Comparison shows that the volume and scaling of Maximum Entropy's weight is fit to relations of space and the accuracy is superior to the latter two.

  8. Monitoring of Time-Dependent System Profiles by Multiplex Gas Chromatography with Maximum Entropy Demodulation

    NASA Technical Reports Server (NTRS)

    Becker, Joseph F.; Valentin, Jose

    1996-01-01

    The maximum entropy technique was successfully applied to the deconvolution of overlapped chromatographic peaks. An algorithm was written in which the chromatogram was represented as a vector of sample concentrations multiplied by a peak shape matrix. Simulation results demonstrated that there is a trade off between the detector noise and peak resolution in the sense that an increase of the noise level reduced the peak separation that could be recovered by the maximum entropy method. Real data originated from a sample storage column was also deconvoluted using maximum entropy. Deconvolution is useful in this type of system because the conservation of time dependent profiles depends on the band spreading processes in the chromatographic column, which might smooth out the finer details in the concentration profile. The method was also applied to the deconvolution of previously interpretted Pioneer Venus chromatograms. It was found in this case that the correct choice of peak shape function was critical to the sensitivity of maximum entropy in the reconstruction of these chromatograms.

  9. Generalized relative entropies in the classical limit

    NASA Astrophysics Data System (ADS)

    Kowalski, A. M.; Martin, M. T.; Plastino, A.

    2015-03-01

    Our protagonists are (i) the Cressie-Read family of divergences (characterized by the parameter γ), (ii) Tsallis' generalized relative entropies (characterized by the q one), and, as a particular instance of both, (iii) the Kullback-Leibler (KL) relative entropy. In their normalized versions, we ascertain the equivalence between (i) and (ii). Additionally, we employ these three entropic quantifiers in order to provide a statistical investigation of the classical limit of a semiclassical model, whose properties are well known from a purely dynamic viewpoint. This places us in a good position to assess the appropriateness of our statistical quantifiers for describing involved systems. We compare the behaviour of (i), (ii), and (iii) as one proceeds towards the classical limit. We determine optimal ranges for γ and/or q. It is shown the Tsallis-quantifier is better than KL's for 1.5 < q < 2.5.

  10. Investigating Friction as a Main Source of Entropy Generation in the Expansion of Confined Gas in a Piston-and-Cylinder Device

    ERIC Educational Resources Information Center

    Kang, Dun-Yen; Liou, Kai-Hsin; Chang, Wei-Lun

    2015-01-01

    The expansion or compression of gas confined in a piston-and-cylinder device is a classic working example used for illustrating the First and Second Laws of Thermodynamics. The balance of energy and entropy enables the estimation of a number of thermodynamic properties. The entropy generation (also called entropy production) resulting from this…

  11. Microcanonical entropy for classical systems

    NASA Astrophysics Data System (ADS)

    Franzosi, Roberto

    2018-03-01

    The entropy definition in the microcanonical ensemble is revisited. We propose a novel definition for the microcanonical entropy that resolve the debate on the correct definition of the microcanonical entropy. In particular we show that this entropy definition fixes the problem inherent the exact extensivity of the caloric equation. Furthermore, this entropy reproduces results which are in agreement with the ones predicted with standard Boltzmann entropy when applied to macroscopic systems. On the contrary, the predictions obtained with the standard Boltzmann entropy and with the entropy we propose, are different for small system sizes. Thus, we conclude that the Boltzmann entropy provides a correct description for macroscopic systems whereas extremely small systems should be better described with the entropy that we propose here.

  12. Convex Accelerated Maximum Entropy Reconstruction

    PubMed Central

    Worley, Bradley

    2016-01-01

    Maximum entropy (MaxEnt) spectral reconstruction methods provide a powerful framework for spectral estimation of nonuniformly sampled datasets. Many methods exist within this framework, usually defined based on the magnitude of a Lagrange multiplier in the MaxEnt objective function. An algorithm is presented here that utilizes accelerated first-order convex optimization techniques to rapidly and reliably reconstruct nonuniformly sampled NMR datasets using the principle of maximum entropy. This algorithm – called CAMERA for Convex Accelerated Maximum Entropy Reconstruction Algorithm – is a new approach to spectral reconstruction that exhibits fast, tunable convergence in both constant-aim and constant-lambda modes. A high-performance, open source NMR data processing tool is described that implements CAMERA, and brief comparisons to existing reconstruction methods are made on several example spectra. PMID:26894476

  13. Entropy-based goodness-of-fit test: Application to the Pareto distribution

    NASA Astrophysics Data System (ADS)

    Lequesne, Justine

    2013-08-01

    Goodness-of-fit tests based on entropy have been introduced in [13] for testing normality. The maximum entropy distribution in a class of probability distributions defined by linear constraints induces a Pythagorean equality between the Kullback-Leibler information and an entropy difference. This allows one to propose a goodness-of-fit test for maximum entropy parametric distributions which is based on the Kullback-Leibler information. We will focus on the application of the method to the Pareto distribution. The power of the proposed test is computed through Monte Carlo simulation.

  14. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    PubMed

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  15. Roughness as classicality indicator of a quantum state

    NASA Astrophysics Data System (ADS)

    Lemos, Humberto C. F.; Almeida, Alexandre C. L.; Amaral, Barbara; Oliveira, Adélcio C.

    2018-03-01

    We define a new quantifier of classicality for a quantum state, the Roughness, which is given by the L2 (R2) distance between Wigner and Husimi functions. We show that the Roughness is bounded and therefore it is a useful tool for comparison between different quantum states for single bosonic systems. The state classification via the Roughness is not binary, but rather it is continuous in the interval [ 0 , 1 ], being the state more classic as the Roughness approaches to zero, and more quantum when it is closer to the unity. The Roughness is maximum for Fock states when its number of photons is arbitrarily large, and also for squeezed states at the maximum compression limit. On the other hand, the Roughness approaches its minimum value for thermal states at infinite temperature and, more generally, for infinite entropy states. The Roughness of a coherent state is slightly below one half, so we may say that it is more a classical state than a quantum one. Another important result is that the Roughness performs well for discriminating both pure and mixed states. Since the Roughness measures the inherent quantumness of a state, we propose another function, the Dynamic Distance Measure (DDM), which is suitable for measure how much quantum is a dynamics. Using DDM, we studied the quartic oscillator, and we observed that there is a certain complementarity between dynamics and state, i.e. when dynamics becomes more quantum, the Roughness of the state decreases, while the Roughness grows as the dynamics becomes less quantum.

  16. Maximum entropy method applied to deblurring images on a MasPar MP-1 computer

    NASA Technical Reports Server (NTRS)

    Bonavito, N. L.; Dorband, John; Busse, Tim

    1991-01-01

    A statistical inference method based on the principle of maximum entropy is developed for the purpose of enhancing and restoring satellite images. The proposed maximum entropy image restoration method is shown to overcome the difficulties associated with image restoration and provide the smoothest and most appropriate solution consistent with the measured data. An implementation of the method on the MP-1 computer is described, and results of tests on simulated data are presented.

  17. The Gibbs paradox and the physical criteria for indistinguishability of identical particles

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, C. S.

    2016-08-01

    Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the problem is treated in the language of information and associated entropy. We pinpoint the physical criteria for indistinguishability that is crucial for the treatment of the Gibbs’ problem and the consistency of its solution with conventional thermodynamics. Quantum mechanics provides a quantitative criterion, not possible in the classical picture, for the degree of indistinguishability in terms of visibility of quantum interference, or overlap of the states as pointed out by von Neumann, thereby endowing the entropy expression with mathematical continuity and physical reasonableness.

  18. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  19. Quantum Discord for d⊗2 Systems

    PubMed Central

    Ma, Zhihao; Chen, Zhihua; Fanchini, Felipe Fernandes; Fei, Shao-Ming

    2015-01-01

    We present an analytical solution for classical correlation, defined in terms of linear entropy, in an arbitrary system when the second subsystem is measured. We show that the optimal measurements used in the maximization of the classical correlation in terms of linear entropy, when used to calculate the quantum discord in terms of von Neumann entropy, result in a tight upper bound for arbitrary systems. This bound agrees with all known analytical results about quantum discord in terms of von Neumann entropy and, when comparing it with the numerical results for 106 two-qubit random density matrices, we obtain an average deviation of order 10−4. Furthermore, our results give a way to calculate the quantum discord for arbitrary n-qubit GHZ and W states evolving under the action of the amplitude damping noisy channel. PMID:26036771

  20. Entropy, temperature and internal energy of trapped gravitons and corrections to the Black Hole entropy

    NASA Astrophysics Data System (ADS)

    Viaggiu, Stefano

    2017-12-01

    In this paper we study the proposal present in Viaggiu (2017) concerning the statistical description of trapped gravitons and applied to derive the semi-classical black hole (BH) entropy SBH. We study the possible configurations depending on physically reasonable expressions for the internal energy U. In particular, we show that expressions for U ∼Rk , k ≥ 1, with R the radius of the confining spherical box, can have a semi-classical description, while behaviors with k < 1 derive from thermodynamic or quantum fluctuations. There, by taking a suitable physically motivated expression for U(R) , we obtain the well known logarithmic corrections to the BH entropy, with the usual behaviors present in the literature of BH entropy. Moreover, a phase transition emerges with a positive specific heat C at Planckian lengths instead of the usual negative one at non-Planckian scales, in agreement with results present in the literature. Finally, we show that evaporation stops at a radius R of the order of the Planck length.

  1. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    PubMed

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  2. Removing the Mystery of Entropy and Thermodynamics--Part I

    ERIC Educational Resources Information Center

    Left, Harvey S.

    2012-01-01

    Energy and entropy are centerpieces of physics. Energy is typically introduced in the study of classical mechanics. Although energy in this context can be challenging, its use in thermodynamics and its connection with entropy seem to take on a special air of mystery. In this five-part series, I pinpoint ways around key areas of difficulty to…

  3. Consistent maximum entropy representations of pipe flow networks

    NASA Astrophysics Data System (ADS)

    Waldrip, Steven H.; Niven, Robert K.; Abel, Markus; Schlegel, Michael

    2017-06-01

    The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.

  4. Maximum entropy production: Can it be used to constrain conceptual hydrological models?

    Treesearch

    M.C. Westhoff; E. Zehe

    2013-01-01

    In recent years, optimality principles have been proposed to constrain hydrological models. The principle of maximum entropy production (MEP) is one of the proposed principles and is subject of this study. It states that a steady state system is organized in such a way that entropy production is maximized. Although successful applications have been reported in...

  5. A centroid molecular dynamics study of liquid para-hydrogen and ortho-deuterium.

    PubMed

    Hone, Tyler D; Voth, Gregory A

    2004-10-01

    Centroid molecular dynamics (CMD) is applied to the study of collective and single-particle dynamics in liquid para-hydrogen at two state points and liquid ortho-deuterium at one state point. The CMD results are compared with the results of classical molecular dynamics, quantum mode coupling theory, a maximum entropy analytic continuation approach, pair-product forward- backward semiclassical dynamics, and available experimental results. The self-diffusion constants are in excellent agreement with the experimental measurements for all systems studied. Furthermore, it is shown that the method is able to adequately describe both the single-particle and collective dynamics of quantum liquids. (c) 2004 American Institute of Physics

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koenig, Robert

    We propose a generalization of the quantum entropy power inequality involving conditional entropies. For the special case of Gaussian states, we give a proof based on perturbation theory for symplectic spectra. We discuss some implications for entanglement-assisted classical communication over additive bosonic noise channels.

  7. Logarithmic corrections to entropy of magnetically charged AdS4 black holes

    NASA Astrophysics Data System (ADS)

    Jeon, Imtak; Lal, Shailesh

    2017-11-01

    Logarithmic terms are quantum corrections to black hole entropy determined completely from classical data, thus providing a strong check for candidate theories of quantum gravity purely from physics in the infrared. We compute these terms in the entropy associated to the horizon of a magnetically charged extremal black hole in AdS4×S7 using the quantum entropy function and discuss the possibility of matching against recently derived microscopic expressions.

  8. On variational definition of quantum entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belavkin, Roman V.

    Entropy of distribution P can be defined in at least three different ways: 1) as the expectation of the Kullback-Leibler (KL) divergence of P from elementary δ-measures (in this case, it is interpreted as expected surprise); 2) as a negative KL-divergence of some reference measure ν from the probability measure P; 3) as the supremum of Shannon’s mutual information taken over all channels such that P is the output probability, in which case it is dual of some transportation problem. In classical (i.e. commutative) probability, all three definitions lead to the same quantity, providing only different interpretations of entropy. Inmore » non-commutative (i.e. quantum) probability, however, these definitions are not equivalent. In particular, the third definition, where the supremum is taken over all entanglements of two quantum systems with P being the output state, leads to the quantity that can be twice the von Neumann entropy. It was proposed originally by V. Belavkin and Ohya [1] and called the proper quantum entropy, because it allows one to define quantum conditional entropy that is always non-negative. Here we extend these ideas to define also quantum counterpart of proper cross-entropy and cross-information. We also show inequality for the values of classical and quantum information.« less

  9. On determining absolute entropy without quantum theory or the third law of thermodynamics

    NASA Astrophysics Data System (ADS)

    Steane, Andrew M.

    2016-04-01

    We employ classical thermodynamics to gain information about absolute entropy, without recourse to statistical methods, quantum mechanics or the third law of thermodynamics. The Gibbs-Duhem equation yields various simple methods to determine the absolute entropy of a fluid. We also study the entropy of an ideal gas and the ionization of a plasma in thermal equilibrium. A single measurement of the degree of ionization can be used to determine an unknown constant in the entropy equation, and thus determine the absolute entropy of a gas. It follows from all these examples that the value of entropy at absolute zero temperature does not need to be assigned by postulate, but can be deduced empirically.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giovannetti, Vittorio; Maccone, Lorenzo; Shapiro, Jeffrey H.

    The minimum Renyi and Wehrl output entropies are found for bosonic channels in which the signal photons are either randomly displaced by a Gaussian distribution (classical-noise channel), or coupled to a thermal environment through lossy propagation (thermal-noise channel). It is shown that the Renyi output entropies of integer orders z{>=}2 and the Wehrl output entropy are minimized when the channel input is a coherent state.

  11. On Entropy Production in the Madelung Fluid and the Role of Bohm's Potential in Classical Diffusion

    NASA Astrophysics Data System (ADS)

    Heifetz, Eyal; Tsekov, Roumen; Cohen, Eliahu; Nussinov, Zohar

    2016-07-01

    The Madelung equations map the non-relativistic time-dependent Schrödinger equation into hydrodynamic equations of a virtual fluid. While the von Neumann entropy remains constant, we demonstrate that an increase of the Shannon entropy, associated with this Madelung fluid, is proportional to the expectation value of its velocity divergence. Hence, the Shannon entropy may grow (or decrease) due to an expansion (or compression) of the Madelung fluid. These effects result from the interference between solutions of the Schrödinger equation. Growth of the Shannon entropy due to expansion is common in diffusive processes. However, in the latter the process is irreversible while the processes in the Madelung fluid are always reversible. The relations between interference, compressibility and variation of the Shannon entropy are then examined in several simple examples. Furthermore, we demonstrate that for classical diffusive processes, the "force" accelerating diffusion has the form of the positive gradient of the quantum Bohm potential. Expressing then the diffusion coefficient in terms of the Planck constant reveals the lower bound given by the Heisenberg uncertainty principle in terms of the product between the gas mean free path and the Brownian momentum.

  12. Thermodynamics of finite systems: a key issues review

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2018-07-01

    A little over ten years ago, Campisi, and Dunkel and Hilbert, published papers claiming that the Gibbs (volume) entropy of a classical system was correct, and that the Boltzmann (surface) entropy was not. They claimed further that the quantum version of the Gibbs entropy was also correct, and that the phenomenon of negative temperatures was thermodynamically inconsistent. Their work began a vigorous debate of exactly how the entropy, both classical and quantum, should be defined. The debate has called into question the basis of thermodynamics, along with fundamental ideas such as whether heat always flows from hot to cold. The purpose of this paper is to sum up the present status—admittedly from my point of view. I will show that standard thermodynamics, with some minor generalizations, is correct, and the alternative thermodynamics suggested by Hilbert, Hänggi, and Dunkel is not. Heat does not flow from cold to hot. Negative temperatures are thermodynamically consistent. The small ‘errors’ in the Boltzmann entropy that started the whole debate are shown to be a consequence of the micro-canonical assumption of an energy distribution of zero width. Improved expressions for the entropy are found when this assumption is abandoned.

  13. Cell-model prediction of the melting of a Lennard-Jones solid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holian, B.L.

    The classical free energy of the Lennard-Jones 6-12 solid is computed from a single-particle anharmonic cell model with a correction to the entropy given by the classical correlational entropy of quasiharmonic lattice dynamics. The free energy of the fluid is obtained from the Hansen-Ree analytic fit to Monte Carlo equation-of-state calculations. The resulting predictions of the solid-fluid coexistence curves by this corrected cell model of the solid are in excellent agreement with the computer experiments.

  14. An Improved Otsu Threshold Segmentation Method for Underwater Simultaneous Localization and Mapping-Based Navigation

    PubMed Central

    Yuan, Xin; Martínez, José-Fernán; Eckert, Martina; López-Santidrián, Lourdes

    2016-01-01

    The main focus of this paper is on extracting features with SOund Navigation And Ranging (SONAR) sensing for further underwater landmark-based Simultaneous Localization and Mapping (SLAM). According to the characteristics of sonar images, in this paper, an improved Otsu threshold segmentation method (TSM) has been developed for feature detection. In combination with a contour detection algorithm, the foreground objects, although presenting different feature shapes, are separated much faster and more precisely than by other segmentation methods. Tests have been made with side-scan sonar (SSS) and forward-looking sonar (FLS) images in comparison with other four TSMs, namely the traditional Otsu method, the local TSM, the iterative TSM and the maximum entropy TSM. For all the sonar images presented in this work, the computational time of the improved Otsu TSM is much lower than that of the maximum entropy TSM, which achieves the highest segmentation precision among the four above mentioned TSMs. As a result of the segmentations, the centroids of the main extracted regions have been computed to represent point landmarks which can be used for navigation, e.g., with the help of an Augmented Extended Kalman Filter (AEKF)-based SLAM algorithm. The AEKF-SLAM approach is a recursive and iterative estimation-update process, which besides a prediction and an update stage (as in classical Extended Kalman Filter (EKF)), includes an augmentation stage. During navigation, the robot localizes the centroids of different segments of features in sonar images, which are detected by our improved Otsu TSM, as point landmarks. Using them with the AEKF achieves more accurate and robust estimations of the robot pose and the landmark positions, than with those detected by the maximum entropy TSM. Together with the landmarks identified by the proposed segmentation algorithm, the AEKF-SLAM has achieved reliable detection of cycles in the map and consistent map update on loop closure, which is shown in simulated experiments. PMID:27455279

  15. An Improved Otsu Threshold Segmentation Method for Underwater Simultaneous Localization and Mapping-Based Navigation.

    PubMed

    Yuan, Xin; Martínez, José-Fernán; Eckert, Martina; López-Santidrián, Lourdes

    2016-07-22

    The main focus of this paper is on extracting features with SOund Navigation And Ranging (SONAR) sensing for further underwater landmark-based Simultaneous Localization and Mapping (SLAM). According to the characteristics of sonar images, in this paper, an improved Otsu threshold segmentation method (TSM) has been developed for feature detection. In combination with a contour detection algorithm, the foreground objects, although presenting different feature shapes, are separated much faster and more precisely than by other segmentation methods. Tests have been made with side-scan sonar (SSS) and forward-looking sonar (FLS) images in comparison with other four TSMs, namely the traditional Otsu method, the local TSM, the iterative TSM and the maximum entropy TSM. For all the sonar images presented in this work, the computational time of the improved Otsu TSM is much lower than that of the maximum entropy TSM, which achieves the highest segmentation precision among the four above mentioned TSMs. As a result of the segmentations, the centroids of the main extracted regions have been computed to represent point landmarks which can be used for navigation, e.g., with the help of an Augmented Extended Kalman Filter (AEKF)-based SLAM algorithm. The AEKF-SLAM approach is a recursive and iterative estimation-update process, which besides a prediction and an update stage (as in classical Extended Kalman Filter (EKF)), includes an augmentation stage. During navigation, the robot localizes the centroids of different segments of features in sonar images, which are detected by our improved Otsu TSM, as point landmarks. Using them with the AEKF achieves more accurate and robust estimations of the robot pose and the landmark positions, than with those detected by the maximum entropy TSM. Together with the landmarks identified by the proposed segmentation algorithm, the AEKF-SLAM has achieved reliable detection of cycles in the map and consistent map update on loop closure, which is shown in simulated experiments.

  16. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method

    PubMed Central

    Roux, Benoît; Weare, Jonathan

    2013-01-01

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140

  17. On the location of the maximum homogeneous crystal nucleation temperature

    NASA Technical Reports Server (NTRS)

    Weinberg, Michael C.

    1986-01-01

    Detailed considerations are given to the location of the temperature of maximum homogeneous nucleation as predicted by classical nucleation theory. It is shown quite generally that this maximum temperature, T-asterisk, must occur above the Kauzmann temperature and that the T-asterisk is such that T-asterisk is greater than T(m)/3, where T(m) is the melting temperature. Also, it is demonstrated tha T-asterisk may be considered to be approximately dependent upon two parameters: gamma, the ratio of the difference in specific heat between the crystal and liquid divided by the entropy of fusion, and E, a reduced activation energy for viscous flow. The variation of T-asterisk with these parameters is described. The relationship of the relative location of T-asterisk to the glass transition temperature, is discussed too. This discussion is couched within the framework of the strong and fragile liquid notion introduced by Angell (1981) and coworkers. Finally, the question of the ultimate limits to the undercooling of liquid metals is considered and its relationhsip to computations of the maximum nucleation temperature in such systems.

  18. Maximum and minimum entropy states yielding local continuity bounds

    NASA Astrophysics Data System (ADS)

    Hanson, Eric P.; Datta, Nilanjana

    2018-04-01

    Given an arbitrary quantum state (σ), we obtain an explicit construction of a state ρɛ * ( σ ) [respectively, ρ * , ɛ ( σ ) ] which has the maximum (respectively, minimum) entropy among all states which lie in a specified neighborhood (ɛ-ball) of σ. Computing the entropy of these states leads to a local strengthening of the continuity bound of the von Neumann entropy, i.e., the Audenaert-Fannes inequality. Our bound is local in the sense that it depends on the spectrum of σ. The states ρɛ * ( σ ) and ρ * , ɛ (σ) depend only on the geometry of the ɛ-ball and are in fact optimizers for a larger class of entropies. These include the Rényi entropy and the minimum- and maximum-entropies, providing explicit formulas for certain smoothed quantities. This allows us to obtain local continuity bounds for these quantities as well. In obtaining this bound, we first derive a more general result which may be of independent interest, namely, a necessary and sufficient condition under which a state maximizes a concave and Gâteaux-differentiable function in an ɛ-ball around a given state σ. Examples of such a function include the von Neumann entropy and the conditional entropy of bipartite states. Our proofs employ tools from the theory of convex optimization under non-differentiable constraints, in particular Fermat's rule, and majorization theory.

  19. Holographic calculation for large interval Rényi entropy at high temperature

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Wu, Jie-qiang

    2015-11-01

    In this paper, we study the holographic Rényi entropy of a large interval on a circle at high temperature for the two-dimensional conformal field theory (CFT) dual to pure AdS3 gravity. In the field theory, the Rényi entropy is encoded in the CFT partition function on n -sheeted torus connected with each other by a large branch cut. As proposed by Chen and Wu [Large interval limit of Rényi entropy at high temperature, arXiv:1412.0763], the effective way to read the entropy in the large interval limit is to insert a complete set of state bases of the twist sector at the branch cut. Then the calculation transforms into an expansion of four-point functions in the twist sector with respect to e-2/π T R n . By using the operator product expansion of the twist operators at the branch points, we read the first few terms of the Rényi entropy, including the leading and next-to-leading contributions in the large central charge limit. Moreover, we show that the leading contribution is actually captured by the twist vacuum module. In this case by the Ward identity the four-point functions can be derived from the correlation function of four twist operators, which is related to double interval entanglement entropy. Holographically, we apply the recipe in [T. Faulkner, The entanglement Rényi entropies of disjoint intervals in AdS/CFT, arXiv:1303.7221] and [T. Barrella et al., Holographic entanglement beyond classical gravity, J. High Energy Phys. 09 (2013) 109] to compute the classical Rényi entropy and its one-loop quantum correction, after imposing a new set of monodromy conditions. The holographic classical result matches exactly with the leading contribution in the field theory up to e-4 π T R and l6, while the holographical one-loop contribution is in exact agreement with next-to-leading results in field theory up to e-6/π T R n and l4 as well.

  20. From Maximum Entropy Models to Non-Stationarity and Irreversibility

    NASA Astrophysics Data System (ADS)

    Cofre, Rodrigo; Cessac, Bruno; Maldonado, Cesar

    The maximum entropy distribution can be obtained from a variational principle. This is important as a matter of principle and for the purpose of finding approximate solutions. One can exploit this fact to obtain relevant information about the underlying stochastic process. We report here in recent progress in three aspects to this approach.1- Biological systems are expected to show some degree of irreversibility in time. Based on the transfer matrix technique to find the spatio-temporal maximum entropy distribution, we build a framework to quantify the degree of irreversibility of any maximum entropy distribution.2- The maximum entropy solution is characterized by a functional called Gibbs free energy (solution of the variational principle). The Legendre transformation of this functional is the rate function, which controls the speed of convergence of empirical averages to their ergodic mean. We show how the correct description of this functional is determinant for a more rigorous characterization of first and higher order phase transitions.3- We assess the impact of a weak time-dependent external stimulus on the collective statistics of spiking neuronal networks. We show how to evaluate this impact on any higher order spatio-temporal correlation. RC supported by ERC advanced Grant ``Bridges'', BC: KEOPS ANR-CONICYT, Renvision and CM: CONICYT-FONDECYT No. 3140572.

  1. Fast and Efficient Stochastic Optimization for Analytic Continuation

    DOE PAGES

    Bao, Feng; Zhang, Guannan; Webster, Clayton G; ...

    2016-09-28

    In this analytic continuation of imaginary-time quantum Monte Carlo data to extract real-frequency spectra remains a key problem in connecting theory with experiment. Here we present a fast and efficient stochastic optimization method (FESOM) as a more accessible variant of the stochastic optimization method introduced by Mishchenko et al. [Phys. Rev. B 62, 6317 (2000)], and we benchmark the resulting spectra with those obtained by the standard maximum entropy method for three representative test cases, including data taken from studies of the two-dimensional Hubbard model. Genearally, we find that our FESOM approach yields spectra similar to the maximum entropy results.more » In particular, while the maximum entropy method yields superior results when the quality of the data is strong, we find that FESOM is able to resolve fine structure with more detail when the quality of the data is poor. In addition, because of its stochastic nature, the method provides detailed information on the frequency-dependent uncertainty of the resulting spectra, while the maximum entropy method does so only for the spectral weight integrated over a finite frequency region. Therefore, we believe that this variant of the stochastic optimization approach provides a viable alternative to the routinely used maximum entropy method, especially for data of poor quality.« less

  2. Measurement-induced randomness and state-merging

    NASA Astrophysics Data System (ADS)

    Chakrabarty, Indranil; Deshpande, Abhishek; Chatterjee, Sourav

    In this work we introduce the randomness which is truly quantum mechanical in nature arising as an act of measurement. For a composite classical system, we have the joint entropy to quantify the randomness present in the total system and that happens to be equal to the sum of the entropy of one subsystem and the conditional entropy of the other subsystem, given we know the first system. The same analogy carries over to the quantum setting by replacing the Shannon entropy by the von Neumann entropy. However, if we replace the conditional von Neumann entropy by the average conditional entropy due to measurement, we find that it is different from the joint entropy of the system. We call this difference Measurement Induced Randomness (MIR) and argue that this is unique of quantum mechanical systems and there is no classical counterpart to this. In other words, the joint von Neumann entropy gives only the total randomness that arises because of the heterogeneity of the mixture and we show that it is not the total randomness that can be generated in the composite system. We generalize this quantity for N-qubit systems and show that it reduces to quantum discord for two-qubit systems. Further, we show that it is exactly equal to the change in the cost quantum state merging that arises because of the measurement. We argue that for quantum information processing tasks like state merging, the change in the cost as a result of discarding prior information can also be viewed as a rise of randomness due to measurement.

  3. Crowd macro state detection using entropy model

    NASA Astrophysics Data System (ADS)

    Zhao, Ying; Yuan, Mengqi; Su, Guofeng; Chen, Tao

    2015-08-01

    In the crowd security research area a primary concern is to identify the macro state of crowd behaviors to prevent disasters and to supervise the crowd behaviors. The entropy is used to describe the macro state of a self-organization system in physics. The entropy change indicates the system macro state change. This paper provides a method to construct crowd behavior microstates and the corresponded probability distribution using the individuals' velocity information (magnitude and direction). Then an entropy model was built up to describe the crowd behavior macro state. Simulation experiments and video detection experiments were conducted. It was verified that in the disordered state, the crowd behavior entropy is close to the theoretical maximum entropy; while in ordered state, the entropy is much lower than half of the theoretical maximum entropy. The crowd behavior macro state sudden change leads to the entropy change. The proposed entropy model is more applicable than the order parameter model in crowd behavior detection. By recognizing the entropy mutation, it is possible to detect the crowd behavior macro state automatically by utilizing cameras. Results will provide data support on crowd emergency prevention and on emergency manual intervention.

  4. Maximum-Entropy Inference with a Programmable Annealer

    PubMed Central

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.

    2016-01-01

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition. PMID:26936311

  5. Rényi entropy measure of noise-aided information transmission in a binary channel.

    PubMed

    Chapeau-Blondeau, François; Rousseau, David; Delahaies, Agnès

    2010-05-01

    This paper analyzes a binary channel by means of information measures based on the Rényi entropy. The analysis extends, and contains as a special case, the classic reference model of binary information transmission based on the Shannon entropy measure. The extended model is used to investigate further possibilities and properties of stochastic resonance or noise-aided information transmission. The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order. Furthermore, in definite conditions, when seeking the Rényi information measures that best exploit stochastic resonance, then nontrivial orders differing from the Shannon case usually emerge. In this way, through binary information transmission, stochastic resonance identifies optimal Rényi measures of information differing from the classic Shannon measure. A confrontation of the quantitative information measures with visual perception is also proposed in an experiment of noise-aided binary image transmission.

  6. Block entropy and quantum phase transition in the anisotropic Kondo necklace model

    NASA Astrophysics Data System (ADS)

    Mendoza-Arenas, J. J.; Franco, R.; Silva-Valencia, J.

    2010-06-01

    We study the von Neumann block entropy in the Kondo necklace model for different anisotropies η in the XY interaction between conduction spins using the density matrix renormalization group method. It was found that the block entropy presents a maximum for each η considered, and, comparing it with the results of the quantum criticality of the model based on the behavior of the energy gap, we observe that the maximum block entropy occurs at the quantum critical point between an antiferromagnetic and a Kondo singlet state, so this measure of entanglement is useful for giving information about where a quantum phase transition occurs in this model. We observe that the block entropy also presents a maximum at the quantum critical points that are obtained when an anisotropy Δ is included in the Kondo exchange between localized and conduction spins; when Δ diminishes for a fixed value of η, the critical point increases, favoring the antiferromagnetic phase.

  7. Entropy of adsorption of mixed surfactants from solutions onto the air/water interface

    USGS Publications Warehouse

    Chen, L.-W.; Chen, J.-H.; Zhou, N.-F.

    1995-01-01

    The partial molar entropy change for mixed surfactant molecules adsorbed from solution at the air/water interface has been investigated by surface thermodynamics based upon the experimental surface tension isotherms at various temperatures. Results for different surfactant mixtures of sodium dodecyl sulfate and sodium tetradecyl sulfate, decylpyridinium chloride and sodium alkylsulfonates have shown that the partial molar entropy changes for adsorption of the mixed surfactants were generally negative and decreased with increasing adsorption to a minimum near the maximum adsorption and then increased abruptly. The entropy decrease can be explained by the adsorption-orientation of surfactant molecules in the adsorbed monolayer and the abrupt entropy increase at the maximum adsorption is possible due to the strong repulsion between the adsorbed molecules.

  8. Tsallis Entropy and the Transition to Scaling in Fragmentation

    NASA Astrophysics Data System (ADS)

    Sotolongo-Costa, Oscar; Rodriguez, Arezky H.; Rodgers, G. J.

    2000-12-01

    By using the maximum entropy principle with Tsallis entropy we obtain a fragment size distribution function which undergoes a transition to scaling. This distribution function reduces to those obtained by other authors using Shannon entropy. The treatment is easily generalisable to any process of fractioning with suitable constraints.

  9. A pairwise maximum entropy model accurately describes resting-state human brain networks

    PubMed Central

    Watanabe, Takamitsu; Hirose, Satoshi; Wada, Hiroyuki; Imai, Yoshio; Machida, Toru; Shirouzu, Ichiro; Konishi, Seiki; Miyashita, Yasushi; Masuda, Naoki

    2013-01-01

    The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks. PMID:23340410

  10. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  11. Stationary properties of maximum-entropy random walks.

    PubMed

    Dixit, Purushottam D

    2015-10-01

    Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.

  12. Relative entropy of entanglement and restricted measurements.

    PubMed

    Piani, M

    2009-10-16

    We introduce variants of relative entropy of entanglement based on the optimal distinguishability from unentangled states by means of restricted measurements. In this way we are able to prove that the standard regularized entropy of entanglement is strictly positive for all multipartite entangled states. This implies that the asymptotic creation of a multipartite entangled state by means of local operations and classical communication always requires the consumption of a nonlocal resource at a strictly positive rate.

  13. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  14. Application of Bayesian Maximum Entropy Filter in parameter calibration of groundwater flow model in PingTung Plain

    NASA Astrophysics Data System (ADS)

    Cheung, Shao-Yong; Lee, Chieh-Han; Yu, Hwa-Lung

    2017-04-01

    Due to the limited hydrogeological observation data and high levels of uncertainty within, parameter estimation of the groundwater model has been an important issue. There are many methods of parameter estimation, for example, Kalman filter provides a real-time calibration of parameters through measurement of groundwater monitoring wells, related methods such as Extended Kalman Filter and Ensemble Kalman Filter are widely applied in groundwater research. However, Kalman Filter method is limited to linearity. This study propose a novel method, Bayesian Maximum Entropy Filtering, which provides a method that can considers the uncertainty of data in parameter estimation. With this two methods, we can estimate parameter by given hard data (certain) and soft data (uncertain) in the same time. In this study, we use Python and QGIS in groundwater model (MODFLOW) and development of Extended Kalman Filter and Bayesian Maximum Entropy Filtering in Python in parameter estimation. This method may provide a conventional filtering method and also consider the uncertainty of data. This study was conducted through numerical model experiment to explore, combine Bayesian maximum entropy filter and a hypothesis for the architecture of MODFLOW groundwater model numerical estimation. Through the virtual observation wells to simulate and observe the groundwater model periodically. The result showed that considering the uncertainty of data, the Bayesian maximum entropy filter will provide an ideal result of real-time parameters estimation.

  15. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  16. Determining Dynamical Path Distributions usingMaximum Relative Entropy

    DTIC Science & Technology

    2015-05-31

    entropy to a one-dimensional continuum labeled by a parameter η. The resulting η-entropies are equivalent to those proposed by Renyi [12] or by Tsallis [13...1995). [12] A. Renyi , “On measures of entropy and information,”Proc. 4th Berkeley Simposium on Mathematical Statistics and Probability, Vol 1, p. 547-461

  17. Ergodicity, Maximum Entropy Production, and Steepest Entropy Ascent in the Proofs of Onsager's Reciprocal Relations

    NASA Astrophysics Data System (ADS)

    Benfenati, Francesco; Beretta, Gian Paolo

    2018-04-01

    We show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler's maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).

  18. Exploiting the Maximum Entropy Principle to Increase Retrieval Effectiveness.

    ERIC Educational Resources Information Center

    Cooper, William S.

    1983-01-01

    Presents information retrieval design approach in which queries of computer-based system consist of sets of terms, either unweighted or weighted with subjective term precision estimates, and retrieval outputs ranked by probability of usefulness estimated by "maximum entropy principle." Boolean and weighted request systems are discussed.…

  19. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  20. On the asymptotic behavior of a subcritical convection-diffusion equation with nonlocal diffusion

    NASA Astrophysics Data System (ADS)

    Cazacu, Cristian M.; Ignat, Liviu I.; Pazoto, Ademir F.

    2017-08-01

    In this paper we consider a subcritical model that involves nonlocal diffusion and a classical convective term. In spite of the nonlocal diffusion, we obtain an Oleinik type estimate similar to the case when the diffusion is local. First we prove that the entropy solution can be obtained by adding a small viscous term μ uxx and letting μ\\to 0 . Then, by using uniform Oleinik estimates for the viscous approximation we are able to prove the well-posedness of the entropy solutions with L 1-initial data. Using a scaling argument and hyperbolic estimates given by Oleinik’s inequality, we obtain the first term in the asymptotic behavior of the nonnegative solutions. Finally, the large time behavior of changing sign solutions is proved using the classical flux-entropy method and estimates for the nonlocal operator.

  1. Maps on positive operators preserving Rényi type relative entropies and maximal f-divergences

    NASA Astrophysics Data System (ADS)

    Gaál, Marcell; Nagy, Gergő

    2018-02-01

    In this paper, we deal with two quantum relative entropy preserver problems on the cones of positive (either positive definite or positive semidefinite) operators. The first one is related to a quantum Rényi relative entropy like quantity which plays an important role in classical-quantum channel decoding. The second one is connected to the so-called maximal f-divergences introduced by D. Petz and M. B. Ruskai who considered this quantity as a generalization of the usual Belavkin-Staszewski relative entropy. We emphasize in advance that all the results are obtained for finite-dimensional Hilbert spaces.

  2. The locking-decoding frontier for generic dynamics.

    PubMed

    Dupuis, Frédéric; Florjanczyk, Jan; Hayden, Patrick; Leung, Debbie

    2013-11-08

    It is known that the maximum classical mutual information, which can be achieved between measurements on pairs of quantum systems, can drastically underestimate the quantum mutual information between them. In this article, we quantify this distinction between classical and quantum information by demonstrating that after removing a logarithmic-sized quantum system from one half of a pair of perfectly correlated bitstrings, even the most sensitive pair of measurements might yield only outcomes essentially independent of each other. This effect is a form of information locking but the definition we use is strictly stronger than those used previously. Moreover, we find that this property is generic, in the sense that it occurs when removing a random subsystem. As such, the effect might be relevant to statistical mechanics or black hole physics. While previous works had always assumed a uniform message, we assume only a min-entropy bound and also explore the effect of entanglement. We find that classical information is strongly locked almost until it can be completely decoded. Finally, we exhibit a quantum key distribution protocol that is 'secure' in the sense of accessible information but in which leakage of even a logarithmic number of bits compromises the secrecy of all others.

  3. The locking-decoding frontier for generic dynamics

    PubMed Central

    Dupuis, Frédéric; Florjanczyk, Jan; Hayden, Patrick; Leung, Debbie

    2013-01-01

    It is known that the maximum classical mutual information, which can be achieved between measurements on pairs of quantum systems, can drastically underestimate the quantum mutual information between them. In this article, we quantify this distinction between classical and quantum information by demonstrating that after removing a logarithmic-sized quantum system from one half of a pair of perfectly correlated bitstrings, even the most sensitive pair of measurements might yield only outcomes essentially independent of each other. This effect is a form of information locking but the definition we use is strictly stronger than those used previously. Moreover, we find that this property is generic, in the sense that it occurs when removing a random subsystem. As such, the effect might be relevant to statistical mechanics or black hole physics. While previous works had always assumed a uniform message, we assume only a min-entropy bound and also explore the effect of entanglement. We find that classical information is strongly locked almost until it can be completely decoded. Finally, we exhibit a quantum key distribution protocol that is ‘secure’ in the sense of accessible information but in which leakage of even a logarithmic number of bits compromises the secrecy of all others. PMID:24204183

  4. Inverting ion images without Abel inversion: maximum entropy reconstruction of velocity maps.

    PubMed

    Dick, Bernhard

    2014-01-14

    A new method for the reconstruction of velocity maps from ion images is presented, which is based on the maximum entropy concept. In contrast to other methods used for Abel inversion the new method never applies an inversion or smoothing to the data. Instead, it iteratively finds the map which is the most likely cause for the observed data, using the correct likelihood criterion for data sampled from a Poissonian distribution. The entropy criterion minimizes the information content in this map, which hence contains no information for which there is no evidence in the data. Two implementations are proposed, and their performance is demonstrated with simulated and experimental data: Maximum Entropy Velocity Image Reconstruction (MEVIR) obtains a two-dimensional slice through the velocity distribution and can be compared directly to Abel inversion. Maximum Entropy Velocity Legendre Reconstruction (MEVELER) finds one-dimensional distribution functions Q(l)(v) in an expansion of the velocity distribution in Legendre polynomials P((cos θ) for the angular dependence. Both MEVIR and MEVELER can be used for the analysis of ion images with intensities as low as 0.01 counts per pixel, with MEVELER performing significantly better than MEVIR for images with low intensity. Both methods perform better than pBASEX, in particular for images with less than one average count per pixel.

  5. Chapman Enskog-maximum entropy method on time-dependent neutron transport equation

    NASA Astrophysics Data System (ADS)

    Abdou, M. A.

    2006-09-01

    The time-dependent neutron transport equation in semi and infinite medium with linear anisotropic and Rayleigh scattering is proposed. The problem is solved by means of the flux-limited, Chapman Enskog-maximum entropy for obtaining the solution of the time-dependent neutron transport. The solution gives the neutron distribution density function which is used to compute numerically the radiant energy density E(x,t), net flux F(x,t) and reflectivity Rf. The behaviour of the approximate flux-limited maximum entropy neutron density function are compared with those found by other theories. Numerical calculations for the radiant energy, net flux and reflectivity of the proposed medium are calculated at different time and space.

  6. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle.

    PubMed

    Shalymov, Dmitry S; Fradkov, Alexander L

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined.

  7. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle

    PubMed Central

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined. PMID:26997886

  8. Maximum-entropy description of animal movement.

    PubMed

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  9. Exact computation of the maximum-entropy potential of spiking neural-network models.

    PubMed

    Cofré, R; Cessac, B

    2014-05-01

    Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.

  10. An alternative expression to the Sackur-Tetrode entropy formula for an ideal gas

    NASA Astrophysics Data System (ADS)

    Nagata, Shoichi

    2018-03-01

    An expression for the entropy of a monoatomic classical ideal gas is known as the Sackur-Tetrode equation. This pioneering investigation about 100 years ago incorporates quantum considerations. The purpose of this paper is to provide an alternative expression for the entropy in terms of the Heisenberg uncertainty relation. The analysis is made on the basis of fluctuation theory, for a canonical system in thermal equilibrium at temperature T. This new formula indicates manifestly that the entropy of macroscopic world is recognized as a measure of uncertainty in microscopic quantum world. The entropy in the Sackur-Tetrode equation can be re-interpreted from a different perspective viewpoint. The emphasis is on the connection between the entropy and the uncertainty relation in quantum consideration.

  11. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  12. Exploiting Acoustic and Syntactic Features for Automatic Prosody Labeling in a Maximum Entropy Framework

    PubMed Central

    Sridhar, Vivek Kumar Rangarajan; Bangalore, Srinivas; Narayanan, Shrikanth S.

    2009-01-01

    In this paper, we describe a maximum entropy-based automatic prosody labeling framework that exploits both language and speech information. We apply the proposed framework to both prominence and phrase structure detection within the Tones and Break Indices (ToBI) annotation scheme. Our framework utilizes novel syntactic features in the form of supertags and a quantized acoustic–prosodic feature representation that is similar to linear parameterizations of the prosodic contour. The proposed model is trained discriminatively and is robust in the selection of appropriate features for the task of prosody detection. The proposed maximum entropy acoustic–syntactic model achieves pitch accent and boundary tone detection accuracies of 86.0% and 93.1% on the Boston University Radio News corpus, and, 79.8% and 90.3% on the Boston Directions corpus. The phrase structure detection through prosodic break index labeling provides accuracies of 84% and 87% on the two corpora, respectively. The reported results are significantly better than previously reported results and demonstrate the strength of maximum entropy model in jointly modeling simple lexical, syntactic, and acoustic features for automatic prosody labeling. PMID:19603083

  13. The Thermodynamics of Black Holes.

    PubMed

    Wald, Robert M

    2001-01-01

    We review the present status of black hole thermodynamics. Our review includes discussion of classical black hole thermodynamics, Hawking radiation from black holes, the generalized second law, and the issue of entropy bounds. A brief survey also is given of approaches to the calculation of black hole entropy. We conclude with a discussion of some unresolved open issues.

  14. Holographic equipartition and the maximization of entropy

    NASA Astrophysics Data System (ADS)

    Krishna, P. B.; Mathew, Titus K.

    2017-09-01

    The accelerated expansion of the Universe can be interpreted as a tendency to satisfy holographic equipartition. It can be expressed by a simple law, Δ V =Δ t (Nsurf-ɛ Nbulk) , where V is the Hubble volume in Planck units, t is the cosmic time in Planck units, and Nsurf /bulk is the number of degrees of freedom on the horizon/bulk of the Universe. We show that this holographic equipartition law effectively implies the maximization of entropy. In the cosmological context, a system that obeys the holographic equipartition law behaves as an ordinary macroscopic system that proceeds to an equilibrium state of maximum entropy. We consider the standard Λ CDM model of the Universe and show that it is consistent with the holographic equipartition law. Analyzing the entropy evolution, we find that it also proceeds to an equilibrium state of maximum entropy.

  15. Entropic Inference

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  16. Entanglement entropy of ABJM theory and entropy of topological black hole

    NASA Astrophysics Data System (ADS)

    Nian, Jun; Zhang, Xinyu

    2017-07-01

    In this paper we discuss the supersymmetric localization of the 4D N = 2 offshell gauged supergravity on the background of the AdS4 neutral topological black hole, which is the gravity dual of the ABJM theory defined on the boundary {S}^1× H^2 . We compute the large- N expansion of the supergravity partition function. The result gives the black hole entropy with the logarithmic correction, which matches the previous result of the entanglement entropy of the ABJM theory up to some stringy effects. Our result is consistent with the previous on-shell one-loop computation of the logarithmic correction to black hole entropy. It provides an explicit example of the identification of the entanglement entropy of the boundary conformal field theory with the bulk black hole entropy beyond the leading order given by the classical Bekenstein-Hawking formula, which consequently tests the AdS/CFT correspondence at the subleading order.

  17. Geometric entropy and edge modes of the electromagnetic field

    NASA Astrophysics Data System (ADS)

    Donnelly, William; Wall, Aron C.

    2016-11-01

    We calculate the vacuum entanglement entropy of Maxwell theory in a class of curved spacetimes by Kaluza-Klein reduction of the theory onto a two-dimensional base manifold. Using two-dimensional duality, we express the geometric entropy of the electromagnetic field as the entropy of a tower of scalar fields, constant electric and magnetic fluxes, and a contact term, whose leading-order divergence was discovered by Kabat. The complete contact term takes the form of one negative scalar degree of freedom confined to the entangling surface. We show that the geometric entropy agrees with a statistical definition of entanglement entropy that includes edge modes: classical solutions determined by their boundary values on the entangling surface. This resolves a long-standing puzzle about the statistical interpretation of the contact term in the entanglement entropy. We discuss the implications of this negative term for black hole thermodynamics and the renormalization of Newton's constant.

  18. Statistical mechanics in the context of special relativity. II.

    PubMed

    Kaniadakis, G

    2005-09-01

    The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.

  19. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Treesearch

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  20. A unified approach to computational drug discovery.

    PubMed

    Tseng, Chih-Yuan; Tuszynski, Jack

    2015-11-01

    It has been reported that a slowdown in the development of new medical therapies is affecting clinical outcomes. The FDA has thus initiated the Critical Path Initiative project investigating better approaches. We review the current strategies in drug discovery and focus on the advantages of the maximum entropy method being introduced in this area. The maximum entropy principle is derived from statistical thermodynamics and has been demonstrated to be an inductive inference tool. We propose a unified method to drug discovery that hinges on robust information processing using entropic inductive inference. Increasingly, applications of maximum entropy in drug discovery employ this unified approach and demonstrate the usefulness of the concept in the area of pharmaceutical sciences. Copyright © 2015. Published by Elsevier Ltd.

  1. Monte Carlo simulation of a noisy quantum channel with memory.

    PubMed

    Akhalwaya, Ismail; Moodley, Mervlyn; Petruccione, Francesco

    2015-10-01

    The classical capacity of quantum channels is well understood for channels with uncorrelated noise. For the case of correlated noise, however, there are still open questions. We calculate the classical capacity of a forgetful channel constructed by Markov switching between two depolarizing channels. Techniques have previously been applied to approximate the output entropy of this channel and thus its capacity. In this paper, we use a Metropolis-Hastings Monte Carlo approach to numerically calculate the entropy. The algorithm is implemented in parallel and its performance is studied and optimized. The effects of memory on the capacity are explored and previous results are confirmed to higher precision.

  2. Thermalization dynamics in a quenched many-body state

    NASA Astrophysics Data System (ADS)

    Kaufman, Adam; Preiss, Philipp; Tai, Eric; Lukin, Alex; Rispoli, Matthew; Schittko, Robert; Greiner, Markus

    2016-05-01

    Quantum and classical many-body systems appear to have disparate behavior due to the different mechanisms that govern their evolution. The dynamics of a classical many-body system equilibrate to maximally entropic states and quickly re-thermalize when perturbed. The assumptions of ergodicity and unbiased configurations lead to a successful framework of describing classical systems by a sampling of thermal ensembles that are blind to the system's microscopic details. By contrast, an isolated quantum many-body system is governed by unitary evolution: the system retains memory of past dynamics and constant global entropy. However, even with differing characteristics, the long-term behavior for local observables in quenched, non-integrable quantum systems are often well described by the same thermal framework. We explore the onset of this convergence in a many-body system of bosonic atoms in an optical lattice. Our system's finite size allows us to verify full state purity and measure local observables. We observe rapid growth and saturation of the entanglement entropy with constant global purity. The combination of global purity and thermalized local observables agree with the Eigenstate Thermalization Hypothesis in the presence of a near-volume law in the entanglement entropy.

  3. Entropy-driven phase transitions of entanglement

    NASA Astrophysics Data System (ADS)

    Facchi, Paolo; Florio, Giuseppe; Parisi, Giorgio; Pascazio, Saverio; Yuasa, Kazuya

    2013-05-01

    We study the behavior of bipartite entanglement at fixed von Neumann entropy. We look at the distribution of the entanglement spectrum, that is, the eigenvalues of the reduced density matrix of a quantum system in a pure state. We report the presence of two continuous phase transitions, characterized by different entanglement spectra, which are deformations of classical eigenvalue distributions.

  4. A Formal Derivation of the Gibbs Entropy for Classical Systems Following the Schrodinger Quantum Mechanical Approach

    ERIC Educational Resources Information Center

    Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.

    2008-01-01

    In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…

  5. Maximization of the Thermoelectric Cooling of a Graded Peltier Device by Analytical Heat-Equation Resolution

    NASA Astrophysics Data System (ADS)

    Thiébaut, E.; Goupil, C.; Pesty, F.; D'Angelo, Y.; Guegan, G.; Lecoeur, P.

    2017-12-01

    Increasing the maximum cooling effect of a Peltier cooler can be achieved through material and device design. The use of inhomogeneous, functionally graded materials may be adopted in order to increase maximum cooling without improvement of the Z T (figure of merit); however, these systems are usually based on the assumption that the local optimization of the Z T is the suitable criterion to increase thermoelectric performance. We solve the heat equation in a graded material and perform both analytical and numerical analysis of a graded Peltier cooler. We find a local criterion that we use to assess the possible improvement of graded materials for thermoelectric cooling. A fair improvement of the cooling effect (up to 36%) is predicted for semiconductor materials, and the best graded system for cooling is described. The influence of the equation of state of the electronic gas of the material is discussed, and the difference in term of entropy production between the graded and the classical system is also described.

  6. Economics and Maximum Entropy Production

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.

    2003-04-01

    Price differentials, sales volume and profit can be seen as analogues of temperature difference, heat flow and work or entropy production in the climate system. One aspect in which economic systems exhibit more clarity than the climate is that the empirical and/or statistical mechanical tendency for systems to seek a maximum in production is very evident in economics, in that the profit motive is very clear. Noting the common link between 1/f noise, power laws and Self-Organized Criticality with Maximum Entropy Production, the power law fluctuations in security and commodity prices is not inconsistent with the analogy. There is an additional thermodynamic analogy, in that scarcity is valued. A commodity concentrated among a few traders is valued highly by the many who do not have it. The market therefore encourages via prices the spreading of those goods among a wider group, just as heat tends to diffuse, increasing entropy. I explore some empirical price-volume relationships of metals and meteorites in this context.

  7. Metabolic networks evolve towards states of maximum entropy production.

    PubMed

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Achieving the classical Carnot efficiency in a strongly coupled quantum heat engine

    NASA Astrophysics Data System (ADS)

    Xu, Y. Y.; Chen, B.; Liu, J.

    2018-02-01

    Generally, the efficiency of a heat engine strongly coupled with a heat bath is less than the classical Carnot efficiency. Through a model-independent method, we show that the classical Carnot efficiency is achieved in a strongly coupled quantum heat engine. First, we present the first law of quantum thermodynamics in strong coupling. Then, we show how to achieve the Carnot cycle and the classical Carnot efficiency at strong coupling. We find that this classical Carnot efficiency stems from the fact that the heat released in a nonequilibrium process is balanced by the absorbed heat. We also analyze the restrictions in the achievement of the Carnot cycle. The first restriction is that there must be two corresponding intervals of the controllable parameter in which the corresponding entropies of the work substance at the hot and cold temperatures are equal, and the second is that the entropy of the initial and final states in a nonequilibrium process must be equal. Through these restrictions, we obtain the positive work conditions, including the usual one in which the hot temperature should be higher than the cold, and a new one in which there must be an entropy interval at the hot temperature overlapping that at the cold. We demonstrate our result through a paradigmatic model—a two-level system in which a work substance strongly interacts with a heat bath. In this model, we find that the efficiency may abruptly decrease to zero due to the first restriction, and that the second restriction results in the control scheme becoming complex.

  9. Achieving the classical Carnot efficiency in a strongly coupled quantum heat engine.

    PubMed

    Xu, Y Y; Chen, B; Liu, J

    2018-02-01

    Generally, the efficiency of a heat engine strongly coupled with a heat bath is less than the classical Carnot efficiency. Through a model-independent method, we show that the classical Carnot efficiency is achieved in a strongly coupled quantum heat engine. First, we present the first law of quantum thermodynamics in strong coupling. Then, we show how to achieve the Carnot cycle and the classical Carnot efficiency at strong coupling. We find that this classical Carnot efficiency stems from the fact that the heat released in a nonequilibrium process is balanced by the absorbed heat. We also analyze the restrictions in the achievement of the Carnot cycle. The first restriction is that there must be two corresponding intervals of the controllable parameter in which the corresponding entropies of the work substance at the hot and cold temperatures are equal, and the second is that the entropy of the initial and final states in a nonequilibrium process must be equal. Through these restrictions, we obtain the positive work conditions, including the usual one in which the hot temperature should be higher than the cold, and a new one in which there must be an entropy interval at the hot temperature overlapping that at the cold. We demonstrate our result through a paradigmatic model-a two-level system in which a work substance strongly interacts with a heat bath. In this model, we find that the efficiency may abruptly decrease to zero due to the first restriction, and that the second restriction results in the control scheme becoming complex.

  10. Stochastic thermodynamics, fluctuation theorems and molecular machines.

    PubMed

    Seifert, Udo

    2012-12-01

    Stochastic thermodynamics as reviewed here systematically provides a framework for extending the notions of classical thermodynamics such as work, heat and entropy production to the level of individual trajectories of well-defined non-equilibrium ensembles. It applies whenever a non-equilibrium process is still coupled to one (or several) heat bath(s) of constant temperature. Paradigmatic systems are single colloidal particles in time-dependent laser traps, polymers in external flow, enzymes and molecular motors in single molecule assays, small biochemical networks and thermoelectric devices involving single electron transport. For such systems, a first-law like energy balance can be identified along fluctuating trajectories. For a basic Markovian dynamics implemented either on the continuum level with Langevin equations or on a discrete set of states as a master equation, thermodynamic consistency imposes a local-detailed balance constraint on noise and rates, respectively. Various integral and detailed fluctuation theorems, which are derived here in a unifying approach from one master theorem, constrain the probability distributions for work, heat and entropy production depending on the nature of the system and the choice of non-equilibrium conditions. For non-equilibrium steady states, particularly strong results hold like a generalized fluctuation-dissipation theorem involving entropy production. Ramifications and applications of these concepts include optimal driving between specified states in finite time, the role of measurement-based feedback processes and the relation between dissipation and irreversibility. Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production.

  11. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  12. Griffiths-like phase, critical behavior near the paramagnetic-ferromagnetic phase transition and magnetic entropy change of nanocrystalline La0.75Ca0.25MnO3

    NASA Astrophysics Data System (ADS)

    Phong, P. T.; Ngan, L. T. T.; Dang, N. V.; Nguyen, L. H.; Nam, P. H.; Thuy, D. M.; Tuan, N. D.; Bau, L. V.; Lee, I. J.

    2018-03-01

    In this work, we report the structural and magnetic properties of La0.75Ca0.25MnO3 nanoparticles synthesized by the sol-gel route. Rietvield refinement of X-ray powder diffraction confirms that our sample is single phase and crystallizes in orthorhombic system with Pnma space group. The facts that effective magnetic moment is large and the inverse susceptibility deviates from the Curie Weiss lawn indicate the presence of Griffiths-like cluster phase. The critical exponents have been estimated using different techniques such as modified Arrott plot, Kouvel-Fisher plot and critical isotherm technique. The critical exponents values of La0.75Ca0.25MnO3 are very close to those found out by the mean-field model, and this can be explained by the existence of a long-range interactions between spins in this system. These results were in good agreement with those obtained using the critical exponents of magnetic entropy change. The self-consistency and reliability of the critical exponent was verified by the Widom scaling law and the universal scaling hypothesis. Using the Harris criterion, we deduced that the disorder is relevant in our case. The maximum magnetic entropy change (ΔSM) calculated from the M-H measurements is 3.47 J/kg K under an external field change of 5 T. The ΔSM-T curves collapsed onto a single master curve regardless of the composition and the applied field, confirming the magnetic ordering is of second order nature. The obtained result was compared to ones calculated based on the Arrott plot and a good concordance is observed. Moreover, the spontaneous magnetization obtained from the entropy change is in excellent agreement with that deduced by classically extrapolation the Arrott curves. This result confirms the validity of the estimation of the spontaneous magnetization using the magnetic entropy change.

  13. Time dependence of Hawking radiation entropy

    NASA Astrophysics Data System (ADS)

    Page, Don N.

    2013-09-01

    If a black hole starts in a pure quantum state and evaporates completely by a unitary process, the von Neumann entropy of the Hawking radiation initially increases and then decreases back to zero when the black hole has disappeared. Here numerical results are given for an approximation to the time dependence of the radiation entropy under an assumption of fast scrambling, for large nonrotating black holes that emit essentially only photons and gravitons. The maximum of the von Neumann entropy then occurs after about 53.81% of the evaporation time, when the black hole has lost about 40.25% of its original Bekenstein-Hawking (BH) entropy (an upper bound for its von Neumann entropy) and then has a BH entropy that equals the entropy in the radiation, which is about 59.75% of the original BH entropy 4πM02, or about 7.509M02 ≈ 6.268 × 1076(M0/Msolar)2, using my 1976 calculations that the photon and graviton emission process into empty space gives about 1.4847 times the BH entropy loss of the black hole. Results are also given for black holes in initially impure states. If the black hole starts in a maximally mixed state, the von Neumann entropy of the Hawking radiation increases from zero up to a maximum of about 119.51% of the original BH entropy, or about 15.018M02 ≈ 1.254 × 1077(M0/Msolar)2, and then decreases back down to 4πM02 = 1.049 × 1077(M0/Msolar)2.

  14. Using maximum entropy modeling to identify and prioritize red spruce forest habitat in West Virginia

    Treesearch

    Nathan R. Beane; James S. Rentch; Thomas M. Schuler

    2013-01-01

    Red spruce forests in West Virginia are found in island-like distributions at high elevations and provide essential habitat for the endangered Cheat Mountain salamander and the recently delisted Virginia northern flying squirrel. Therefore, it is important to identify restoration priorities of red spruce forests. Maximum entropy modeling was used to identify areas of...

  15. Maximum entropy PDF projection: A review

    NASA Astrophysics Data System (ADS)

    Baggenstoss, Paul M.

    2017-06-01

    We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.

  16. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    NASA Technical Reports Server (NTRS)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  17. Entanglement entropy of electromagnetic edge modes.

    PubMed

    Donnelly, William; Wall, Aron C

    2015-03-20

    The vacuum entanglement entropy of Maxwell theory, when evaluated by standard methods, contains an unexpected term with no known statistical interpretation. We resolve this two-decades old puzzle by showing that this term is the entanglement entropy of edge modes: classical solutions determined by the electric field normal to the entangling surface. We explain how the heat kernel regularization applied to this term leads to the negative divergent expression found by Kabat. This calculation also resolves a recent puzzle concerning the logarithmic divergences of gauge fields in 3+1 dimensions.

  18. The stochastic thermodynamics of a rotating Brownian particle in a gradient flow

    PubMed Central

    Lan, Yueheng; Aurell, Erik

    2015-01-01

    We compute the entropy production engendered in the environment from a single Brownian particle which moves in a gradient flow, and show that it corresponds in expectation to classical near-equilibrium entropy production in the surrounding fluid with specific mesoscopic transport coefficients. With temperature gradient, extra terms are found which result from the nonlinear interaction between the particle and the non-equilibrated environment. The calculations are based on the fluctuation relations which relate entropy production to the probabilities of stochastic paths and carried out in a multi-time formalism. PMID:26194015

  19. n-Order and maximum fuzzy similarity entropy for discrimination of signals of different complexity: Application to fetal heart rate signals.

    PubMed

    Zaylaa, Amira; Oudjemia, Souad; Charara, Jamal; Girault, Jean-Marc

    2015-09-01

    This paper presents two new concepts for discrimination of signals of different complexity. The first focused initially on solving the problem of setting entropy descriptors by varying the pattern size instead of the tolerance. This led to the search for the optimal pattern size that maximized the similarity entropy. The second paradigm was based on the n-order similarity entropy that encompasses the 1-order similarity entropy. To improve the statistical stability, n-order fuzzy similarity entropy was proposed. Fractional Brownian motion was simulated to validate the different methods proposed, and fetal heart rate signals were used to discriminate normal from abnormal fetuses. In all cases, it was found that it was possible to discriminate time series of different complexity such as fractional Brownian motion and fetal heart rate signals. The best levels of performance in terms of sensitivity (90%) and specificity (90%) were obtained with the n-order fuzzy similarity entropy. However, it was shown that the optimal pattern size and the maximum similarity measurement were related to intrinsic features of the time series. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Mathematical and information-geometrical entropy for phenomenological Fourier and non-Fourier heat conduction

    NASA Astrophysics Data System (ADS)

    Li, Shu-Nan; Cao, Bing-Yang

    2017-09-01

    The second law of thermodynamics governs the direction of heat transport, which provides the foundational definition of thermodynamic Clausius entropy. The definitions of entropy are further generalized for the phenomenological heat transport models in the frameworks of classical irreversible thermodynamics and extended irreversible thermodynamics (EIT). In this work, entropic functions from mathematics are combined with phenomenological heat conduction models and connected to several information-geometrical conceptions. The long-time behaviors of these mathematical entropies exhibit a wide diversity and physical pictures in phenomenological heat conductions, including the tendency to thermal equilibrium, and exponential decay of nonequilibrium and asymptotics, which build a bridge between the macroscopic and microscopic modelings. In contrast with the EIT entropies, the mathematical entropies expressed in terms of the internal energy function can avoid singularity paired with nonpositive local absolute temperature caused by non-Fourier heat conduction models.

  1. Information theory lateral density distribution for Earth inferred from global gravity field

    NASA Technical Reports Server (NTRS)

    Rubincam, D. P.

    1981-01-01

    Information Theory Inference, better known as the Maximum Entropy Method, was used to infer the lateral density distribution inside the Earth. The approach assumed that the Earth consists of indistinguishable Maxwell-Boltzmann particles populating infinitesimal volume elements, and followed the standard methods of statistical mechanics (maximizing the entropy function). The GEM 10B spherical harmonic gravity field coefficients, complete to degree and order 36, were used as constraints on the lateral density distribution. The spherically symmetric part of the density distribution was assumed to be known. The lateral density variation was assumed to be small compared to the spherically symmetric part. The resulting information theory density distribution for the cases of no crust removed, 30 km of compensated crust removed, and 30 km of uncompensated crust removed all gave broad density anomalies extending deep into the mantle, but with the density contrasts being the greatest towards the surface (typically + or 0.004 g cm 3 in the first two cases and + or - 0.04 g cm 3 in the third). None of the density distributions resemble classical organized convection cells. The information theory approach may have use in choosing Standard Earth Models, but, the inclusion of seismic data into the approach appears difficult.

  2. Horizon Entropy from Quantum Gravity Condensates.

    PubMed

    Oriti, Daniele; Pranzetti, Daniele; Sindoni, Lorenzo

    2016-05-27

    We construct condensate states encoding the continuum spherically symmetric quantum geometry of a horizon in full quantum gravity, i.e., without any classical symmetry reduction, in the group field theory formalism. Tracing over the bulk degrees of freedom, we show how the resulting reduced density matrix manifestly exhibits a holographic behavior. We derive a complete orthonormal basis of eigenstates for the reduced density matrix of the horizon and use it to compute the horizon entanglement entropy. By imposing consistency with the horizon boundary conditions and semiclassical thermodynamical properties, we recover the Bekenstein-Hawking entropy formula for any value of the Immirzi parameter. Our analysis supports the equivalence between the von Neumann (entanglement) entropy interpretation and the Boltzmann (statistical) one.

  3. Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations

    DTIC Science & Technology

    2017-08-21

    distributions, and we discuss some applications for engineered and biological information transmission systems. Keywords: information theory; minimum...of its interpretation as a measure of the amount of information communicable by a neural system to groups of downstream neurons. Previous authors...of the maximum entropy approach. Our results also have relevance for engineered information transmission systems. We show that empirically measured

  4. Interatomic potentials in condensed matter via the maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Carlsson, A. E.

    1987-09-01

    A general method is described for the calculation of interatomic potentials in condensed-matter systems by use of a maximum-entropy Ansatz for the interatomic correlation functions. The interatomic potentials are given explicitly in terms of statistical correlation functions involving the potential energy and the structure factor of a ``reference medium.'' Illustrations are given for Al-Cu alloys and a model transition metal.

  5. Stochastic characteristics of different duration annual maximum rainfall and its spatial difference in China based on information entropy

    NASA Astrophysics Data System (ADS)

    Li, X.; Sang, Y. F.

    2017-12-01

    Mountain torrents, urban floods and other disasters caused by extreme precipitation bring great losses to the ecological environment, social and economic development, people's lives and property security. So there is of great significance to floods prevention and control by the study of its spatial distribution. Based on the annual maximum rainfall data of 60min, 6h and 24h, the paper generate long sequences following Pearson-III distribution, and then use the information entropy index to study the spatial distribution and difference of different duration. The results show that the information entropy value of annual maximum rainfall in the south region is greater than that in the north region, indicating more obvious stochastic characteristics of annual maximum rainfall in the latter. However, the spatial distribution of stochastic characteristics is different in different duration. For example, stochastic characteristics of 60min annual maximum rainfall in the Eastern Tibet is smaller than surrounding, but 6h and 24h annual maximum rainfall is larger than surrounding area. In the Haihe River Basin and the Huaihe River Basin, the stochastic characteristics of the 60min annual maximum rainfall was not significantly different from that in the surrounding area, and stochastic characteristics of 6h and 24h was smaller than that in the surrounding area. We conclude that the spatial distribution of information entropy values of annual maximum rainfall in different duration can reflect the spatial distribution of its stochastic characteristics, thus the results can be an importantly scientific basis for the flood prevention and control, agriculture, economic-social developments and urban flood control and waterlogging.

  6. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  7. Bayesian view of single-qubit clocks, and an energy versus accuracy tradeoff

    NASA Astrophysics Data System (ADS)

    Gopalkrishnan, Manoj; Kandula, Varshith; Sriram, Praveen; Deshpande, Abhishek; Muralidharan, Bhaskaran

    2017-09-01

    We bring a Bayesian approach to the analysis of clocks. Using exponential distributions as priors for clocks, we analyze how well one can keep time with a single qubit freely precessing under a magnetic field. We find that, at least with a single qubit, quantum mechanics does not allow exact timekeeping, in contrast to classical mechanics, which does. We find the design of the single-qubit clock that leads to maximum accuracy. Further, we find an energy versus accuracy tradeoff—the energy cost is at least kBT times the improvement in accuracy as measured by the entropy reduction in going from the prior distribution to the posterior distribution. We propose a physical realization of the single-qubit clock using charge transport across a capacitively coupled quantum dot.

  8. Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: a basis for q-exponential distributions.

    PubMed

    Abe, Sumiyoshi

    2002-10-01

    The q-exponential distributions, which are generalizations of the Zipf-Mandelbrot power-law distribution, are frequently encountered in complex systems at their stationary states. From the viewpoint of the principle of maximum entropy, they can apparently be derived from three different generalized entropies: the Rényi entropy, the Tsallis entropy, and the normalized Tsallis entropy. Accordingly, mere fittings of observed data by the q-exponential distributions do not lead to identification of the correct physical entropy. Here, stabilities of these entropies, i.e., their behaviors under arbitrary small deformation of a distribution, are examined. It is shown that, among the three, the Tsallis entropy is stable and can provide an entropic basis for the q-exponential distributions, whereas the others are unstable and cannot represent any experimentally observable quantities.

  9. Time dependence of Hawking radiation entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, Don N., E-mail: profdonpage@gmail.com

    2013-09-01

    If a black hole starts in a pure quantum state and evaporates completely by a unitary process, the von Neumann entropy of the Hawking radiation initially increases and then decreases back to zero when the black hole has disappeared. Here numerical results are given for an approximation to the time dependence of the radiation entropy under an assumption of fast scrambling, for large nonrotating black holes that emit essentially only photons and gravitons. The maximum of the von Neumann entropy then occurs after about 53.81% of the evaporation time, when the black hole has lost about 40.25% of its originalmore » Bekenstein-Hawking (BH) entropy (an upper bound for its von Neumann entropy) and then has a BH entropy that equals the entropy in the radiation, which is about 59.75% of the original BH entropy 4πM{sub 0}{sup 2}, or about 7.509M{sub 0}{sup 2} ≈ 6.268 × 10{sup 76}(M{sub 0}/M{sub s}un){sup 2}, using my 1976 calculations that the photon and graviton emission process into empty space gives about 1.4847 times the BH entropy loss of the black hole. Results are also given for black holes in initially impure states. If the black hole starts in a maximally mixed state, the von Neumann entropy of the Hawking radiation increases from zero up to a maximum of about 119.51% of the original BH entropy, or about 15.018M{sub 0}{sup 2} ≈ 1.254 × 10{sup 77}(M{sub 0}/M{sub s}un){sup 2}, and then decreases back down to 4πM{sub 0}{sup 2} = 1.049 × 10{sup 77}(M{sub 0}/M{sub s}un){sup 2}.« less

  10. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models

    PubMed Central

    Grün, Sonja; Helias, Moritz

    2017-01-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition. PMID:28968396

  11. Combining Experiments and Simulations Using the Maximum Entropy Principle

    PubMed Central

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124

  12. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models.

    PubMed

    Rostami, Vahid; Porta Mana, PierGianLuca; Grün, Sonja; Helias, Moritz

    2017-10-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition.

  13. [Maximum entropy model versus remote sensing-based methods for extracting Oncomelania hupensis snail habitats].

    PubMed

    Cong-Cong, Xia; Cheng-Fang, Lu; Si, Li; Tie-Jun, Zhang; Sui-Heng, Lin; Yi, Hu; Ying, Liu; Zhi-Jie, Zhang

    2016-12-02

    To explore the technique of maximum entropy model for extracting Oncomelania hupensis snail habitats in Poyang Lake zone. The information of snail habitats and related environment factors collected in Poyang Lake zone were integrated to set up the maximum entropy based species model and generate snail habitats distribution map. Two Landsat 7 ETM+ remote sensing images of both wet and drought seasons in Poyang Lake zone were obtained, where the two indices of modified normalized difference water index (MNDWI) and normalized difference vegetation index (NDVI) were applied to extract snail habitats. The ROC curve, sensitivities and specificities were applied to assess their results. Furthermore, the importance of the variables for snail habitats was analyzed by using Jackknife approach. The evaluation results showed that the area under receiver operating characteristic curve (AUC) of testing data by the remote sensing-based method was only 0.56, and the sensitivity and specificity were 0.23 and 0.89 respectively. Nevertheless, those indices above-mentioned of maximum entropy model were 0.876, 0.89 and 0.74 respectively. The main concentration of snail habitats in Poyang Lake zone covered the northeast part of Yongxiu County, northwest of Yugan County, southwest of Poyang County and middle of Xinjian County, and the elevation was the most important environment variable affecting the distribution of snails, and the next was land surface temperature (LST). The maximum entropy model is more reliable and accurate than the remote sensing-based method for the sake of extracting snail habitats, which has certain guiding significance for the relevant departments to carry out measures to prevent and control high-risk snail habitats.

  14. Identification of Langmuir wave turbulence-supercontinuum transition by application of von Neumann entropy

    NASA Astrophysics Data System (ADS)

    Kawamori, Eiichirou

    2017-09-01

    A transition from Langmuir wave turbulence (LWT) to coherent Langmuir wave supercontinuum (LWSC) is identified in one-dimensional particle-in-cell simulations as the emergence of a broad frequency band showing significant temporal coherence of a wave field accompanied by a decrease in the von Neumann entropy of classical wave fields. The concept of the von Neumann entropy is utilized for evaluation of the phase-randomizing degree of the classical wave fields, together with introduction of the density matrix of the wave fields. The transition from LWT to LWSC takes place when the energy per one plasmon (one wave quantum) exceeds a certain threshold. The coherent nature, which Langmuir wave systems acquire through the transition, is created by four wave mixings of the plasmons. The emergence of temporal coherence and the decrease in the phase randomization are considered as the development of long-range order and spontaneous symmetry breaking, respectively, indicating that the LWT-LWSC transition is a second order phase transition phenomenon.

  15. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  16. Cosmic equilibration: A holographic no-hair theorem from the generalized second law

    NASA Astrophysics Data System (ADS)

    Carroll, Sean M.; Chatwin-Davies, Aidan

    2018-02-01

    In a wide class of cosmological models, a positive cosmological constant drives cosmological evolution toward an asymptotically de Sitter phase. Here we connect this behavior to the increase of entropy over time, based on the idea that de Sitter spacetime is a maximum-entropy state. We prove a cosmic no-hair theorem for Robertson-Walker and Bianchi I spacetimes that admit a Q-screen ("quantum" holographic screen) with certain entropic properties: If generalized entropy, in the sense of the cosmological version of the generalized second law conjectured by Bousso and Engelhardt, increases up to a finite maximum value along the screen, then the spacetime is asymptotically de Sitter in the future. Moreover, the limiting value of generalized entropy coincides with the de Sitter horizon entropy. We do not use the Einstein field equations in our proof, nor do we assume the existence of a positive cosmological constant. As such, asymptotic relaxation to a de Sitter phase can, in a precise sense, be thought of as cosmological equilibration.

  17. Pareto versus lognormal: A maximum entropy test

    NASA Astrophysics Data System (ADS)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  18. Statistical mechanical theory of liquid entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, D.C.

    The multiparticle correlation expansion for the entropy of a classical monatomic liquid is presented. This entropy expresses the physical picture in which there is no free particle motion, but rather, each atom moves within a cage formed by its neighbors. The liquid expansion, including only pair correlations, gives an excellent account of the experimental entropy of most liquid metals, of liquid argon, and the hard sphere liquid. The pair correlation entropy is well approximated by a universal function of temperature. Higher order correlation entropy, due to n-particle irreducible correlations for n{ge}3, is significant in only a few liquid metals, andmore » its occurrence suggests the presence of n-body forces. When the liquid theory is applied to the study of melting, the author discovers the important classification of normal and anomalous melting, according to whether there is not or is a significant change in the electronic structure upon melting, and he discovers the universal disordering entropy for melting of a monatomic crystal. Interesting directions for future research are: extension to include orientational correlations of molecules, theoretical calculation of the entropy of water, application to the entropy of the amorphous state, and correlational entropy of compressed argon. The author clarifies the relation among different entropy expansions in the recent literature.« less

  19. Formal groups and Z-entropies

    PubMed Central

    2016-01-01

    We shall prove that the celebrated Rényi entropy is the first example of a new family of infinitely many multi-parametric entropies. We shall call them the Z-entropies. Each of them, under suitable hypotheses, generalizes the celebrated entropies of Boltzmann and Rényi. A crucial aspect is that every Z-entropy is composable (Tempesta 2016 Ann. Phys. 365, 180–197. (doi:10.1016/j.aop.2015.08.013)). This property means that the entropy of a system which is composed of two or more independent systems depends, in all the associated probability space, on the choice of the two systems only. Further properties are also required to describe the composition process in terms of a group law. The composability axiom, introduced as a generalization of the fourth Shannon–Khinchin axiom (postulating additivity), is a highly non-trivial requirement. Indeed, in the trace-form class, the Boltzmann entropy and Tsallis entropy are the only known composable cases. However, in the non-trace form class, the Z-entropies arise as new entropic functions possessing the mathematical properties necessary for information-theoretical applications, in both classical and quantum contexts. From a mathematical point of view, composability is intimately related to formal group theory of algebraic topology. The underlying group-theoretical structure determines crucially the statistical properties of the corresponding entropies. PMID:27956871

  20. Maximum Entropy for the International Division of Labor.

    PubMed

    Lei, Hongmei; Chen, Ying; Li, Ruiqi; He, Deli; Zhang, Jiang

    2015-01-01

    As a result of the international division of labor, the trade value distribution on different products substantiated by international trade flows can be regarded as one country's strategy for competition. According to the empirical data of trade flows, countries may spend a large fraction of export values on ubiquitous and competitive products. Meanwhile, countries may also diversify their exports share on different types of products to reduce the risk. In this paper, we report that the export share distribution curves can be derived by maximizing the entropy of shares on different products under the product's complexity constraint once the international market structure (the country-product bipartite network) is given. Therefore, a maximum entropy model provides a good fit to empirical data. The empirical data is consistent with maximum entropy subject to a constraint on the expected value of the product complexity for each country. One country's strategy is mainly determined by the types of products this country can export. In addition, our model is able to fit the empirical export share distribution curves of nearly every country very well by tuning only one parameter.

  1. Maximum Entropy for the International Division of Labor

    PubMed Central

    Lei, Hongmei; Chen, Ying; Li, Ruiqi; He, Deli; Zhang, Jiang

    2015-01-01

    As a result of the international division of labor, the trade value distribution on different products substantiated by international trade flows can be regarded as one country’s strategy for competition. According to the empirical data of trade flows, countries may spend a large fraction of export values on ubiquitous and competitive products. Meanwhile, countries may also diversify their exports share on different types of products to reduce the risk. In this paper, we report that the export share distribution curves can be derived by maximizing the entropy of shares on different products under the product’s complexity constraint once the international market structure (the country-product bipartite network) is given. Therefore, a maximum entropy model provides a good fit to empirical data. The empirical data is consistent with maximum entropy subject to a constraint on the expected value of the product complexity for each country. One country’s strategy is mainly determined by the types of products this country can export. In addition, our model is able to fit the empirical export share distribution curves of nearly every country very well by tuning only one parameter. PMID:26172052

  2. Maximum entropy production in environmental and ecological systems.

    PubMed

    Kleidon, Axel; Malhi, Yadvinder; Cox, Peter M

    2010-05-12

    The coupled biosphere-atmosphere system entails a vast range of processes at different scales, from ecosystem exchange fluxes of energy, water and carbon to the processes that drive global biogeochemical cycles, atmospheric composition and, ultimately, the planetary energy balance. These processes are generally complex with numerous interactions and feedbacks, and they are irreversible in their nature, thereby producing entropy. The proposed principle of maximum entropy production (MEP), based on statistical mechanics and information theory, states that thermodynamic processes far from thermodynamic equilibrium will adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate. This issue focuses on the latest development of applications of MEP to the biosphere-atmosphere system including aspects of the atmospheric circulation, the role of clouds, hydrology, vegetation effects, ecosystem exchange of energy and mass, biogeochemical interactions and the Gaia hypothesis. The examples shown in this special issue demonstrate the potential of MEP to contribute to improved understanding and modelling of the biosphere and the wider Earth system, and also explore limitations and constraints to the application of the MEP principle.

  3. Propane spectral resolution enhancement by the maximum entropy method

    NASA Technical Reports Server (NTRS)

    Bonavito, N. L.; Stewart, K. P.; Hurley, E. J.; Yeh, K. C.; Inguva, R.

    1990-01-01

    The Burg algorithm for maximum entropy power spectral density estimation is applied to a time series of data obtained from a Michelson interferometer and compared with a standard FFT estimate for resolution capability. The propane transmittance spectrum was estimated by use of the FFT with a 2 to the 18th data sample interferogram, giving a maximum unapodized resolution of 0.06/cm. This estimate was then interpolated by zero filling an additional 2 to the 18th points, and the final resolution was taken to be 0.06/cm. Comparison of the maximum entropy method (MEM) estimate with the FFT was made over a 45/cm region of the spectrum for several increasing record lengths of interferogram data beginning at 2 to the 10th. It is found that over this region the MEM estimate with 2 to the 16th data samples is in close agreement with the FFT estimate using 2 to the 18th samples.

  4. Group entropies, correlation laws, and zeta functions.

    PubMed

    Tempesta, Piergiulio

    2011-08-01

    The notion of group entropy is proposed. It enables the unification and generaliztion of many different definitions of entropy known in the literature, such as those of Boltzmann-Gibbs, Tsallis, Abe, and Kaniadakis. Other entropic functionals are introduced, related to nontrivial correlation laws characterizing universality classes of systems out of equilibrium when the dynamics is weakly chaotic. The associated thermostatistics are discussed. The mathematical structure underlying our construction is that of formal group theory, which provides the general structure of the correlations among particles and dictates the associated entropic functionals. As an example of application, the role of group entropies in information theory is illustrated and generalizations of the Kullback-Leibler divergence are proposed. A new connection between statistical mechanics and zeta functions is established. In particular, Tsallis entropy is related to the classical Riemann zeta function.

  5. Thermodynamic resource theories, non-commutativity and maximum entropy principles

    NASA Astrophysics Data System (ADS)

    Lostaglio, Matteo; Jennings, David; Rudolph, Terry

    2017-04-01

    We discuss some features of thermodynamics in the presence of multiple conserved quantities. We prove a generalisation of Landauer principle illustrating tradeoffs between the erasure costs paid in different ‘currencies’. We then show how the maximum entropy and complete passivity approaches give different answers in the presence of multiple observables. We discuss how this seems to prevent current resource theories from fully capturing thermodynamic aspects of non-commutativity.

  6. Using the Maximum Entropy Principle as a Unifying Theory Characterization and Sampling of Multi-Scaling Processes in Hydrometeorology

    DTIC Science & Technology

    2015-08-20

    evapotranspiration (ET) over oceans may be significantly lower than previously thought. The MEP model parameterized turbulent transfer coefficients...fluxes, ocean freshwater fluxes, regional crop yield among others. An on-going study suggests that the global annual evapotranspiration (ET) over...Bras, Jingfeng Wang. A model of evapotranspiration based on the theory of maximum entropy production, Water Resources Research, (03 2011): 0. doi

  7. Hydrodynamic equations for electrons in graphene obtained from the maximum entropy principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barletti, Luigi, E-mail: luigi.barletti@unifi.it

    2014-08-15

    The maximum entropy principle is applied to the formal derivation of isothermal, Euler-like equations for semiclassical fermions (electrons and holes) in graphene. After proving general mathematical properties of the equations so obtained, their asymptotic form corresponding to significant physical regimes is investigated. In particular, the diffusive regime, the Maxwell-Boltzmann regime (high temperature), the collimation regime and the degenerate gas limit (vanishing temperature) are considered.

  8. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  9. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  10. Moisture sorption isotherms and thermodynamic properties of bovine leather

    NASA Astrophysics Data System (ADS)

    Fakhfakh, Rihab; Mihoubi, Daoued; Kechaou, Nabil

    2018-04-01

    This study was aimed at the determination of bovine leather moisture sorption characteristics using a static gravimetric method at 30, 40, 50, 60 and 70 °C. The curves exhibit type II behaviour according to the BET classification. The sorption isotherms fitting by seven equations shows that GAB model is able to reproduce the equilibrium moisture content evolution with water activity for moisture range varying from 0.02 to 0.83 kg/kg d.b (0.9898 < R2 < 0.999). The sorption isotherms exhibit hysteresis effect. Additionally, sorption isotherms data were used to determine the thermodynamic properties such as isosteric heat of sorption, sorption entropy, spreading pressure, net integral enthalpy and entropy. Net isosteric heat of sorption and differential entropy were evaluated through direct use of moisture isotherms by applying the Clausius-Clapeyron equation and used to investigate the enthalpy-entropy compensation theory. Both sorption enthalpy and entropy for desorption increase to a maximum with increasing moisture content, and then decrease sharply with rising moisture content. Adsorption enthalpy decreases with increasing moisture content. Whereas, adsorption entropy increases smoothly with increasing moisture content to a maximum of 6.29 J/K.mol. Spreading pressure increases with rising water activity. The net integral enthalpy seemed to decrease and then increase to become asymptotic. The net integral entropy decreased with moisture content increase.

  11. The smooth entropy formalism for von Neumann algebras

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Furrer, Fabian; Scholz, Volkher B.

    2016-01-01

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.

  12. The smooth entropy formalism for von Neumann algebras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berta, Mario, E-mail: berta@caltech.edu; Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp; Scholz, Volkher B., E-mail: scholz@phys.ethz.ch

    2016-01-15

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asplund, Curtis T., E-mail: ca2621@columbia.edu; Berenstein, David, E-mail: dberens@physics.ucsb.edu

    We consider oscillators evolving subject to a periodic driving force that dynamically entangles them, and argue that this gives the linearized evolution around periodic orbits in a general chaotic Hamiltonian dynamical system. We show that the entanglement entropy, after tracing over half of the oscillators, generically asymptotes to linear growth at a rate given by the sum of the positive Lyapunov exponents of the system. These exponents give a classical entropy growth rate, in the sense of Kolmogorov, Sinai and Pesin. We also calculate the dependence of this entropy on linear mixtures of the oscillator Hilbert-space factors, to investigate themore » dependence of the entanglement entropy on the choice of coarse graining. We find that for almost all choices the asymptotic growth rate is the same.« less

  14. Modeling Loop Entropy

    PubMed Central

    Chirikjian, Gregory S.

    2011-01-01

    Proteins fold from a highly disordered state into a highly ordered one. Traditionally, the folding problem has been stated as one of predicting ‘the’ tertiary structure from sequential information. However, new evidence suggests that the ensemble of unfolded forms may not be as disordered as once believed, and that the native form of many proteins may not be described by a single conformation, but rather an ensemble of its own. Quantifying the relative disorder in the folded and unfolded ensembles as an entropy difference may therefore shed light on the folding process. One issue that clouds discussions of ‘entropy’ is that many different kinds of entropy can be defined: entropy associated with overall translational and rotational Brownian motion, configurational entropy, vibrational entropy, conformational entropy computed in internal or Cartesian coordinates (which can even be different from each other), conformational entropy computed on a lattice; each of the above with different solvation and solvent models; thermodynamic entropy measured experimentally, etc. The focus of this work is the conformational entropy of coil/loop regions in proteins. New mathematical modeling tools for the approximation of changes in conformational entropy during transition from unfolded to folded ensembles are introduced. In particular, models for computing lower and upper bounds on entropy for polymer models of polypeptide coils both with and without end constraints are presented. The methods reviewed here include kinematics (the mathematics of rigid-body motions), classical statistical mechanics and information theory. PMID:21187223

  15. Excess Entropy Production in Quantum System: Quantum Master Equation Approach

    NASA Astrophysics Data System (ADS)

    Nakajima, Satoshi; Tokura, Yasuhiro

    2017-12-01

    For open systems described by the quantum master equation (QME), we investigate the excess entropy production under quasistatic operations between nonequilibrium steady states. The average entropy production is composed of the time integral of the instantaneous steady entropy production rate and the excess entropy production. We propose to define average entropy production rate using the average energy and particle currents, which are calculated by using the full counting statistics with QME. The excess entropy production is given by a line integral in the control parameter space and its integrand is called the Berry-Sinitsyn-Nemenman (BSN) vector. In the weakly nonequilibrium regime, we show that BSN vector is described by ln \\breve{ρ }_0 and ρ _0 where ρ _0 is the instantaneous steady state of the QME and \\breve{ρ }_0 is that of the QME which is given by reversing the sign of the Lamb shift term. If the system Hamiltonian is non-degenerate or the Lamb shift term is negligible, the excess entropy production approximately reduces to the difference between the von Neumann entropies of the system. Additionally, we point out that the expression of the entropy production obtained in the classical Markov jump process is different from our result and show that these are approximately equivalent only in the weakly nonequilibrium regime.

  16. Coherence and entanglement measures based on Rényi relative entropies

    NASA Astrophysics Data System (ADS)

    Zhu, Huangjun; Hayashi, Masahito; Chen, Lin

    2017-11-01

    We study systematically resource measures of coherence and entanglement based on Rényi relative entropies, which include the logarithmic robustness of coherence, geometric coherence, and conventional relative entropy of coherence together with their entanglement analogues. First, we show that each Rényi relative entropy of coherence is equal to the corresponding Rényi relative entropy of entanglement for any maximally correlated state. By virtue of this observation, we establish a simple operational connection between entanglement measures and coherence measures based on Rényi relative entropies. We then prove that all these coherence measures, including the logarithmic robustness of coherence, are additive. Accordingly, all these entanglement measures are additive for maximally correlated states. In addition, we derive analytical formulas for Rényi relative entropies of entanglement of maximally correlated states and bipartite pure states, which reproduce a number of classic results on the relative entropy of entanglement and logarithmic robustness of entanglement in a unified framework. Several nontrivial bounds for Rényi relative entropies of coherence (entanglement) are further derived, which improve over results known previously. Moreover, we determine all states whose relative entropy of coherence is equal to the logarithmic robustness of coherence. As an application, we provide an upper bound for the exact coherence distillation rate, which is saturated for pure states.

  17. Emergent Geometry from Entropy and Causality

    NASA Astrophysics Data System (ADS)

    Engelhardt, Netta

    In this thesis, we investigate the connections between the geometry of spacetime and aspects of quantum field theory such as entanglement entropy and causality. This work is motivated by the idea that spacetime geometry is an emergent phenomenon in quantum gravity, and that the physics responsible for this emergence is fundamental to quantum field theory. Part I of this thesis is focused on the interplay between spacetime and entropy, with a special emphasis on entropy due to entanglement. In general spacetimes, there exist locally-defined surfaces sensitive to the geometry that may act as local black hole boundaries or cosmological horizons; these surfaces, known as holographic screens, are argued to have a connection with the second law of thermodynamics. Holographic screens obey an area law, suggestive of an association with entropy; they are also distinguished surfaces from the perspective of the covariant entropy bound, a bound on the total entropy of a slice of the spacetime. This construction is shown to be quite general, and is formulated in both classical and perturbatively quantum theories of gravity. The remainder of Part I uses the Anti-de Sitter/ Conformal Field Theory (AdS/CFT) correspondence to both expand and constrain the connection between entanglement entropy and geometry. The AdS/CFT correspondence posits an equivalence between string theory in the "bulk" with AdS boundary conditions and certain quantum field theories. In the limit where the string theory is simply classical General Relativity, the Ryu-Takayanagi and more generally, the Hubeny-Rangamani-Takayanagi (HRT) formulae provide a way of relating the geometry of surfaces to entanglement entropy. A first-order bulk quantum correction to HRT was derived by Faulkner, Lewkowycz and Maldacena. This formula is generalized to include perturbative quantum corrections in the bulk at any (finite) order. Hurdles to spacetime emergence from entanglement entropy as described by HRT and its quantum generalizations are discussed, both at the classical and perturbatively quantum limits. In particular, several No Go Theorems are proven, indicative of a conclusion that supplementary approaches or information may be necessary to recover the full spacetime geometry. Part II of this thesis involves the relation between geometry and causality, the property that information cannot travel faster than light. Requiring this of any quantum field theory results in constraints on string theory setups that are dual to quantum field theories via the AdS/CFT correspondence. At the level of perturbative quantum gravity, it is shown that causality in the field theory constraints the causal structure in the bulk. At the level of nonperturbative quantum string theory, we find that constraints on causal signals restrict the possible ways in which curvature singularities can be resolved in string theory. Finally, a new program of research is proposed for the construction of bulk geometry from the divergences of correlation functions in the dual field theory. This divergence structure is linked to the causal structure of the bulk and of the field theory.

  18. Lyapounov variable: Entropy and measurement in quantum mechanics

    PubMed Central

    Misra, B.; Prigogine, I.; Courbage, M.

    1979-01-01

    We discuss the question of the dynamical meaning of the second law of thermodynamics in the framework of quantum mechanics. Previous discussion of the problem in the framework of classical dynamics has shown that the second law can be given a dynamical meaning in terms of the existence of so-called Lyapounov variables—i.e., dynamical variables varying monotonically in time without becoming contradictory. It has been found that such variables can exist in an extended framework of classical dynamics, provided that the dynamical motion is suitably unstable. In this paper we begin to extend these results to quantum mechanics. It is found that no dynamical variable with the characteristic properties of nonequilibrium entropy can be defined in the standard formulation of quantum mechanics. However, if the Hamiltonian has certain well-defined spectral properties, such variables can be defined but only as a nonfactorizable superoperator. Necessary nonfactorizability of such entropy operators M has the consequence that they cannot preserve the class of pure states. Physically, this means that the distinguishability between pure states and corresponding mixtures must be lost in the case of a quantal system for which the algebra of observables can be extended to include a new dynamical variable representing nonequilibrium entropy. We discuss how this result leads to a solution of the quantum measurement problem. It is also found that the question of existence of entropy of superoperators M is closely linked to the problem of defining an operator of time in quantum mechanics. PMID:16578757

  19. Entropy jump across an inviscid shock wave

    NASA Technical Reports Server (NTRS)

    Salas, Manuel D.; Iollo, Angelo

    1995-01-01

    The shock jump conditions for the Euler equations in their primitive form are derived by using generalized functions. The shock profiles for specific volume, speed, and pressure are shown to be the same, however density has a different shock profile. Careful study of the equations that govern the entropy shows that the inviscid entropy profile has a local maximum within the shock layer. We demonstrate that because of this phenomenon, the entropy, propagation equation cannot be used as a conservation law.

  20. An improved wavelet neural network medical image segmentation algorithm with combined maximum entropy

    NASA Astrophysics Data System (ADS)

    Hu, Xiaoqian; Tao, Jinxu; Ye, Zhongfu; Qiu, Bensheng; Xu, Jinzhang

    2018-05-01

    In order to solve the problem of medical image segmentation, a wavelet neural network medical image segmentation algorithm based on combined maximum entropy criterion is proposed. Firstly, we use bee colony algorithm to optimize the network parameters of wavelet neural network, get the parameters of network structure, initial weights and threshold values, and so on, we can quickly converge to higher precision when training, and avoid to falling into relative extremum; then the optimal number of iterations is obtained by calculating the maximum entropy of the segmented image, so as to achieve the automatic and accurate segmentation effect. Medical image segmentation experiments show that the proposed algorithm can reduce sample training time effectively and improve convergence precision, and segmentation effect is more accurate and effective than traditional BP neural network (back propagation neural network : a multilayer feed forward neural network which trained according to the error backward propagation algorithm.

  1. Maximum entropy deconvolution of the optical jet of 3C 273

    NASA Technical Reports Server (NTRS)

    Evans, I. N.; Ford, H. C.; Hui, X.

    1989-01-01

    The technique of maximum entropy image restoration is applied to the problem of deconvolving the point spread function from a deep, high-quality V band image of the optical jet of 3C 273. The resulting maximum entropy image has an approximate spatial resolution of 0.6 arcsec and has been used to study the morphology of the optical jet. Four regularly-spaced optical knots are clearly evident in the data, together with an optical 'extension' at each end of the optical jet. The jet oscillates around its center of gravity, and the spatial scale of the oscillations is very similar to the spacing between the optical knots. The jet is marginally resolved in the transverse direction and has an asymmetric profile perpendicular to the jet axis. The distribution of V band flux along the length of the jet, and accurate astrometry of the optical knot positions are presented.

  2. Moments of the Wigner function and Renyi entropies at freeze-out

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyz, W.; Zalewski, K.

    2006-03-01

    The relation between Renyi entropies and moments of the Wigner function, representing the quantum mechanical description of the M-particle semi-inclusive distribution at freeze-out, is investigated. It is shown that in the limit of infinite volume of the system, the classical and quantum descriptions are equivalent. Finite volume corrections are derived and shown to be small for systems encountered in relativistic heavy ion collisions.

  3. The Role of the Total Entropy Production in the Dynamics of Open Quantum Systems in Detection of Non-Markovianity

    NASA Astrophysics Data System (ADS)

    Salimi, S.; Haseli, S.; Khorashad, A. S.; Adabi, F.

    2016-09-01

    The interaction between system and environment is a fundamental concept in the theory of open quantum systems. As a result of the interaction, an amount of correlation (both classical and quantum) emerges between the system and the environment. In this work, we recall the quantity that will be very useful to describe the emergence of the correlation between the system and the environment, namely, the total entropy production. Appearance of total entropy production is due to the entanglement production between the system and the environment. In this work, we discuss about the role of the total entropy production for detecting the non-Markovianity. By utilizing the relation between the total entropy production and total correlation between subsystems, one can see a temporary decrease of total entropy production is a signature of non-Markovianity. We apply our criterion for the special case, where the composite system has initial correlation with environment.

  4. Essays on inference in economics, competition, and the rate of profit

    NASA Astrophysics Data System (ADS)

    Scharfenaker, Ellis S.

    This dissertation is comprised of three papers that demonstrate the role of Bayesian methods of inference and Shannon's information theory in classical political economy. The first chapter explores the empirical distribution of profit rate data from North American firms from 1962-2012. This chapter address the fact that existing methods for sample selection from noisy profit rate data in the industrial organization field of economics tends to be conditional on a covariate's value that risks discarding information. Conditioning sample selection instead on the profit rate data's structure by means of a two component (signal and noise) Bayesian mixture model we find the the profit rate sample to be time stationary Laplace distributed, corroborating earlier estimates of cross section distributions. The second chapter compares alternative probabilistic approaches to discrete (quantal) choice analysis and examines the various ways in which they overlap. In particular, the work on individual choice behavior by Duncan Luce and the extension of this work to quantal response problems by game theoreticians is shown to be related both to the rational inattention work of Christopher Sims through Shannon's information theory as well as to the maximum entropy principle of inference proposed physicist Edwin T. Jaynes. In the third chapter I propose a model of ``classically" competitive firms facing informational entropy constraints in their decisions to potentially enter or exit markets based on profit rate differentials. The result is a three parameter logit quantal response distribution for firm entry and exit decisions. Bayesian methods are used for inference into the the distribution of entry and exit decisions conditional on profit rate deviations and firm level data from Compustat is used to test these predictions.

  5. Holographic entanglement entropy in Suzuki-Trotter decomposition of spin systems.

    PubMed

    Matsueda, Hiroaki

    2012-03-01

    In quantum spin chains at criticality, two types of scaling for the entanglement entropy exist: one comes from conformal field theory (CFT), and the other is for entanglement support of matrix product state (MPS) approximation. On the other hand, the quantum spin-chain models can be mapped onto two-dimensional (2D) classical ones by the Suzuki-Trotter decomposition. Motivated by the scaling and the mapping, we introduce information entropy for 2D classical spin configurations as well as a spectrum, and examine their basic properties in the Ising and the three-state Potts models on the square lattice. They are defined by the singular values of the reduced density matrix for a Monte Carlo snapshot. We find scaling relations of the entropy compatible with the CFT and the MPS results. Thus, we propose that the entropy is a kind of "holographic" entanglement entropy. At T(c), the spin configuration is fractal, and various sizes of ordered clusters coexist. Then, the singular values automatically decompose the original snapshot into a set of images with different length scales, respectively. This is the origin of the scaling. In contrast to the MPS scaling, long-range spin correlation can be described by only few singular values. Furthermore, the spectrum, which is a set of logarithms of the singular values, also seems to be a holographic entanglement spectrum. We find multiple gaps in the spectrum, and in contrast to the topological phases, the low-lying levels below the gap represent spontaneous symmetry breaking. These contrasts are strong evidence of the dual nature of the holography. Based on these observations, we discuss the amount of information contained in one snapshot.

  6. Quantum entropy and uncertainty for two-mode squeezed, coherent and intelligent spin states

    NASA Technical Reports Server (NTRS)

    Aragone, C.; Mundarain, D.

    1993-01-01

    We compute the quantum entropy for monomode and two-mode systems set in squeezed states. Thereafter, the quantum entropy is also calculated for angular momentum algebra when the system is either in a coherent or in an intelligent spin state. These values are compared with the corresponding values of the respective uncertainties. In general, quantum entropies and uncertainties have the same minimum and maximum points. However, for coherent and intelligent spin states, it is found that some minima for the quantum entropy turn out to be uncertainty maxima. We feel that the quantum entropy we use provides the right answer, since it is given in an essentially unique way.

  7. Entropy and climate. I - ERBE observations of the entropy production of the earth

    NASA Technical Reports Server (NTRS)

    Stephens, G. L.; O'Brien, D. M.

    1993-01-01

    An approximate method for estimating the global distributions of the entropy fluxes flowing through the upper boundary of the climate system is introduced, and an estimate of the entropy exchange between the earth and space and the entropy production of the planet is provided. Entropy fluxes calculated from the Earth Radiation Budget Experiment measurements show how the long-wave entropy flux densities dominate the total entropy fluxes at all latitudes compared with the entropy flux densities associated with reflected sunlight, although the short-wave flux densities are important in the context of clear sky-cloudy sky net entropy flux differences. It is suggested that the entropy production of the planet is both constant for the 36 months of data considered and very near its maximum possible value. The mean value of this production is 0.68 x 10 exp 15 W/K, and the amplitude of the annual cycle is approximately 1 to 2 percent of this value.

  8. Mixed memory, (non) Hurst effect, and maximum entropy of rainfall in the tropical Andes

    NASA Astrophysics Data System (ADS)

    Poveda, Germán

    2011-02-01

    Diverse linear and nonlinear statistical parameters of rainfall under aggregation in time and the kind of temporal memory are investigated. Data sets from the Andes of Colombia at different resolutions (15 min and 1-h), and record lengths (21 months and 8-40 years) are used. A mixture of two timescales is found in the autocorrelation and autoinformation functions, with short-term memory holding for time lags less than 15-30 min, and long-term memory onwards. Consistently, rainfall variance exhibits different temporal scaling regimes separated at 15-30 min and 24 h. Tests for the Hurst effect evidence the frailty of the R/ S approach in discerning the kind of memory in high resolution rainfall, whereas rigorous statistical tests for short-memory processes do reject the existence of the Hurst effect. Rainfall information entropy grows as a power law of aggregation time, S( T) ˜ Tβ with < β> = 0.51, up to a timescale, TMaxEnt (70-202 h), at which entropy saturates, with β = 0 onwards. Maximum entropy is reached through a dynamic Generalized Pareto distribution, consistently with the maximum information-entropy principle for heavy-tailed random variables, and with its asymptotically infinitely divisible property. The dynamics towards the limit distribution is quantified. Tsallis q-entropies also exhibit power laws with T, such that Sq( T) ˜ Tβ( q) , with β( q) ⩽ 0 for q ⩽ 0, and β( q) ≃ 0.5 for q ⩾ 1. No clear patterns are found in the geographic distribution within and among the statistical parameters studied, confirming the strong variability of tropical Andean rainfall.

  9. On variational expressions for quantum relative entropies

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Fawzi, Omar; Tomamichel, Marco

    2017-12-01

    Distance measures between quantum states like the trace distance and the fidelity can naturally be defined by optimizing a classical distance measure over all measurement statistics that can be obtained from the respective quantum states. In contrast, Petz showed that the measured relative entropy, defined as a maximization of the Kullback-Leibler divergence over projective measurement statistics, is strictly smaller than Umegaki's quantum relative entropy whenever the states do not commute. We extend this result in two ways. First, we show that Petz' conclusion remains true if we allow general positive operator-valued measures. Second, we extend the result to Rényi relative entropies and show that for non-commuting states the sandwiched Rényi relative entropy is strictly larger than the measured Rényi relative entropy for α \\in (1/2, \\infty ) and strictly smaller for α \\in [0,1/2). The latter statement provides counterexamples for the data processing inequality of the sandwiched Rényi relative entropy for α < 1/2. Our main tool is a new variational expression for the measured Rényi relative entropy, which we further exploit to show that certain lower bounds on quantum conditional mutual information are superadditive.

  10. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  11. Competition between Homophily and Information Entropy Maximization in Social Networks

    PubMed Central

    Zhao, Jichang; Liang, Xiao; Xu, Ke

    2015-01-01

    In social networks, it is conventionally thought that two individuals with more overlapped friends tend to establish a new friendship, which could be stated as homophily breeding new connections. While the recent hypothesis of maximum information entropy is presented as the possible origin of effective navigation in small-world networks. We find there exists a competition between information entropy maximization and homophily in local structure through both theoretical and experimental analysis. This competition suggests that a newly built relationship between two individuals with more common friends would lead to less information entropy gain for them. We demonstrate that in the evolution of the social network, both of the two assumptions coexist. The rule of maximum information entropy produces weak ties in the network, while the law of homophily makes the network highly clustered locally and the individuals would obtain strong and trust ties. A toy model is also presented to demonstrate the competition and evaluate the roles of different rules in the evolution of real networks. Our findings could shed light on the social network modeling from a new perspective. PMID:26334994

  12. Symplectic evolution of Wigner functions in Markovian open systems.

    PubMed

    Brodier, O; Almeida, A M Ozorio de

    2004-01-01

    The Wigner function is known to evolve classically under the exclusive action of a quadratic Hamiltonian. If the system also interacts with the environment through Lindblad operators that are complex linear functions of position and momentum, then the general evolution is the convolution of a non-Hamiltonian classical propagation of the Wigner function with a phase space Gaussian that broadens in time. We analyze the consequences of this in the three generic cases of elliptic, hyperbolic, and parabolic Hamiltonians. The Wigner function always becomes positive in a definite time, which does not depend on the initial pure state. We observe the influence of classical dynamics and dissipation upon this threshold. We also derive an exact formula for the evolving linear entropy as the average of a narrowing Gaussian taken over a probability distribution that depends only on the initial state. This leads to a long time asymptotic formula for the growth of linear entropy. We finally discuss the possibility of recovering the initial state.

  13. Nonequilibrium thermodynamics and maximum entropy production in the Earth system: applications and implications.

    PubMed

    Kleidon, Axel

    2009-06-01

    The Earth system is maintained in a unique state far from thermodynamic equilibrium, as, for instance, reflected in the high concentration of reactive oxygen in the atmosphere. The myriad of processes that transform energy, that result in the motion of mass in the atmosphere, in oceans, and on land, processes that drive the global water, carbon, and other biogeochemical cycles, all have in common that they are irreversible in their nature. Entropy production is a general consequence of these processes and measures their degree of irreversibility. The proposed principle of maximum entropy production (MEP) states that systems are driven to steady states in which they produce entropy at the maximum possible rate given the prevailing constraints. In this review, the basics of nonequilibrium thermodynamics are described, as well as how these apply to Earth system processes. Applications of the MEP principle are discussed, ranging from the strength of the atmospheric circulation, the hydrological cycle, and biogeochemical cycles to the role that life plays in these processes. Nonequilibrium thermodynamics and the MEP principle have potentially wide-ranging implications for our understanding of Earth system functioning, how it has evolved in the past, and why it is habitable. Entropy production allows us to quantify an objective direction of Earth system change (closer to vs further away from thermodynamic equilibrium, or, equivalently, towards a state of MEP). When a maximum in entropy production is reached, MEP implies that the Earth system reacts to perturbations primarily with negative feedbacks. In conclusion, this nonequilibrium thermodynamic view of the Earth system shows great promise to establish a holistic description of the Earth as one system. This perspective is likely to allow us to better understand and predict its function as one entity, how it has evolved in the past, and how it is modified by human activities in the future.

  14. Beyond maximum entropy: Fractal pixon-based image reconstruction

    NASA Technical Reports Server (NTRS)

    Puetter, R. C.; Pina, R. K.

    1994-01-01

    We have developed a new Bayesian image reconstruction method that has been shown to be superior to the best implementations of other methods, including Goodness-of-Fit (e.g. Least-Squares and Lucy-Richardson) and Maximum Entropy (ME). Our new method is based on the concept of the pixon, the fundamental, indivisible unit of picture information. Use of the pixon concept provides an improved image model, resulting in an image prior which is superior to that of standard ME.

  15. Assisted Distillation of Quantum Coherence.

    PubMed

    Chitambar, E; Streltsov, A; Rana, S; Bera, M N; Adesso, G; Lewenstein, M

    2016-02-19

    We introduce and study the task of assisted coherence distillation. This task arises naturally in bipartite systems where both parties work together to generate the maximal possible coherence on one of the subsystems. Only incoherent operations are allowed on the target system, while general local quantum operations are permitted on the other; this is an operational paradigm that we call local quantum-incoherent operations and classical communication. We show that the asymptotic rate of assisted coherence distillation for pure states is equal to the coherence of assistance, an analog of the entanglement of assistance, whose properties we characterize. Our findings imply a novel interpretation of the von Neumann entropy: it quantifies the maximum amount of extra quantum coherence a system can gain when receiving assistance from a collaborative party. Our results are generalized to coherence localization in a multipartite setting and possible applications are discussed.

  16. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  17. Fundamental properties of fracture and seismicity in a non extensive statistical physics framework.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2010-05-01

    A fundamental challenge in many scientific disciplines concerns upscaling, that is, of determining the regularities and laws of evolution at some large scale, from those known at a lower scale. Earthquake physics is no exception, with the challenge of understanding the transition from the laboratory scale to the scale of fault networks and large earthquakes. In this context, statistical physics has a remarkably successful work record in addressing the upscaling problem in physics. It is natural then to consider that the physics of many earthquakes has to be studied with a different approach than the physics of one earthquake and in this sense we can consider the use of statistical physics not only appropriate but necessary to understand the collective properties of earthquakes [see Corral 2004, 2005a,b,c;]. A significant attempt is given in a series of works [Main 1996; Rundle et al., 1997; Main et al., 2000; Main and Al-Kindy, 2002; Rundle et al., 2003; Vallianatos and Triantis, 2008a] that uses classical statistical physics to describe seismicity. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from fracture level to seismicity scale?? The application of non extensive statistical physics offers a consistent theoretical framework, based on a generalization of entropy, to analyze the behavior of natural systems with fractal or multi-fractal distribution of their elements. Such natural systems where long - range interactions or intermittency are important, lead to power law behavior. We note that this is consistent with a classical thermodynamic approach to natural systems that rapidly attain equilibrium, leading to exponential-law behavior. In the frame of non extensive statistical physics approach, the probability function p(X) is calculated using the maximum entropy formulation of Tsallis entropy which involves the introduction of at least two constraints (Tsallis et al., 1998). The first one is the classical normalization of p(X). The second one is based on the definition of the expectation value which has to be generalized to the "q-expectation value", according to the generalization of the entropy [Abe and Suzuki, 2003]. In order to calculate p(X) we apply the technique of Langrange multipliers maximizing an appropriate functional and leading tο maximization of the Tsallis entropy under the constraints on the normalization and the q-expectation value. It is well known that the Gutenberg-Richter (G-R) power law distribution has to be modified for large seismic moments because of energy conservation and geometrical reasons. Several models have been proposed, either in terms of a second power law with a larger b value beyond a crossover magnitude, or based on a magnidute cut-off using an exponential taper. In the present work we point out that the non extensivity viewpoint is applicable to seismic processes. In the frame of a non-extensive approach which is based on Tsallis entropy we construct a generalized expression of Gutenberg-Richter (GGR) law [Vallianatos, 2008]. The existence of lower or/and upper bound to magnitude is discussed and the conditions under which GGR lead to classical GR law are analysed. For the lowest earthquake size (i.e., energy level) the correlation between the different parts of elements involved in the evolution of an earthquake are short-ranged and GR can be deduced on the basis of the maximum entropy principle using BG statistics. As the size (i.e., energy) increases, long range correlation becomes much more important, implying the necessity of using Tsallis entropy as an appropriate generalization of BG entropy. The power law behaviour is derived as a special case, leading to b-values being functions of the non-extensivity parameter q. Furthermore a theoretical analysis of similarities presented in stress stimulated electric and acoustic emissions and earthquakes are discussed not only in the frame of GGR but taking into account a universality in the description of intrevent times distribution. Its particular form can be well expressed in the frame of a non extensive approach. This formulation is very different from an exponential distribution expected for simple random Poisson processes and indicates the existence of a nontrivial universal mechanism in the generation process. All the aforementioned similarities within stress stimulated electrical and acoustic emissions and seismicity suggests a connection with fracture phenomena at much larger scales implying that a basic general mechanism is "actively hidden" behind all this phenomena [Vallianatos and Triantis, 2008b]. Examples from S.Aegean seismicity are given. Acknowledgements: This work is partially supported by the "NEXT EARTH" project FP7-PEOPLE, 2009-2011 References Abe S. and Suzuki N., J. Goephys. Res. 108 (B2), 2113, 2003. Corral A., Phys. Rev. Lett. 92, 108501, 2004. Corral A., Nonlinear Proc. Geophys. 12, 89, 2005a. Corral A., Phys. Rev. E 71, 017101, 2005b. Corral A., Phys. Rev. Lett. 95, 028501, 2005c. Main I. G., Rev. of Geoph., 34, 433, 1996. Main I. G., O' Brien G. And Henderson R., J. Geoph. Res., 105, 6105, 2000. Main I. G. and Al-Kindy F. H., Geoph. Res. Let., 29, 7, 2002. Rundle J. B., Gross S., Klein W., Fergunson C. and Turcotte D., Tectonophysics, 277, 147-164, 1997. Rundle J. B., Turcotte D. L., Shcherbakov R., Klein W. and Sammis C., Rev. Geophys. 41, 1019, 2003. Tsallis C., J. Stat. Phys. 52, 479, 1988; See also http://tsallis.cat.cbpf.br/biblio.htm for an updated bibliography. Vallianatos, F., 2th IASME/WSEAS International Conference on Geology and Seismology (GES08), Cambridge, U.K, 2008. Vallianatos F. and Triantis D., Physica A, 387, 4940-4946, 2008a.

  18. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  19. Entropy in sound and vibration: towards a new paradigm.

    PubMed

    Le Bot, A

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.

  20. Clauser-Horne-Shimony-Holt violation and the entropy-concurrence plane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derkacz, Lukasz; Jakobczyk, Lech

    2005-10-15

    We characterize violation of Clauser-Horne-Shimony-Holt (CHSH) inequalities for mixed two-qubit states by their mixedness and entanglement. The class of states that have maximum degree of CHSH violation for a given linear entropy is also constructed.

  1. Classical conformal blocks and accessory parameters from isomonodromic deformations

    NASA Astrophysics Data System (ADS)

    Lencsés, Máté; Novaes, Fábio

    2018-04-01

    Classical conformal blocks appear in the large central charge limit of 2D Virasoro conformal blocks. In the AdS3 /CFT2 correspondence, they are related to classical bulk actions and used to calculate entanglement entropy and geodesic lengths. In this work, we discuss the identification of classical conformal blocks and the Painlevé VI action showing how isomonodromic deformations naturally appear in this context. We recover the accessory parameter expansion of Heun's equation from the isomonodromic τ -function. We also discuss how the c = 1 expansion of the τ -function leads to a novel approach to calculate the 4-point classical conformal block.

  2. Maximum entropy, fluctuations and priors

    NASA Astrophysics Data System (ADS)

    Caticha, A.

    2001-05-01

    The method of maximum entropy (ME) is extended to address the following problem: Once one accepts that the ME distribution is to be preferred over all others, the question is to what extent are distributions with lower entropy supposed to be ruled out. Two applications are given. The first is to the theory of thermodynamic fluctuations. The formulation is exact, covariant under changes of coordinates, and allows fluctuations of both the extensive and the conjugate intensive variables. The second application is to the construction of an objective prior for Bayesian inference. The prior obtained by following the ME method to its inevitable conclusion turns out to be a special case (α=1) of what are currently known under the name of entropic priors. .

  3. Analysing causal structures with entropy

    PubMed Central

    Weilenmann, Mirjam

    2017-01-01

    A central question for causal inference is to decide whether a set of correlations fits a given causal structure. In general, this decision problem is computationally infeasible and hence several approaches have emerged that look for certificates of compatibility. Here, we review several such approaches based on entropy. We bring together the key aspects of these entropic techniques with unified terminology, filling several gaps and establishing new connections, all illustrated with examples. We consider cases where unobserved causes are classical, quantum and post-quantum, and discuss what entropic analyses tell us about the difference. This difference has applications to quantum cryptography, where it can be crucial to eliminate the possibility of classical causes. We discuss the achievements and limitations of the entropic approach in comparison to other techniques and point out the main open problems. PMID:29225499

  4. Using a Classical Gluon Cascade to study the Equilibration of a Gluon-Plasma

    NASA Astrophysics Data System (ADS)

    McConnell, Lucas

    2015-10-01

    Using a classical gluon cascade, we study the thermalisation of a gluon-plasma in a homogeneous box by considering the time evolution of the entropy, and in particular how the thermalisation time depends on the strong coupling αs. We then partition the volume into cells with a linearly increasing temperature gradient in one direction, and homogeneous/isotropic in the the other two directions. We allow the gluons to stream in one direction in order to study how they then evolve spatially. We examine cases with and without collisions. We study the entropy as well as the flow-velocity in the z-direction and find that the system initially has a flow which dissipates over time as the gluons become distributed homogeneously throughout the box.

  5. Analysing causal structures with entropy

    NASA Astrophysics Data System (ADS)

    Weilenmann, Mirjam; Colbeck, Roger

    2017-11-01

    A central question for causal inference is to decide whether a set of correlations fits a given causal structure. In general, this decision problem is computationally infeasible and hence several approaches have emerged that look for certificates of compatibility. Here, we review several such approaches based on entropy. We bring together the key aspects of these entropic techniques with unified terminology, filling several gaps and establishing new connections, all illustrated with examples. We consider cases where unobserved causes are classical, quantum and post-quantum, and discuss what entropic analyses tell us about the difference. This difference has applications to quantum cryptography, where it can be crucial to eliminate the possibility of classical causes. We discuss the achievements and limitations of the entropic approach in comparison to other techniques and point out the main open problems.

  6. Third law of thermodynamics as a key test of generalized entropies.

    PubMed

    Bento, E P; Viswanathan, G M; da Luz, M G E; Silva, R

    2015-02-01

    The laws of thermodynamics constrain the formulation of statistical mechanics at the microscopic level. The third law of thermodynamics states that the entropy must vanish at absolute zero temperature for systems with nondegenerate ground states in equilibrium. Conversely, the entropy can vanish only at absolute zero temperature. Here we ask whether or not generalized entropies satisfy this fundamental property. We propose a direct analytical procedure to test if a generalized entropy satisfies the third law, assuming only very general assumptions for the entropy S and energy U of an arbitrary N-level classical system. Mathematically, the method relies on exact calculation of β=dS/dU in terms of the microstate probabilities p(i). To illustrate this approach, we present exact results for the two best known generalizations of statistical mechanics. Specifically, we study the Kaniadakis entropy S(κ), which is additive, and the Tsallis entropy S(q), which is nonadditive. We show that the Kaniadakis entropy correctly satisfies the third law only for -1<κ<+1, thereby shedding light on why κ is conventionally restricted to this interval. Surprisingly, however, the Tsallis entropy violates the third law for q<1. Finally, we give a concrete example of the power of our proposed method by applying it to a paradigmatic system: the one-dimensional ferromagnetic Ising model with nearest-neighbor interactions.

  7. Evaluation of entropy for monitoring the depth of anesthesia compared with bispectral index: a multicenter clinical trial.

    PubMed

    Gao, Jian-dong; Zhao, Yu-jie; Xu, Chen-shi; Zhao, Jing; Huang, Yu-guang; Wang, Tian-long; Pei, Ling; Wang, Jian; Yao, Li-nong; Ding, Qian; Tan, Zhi-ming; Zhu, Zhi-rong; Yue, Yun

    2012-04-01

    As a new electroencephalogram (EEG) signal processing technique for monitoring the depth of anesthesia, entropy consists of two indices: reaction entropy (RE) and state entropy (SE). Our study compared entropy with classical bispectral index (BIS) in reduction of myoelectrical interference and noxious stimuli with EEG signals. Two hundred and eighty patients (ASA I-II, 18-60 years old) undergoing scheduled surgeries from seven medical centers were enrolled. Anesthesia induction was managed with propofol via the target-controlled infusion (TCI) system. The results of BIS, RE, SE, mean arterial pressure (MAP) and heart rate (HR) were recorded before anesthesia induction, at the moment of unconsciousness, before and 2 minutes after administration of muscle relaxant, and before and one and three minutes after the tracheal intubation. The values of half maximum effective concentrations (EC50), 5% effective concentrations (EC05) and 95% effective concentrations (EC95) of propofol effect-site concentration at the onset of unconsciousness were 1.2 (1.1-1.3 µg/ml), 2.5 (2.4-2.5 µg/ml) and 3.7 (3.7-3.8 µg/ml), while those of the predicted plasma propofol concentration were 2.8 (2.7-2.9 µg/ml), 3.9 (3.8-3.9 µg/ml) and 4.9 (4.8-5.0 µg/ml), respectively. The values of BIS, SE and RE were 62, 59 and 63 when 50% of patients lost consciousness, and 79, 80, 85 and 42, 37, 44, respectively, when 5% and 95% of patients were unconscious. The values of BIS, RE and SE dropped two minutes after the injection of muscle relaxant, but there were no significant differences between RE and SE. MAP and HR increased visibly, which indicated a reaction to tracheal intubation; the values of BIS, RE and SE, however, did not display any significant changes. This large-sample multicentric study confirmed the values of RE and SE as approximating BIS value, at the onset of unconsciousness during propofol TCI anesthesia. After elimination of myoelectrical activation, all values of RE, SE and BIS decreased significantly and the three indices were less sensitive to noxious stimuli than cardiovascular responses.

  8. REMARKS ON THE MAXIMUM ENTROPY METHOD APPLIED TO FINITE TEMPERATURE LATTICE QCD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    UMEDA, T.; MATSUFURU, H.

    2005-07-25

    We make remarks on the Maximum Entropy Method (MEM) for studies of the spectral function of hadronic correlators in finite temperature lattice QCD. We discuss the virtues and subtlety of MEM in the cases that one does not have enough number of data points such as at finite temperature. Taking these points into account, we suggest several tests which one should examine to keep the reliability for the results, and also apply them using mock and lattice QCD data.

  9. Single-Cell-Based Analysis Highlights a Surge in Cell-to-Cell Molecular Variability Preceding Irreversible Commitment in a Differentiation Process

    PubMed Central

    Boullu, Loïs; Morin, Valérie; Vallin, Elodie; Guillemin, Anissa; Papili Gao, Nan; Cosette, Jérémie; Arnaud, Ophélie; Kupiec, Jean-Jacques; Espinasse, Thibault

    2016-01-01

    In some recent studies, a view emerged that stochastic dynamics governing the switching of cells from one differentiation state to another could be characterized by a peak in gene expression variability at the point of fate commitment. We have tested this hypothesis at the single-cell level by analyzing primary chicken erythroid progenitors through their differentiation process and measuring the expression of selected genes at six sequential time-points after induction of differentiation. In contrast to population-based expression data, single-cell gene expression data revealed a high cell-to-cell variability, which was masked by averaging. We were able to show that the correlation network was a very dynamical entity and that a subgroup of genes tend to follow the predictions from the dynamical network biomarker (DNB) theory. In addition, we also identified a small group of functionally related genes encoding proteins involved in sterol synthesis that could act as the initial drivers of the differentiation. In order to assess quantitatively the cell-to-cell variability in gene expression and its evolution in time, we used Shannon entropy as a measure of the heterogeneity. Entropy values showed a significant increase in the first 8 h of the differentiation process, reaching a peak between 8 and 24 h, before decreasing to significantly lower values. Moreover, we observed that the previous point of maximum entropy precedes two paramount key points: an irreversible commitment to differentiation between 24 and 48 h followed by a significant increase in cell size variability at 48 h. In conclusion, when analyzed at the single cell level, the differentiation process looks very different from its classical population average view. New observables (like entropy) can be computed, the behavior of which is fully compatible with the idea that differentiation is not a “simple” program that all cells execute identically but results from the dynamical behavior of the underlying molecular network. PMID:28027290

  10. Single-Cell-Based Analysis Highlights a Surge in Cell-to-Cell Molecular Variability Preceding Irreversible Commitment in a Differentiation Process.

    PubMed

    Richard, Angélique; Boullu, Loïs; Herbach, Ulysse; Bonnafoux, Arnaud; Morin, Valérie; Vallin, Elodie; Guillemin, Anissa; Papili Gao, Nan; Gunawan, Rudiyanto; Cosette, Jérémie; Arnaud, Ophélie; Kupiec, Jean-Jacques; Espinasse, Thibault; Gonin-Giraud, Sandrine; Gandrillon, Olivier

    2016-12-01

    In some recent studies, a view emerged that stochastic dynamics governing the switching of cells from one differentiation state to another could be characterized by a peak in gene expression variability at the point of fate commitment. We have tested this hypothesis at the single-cell level by analyzing primary chicken erythroid progenitors through their differentiation process and measuring the expression of selected genes at six sequential time-points after induction of differentiation. In contrast to population-based expression data, single-cell gene expression data revealed a high cell-to-cell variability, which was masked by averaging. We were able to show that the correlation network was a very dynamical entity and that a subgroup of genes tend to follow the predictions from the dynamical network biomarker (DNB) theory. In addition, we also identified a small group of functionally related genes encoding proteins involved in sterol synthesis that could act as the initial drivers of the differentiation. In order to assess quantitatively the cell-to-cell variability in gene expression and its evolution in time, we used Shannon entropy as a measure of the heterogeneity. Entropy values showed a significant increase in the first 8 h of the differentiation process, reaching a peak between 8 and 24 h, before decreasing to significantly lower values. Moreover, we observed that the previous point of maximum entropy precedes two paramount key points: an irreversible commitment to differentiation between 24 and 48 h followed by a significant increase in cell size variability at 48 h. In conclusion, when analyzed at the single cell level, the differentiation process looks very different from its classical population average view. New observables (like entropy) can be computed, the behavior of which is fully compatible with the idea that differentiation is not a "simple" program that all cells execute identically but results from the dynamical behavior of the underlying molecular network.

  11. It is not the entropy you produce, rather, how you produce it

    PubMed Central

    Volk, Tyler; Pauluis, Olivier

    2010-01-01

    The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise the entropy production rate. Our analysis makes explicit fundamental questions for MEP that should be brought into focus: can MEP predict not just the overall state of entropy production of a system but also the details of the sub-systems of dissipaters within the system? Which fluxes of the system are those that are most likely to be maximized? How it is possible for MEP theory to be so domain-neutral that it can claim to apply equally to both purely physical–chemical systems and also systems governed by the ‘laws’ of biological evolution? We conclude that the principle of MEP needs to take on the issue of exactly how entropy is produced. PMID:20368249

  12. A subjective supply-demand model: the maximum Boltzmann/Shannon entropy solution

    NASA Astrophysics Data System (ADS)

    Piotrowski, Edward W.; Sładkowski, Jan

    2009-03-01

    The present authors have put forward a projective geometry model of rational trading. The expected (mean) value of the time that is necessary to strike a deal and the profit strongly depend on the strategies adopted. A frequent trader often prefers maximal profit intensity to the maximization of profit resulting from a separate transaction because the gross profit/income is the adopted/recommended benchmark. To investigate activities that have different periods of duration we define, following the queuing theory, the profit intensity as a measure of this economic category. The profit intensity in repeated trading has a unique property of attaining its maximum at a fixed point regardless of the shape of demand curves for a wide class of probability distributions of random reverse transactions (i.e. closing of the position). These conclusions remain valid for an analogous model based on supply analysis. This type of market game is often considered in research aiming at finding an algorithm that maximizes profit of a trader who negotiates prices with the Rest of the World (a collective opponent), possessing a definite and objective supply profile. Such idealization neglects the sometimes important influence of an individual trader on the demand/supply profile of the Rest of the World and in extreme cases questions the very idea of demand/supply profile. Therefore we put forward a trading model in which the demand/supply profile of the Rest of the World induces the (rational) trader to (subjectively) presume that he/she lacks (almost) all knowledge concerning the market but his/her average frequency of trade. This point of view introduces maximum entropy principles into the model and broadens the range of economic phenomena that can be perceived as a sort of thermodynamical system. As a consequence, the profit intensity has a fixed point with an astonishing connection with Fibonacci classical works and looking for the quickest algorithm for obtaining the extremum of a convex function: the profit intensity reaches its maximum when the probability of transaction is given by the golden ratio rule (\\sqrt {5}-1)/{2} . This condition sets a sharp criterion of validity of the model and can be tested with real market data.

  13. Inability of the entropy vector method to certify nonclassicality in linelike causal structures

    NASA Astrophysics Data System (ADS)

    Weilenmann, Mirjam; Colbeck, Roger

    2016-10-01

    Bell's theorem shows that our intuitive understanding of causation must be overturned in light of quantum correlations. Nevertheless, quantum mechanics does not permit signaling and hence a notion of cause remains. Understanding this notion is not only important at a fundamental level, but also for technological applications such as key distribution and randomness expansion. It has recently been shown that a useful way to decide which classical causal structures could give rise to a given set of correlations is to use entropy vectors. These are vectors whose components are the entropies of all subsets of the observed variables in the causal structure. The entropy vector method employs causal relationships among the variables to restrict the set of possible entropy vectors. Here, we consider whether the same approach can lead to useful certificates of nonclassicality within a given causal structure. Surprisingly, we find that for a family of causal structures that includes the usual bipartite Bell structure they do not. For all members of this family, no function of the entropies of the observed variables gives such a certificate, in spite of the existence of nonclassical correlations. It is therefore necessary to look beyond entropy vectors to understand cause from a quantum perspective.

  14. Entanglement entropy for 2D gauge theories with matters

    NASA Astrophysics Data System (ADS)

    Aoki, Sinya; Iizuka, Norihiro; Tamaoka, Kotaro; Yokoya, Tsuyoshi

    2017-08-01

    We investigate the entanglement entropy in 1 +1 -dimensional S U (N ) gauge theories with various matter fields using the lattice regularization. Here we use extended Hilbert space definition for entanglement entropy, which contains three contributions; (1) classical Shannon entropy associated with superselection sector distribution, where sectors are labeled by irreducible representations of boundary penetrating fluxes, (2) logarithm of the dimensions of their representations, which is associated with "color entanglement," and (3) EPR Bell pairs, which give "genuine" entanglement. We explicitly show that entanglement entropies (1) and (2) above indeed appear for various multiple "meson" states in gauge theories with matter fields. Furthermore, we employ transfer matrix formalism for gauge theory with fundamental matter field and analyze its ground state using hopping parameter expansion (HPE), where the hopping parameter K is roughly the inverse square of the mass for the matter. We evaluate the entanglement entropy for the ground state and show that all (1), (2), (3) above appear in the HPE, though the Bell pair part (3) appears in higher order than (1) and (2) do. With these results, we discuss how the ground state entanglement entropy in the continuum limit can be understood from the lattice ground state obtained in the HPE.

  15. Finite entanglement entropy of black holes

    NASA Astrophysics Data System (ADS)

    Giaccari, Stefano; Modesto, Leonardo; Rachwał, Lesław; Zhu, Yiwei

    2018-06-01

    We compute the area term contribution to black holes' entanglement entropy (using the conical technique) for a class of local or weakly non-local super-renormalizable gravitational theories coupled to matter. For the first time, we explicitly prove that all the beta functions in the proposed theory, except for the cosmological constant, are identically zero in cut-off regularization scheme and not only in dimensional regularization scheme. In particular, we show that there is no divergence quadratic in cut-off and hence there is no contribution to the beta function of the Newton constant. As a consequence of this result, we argue that in these theories of gravity conical entropy is a sensible definition of physical entropy, in particular, it is positive-definite and gauge independent. On top of this the conical entropy, being expressed only in terms of the classical Newton constant, turns out to be finite and naturally coincides with Bekenstein-Hawking entropy. Finally, we propose a theory in which the renormalization of the Newton constant is entirely due to the Standard Model matter, arguing that such a contribution does not give the usual interpretational problems of conical entropy discussed in the literature.

  16. Perspective: Maximum caliber is a general variational principle for dynamical systems

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A.

    2018-01-01

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics—such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production—are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  17. Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung

    2015-04-01

    In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting

  18. Perspective: Maximum caliber is a general variational principle for dynamical systems.

    PubMed

    Dixit, Purushottam D; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A

    2018-01-07

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics-such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production-are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosales-Zarate, Laura E. C.; Drummond, P. D.

    We calculate the quantum Renyi entropy in a phase-space representation for either fermions or bosons. This can also be used to calculate purity and fidelity, or the entanglement between two systems. We show that it is possible to calculate the entropy from sampled phase-space distributions in normally ordered representations, although this is not possible for all quantum states. We give an example of the use of this method in an exactly soluble thermal case. The quantum entropy cannot be calculated at all using sampling methods in classical symmetric (Wigner) or antinormally ordered (Husimi) phase spaces, due to inner-product divergences. Themore » preferred method is to use generalized Gaussian phase-space methods, which utilize a distribution over stochastic Green's functions. We illustrate this approach by calculating the reduced entropy and entanglement of bosonic or fermionic modes coupled to a time-evolving, non-Markovian reservoir.« less

  20. Information and Entropy

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel

    2007-11-01

    What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.

  1. Development and application of the maximum entropy method and other spectral estimation techniques

    NASA Astrophysics Data System (ADS)

    King, W. R.

    1980-09-01

    This summary report is a collection of four separate progress reports prepared under three contracts, which are all sponsored by the Office of Naval Research in Arlington, Virginia. This report contains the results of investigations into the application of the maximum entropy method (MEM), a high resolution, frequency and wavenumber estimation technique. The report also contains a description of two, new, stable, high resolution spectral estimation techniques that is provided in the final report section. Many examples of wavenumber spectral patterns for all investigated techniques are included throughout the report. The maximum entropy method is also known as the maximum entropy spectral analysis (MESA) technique, and both names are used in the report. Many MEM wavenumber spectral patterns are demonstrated using both simulated and measured radar signal and noise data. Methods for obtaining stable MEM wavenumber spectra are discussed, broadband signal detection using the MEM prediction error transform (PET) is discussed, and Doppler radar narrowband signal detection is demonstrated using the MEM technique. It is also shown that MEM cannot be applied to randomly sampled data. The two new, stable, high resolution, spectral estimation techniques discussed in the final report section, are named the Wiener-King and the Fourier spectral estimation techniques. The two new techniques have a similar derivation based upon the Wiener prediction filter, but the two techniques are otherwise quite different. Further development of the techniques and measurement of the technique spectral characteristics is recommended for subsequent investigation.

  2. Molecular extended thermodynamics of rarefied polyatomic gases and wave velocities for increasing number of moments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arima, Takashi, E-mail: tks@stat.nitech.ac.jp; Mentrelli, Andrea, E-mail: andrea.mentrelli@unibo.it; Ruggeri, Tommaso, E-mail: tommaso.ruggeri@unibo.it

    Molecular extended thermodynamics of rarefied polyatomic gases is characterized by two hierarchies of equations for moments of a suitable distribution function in which the internal degrees of freedom of a molecule is taken into account. On the basis of physical relevance the truncation orders of the two hierarchies are proven to be not independent on each other, and the closure procedures based on the maximum entropy principle (MEP) and on the entropy principle (EP) are proven to be equivalent. The characteristic velocities of the emerging hyperbolic system of differential equations are compared to those obtained for monatomic gases and themore » lower bound estimate for the maximum equilibrium characteristic velocity established for monatomic gases (characterized by only one hierarchy for moments with truncation order of moments N) by Boillat and Ruggeri (1997) (λ{sub (N)}{sup E,max})/(c{sub 0}) ⩾√(6/5 (N−1/2 )),(c{sub 0}=√(5/3 k/m T)) is proven to hold also for rarefied polyatomic gases independently from the degrees of freedom of a molecule. -- Highlights: •Molecular extended thermodynamics of rarefied polyatomic gases is studied. •The relation between two hierarchies of equations for moments is derived. •The equivalence of maximum entropy principle and entropy principle is proven. •The characteristic velocities are compared to those of monatomic gases. •The lower bound of the maximum characteristic velocity is estimated.« less

  3. Statistical Neurodynamics.

    NASA Astrophysics Data System (ADS)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better understanding of the behavior of these systems.

  4. Quantum Rényi relative entropies affirm universality of thermodynamics.

    PubMed

    Misra, Avijit; Singh, Uttam; Bera, Manabendra Nath; Rajagopal, A K

    2015-10-01

    We formulate a complete theory of quantum thermodynamics in the Rényi entropic formalism exploiting the Rényi relative entropies, starting from the maximum entropy principle. In establishing the first and second laws of quantum thermodynamics, we have correctly identified accessible work and heat exchange in both equilibrium and nonequilibrium cases. The free energy (internal energy minus temperature times entropy) remains unaltered, when all the entities entering this relation are suitably defined. Exploiting Rényi relative entropies we have shown that this "form invariance" holds even beyond equilibrium and has profound operational significance in isothermal process. These results reduce to the Gibbs-von Neumann results when the Rényi entropic parameter α approaches 1. Moreover, it is shown that the universality of the Carnot statement of the second law is the consequence of the form invariance of the free energy, which is in turn the consequence of maximum entropy principle. Further, the Clausius inequality, which is the precursor to the Carnot statement, is also shown to hold based on the data processing inequalities for the traditional and sandwiched Rényi relative entropies. Thus, we find that the thermodynamics of nonequilibrium state and its deviation from equilibrium together determine the thermodynamic laws. This is another important manifestation of the concepts of information theory in thermodynamics when they are extended to the quantum realm. Our work is a substantial step towards formulating a complete theory of quantum thermodynamics and corresponding resource theory.

  5. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  6. Energy conservation and maximal entropy production in enzyme reactions.

    PubMed

    Dobovišek, Andrej; Vitas, Marko; Brumen, Milan; Fajmut, Aleš

    2017-08-01

    A procedure for maximization of the density of entropy production in a single stationary two-step enzyme reaction is developed. Under the constraints of mass conservation, fixed equilibrium constant of a reaction and fixed products of forward and backward enzyme rate constants the existence of maximum in the density of entropy production is demonstrated. In the state with maximal density of entropy production the optimal enzyme rate constants, the stationary concentrations of the substrate and the product, the stationary product yield as well as the stationary reaction flux are calculated. The test, whether these calculated values of the reaction parameters are consistent with their corresponding measured values, is performed for the enzyme Glucose Isomerase. It is found that calculated and measured rate constants agree within an order of magnitude, whereas the calculated reaction flux and the product yield differ from their corresponding measured values for less than 20 % and 5 %, respectively. This indicates that the enzyme Glucose Isomerase, considered in a non-equilibrium stationary state, as found in experiments using the continuous stirred tank reactors, possibly operates close to the state with the maximum in the density of entropy production. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Shock heating of the solar wind plasma

    NASA Technical Reports Server (NTRS)

    Whang, Y. C.; Liu, Shaoliang; Burlaga, L. F.

    1990-01-01

    The role played by shocks in heating solar-wind plasma is investigated using data on 413 shocks which were identified from the plasma and magnetic-field data collected between 1973 and 1982 by Pioneer and Voyager spacecraft. It is found that the average shock strength increased with the heliocentric distance outside 1 AU, reaching a maximum near 5 AU, after which the shock strength decreased with the distance; the entropy of the solar wind protons also reached a maximum at 5 AU. An MHD simulation model in which shock heating is the only heating mechanism available was used to calculate the entropy changes for the November 1977 event. The calculated entropy agreed well with the value calculated from observational data, suggesting that shocks are chiefly responsible for heating solar wind plasma between 1 and 15 AU.

  8. Entropy maximization under the constraints on the generalized Gini index and its application in modeling income distributions

    NASA Astrophysics Data System (ADS)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2015-11-01

    In economics and social sciences, the inequality measures such as Gini index, Pietra index etc., are commonly used to measure the statistical dispersion. There is a generalization of Gini index which includes it as special case. In this paper, we use principle of maximum entropy to approximate the model of income distribution with a given mean and generalized Gini index. Many distributions have been used as descriptive models for the distribution of income. The most widely known of these models are the generalized beta of second kind and its subclass distributions. The obtained maximum entropy distributions are fitted to the US family total money income in 2009, 2011 and 2013 and their relative performances with respect to generalized beta of second kind family are compared.

  9. A Maximum Entropy Method for Particle Filtering

    NASA Astrophysics Data System (ADS)

    Eyink, Gregory L.; Kim, Sangil

    2006-06-01

    Standard ensemble or particle filtering schemes do not properly represent states of low priori probability when the number of available samples is too small, as is often the case in practical applications. We introduce here a set of parametric resampling methods to solve this problem. Motivated by a general H-theorem for relative entropy, we construct parametric models for the filter distributions as maximum-entropy/minimum-information models consistent with moments of the particle ensemble. When the prior distributions are modeled as mixtures of Gaussians, our method naturally generalizes the ensemble Kalman filter to systems with highly non-Gaussian statistics. We apply the new particle filters presented here to two simple test cases: a one-dimensional diffusion process in a double-well potential and the three-dimensional chaotic dynamical system of Lorenz.

  10. Maximum entropy and equations of state for random cellular structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivier, N.

    Random, space-filling cellular structures (biological tissues, metallurgical grain aggregates, foams, etc.) are investigated. Maximum entropy inference under a few constraints yields structural equations of state, relating the size of cells to their topological shape. These relations are known empirically as Lewis's law in Botany, or Desch's relation in Metallurgy. Here, the functional form of the constraints is now known as a priori, and one takes advantage of this arbitrariness to increase the entropy further. The resulting structural equations of state are independent of priors, they are measurable experimentally and constitute therefore a direct test for the applicability of MaxEnt inferencemore » (given that the structure is in statistical equilibrium, a fact which can be tested by another simple relation (Aboav's law)). 23 refs., 2 figs., 1 tab.« less

  11. Entanglement entropy of dispersive media from thermodynamic entropy in one higher dimension.

    PubMed

    Maghrebi, M F; Reid, M T H

    2015-04-17

    A dispersive medium becomes entangled with zero-point fluctuations in the vacuum. We consider an arbitrary array of material bodies weakly interacting with a quantum field and compute the quantum mutual information between them. It is shown that the mutual information in D dimensions can be mapped to classical thermodynamic entropy in D+1 dimensions. As a specific example, we compute the mutual information both analytically and numerically for a range of separation distances between two bodies in D=2 dimensions and find a logarithmic correction to the area law at short separations. A key advantage of our method is that it allows the strong subadditivity property to be easily verified.

  12. Entropy in sound and vibration: towards a new paradigm

    PubMed Central

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190

  13. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S.

    1989-05-15

    Several theorems on the mechanics of gathering information are proved, and the possibility of violating the second law of thermodynamics by obtaining information is discussed in light of these theorems. Maxwell's demon can lower the entropy of his surroundings by an amount equal to the difference between the maximum entropy of his recording device and its initial entropy, without generating a compensating entropy increase. A demon with human-scale recording devices can reduce the entropy of a gas by a negligible amount only, but the proof of the demon's impracticability leaves open the possibility that systems highly correlated with their environmentmore » can reduce the environment's entropy by a substantial amount without increasing entropy elsewhere. In the event that a boundary condition for the universe requires it to be in a state of low entropy when small, the correlations induced between different particle modes during the expansion phase allow the modes to behave like Maxwell's demons during the contracting phase, reducing the entropy of the universe to a low value.« less

  14. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations

    NASA Astrophysics Data System (ADS)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-01

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  15. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    PubMed

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  16. Exact Maximum-Entropy Estimation with Feynman Diagrams

    NASA Astrophysics Data System (ADS)

    Netser Zernik, Amitai; Schlank, Tomer M.; Tessler, Ran J.

    2018-02-01

    A longstanding open problem in statistics is finding an explicit expression for the probability measure which maximizes entropy with respect to given constraints. In this paper a solution to this problem is found, using perturbative Feynman calculus. The explicit expression is given as a sum over weighted trees.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine A.

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Cemore » 4+/Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.« less

  18. Bose-Einstein condensation of the classical axion field in cosmology?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, Sacha; Elmer, Martin, E-mail: s.davidson@ipnl.in2p3.fr, E-mail: m.elmer@ipnl.in2p3.fr

    The axion is a motivated cold dark matter candidate, which it would be interesting to distinguish from weakly interacting massive particles. Sikivie has suggested that axions could behave differently during non-linear galaxy evolution, if they form a Bose-Einstein condensate, and argues that ''gravitational thermalisation'' drives them to a Bose-Einstein condensate during the radiation dominated era. Using classical equations of motion during linear structure formation, we explore whether the gravitational interactions of axions can generate enough entropy. At linear order in G{sub N}, we interpret that the principle activities of gravity are to expand the Universe and grow density fluctuations. Tomore » quantify the rate of entropy creation we use the anisotropic stress to estimate a short dissipation scale for axions which does not confirm previous estimates of their gravitational thermalisation rate.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khosla, D.; Singh, M.

    The estimation of three-dimensional dipole current sources on the cortical surface from the measured magnetoencephalogram (MEG) is a highly under determined inverse problem as there are many {open_quotes}feasible{close_quotes} images which are consistent with the MEG data. Previous approaches to this problem have concentrated on the use of weighted minimum norm inverse methods. While these methods ensure a unique solution, they often produce overly smoothed solutions and exhibit severe sensitivity to noise. In this paper we explore the maximum entropy approach to obtain better solutions to the problem. This estimation technique selects that image from the possible set of feasible imagesmore » which has the maximum entropy permitted by the information available to us. In order to account for the presence of noise in the data, we have also incorporated a noise rejection or likelihood term into our maximum entropy method. This makes our approach mirror a Bayesian maximum a posteriori (MAP) formulation. Additional information from other functional techniques like functional magnetic resonance imaging (fMRI) can be incorporated in the proposed method in the form of a prior bias function to improve solutions. We demonstrate the method with experimental phantom data from a clinical 122 channel MEG system.« less

  20. Respiration-Averaged CT for Attenuation Correction of PET Images – Impact on PET Texture Features in Non-Small Cell Lung Cancer Patients

    PubMed Central

    Cheng, Nai-Ming; Fang, Yu-Hua Dean; Tsan, Din-Li

    2016-01-01

    Purpose We compared attenuation correction of PET images with helical CT (PET/HCT) and respiration-averaged CT (PET/ACT) in patients with non-small-cell lung cancer (NSCLC) with the goal of investigating the impact of respiration-averaged CT on 18F FDG PET texture parameters. Materials and Methods A total of 56 patients were enrolled. Tumors were segmented on pretreatment PET images using the adaptive threshold. Twelve different texture parameters were computed: standard uptake value (SUV) entropy, uniformity, entropy, dissimilarity, homogeneity, coarseness, busyness, contrast, complexity, grey-level nonuniformity, zone-size nonuniformity, and high grey-level large zone emphasis. Comparisons of PET/HCT and PET/ACT were performed using Wilcoxon signed-rank tests, intraclass correlation coefficients, and Bland-Altman analysis. Receiver operating characteristic (ROC) curves as well as univariate and multivariate Cox regression analyses were used to identify the parameters significantly associated with disease-specific survival (DSS). A fixed threshold at 45% of the maximum SUV (T45) was used for validation. Results SUV maximum and total lesion glycolysis (TLG) were significantly higher in PET/ACT. However, texture parameters obtained with PET/ACT and PET/HCT showed a high degree of agreement. The lowest levels of variation between the two modalities were observed for SUV entropy (9.7%) and entropy (9.8%). SUV entropy, entropy, and coarseness from both PET/ACT and PET/HCT were significantly associated with DSS. Validation analyses using T45 confirmed the usefulness of SUV entropy and entropy in both PET/HCT and PET/ACT for the prediction of DSS, but only coarseness from PET/ACT achieved the statistical significance threshold. Conclusions Our results indicate that 1) texture parameters from PET/ACT are clinically useful in the prediction of survival in NSCLC patients and 2) SUV entropy and entropy are robust to attenuation correction methods. PMID:26930211

  1. Maximum Entropy Production As a Framework for Understanding How Living Systems Evolve, Organize and Function

    NASA Astrophysics Data System (ADS)

    Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.

    2014-12-01

    The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.

  2. Effective field theory of dissipative fluids (II): classical limit, dynamical KMS symmetry and entropy current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glorioso, Paolo; Crossley, Michael; Liu, Hong

    2017-09-20

    Here in this paper we further develop the fluctuating hydrodynamics proposed in a number of ways. We first work out in detail the classical limit of the hydrodynamical action, which exhibits many simplifications. In particular, this enables a transparent formulation of the action in physical spacetime in the presence of arbitrary external fields. It also helps to clarify issues related to field redefinitions and frame choices. We then propose that the action is invariant under a Z2 symmetry to which we refer as the dynamical KMS symmetry. The dynamical KMS symmetry is physically equivalent to the previously proposed local KMSmore » condition in the classical limit, but is more convenient to implement and more general. It is applicable to any states in local equilibrium rather than just thermal density matrix perturbed by external background fields. Finally we elaborate the formulation for a conformal fluid, which contains some new features, and work out the explicit form of the entropy current to second order in derivatives for a neutral conformal fluid.« less

  3. Giant onsite electronic entropy enhances the performance of ceria for water splitting

    DOE PAGES

    Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine A.; ...

    2017-08-18

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Cemore » 4+/Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.« less

  4. Giant onsite electronic entropy enhances the performance of ceria for water splitting.

    PubMed

    Naghavi, S Shahab; Emery, Antoine A; Hansen, Heine A; Zhou, Fei; Ozolins, Vidvuds; Wolverton, Chris

    2017-08-18

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Ce 4+ /Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.Solid-state entropy of reduction increases the thermodynamic efficiency of ceria for two-step thermochemical water splitting. Here, the authors report a large and different source of entropy, the onsite electronic configurational entropy arising from coupling between orbital and spin angular momenta in f orbitals.

  5. Shallow water equations: viscous solutions and inviscid limit

    NASA Astrophysics Data System (ADS)

    Chen, Gui-Qiang; Perepelitsa, Mikhail

    2012-12-01

    We establish the inviscid limit of the viscous shallow water equations to the Saint-Venant system. For the viscous equations, the viscosity terms are more degenerate when the shallow water is close to the bottom, in comparison with the classical Navier-Stokes equations for barotropic gases; thus, the analysis in our earlier work for the classical Navier-Stokes equations does not apply directly, which require new estimates to deal with the additional degeneracy. We first introduce a notion of entropy solutions to the viscous shallow water equations and develop an approach to establish the global existence of such solutions and their uniform energy-type estimates with respect to the viscosity coefficient. These uniform estimates yield the existence of measure-valued solutions to the Saint-Venant system generated by the viscous solutions. Based on the uniform energy-type estimates and the features of the Saint-Venant system, we further establish that the entropy dissipation measures of the viscous solutions for weak entropy-entropy flux pairs, generated by compactly supported C 2 test-functions, are confined in a compact set in H -1, which yields that the measure-valued solutions are confined by the Tartar-Murat commutator relation. Then, the reduction theorem established in Chen and Perepelitsa [5] for the measure-valued solutions with unbounded support leads to the convergence of the viscous solutions to a finite-energy entropy solution of the Saint-Venant system with finite-energy initial data, which is relative with respect to the different end-states of the bottom topography of the shallow water at infinity. The analysis also applies to the inviscid limit problem for the Saint-Venant system in the presence of friction.

  6. Direct measurement of weakly nonequilibrium system entropy is consistent with Gibbs–Shannon form

    PubMed Central

    2017-01-01

    Stochastic thermodynamics extends classical thermodynamics to small systems in contact with one or more heat baths. It can account for the effects of thermal fluctuations and describe systems far from thermodynamic equilibrium. A basic assumption is that the expression for Shannon entropy is the appropriate description for the entropy of a nonequilibrium system in such a setting. Here we measure experimentally this function in a system that is in local but not global equilibrium. Our system is a micron-scale colloidal particle in water, in a virtual double-well potential created by a feedback trap. We measure the work to erase a fraction of a bit of information and show that it is bounded by the Shannon entropy for a two-state system. Further, by measuring directly the reversibility of slow protocols, we can distinguish unambiguously between protocols that can and cannot reach the expected thermodynamic bounds. PMID:29073017

  7. Gaussian States Minimize the Output Entropy of One-Mode Quantum Gaussian Channels

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo; Trevisan, Dario; Giovannetti, Vittorio

    2017-04-01

    We prove the long-standing conjecture stating that Gaussian thermal input states minimize the output von Neumann entropy of one-mode phase-covariant quantum Gaussian channels among all the input states with a given entropy. Phase-covariant quantum Gaussian channels model the attenuation and the noise that affect any electromagnetic signal in the quantum regime. Our result is crucial to prove the converse theorems for both the triple trade-off region and the capacity region for broadcast communication of the Gaussian quantum-limited amplifier. Our result extends to the quantum regime the entropy power inequality that plays a key role in classical information theory. Our proof exploits a completely new technique based on the recent determination of the p →q norms of the quantum-limited amplifier [De Palma et al., arXiv:1610.09967]. This technique can be applied to any quantum channel.

  8. Gaussian States Minimize the Output Entropy of One-Mode Quantum Gaussian Channels.

    PubMed

    De Palma, Giacomo; Trevisan, Dario; Giovannetti, Vittorio

    2017-04-21

    We prove the long-standing conjecture stating that Gaussian thermal input states minimize the output von Neumann entropy of one-mode phase-covariant quantum Gaussian channels among all the input states with a given entropy. Phase-covariant quantum Gaussian channels model the attenuation and the noise that affect any electromagnetic signal in the quantum regime. Our result is crucial to prove the converse theorems for both the triple trade-off region and the capacity region for broadcast communication of the Gaussian quantum-limited amplifier. Our result extends to the quantum regime the entropy power inequality that plays a key role in classical information theory. Our proof exploits a completely new technique based on the recent determination of the p→q norms of the quantum-limited amplifier [De Palma et al., arXiv:1610.09967]. This technique can be applied to any quantum channel.

  9. Entropy bounds in terms of the w parameter

    NASA Astrophysics Data System (ADS)

    Abreu, Gabriel; Barceló, Carlos; Visser, Matt

    2011-12-01

    In a pair of recent articles [PRL 105 (2010) 041302; JHEP 1103 (2011) 056] two of the current authors have developed an entropy bound for equilibrium uncollapsed matter using only classical general relativity, basic thermodynamics, and the Unruh effect. An odd feature of that bound, [InlineMediaObject not available: see fulltext.], was that the proportionality constant, 1/2 , was weaker than that expected from black hole thermodynamics, 1/4 . In the current article we strengthen the previous results by obtaining a bound involving the (suitably averaged) w parameter. Simple causality arguments restrict this averaged < w> parameter to be ≤ 1. When equality holds, the entropy bound saturates at the value expected based on black hole thermodynamics. We also add some clarifying comments regarding the (net) positivity of the chemical potential. Overall, we find that even in the absence of any black hole region, we can nevertheless get arbitrarily close to the Bekenstein entropy.

  10. The limit behavior of the evolution of the Tsallis entropy in self-gravitating systems

    NASA Astrophysics Data System (ADS)

    Zheng, Yahui; Du, Jiulin; Liang, Faku

    2017-06-01

    In this letter, we study the limit behavior of the evolution of the Tsallis entropy in self-gravitating systems. The study is carried out under two different situations, drawing the same conclusion. No matter in the energy transfer process or in the mass transfer process inside the system, when the nonextensive parameter q is more than unity, the total entropy is bounded; on the contrary, when this parameter is less than unity, the total entropy is unbounded. There are proofs in both theory and observation that the q is always more than unity. So the Tsallis entropy in self-gravitating systems generally exhibits a bounded property. This indicates the existence of a global maximum of the Tsallis entropy. It is possible for self-gravitating systems to evolve to thermodynamically stable states.

  11. Temperature anisotropy at equilibrium reveals nonlocal entropic contributions to interfacial properties.

    PubMed

    Wilhelmsen, Øivind; Trinh, Thuat T; Lervik, Anders

    2018-01-01

    Density gradient theory for fluids has played a key role in the study of interfacial phenomena for a century. In this work, we revisit its fundamentals by examining the vapor-liquid interface of argon, represented by the cut and shifted Lennard-Jones fluid. The starting point has traditionally been a Helmholtz energy functional using mass densities as arguments. By using rather the internal energy as starting point and including the entropy density as an additional argument, following thereby the phenomenological approach from classical thermodynamics, the extended theory suggests that the configurational part of the temperature has different contributions from the parallel and perpendicular directions at the interface, even at equilibrium. We find a similar anisotropy by examining the configurational temperature in molecular dynamics simulations and obtain a qualitative agreement between theory and simulations. The extended theory shows that the temperature anisotropy originates in nonlocal entropic contributions, which are currently missing from the classical theory. The nonlocal entropic contributions discussed in this work are likely to play a role in the description of both equilibrium and nonequilibrium properties of interfaces. At equilibrium, they influence the temperature- and curvature-dependence of the surface tension. Across the vapor-liquid interface of the Lennard Jones fluid, we find that the maximum in the temperature anisotropy coincides precisely with the maximum in the thermal resistivity relative to the equimolar surface, where the integral of the thermal resistivity gives the Kapitza resistance. This links the temperature anisotropy at equilibrium to the Kapitza resistance of the vapor-liquid interface at nonequilibrium.

  12. Temperature anisotropy at equilibrium reveals nonlocal entropic contributions to interfacial properties

    NASA Astrophysics Data System (ADS)

    Wilhelmsen, Øivind; Trinh, Thuat T.; Lervik, Anders

    2018-01-01

    Density gradient theory for fluids has played a key role in the study of interfacial phenomena for a century. In this work, we revisit its fundamentals by examining the vapor-liquid interface of argon, represented by the cut and shifted Lennard-Jones fluid. The starting point has traditionally been a Helmholtz energy functional using mass densities as arguments. By using rather the internal energy as starting point and including the entropy density as an additional argument, following thereby the phenomenological approach from classical thermodynamics, the extended theory suggests that the configurational part of the temperature has different contributions from the parallel and perpendicular directions at the interface, even at equilibrium. We find a similar anisotropy by examining the configurational temperature in molecular dynamics simulations and obtain a qualitative agreement between theory and simulations. The extended theory shows that the temperature anisotropy originates in nonlocal entropic contributions, which are currently missing from the classical theory. The nonlocal entropic contributions discussed in this work are likely to play a role in the description of both equilibrium and nonequilibrium properties of interfaces. At equilibrium, they influence the temperature- and curvature-dependence of the surface tension. Across the vapor-liquid interface of the Lennard Jones fluid, we find that the maximum in the temperature anisotropy coincides precisely with the maximum in the thermal resistivity relative to the equimolar surface, where the integral of the thermal resistivity gives the Kapitza resistance. This links the temperature anisotropy at equilibrium to the Kapitza resistance of the vapor-liquid interface at nonequilibrium.

  13. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  14. Predicting protein β-sheet contacts using a maximum entropy-based correlated mutation measure.

    PubMed

    Burkoff, Nikolas S; Várnai, Csilla; Wild, David L

    2013-03-01

    The problem of ab initio protein folding is one of the most difficult in modern computational biology. The prediction of residue contacts within a protein provides a more tractable immediate step. Recently introduced maximum entropy-based correlated mutation measures (CMMs), such as direct information, have been successful in predicting residue contacts. However, most correlated mutation studies focus on proteins that have large good-quality multiple sequence alignments (MSA) because the power of correlated mutation analysis falls as the size of the MSA decreases. However, even with small autogenerated MSAs, maximum entropy-based CMMs contain information. To make use of this information, in this article, we focus not on general residue contacts but contacts between residues in β-sheets. The strong constraints and prior knowledge associated with β-contacts are ideally suited for prediction using a method that incorporates an often noisy CMM. Using contrastive divergence, a statistical machine learning technique, we have calculated a maximum entropy-based CMM. We have integrated this measure with a new probabilistic model for β-contact prediction, which is used to predict both residue- and strand-level contacts. Using our model on a standard non-redundant dataset, we significantly outperform a 2D recurrent neural network architecture, achieving a 5% improvement in true positives at the 5% false-positive rate at the residue level. At the strand level, our approach is competitive with the state-of-the-art single methods achieving precision of 61.0% and recall of 55.4%, while not requiring residue solvent accessibility as an input. http://www2.warwick.ac.uk/fac/sci/systemsbiology/research/software/

  15. Human vision is determined based on information theory.

    PubMed

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-03

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  16. Human vision is determined based on information theory

    NASA Astrophysics Data System (ADS)

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  17. Quantifying Extrinsic Noise in Gene Expression Using the Maximum Entropy Framework

    PubMed Central

    Dixit, Purushottam D.

    2013-01-01

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. PMID:23790383

  18. Quantifying extrinsic noise in gene expression using the maximum entropy framework.

    PubMed

    Dixit, Purushottam D

    2013-06-18

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. Human vision is determined based on information theory

    PubMed Central

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-01-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition. PMID:27808236

  20. Classification of pulmonary pathology from breath sounds using the wavelet packet transform and an extreme learning machine.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian; Huliraj, N; Revadi, S S

    2017-06-08

    Auscultation is a medical procedure used for the initial diagnosis and assessment of lung and heart diseases. From this perspective, we propose assessing the performance of the extreme learning machine (ELM) classifiers for the diagnosis of pulmonary pathology using breath sounds. Energy and entropy features were extracted from the breath sound using the wavelet packet transform. The statistical significance of the extracted features was evaluated by one-way analysis of variance (ANOVA). The extracted features were inputted into the ELM classifier. The maximum classification accuracies obtained for the conventional validation (CV) of the energy and entropy features were 97.36% and 98.37%, respectively, whereas the accuracies obtained for the cross validation (CRV) of the energy and entropy features were 96.80% and 97.91%, respectively. In addition, maximum classification accuracies of 98.25% and 99.25% were obtained for the CV and CRV of the ensemble features, respectively. The results indicate that the classification accuracy obtained with the ensemble features was higher than those obtained with the energy and entropy features.

  1. Thermodynamics of stoichiometric biochemical networks in living systems far from equilibrium.

    PubMed

    Qian, Hong; Beard, Daniel A

    2005-04-22

    The principles of thermodynamics apply to both equilibrium and nonequilibrium biochemical systems. The mathematical machinery of the classic thermodynamics, however, mainly applies to systems in equilibrium. We introduce a thermodynamic formalism for the study of metabolic biochemical reaction (open, nonlinear) networks in both time-dependent and time-independent nonequilibrium states. Classical concepts in equilibrium thermodynamics-enthalpy, entropy, and Gibbs free energy of biochemical reaction systems-are generalized to nonequilibrium settings. Chemical motive force, heat dissipation rate, and entropy production (creation) rate, key concepts in nonequilibrium systems, are introduced. Dynamic equations for the thermodynamic quantities are presented in terms of the key observables of a biochemical network: stoichiometric matrix Q, reaction fluxes J, and chemical potentials of species mu without evoking empirical rate laws. Energy conservation and the Second Law are established for steady-state and dynamic biochemical networks. The theory provides the physiochemical basis for analyzing large-scale metabolic networks in living organisms.

  2. Derivation of Hunt equation for suspension distribution using Shannon entropy theory

    NASA Astrophysics Data System (ADS)

    Kundu, Snehasis

    2017-12-01

    In this study, the Hunt equation for computing suspension concentration in sediment-laden flows is derived using Shannon entropy theory. Considering the inverse of the void ratio as a random variable and using principle of maximum entropy, probability density function and cumulative distribution function of suspension concentration is derived. A new and more general cumulative distribution function for the flow domain is proposed which includes several specific other models of CDF reported in literature. This general form of cumulative distribution function also helps to derive the Rouse equation. The entropy based approach helps to estimate model parameters using suspension data of sediment concentration which shows the advantage of using entropy theory. Finally model parameters in the entropy based model are also expressed as functions of the Rouse number to establish a link between the parameters of the deterministic and probabilistic approaches.

  3. Entropy of uremia and dialysis technology.

    PubMed

    Ronco, Claudio

    2013-01-01

    The second law of thermodynamics applies with local exceptions to patient history and therapy interventions. Living things preserve their low level of entropy throughout time because they receive energy from their surroundings in the form of food. They gain their order at the expense of disordering the nutrients they consume. Death is the thermodynamically favored state: it represents a large increase in entropy as molecular structure yields to chaos. The kidney is an organ dissipating large amounts of energy to maintain the level of entropy of the organism as low as possible. Diseases, and in particular uremia, represent conditions of rapid increase in entropy. Therapeutic strategies are oriented towards a reduction in entropy or at least a decrease in the speed of entropy increase. Uremia is a process accelerating the trend towards randomness and disorder (increase in entropy). Dialysis is a factor external to the patient that tends to reduce the level of entropy caused by kidney disease. Since entropy can only increase in closed systems, energy and work must be spent to limit the entropy of uremia. This energy should be adapted to the system (patient) and be specifically oriented and personalized. This includes a multidimensional effort to achieve an adequate dialysis that goes beyond small molecular weight solute clearance. It includes a biological plan for recovery of homeostasis and a strategy towards long-term rehabilitation of the patient. Such objectives can be achieved with a combination of technology and innovation to answer specific questions that are still present after 60 years of dialysis history. This change in the individual bioentropy may represent a local exception to natural trends as the patient could be considered an isolated universe responding to the classic laws of thermodynamics. Copyright © 2013 S. Karger AG, Basel.

  4. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    PubMed

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  5. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies

    PubMed Central

    Lorenz, Ralph D.

    2010-01-01

    The ‘two-box model’ of planetary climate is discussed. This model has been used to demonstrate consistency of the equator–pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b. PMID:20368253

  6. Optimal behavior of viscoelastic flow at resonant frequencies.

    PubMed

    Lambert, A A; Ibáñez, G; Cuevas, S; del Río, J A

    2004-11-01

    The global entropy generation rate in the zero-mean oscillatory flow of a Maxwell fluid in a pipe is analyzed with the aim of determining its behavior at resonant flow conditions. This quantity is calculated explicitly using the analytic expression for the velocity field and assuming isothermal conditions. The global entropy generation rate shows well-defined peaks at the resonant frequencies where the flow displays maximum velocities. It was found that resonant frequencies can be considered optimal in the sense that they maximize the power transmitted to the pulsating flow at the expense of maximum dissipation.

  7. Twenty-five years of maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  8. Estimation of conformational entropy in protein-ligand interactions: a computational perspective.

    PubMed

    Polyansky, Anton A; Zubac, Ruben; Zagrovic, Bojan

    2012-01-01

    Conformational entropy is an important component of the change in free energy upon binding of a ligand to its target protein. As a consequence, development of computational techniques for reliable estimation of conformational entropies is currently receiving an increased level of attention in the context of computational drug design. Here, we review the most commonly used techniques for conformational entropy estimation from classical molecular dynamics simulations. Although by-and-large still not directly used in practical drug design, these techniques provide a golden standard for developing other, computationally less-demanding methods for such applications, in addition to furthering our understanding of protein-ligand interactions in general. In particular, we focus on the quasi-harmonic approximation and discuss different approaches that can be used to go beyond it, most notably, when it comes to treating anharmonic and/or correlated motions. In addition to reviewing basic theoretical formalisms, we provide a concrete set of steps required to successfully calculate conformational entropy from molecular dynamics simulations, as well as discuss a number of practical issues that may arise in such calculations.

  9. Out-of-equilibrium protocol for Rényi entropies via the Jarzynski equality.

    PubMed

    Alba, Vincenzo

    2017-06-01

    In recent years entanglement measures, such as the von Neumann and the Rényi entropies, provided a unique opportunity to access elusive features of quantum many-body systems. However, extracting entanglement properties analytically, experimentally, or in numerical simulations can be a formidable task. Here, by combining the replica trick and the Jarzynski equality we devise an alternative effective out-of-equilibrium protocol for measuring the equilibrium Rényi entropies. The key idea is to perform a quench in the geometry of the replicas. The Rényi entropies are obtained as the exponential average of the work performed during the quench. We illustrate an application of the method in classical Monte Carlo simulations, although it could be useful in different contexts, such as in quantum Monte Carlo, or experimentally in cold-atom systems. The method is most effective in the quasistatic regime, i.e., for a slow quench. As a benchmark, we compute the Rényi entropies in the Ising universality class in 1+1 dimensions. We find perfect agreement with the well-known conformal field theory predictions.

  10. BGK-MD, Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haack, Jeffrey; Shohet, Gil

    2016-12-02

    The software implements a heterogeneous multiscale method (HMM), which involves solving a classical molecular dynamics (MD) problem and then computes the entropy production in order to compute the relaxation times towards equilibrium for use in a Bhatnagar-Gross-Krook (BGK) solver.

  11. Dipole Relaxation in an Electric Field.

    ERIC Educational Resources Information Center

    Neumann, Richard M.

    1980-01-01

    Derives an expression for the orientational entropy of a rigid rod (electric dipole) from Boltzmann's equation. Subsequent application of Newton's second law of motion produces Debye's classical expression for the relaxation of an electric dipole in a viscous medium. (Author/GS)

  12. Statistical mechanical theory for steady state systems. VI. Variational principles

    NASA Astrophysics Data System (ADS)

    Attard, Phil

    2006-12-01

    Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.

  13. Energy transports by ocean and atmosphere based on an entropy extremum principle. I - Zonal averaged transports

    NASA Technical Reports Server (NTRS)

    Sohn, Byung-Ju; Smith, Eric A.

    1993-01-01

    The maximum entropy production principle suggested by Paltridge (1975) is applied to separating the satellite-determined required total transports into atmospheric and oceanic components. Instead of using the excessively restrictive equal energy dissipation hypothesis as a deterministic tool for separating transports between the atmosphere and ocean fluids, the satellite-inferred required 2D energy transports are imposed on Paltridge's energy balance model, which is then solved as a variational problem using the equal energy dissipation hypothesis only to provide an initial guess field. It is suggested that Southern Ocean transports are weaker than previously reported. It is argued that a maximum entropy production principle can serve as a governing rule on macroscale global climate, and, in conjunction with conventional satellite measurements of the net radiation balance, provides a means to decompose atmosphere and ocean transports from the total transport field.

  14. Direct measurement of the electrocaloric effect in poly(vinylidene fluoride-trifluoroethylene-chlorotrifluoroethylene) terpolymer films

    NASA Astrophysics Data System (ADS)

    Basso, Vittorio; Russo, Florence; Gerard, Jean-François; Pruvost, Sébastien

    2013-11-01

    We investigated the entropy change in poly(vinylidene fluoride-trifluoroethylene-chlorotrifluoroethylene) (P(VDF-TrFE-CTFE)) films in the temperature range between -5 ∘C and 60 ∘C by direct heat flux calorimetry using Peltier cell heat flux sensors. At the electric field E = 50 MVm-1 the isothermal entropy change attains a maximum of |Δs|=4.2 Jkg-1K-1 at 31∘C with an adiabatic temperature change ΔTad=1.1 K. At temperatures below the maximum, in the range from 25 ∘C to -5 ∘C, the entropy change |Δs | rapidly decreases and the unipolar P vs E relationship becomes hysteretic. This phenomenon is interpreted as the fact that the fluctuations of the polar segments of the polymer chain, responsible for the electrocaloric effect ECE in the polymer, becomes progressively frozen below the relaxor transition.

  15. Maximum Renyi entropy principle for systems with power-law Hamiltonians.

    PubMed

    Bashkirov, A G

    2004-09-24

    The Renyi distribution ensuring the maximum of Renyi entropy is investigated for a particular case of a power-law Hamiltonian. Both Lagrange parameters alpha and beta can be eliminated. It is found that beta does not depend on a Renyi parameter q and can be expressed in terms of an exponent kappa of the power-law Hamiltonian and an average energy U. The Renyi entropy for the resulting Renyi distribution reaches its maximal value at q=1/(1+kappa) that can be considered as the most probable value of q when we have no additional information on the behavior of the stochastic process. The Renyi distribution for such q becomes a power-law distribution with the exponent -(kappa+1). When q=1/(1+kappa)+epsilon (0

  16. Maximum one-shot dissipated work from Rényi divergences

    NASA Astrophysics Data System (ADS)

    Yunger Halpern, Nicole; Garner, Andrew J. P.; Dahlsten, Oscar C. O.; Vedral, Vlatko

    2018-05-01

    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.

  17. Maximum one-shot dissipated work from Rényi divergences.

    PubMed

    Yunger Halpern, Nicole; Garner, Andrew J P; Dahlsten, Oscar C O; Vedral, Vlatko

    2018-05-01

    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.

  18. A graphic approach to include dissipative-like effects in reversible thermal cycles

    NASA Astrophysics Data System (ADS)

    Gonzalez-Ayala, Julian; Arias-Hernandez, Luis Antonio; Angulo-Brown, Fernando

    2017-05-01

    Since the decade of 1980's, a connection between a family of maximum-work reversible thermal cycles and maximum-power finite-time endoreversible cycles has been established. The endoreversible cycles produce entropy at their couplings with the external heat baths. Thus, this kind of cycles can be optimized under criteria of merit that involve entropy production terms. Meanwhile the relation between the concept of work and power is quite direct, apparently, the finite-time objective functions involving entropy production have not reversible counterparts. In the present paper we show that it is also possible to establish a connection between irreversible cycle models and reversible ones by means of the concept of "geometric dissipation", which has to do with the equivalent role of a deficit of areas between some reversible cycles and the Carnot cycle and actual dissipative terms in a Curzon-Ahlborn engine.

  19. Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.

    The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

  20. Relationship of The Tropical Cyclogenesis With Solar and Magnetospheric Activities

    NASA Astrophysics Data System (ADS)

    Vishnevsky, O. V.; Pankov, V. M.; Erokhine, N. S.

    Formation of tropical cyclones is a badly studied period in their life cycle even though there are many papers dedicated to analysis of influence of different parameters upon cyclones occurrence frequency (see e.g., Gray W.M.). Present paper is dedicated to study of correlation of solar and magnetospheric activity with the appearance of tropical cyclones in north-west region of Pacific ocean. Study of correlation was performed by using both classical statistical methods (including maximum entropy method) and quite modern ones, for example multifractal analysis. Information about Wolf's numbers and cyclogenesis intensity in period of 1944-2000 was received from different Internet databases. It was shown that power spectra maximums of Wolf's numbers and appeared tropical cyclones ones corresponds to 11-year period; solar activity and cyclogenesis processes intensity are in antiphase; maximum of mutual correlation coefficient (~ 0.8) between Wolf's numbers and cyclogenesis intensity is in South-China sea. There is a relation of multifractal characteristics calculated for both time series with the mutual correlation function that is another indicator of correlation between tropical cyclogenesis and solar-magnetospheric activity. So, there is the correlation between solar-magnetospheric activity and tropical cyclone intensity in this region. Possible physical mechanisms of such correlation including anomalous precipitations charged particles from the Earth radiation belts and wind intensity amplification in the troposphere are discussed.

  1. Relationship of The Tropical Cyclogenesis With Solar and Magnetospheric Activities

    NASA Astrophysics Data System (ADS)

    Vishnevsky, O.; Pankov, V.; Erokhine, N.

    Formation of tropical cyclones is a badly studied period in their life cycle even though there are many papers dedicated to analysis of influence of different parameters upon cyclones occurrence frequency (see e.g., Gray W.M.). Present paper is dedicated to study of correlation of solar and magnetospheric activity with the appearance of tropi- cal cyclones in north-west region of Pacific ocean. Study of correlation was performed by using both classical statistical methods (including maximum entropy method) and quite modern ones, for example multifractal analysis. Information about Wolf's num- bers and cyclogenesis intensity in period of 1944-2000 was received from different Internet databases. It was shown that power spectra maximums of Wolf's numbers and appeared tropical cyclones ones corresponds to 11-year period; solar activity and cyclogenesis processes intensity are in antiphase; maximum of mutual correlation co- efficient ( 0.8) between Wolf's numbers and cyclogenesis intensity is in South-China sea. There is a relation of multifractal characteristics calculated for both time series with the mutual correlation function that is another indicator of correlation between tropical cyclogenesis and solar-magnetospheric activity. So, there is the correlation between solar-magnetospheric activity and tropical cyclone intensity in this region. Possible physical mechanisms of such correlation including anomalous precipitations charged particles from the Earth radiation belts and wind intensity amplification in the troposphere are discussed.

  2. Steepest entropy ascent for two-state systems with slowly varying Hamiltonians

    NASA Astrophysics Data System (ADS)

    Militello, Benedetto

    2018-05-01

    The steepest entropy ascent approach is considered and applied to two-state systems. When the Hamiltonian of the system is time-dependent, the principle of maximum entropy production can still be exploited; arguments to support this fact are given. In the limit of slowly varying Hamiltonians, which allows for the adiabatic approximation for the unitary part of the dynamics, the system exhibits significant robustness to the thermalization process. Specific examples such as a spin in a rotating field and a generic two-state system undergoing an avoided crossing are considered.

  3. Entropy Inequality Violations from Ultraspinning Black Holes.

    PubMed

    Hennigar, Robie A; Mann, Robert B; Kubizňák, David

    2015-07-17

    We construct a new class of rotating anti-de Sitter (AdS) black hole solutions with noncompact event horizons of finite area in any dimension and study their thermodynamics. In four dimensions these black holes are solutions to gauged supergravity. We find that their entropy exceeds the maximum implied from the conjectured reverse isoperimetric inequality, which states that for a given thermodynamic volume, the black hole entropy is maximized for Schwarzschild-AdS space. We use this result to suggest more stringent conditions under which this conjecture may hold.

  4. Quantum-like model of brain's functioning: decision making from decoherence.

    PubMed

    Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Basieva, Irina; Khrennikov, Andrei

    2011-07-21

    We present a quantum-like model of decision making in games of the Prisoner's Dilemma type. By this model the brain processes information by using representation of mental states in a complex Hilbert space. Driven by the master equation the mental state of a player, say Alice, approaches an equilibrium point in the space of density matrices (representing mental states). This equilibrium state determines Alice's mixed (i.e., probabilistic) strategy. We use a master equation in which quantum physics describes the process of decoherence as the result of interaction with environment. Thus our model is a model of thinking through decoherence of the initially pure mental state. Decoherence is induced by the interaction with memory and the external mental environment. We study (numerically) the dynamics of quantum entropy of Alice's mental state in the process of decision making. We also consider classical entropy corresponding to Alice's choices. We introduce a measure of Alice's diffidence as the difference between classical and quantum entropies of Alice's mental state. We see that (at least in our model example) diffidence decreases (approaching zero) in the process of decision making. Finally, we discuss the problem of neuronal realization of quantum-like dynamics in the brain; especially roles played by lateral prefrontal cortex or/and orbitofrontal cortex. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Demonstration and resolution of the Gibbs paradox of the first kind

    NASA Astrophysics Data System (ADS)

    Peters, Hjalmar

    2014-01-01

    The Gibbs paradox of the first kind (GP1) refers to the false increase in entropy which, in statistical mechanics, is calculated from the process of combining two gas systems S1 and S2 consisting of distinguishable particles. Presented in a somewhat modified form, the GP1 manifests as a contradiction to the second law of thermodynamics. Contrary to popular belief, this contradiction affects not only classical but also quantum statistical mechanics. This paper resolves the GP1 by considering two effects. (i) The uncertainty about which particles are located in S1 and which in S2 contributes to the entropies of S1 and S2. (ii) S1 and S2 are correlated by the fact that if a certain particle is located in one system, it cannot be located in the other. As a consequence, the entropy of the total system consisting of S1 and S2 is not the sum of the entropies of S1 and S2.

  6. Toward the Application of the Maximum Entropy Production Principle to a Broader Range of Far From Equilibrium Dissipative Systems

    NASA Astrophysics Data System (ADS)

    Lineweaver, C. H.

    2005-12-01

    The principle of Maximum Entropy Production (MEP) is being usefully applied to a wide range of non-equilibrium processes including flows in planetary atmospheres and the bioenergetics of photosynthesis. Our goal of applying the principle of maximum entropy production to an even wider range of Far From Equilibrium Dissipative Systems (FFEDS) depends on the reproducibility of the evolution of the system from macro-state A to macro-state B. In an attempt to apply the principle of MEP to astronomical and cosmological structures, we investigate the problematic relationship between gravity and entropy. In the context of open and non-equilibrium systems, we use a generalization of the Gibbs free energy to include the sources of free energy extracted by non-living FFEDS such as hurricanes and convection cells. Redox potential gradients and thermal and pressure gradients provide the free energy for a broad range of FFEDS, both living and non-living. However, these gradients have to be within certain ranges. If the gradients are too weak, FFEDS do not appear. If the gradients are too strong FFEDS disappear. Living and non-living FFEDS often have different source gradients (redox potential gradients vs thermal and pressure gradients) and when they share the same gradient, they exploit different ranges of the gradient. In a preliminary attempt to distinguish living from non-living FFEDS, we investigate the parameter space of: type of gradient and steepness of gradient.

  7. Learning Probabilities From Random Observables in High Dimensions: The Maximum Entropy Distribution and Others

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi

    2015-11-01

    We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).

  8. Optimized Kernel Entropy Components.

    PubMed

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  9. Statistical theory on the analytical form of cloud particle size distributions

    NASA Astrophysics Data System (ADS)

    Wu, Wei; McFarquhar, Greg

    2017-11-01

    Several analytical forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying physical explanation as to why certain distribution forms preferentially occur instead of others. Theoretically, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of using a process level approach, the use of the principle of maximum entropy (MaxEnt) for determining the analytical form of PSDs from the perspective of system is examined here. Here, the issue of variability under coordinate transformations that arises using the Gibbs/Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power law relations between state variables, scale invariance and a further constraint on the expectation of one state variable (e.g. bulk water mass). DOE ASR.

  10. Analysis of rapid eye movement periodicity in narcoleptics based on maximum entropy method.

    PubMed

    Honma, H; Ohtomo, N; Kohsaka, M; Fukuda, N; Kobayashi, R; Sakakibara, S; Nakamura, F; Koyama, T

    1999-04-01

    We examined REM sleep periodicity in typical narcoleptics and patients who had shown signs of a narcoleptic tetrad without HLA-DRB1*1501/DQB1*0602 or DR2 antigens, using spectral analysis based on the maximum entropy method. The REM sleep period of typical narcoleptics showed two peaks, one at 70-90 min and one at 110-130 min at night, and a single peak at around 70-90 min during the daytime. The nocturnal REM sleep period of typical narcoleptics may be composed of several different periods, one of which corresponds to that of their daytime REM sleep.

  11. A homotopy algorithm for synthesizing robust controllers for flexible structures via the maximum entropy design equations

    NASA Technical Reports Server (NTRS)

    Collins, Emmanuel G., Jr.; Richter, Stephen

    1990-01-01

    One well known deficiency of LQG compensators is that they do not guarantee any measure of robustness. This deficiency is especially highlighted when considering control design for complex systems such as flexible structures. There has thus been a need to generalize LQG theory to incorporate robustness constraints. Here we describe the maximum entropy approach to robust control design for flexible structures, a generalization of LQG theory, pioneered by Hyland, which has proved useful in practice. The design equations consist of a set of coupled Riccati and Lyapunov equations. A homotopy algorithm that is used to solve these design equations is presented.

  12. 16QAM Blind Equalization via Maximum Entropy Density Approximation Technique and Nonlinear Lagrange Multipliers

    PubMed Central

    Mauda, R.; Pinchas, M.

    2014-01-01

    Recently a new blind equalization method was proposed for the 16QAM constellation input inspired by the maximum entropy density approximation technique with improved equalization performance compared to the maximum entropy approach, Godard's algorithm, and others. In addition, an approximated expression for the minimum mean square error (MSE) was obtained. The idea was to find those Lagrange multipliers that bring the approximated MSE to minimum. Since the derivation of the obtained MSE with respect to the Lagrange multipliers leads to a nonlinear equation for the Lagrange multipliers, the part in the MSE expression that caused the nonlinearity in the equation for the Lagrange multipliers was ignored. Thus, the obtained Lagrange multipliers were not those Lagrange multipliers that bring the approximated MSE to minimum. In this paper, we derive a new set of Lagrange multipliers based on the nonlinear expression for the Lagrange multipliers obtained from minimizing the approximated MSE with respect to the Lagrange multipliers. Simulation results indicate that for the high signal to noise ratio (SNR) case, a faster convergence rate is obtained for a channel causing a high initial intersymbol interference (ISI) while the same equalization performance is obtained for an easy channel (initial ISI low). PMID:24723813

  13. Stimulus-dependent Maximum Entropy Models of Neural Population Codes

    PubMed Central

    Segev, Ronen; Schneidman, Elad

    2013-01-01

    Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population. PMID:23516339

  14. Entropy in self-similar shock profiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolin, Len G.; Reisner, Jon Michael; Jordan, Pedro M.

    In this paper, we study the structure of a gaseous shock, and in particular the distribution of entropy within, in both a thermodynamics and a statistical mechanics context. The problem of shock structure has a long and distinguished history that we review. We employ the Navier–Stokes equations to construct a self–similar version of Becker’s solution for a shock assuming a particular (physically plausible) Prandtl number; that solution reproduces the well–known result of Morduchow & Libby that features a maximum of the equilibrium entropy inside the shock profile. We then construct an entropy profile, based on gas kinetic theory, that ismore » smooth and monotonically increasing. The extension of equilibrium thermodynamics to irreversible processes is based in part on the assumption of local thermodynamic equilibrium. We show that this assumption is not valid except for the weakest shocks. Finally, we conclude by hypothesizing a thermodynamic nonequilibrium entropy and demonstrating that it closely estimates the gas kinetic nonequilibrium entropy within a shock.« less

  15. Entropy in self-similar shock profiles

    DOE PAGES

    Margolin, Len G.; Reisner, Jon Michael; Jordan, Pedro M.

    2017-07-16

    In this paper, we study the structure of a gaseous shock, and in particular the distribution of entropy within, in both a thermodynamics and a statistical mechanics context. The problem of shock structure has a long and distinguished history that we review. We employ the Navier–Stokes equations to construct a self–similar version of Becker’s solution for a shock assuming a particular (physically plausible) Prandtl number; that solution reproduces the well–known result of Morduchow & Libby that features a maximum of the equilibrium entropy inside the shock profile. We then construct an entropy profile, based on gas kinetic theory, that ismore » smooth and monotonically increasing. The extension of equilibrium thermodynamics to irreversible processes is based in part on the assumption of local thermodynamic equilibrium. We show that this assumption is not valid except for the weakest shocks. Finally, we conclude by hypothesizing a thermodynamic nonequilibrium entropy and demonstrating that it closely estimates the gas kinetic nonequilibrium entropy within a shock.« less

  16. Maximum Entropy Approach in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.

    PubMed

    Farsani, Zahra Amini; Schmid, Volker J

    2017-01-01

    In the estimation of physiological kinetic parameters from Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) data, the determination of the arterial input function (AIF) plays a key role. This paper proposes a Bayesian method to estimate the physiological parameters of DCE-MRI along with the AIF in situations, where no measurement of the AIF is available. In the proposed algorithm, the maximum entropy method (MEM) is combined with the maximum a posterior approach (MAP). To this end, MEM is used to specify a prior probability distribution of the unknown AIF. The ability of this method to estimate the AIF is validated using the Kullback-Leibler divergence. Subsequently, the kinetic parameters can be estimated with MAP. The proposed algorithm is evaluated with a data set from a breast cancer MRI study. The application shows that the AIF can reliably be determined from the DCE-MRI data using MEM. Kinetic parameters can be estimated subsequently. The maximum entropy method is a powerful tool to reconstructing images from many types of data. This method is useful for generating the probability distribution based on given information. The proposed method gives an alternative way to assess the input function from the existing data. The proposed method allows a good fit of the data and therefore a better estimation of the kinetic parameters. In the end, this allows for a more reliable use of DCE-MRI. Schattauer GmbH.

  17. Application of a multiscale maximum entropy image restoration algorithm to HXMT observations

    NASA Astrophysics Data System (ADS)

    Guan, Ju; Song, Li-Ming; Huo, Zhuo-Xi

    2016-08-01

    This paper introduces a multiscale maximum entropy (MSME) algorithm for image restoration of the Hard X-ray Modulation Telescope (HXMT), which is a collimated scan X-ray satellite mainly devoted to a sensitive all-sky survey and pointed observations in the 1-250 keV range. The novelty of the MSME method is to use wavelet decomposition and multiresolution support to control noise amplification at different scales. Our work is focused on the application and modification of this method to restore diffuse sources detected by HXMT scanning observations. An improved method, the ensemble multiscale maximum entropy (EMSME) algorithm, is proposed to alleviate the problem of mode mixing exiting in MSME. Simulations have been performed on the detection of the diffuse source Cen A by HXMT in all-sky survey mode. The results show that the MSME method is adapted to the deconvolution task of HXMT for diffuse source detection and the improved method could suppress noise and improve the correlation and signal-to-noise ratio, thus proving itself a better algorithm for image restoration. Through one all-sky survey, HXMT could reach a capacity of detecting a diffuse source with maximum differential flux of 0.5 mCrab. Supported by Strategic Priority Research Program on Space Science, Chinese Academy of Sciences (XDA04010300) and National Natural Science Foundation of China (11403014)

  18. Ecosystem functioning and maximum entropy production: a quantitative test of hypotheses.

    PubMed

    Meysman, Filip J R; Bruers, Stijn

    2010-05-12

    The idea that entropy production puts a constraint on ecosystem functioning is quite popular in ecological thermodynamics. Yet, until now, such claims have received little quantitative verification. Here, we examine three 'entropy production' hypotheses that have been forwarded in the past. The first states that increased entropy production serves as a fingerprint of living systems. The other two hypotheses invoke stronger constraints. The state selection hypothesis states that when a system can attain multiple steady states, the stable state will show the highest entropy production rate. The gradient response principle requires that when the thermodynamic gradient increases, the system's new stable state should always be accompanied by a higher entropy production rate. We test these three hypotheses by applying them to a set of conventional food web models. Each time, we calculate the entropy production rate associated with the stable state of the ecosystem. This analysis shows that the first hypothesis holds for all the food webs tested: the living state shows always an increased entropy production over the abiotic state. In contrast, the state selection and gradient response hypotheses break down when the food web incorporates more than one trophic level, indicating that they are not generally valid.

  19. Reversibility and stability of information processing systems

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1984-01-01

    Classical and quantum models of dynamically reversible computers are considered. Instabilities in the evolution of the classical 'billiard ball computer' are analyzed and shown to result in a one-bit increase of entropy per step of computation. 'Quantum spin computers', on the other hand, are not only microscopically, but also operationally reversible. Readoff of the output of quantum computation is shown not to interfere with this reversibility. Dissipation, while avoidable in principle, can be used in practice along with redundancy to prevent errors.

  20. Thermodynamic laws in isolated systems.

    PubMed

    Hilbert, Stefan; Hänggi, Peter; Dunkel, Jörn

    2014-12-01

    The recent experimental realization of exotic matter states in isolated quantum systems and the ensuing controversy about the existence of negative absolute temperatures demand a careful analysis of the conceptual foundations underlying microcanonical thermostatistics. Here we provide a detailed comparison of the most commonly considered microcanonical entropy definitions, focusing specifically on whether they satisfy or violate the zeroth, first, and second laws of thermodynamics. Our analysis shows that, for a broad class of systems that includes all standard classical Hamiltonian systems, only the Gibbs volume entropy fulfills all three laws simultaneously. To avoid ambiguities, the discussion is restricted to exact results and analytically tractable examples.

  1. New constraints for holographic entropy from maximin: A no-go theorem

    NASA Astrophysics Data System (ADS)

    Rota, Massimiliano; Weinberg, Sean J.

    2018-04-01

    The Ryu-Takayanagi (RT) formula for static spacetimes arising in the AdS/CFT correspondence satisfies inequalities that are not yet proven in the case of the Rangamani-Hubeny-Takayanagi (HRT) formula, which applies to general dynamical spacetimes. Wall's maximin construction is the only known technique for extending inequalities of holographic entanglement entropy from the static to dynamical case. We show that this method currently has no further utility when dealing with inequalities for five or fewer regions. Despite this negative result, we propose the validity of one new inequality for covariant holographic entanglement entropy for five regions. This inequality, while not maximin provable, is much weaker than many of the inequalities satisfied by the RT formula and should therefore be easier to prove. If it is valid, then there is strong evidence that holographic entanglement entropy plays a role in general spacetimes including those that arise in cosmology. Our new inequality is obtained by the assumption that the HRT formula satisfies every known balanced inequality obeyed by the Shannon entropies of classical probability distributions. This is a property that the RT formula has been shown to possess and which has been previously conjectured to hold for quantum mechanics in general.

  2. Entropy of international trades

    NASA Astrophysics Data System (ADS)

    Oh, Chang-Young; Lee, D.-S.

    2017-05-01

    The organization of international trades is highly complex under the collective efforts towards economic profits of participating countries given inhomogeneous resources for production. Considering the trade flux as the probability of exporting a product from a country to another, we evaluate the entropy of the world trades in the period 1950-2000. The trade entropy has increased with time, and we show that it is mainly due to the extension of trade partnership. For a given number of trade partners, the mean trade entropy is about 60% of the maximum possible entropy, independent of time, which can be regarded as a characteristic of the trade fluxes' heterogeneity and is shown to be derived from the scaling and functional behaviors of the universal trade-flux distribution. The correlation and time evolution of the individual countries' gross-domestic products and the number of trade partners show that most countries achieved their economic growth partly by extending their trade relationship.

  3. Jarzynski equality in the context of maximum path entropy

    NASA Astrophysics Data System (ADS)

    González, Diego; Davis, Sergio

    2017-06-01

    In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.

  4. High resolution schemes and the entropy condition

    NASA Technical Reports Server (NTRS)

    Osher, S.; Chakravarthy, S.

    1983-01-01

    A systematic procedure for constructing semidiscrete, second order accurate, variation diminishing, five point band width, approximations to scalar conservation laws, is presented. These schemes are constructed to also satisfy a single discrete entropy inequality. Thus, in the convex flux case, convergence is proven to be the unique physically correct solution. For hyperbolic systems of conservation laws, this construction is used formally to extend the first author's first order accurate scheme, and show (under some minor technical hypotheses) that limit solutions satisfy an entropy inequality. Results concerning discrete shocks, a maximum principle, and maximal order of accuracy are obtained. Numerical applications are also presented.

  5. Physics of negative absolute temperatures.

    PubMed

    Abraham, Eitan; Penrose, Oliver

    2017-01-01

    Negative absolute temperatures were introduced into experimental physics by Purcell and Pound, who successfully applied this concept to nuclear spins; nevertheless, the concept has proved controversial: a recent article aroused considerable interest by its claim, based on a classical entropy formula (the "volume entropy") due to Gibbs, that negative temperatures violated basic principles of statistical thermodynamics. Here we give a thermodynamic analysis that confirms the negative-temperature interpretation of the Purcell-Pound experiments. We also examine the principal arguments that have been advanced against the negative temperature concept; we find that these arguments are not logically compelling, and moreover that the underlying "volume" entropy formula leads to predictions inconsistent with existing experimental results on nuclear spins. We conclude that, despite the counterarguments, negative absolute temperatures make good theoretical sense and did occur in the experiments designed to produce them.

  6. Functional entropy variables: A new methodology for deriving thermodynamically consistent algorithms for complex fluids, with particular reference to the isothermal Navier–Stokes–Korteweg equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ju, E-mail: jliu@ices.utexas.edu; Gomez, Hector; Evans, John A.

    2013-09-01

    We propose a new methodology for the numerical solution of the isothermal Navier–Stokes–Korteweg equations. Our methodology is based on a semi-discrete Galerkin method invoking functional entropy variables, a generalization of classical entropy variables, and a new time integration scheme. We show that the resulting fully discrete scheme is unconditionally stable-in-energy, second-order time-accurate, and mass-conservative. We utilize isogeometric analysis for spatial discretization and verify the aforementioned properties by adopting the method of manufactured solutions and comparing coarse mesh solutions with overkill solutions. Various problems are simulated to show the capability of the method. Our methodology provides a means of constructing unconditionallymore » stable numerical schemes for nonlinear non-convex hyperbolic systems of conservation laws.« less

  7. Beating the Clauser-Horne-Shimony-Holt and the Svetlichny games with optimal states

    NASA Astrophysics Data System (ADS)

    Su, Hong-Yi; Ren, Changliang; Chen, Jing-Ling; Zhang, Fu-Lin; Wu, Chunfeng; Xu, Zhen-Peng; Gu, Mile; Vinjanampathy, Sai; Kwek, L. C.

    2016-02-01

    We study the relation between the maximal violation of Svetlichny's inequality and the mixedness of quantum states and obtain the optimal state (i.e., maximally nonlocal mixed states, or MNMS, for each value of linear entropy) to beat the Clauser-Horne-Shimony-Holt and the Svetlichny games. For the two-qubit and three-qubit MNMS, we showed that these states are also the most tolerant state against white noise, and thus serve as valuable quantum resources for such games. In particular, the quantum prediction of the MNMS decreases as the linear entropy increases, and then ceases to be nonlocal when the linear entropy reaches the critical points 2 /3 and 9 /14 for the two- and three-qubit cases, respectively. The MNMS are related to classical errors in experimental preparation of maximally entangled states.

  8. Design of high-strength refractory complex solid-solution alloys

    DOE PAGES

    Singh, Prashant; Sharma, Aayush; Smirnov, A. V.; ...

    2018-03-28

    Nickel-based superalloys and near-equiatomic high-entropy alloys containing molybdenum are known for higher temperature strength and corrosion resistance. Yet, complex solid-solution alloys offer a huge design space to tune for optimal properties at slightly reduced entropy. For refractory Mo-W-Ta-Ti-Zr, we showcase KKR electronic structure methods via the coherent-potential approximation to identify alloys over five-dimensional design space with improved mechanical properties and necessary global (formation enthalpy) and local (short-range order) stability. Deformation is modeled with classical molecular dynamic simulations, validated from our first-principle data. We predict complex solid-solution alloys of improved stability with greatly enhanced modulus of elasticity (3× at 300 K)more » over near-equiatomic cases, as validated experimentally, and with higher moduli above 500 K over commercial alloys (2.3× at 2000 K). We also show that optimal complex solid-solution alloys are not described well by classical potentials due to critical electronic effects.« less

  9. Design of high-strength refractory complex solid-solution alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Prashant; Sharma, Aayush; Smirnov, A. V.

    Nickel-based superalloys and near-equiatomic high-entropy alloys containing molybdenum are known for higher temperature strength and corrosion resistance. Yet, complex solid-solution alloys offer a huge design space to tune for optimal properties at slightly reduced entropy. For refractory Mo-W-Ta-Ti-Zr, we showcase KKR electronic structure methods via the coherent-potential approximation to identify alloys over five-dimensional design space with improved mechanical properties and necessary global (formation enthalpy) and local (short-range order) stability. Deformation is modeled with classical molecular dynamic simulations, validated from our first-principle data. We predict complex solid-solution alloys of improved stability with greatly enhanced modulus of elasticity (3× at 300 K)more » over near-equiatomic cases, as validated experimentally, and with higher moduli above 500 K over commercial alloys (2.3× at 2000 K). We also show that optimal complex solid-solution alloys are not described well by classical potentials due to critical electronic effects.« less

  10. Book Review: Maxwell's Demon 2: Entropy, classical and quantum information, computing. Harvey Leff and Andrew Rex (Eds.); Institute of Physics, Bristol, 2003, 500pp., US 55, ISBN 0750307595

    NASA Astrophysics Data System (ADS)

    Shenker, Orly R.

    2004-09-01

    In 1867, James Clerk Maxwell proposed a perpetuum mobile of the second kind, that is, a counter example for the Second Law of thermodynamics, which came to be known as "Maxwell's Demon." Unlike any other perpetual motion machine, this one escaped attempts by the best scientists and philosophers to show that the Second Law or its statistical mechanical counterparts are universal after all. "Maxwell's demon lives on. After more than 130 years of uncertain life and at least two pronouncements of death, this fanciful character seems more vibrant than ever." These words of Harvey Leff and Andrew Rex (1990), which open their introduction to Maxwell's Demon 2: Entropy, Classical and Quantum Information, Computing (hereafter MD2) are very true: the Demon is as challenging and as intriguing as ever, and forces us to think and rethink about the foundations of thermodynamics and of statistical mechanics.

  11. Numerical solutions of ideal quantum gas dynamical flows governed by semiclassical ellipsoidal-statistical distribution.

    PubMed

    Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin

    2014-01-08

    The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.

  12. Cross-conditional entropy and coherence analysis of pharmaco-EEG changes induced by alprazolam.

    PubMed

    Alonso, J F; Mañanas, M A; Romero, S; Rojas-Martínez, M; Riba, J

    2012-06-01

    Quantitative analysis of electroencephalographic signals (EEG) and their interpretation constitute a helpful tool in the assessment of the bioavailability of psychoactive drugs in the brain. Furthermore, psychotropic drug groups have typical signatures which relate biochemical mechanisms with specific EEG changes. To analyze the pharmacological effect of a dose of alprazolam on the connectivity of the brain during wakefulness by means of linear and nonlinear approaches. EEG signals were recorded after alprazolam administration in a placebo-controlled crossover clinical trial. Nonlinear couplings assessed by means of corrected cross-conditional entropy were compared to linear couplings measured with the classical magnitude squared coherence. Linear variables evidenced a statistically significant drug-induced decrease, whereas nonlinear variables showed significant increases. All changes were highly correlated to drug plasma concentrations. The spatial distribution of the observed connectivity changes clearly differed from a previous study: changes before and after the maximum drug effect were mainly observed over the anterior half of the scalp. Additionally, a new variable with very low computational cost was defined to evaluate nonlinear coupling. This is particularly interesting when all pairs of EEG channels are assessed as in this study. Results showed that alprazolam induced changes in terms of uncoupling between regions of the scalp, with opposite trends depending on the variables: decrease in linear ones and increase in nonlinear features. Maps provided consistent information about the way brain changed in terms of connectivity being definitely necessary to evaluate separately linear and nonlinear interactions.

  13. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production

    PubMed Central

    Kleidon, A.

    2010-01-01

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion. PMID:20368248

  14. Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions

    NASA Astrophysics Data System (ADS)

    Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus

    2017-10-01

    We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.

  15. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production.

    PubMed

    Kleidon, A

    2010-05-12

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion.

  16. An entropy-based method for determining the flow depth distribution in natural channels

    NASA Astrophysics Data System (ADS)

    Moramarco, Tommaso; Corato, Giovanni; Melone, Florisa; Singh, Vijay P.

    2013-08-01

    A methodology for determining the bathymetry of river cross-sections during floods by the sampling of surface flow velocity and existing low flow hydraulic data is developed . Similar to Chiu (1988) who proposed an entropy-based velocity distribution, the flow depth distribution in a cross-section of a natural channel is derived by entropy maximization. The depth distribution depends on one parameter, whose estimate is straightforward, and on the maximum flow depth. Applying to a velocity data set of five river gage sites, the method modeled the flow area observed during flow measurements and accurately assessed the corresponding discharge by coupling the flow depth distribution and the entropic relation between mean velocity and maximum velocity. The methodology unfolds a new perspective for flow monitoring by remote sensing, considering that the two main quantities on which the methodology is based, i.e., surface flow velocity and flow depth, might be potentially sensed by new sensors operating aboard an aircraft or satellite.

  17. Bayesian Approach to Spectral Function Reconstruction for Euclidean Quantum Field Theories

    NASA Astrophysics Data System (ADS)

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33TC.

  18. Bayesian approach to spectral function reconstruction for Euclidean quantum field theories.

    PubMed

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33T(C).

  19. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  20. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  1. A maximum entropy thermodynamics of small systems.

    PubMed

    Dixit, Purushottam D

    2013-05-14

    We present a maximum entropy approach to analyze the state space of a small system in contact with a large bath, e.g., a solvated macromolecular system. For the solute, the fluctuations around the mean values of observables are not negligible and the probability distribution P(r) of the state space depends on the intricate details of the interaction of the solute with the solvent. Here, we employ a superstatistical approach: P(r) is expressed as a marginal distribution summed over the variation in β, the inverse temperature of the solute. The joint distribution P(β, r) is estimated by maximizing its entropy. We also calculate the first order system-size corrections to the canonical ensemble description of the state space. We test the development on a simple harmonic oscillator interacting with two baths with very different chemical identities, viz., (a) Lennard-Jones particles and (b) water molecules. In both cases, our method captures the state space of the oscillator sufficiently well. Future directions and connections with traditional statistical mechanics are discussed.

  2. Statistical mechanics of letters in words

    PubMed Central

    Stephens, Greg J.; Bialek, William

    2013-01-01

    We consider words as a network of interacting letters, and approximate the probability distribution of states taken on by this network. Despite the intuition that the rules of English spelling are highly combinatorial and arbitrary, we find that maximum entropy models consistent with pairwise correlations among letters provide a surprisingly good approximation to the full statistics of words, capturing ~92% of the multi-information in four-letter words and even “discovering” words that were not represented in the data. These maximum entropy models incorporate letter interactions through a set of pairwise potentials and thus define an energy landscape on the space of possible words. Guided by the large letter redundancy we seek a lower-dimensional encoding of the letter distribution and show that distinctions between local minima in the landscape account for ~68% of the four-letter entropy. We suggest that these states provide an effective vocabulary which is matched to the frequency of word use and much smaller than the full lexicon. PMID:20866490

  3. Weak scale from the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  4. Strong converse theorems using Rényi entropies

    NASA Astrophysics Data System (ADS)

    Leditzky, Felix; Wilde, Mark M.; Datta, Nilanjana

    2016-08-01

    We use a Rényi entropy method to prove strong converse theorems for certain information-theoretic tasks which involve local operations and quantum (or classical) communication between two parties. These include state redistribution, coherent state merging, quantum state splitting, measurement compression with quantum side information, randomness extraction against quantum side information, and data compression with quantum side information. The method we employ in proving these results extends ideas developed by Sharma [preprint arXiv:1404.5940 [quant-ph] (2014)], which he used to give a new proof of the strong converse theorem for state merging. For state redistribution, we prove the strong converse property for the boundary of the entire achievable rate region in the (e, q)-plane, where e and q denote the entanglement cost and quantum communication cost, respectively. In the case of measurement compression with quantum side information, we prove a strong converse theorem for the classical communication cost, which is a new result extending the previously known weak converse. For the remaining tasks, we provide new proofs for strong converse theorems previously established using smooth entropies. For each task, we obtain the strong converse theorem from explicit bounds on the figure of merit of the task in terms of a Rényi generalization of the optimal rate. Hence, we identify candidates for the strong converse exponents for each task discussed in this paper. To prove our results, we establish various new entropic inequalities, which might be of independent interest. These involve conditional entropies and mutual information derived from the sandwiched Rényi divergence. In particular, we obtain novel bounds relating these quantities, as well as the Rényi conditional mutual information, to the fidelity of two quantum states.

  5. Strong converse theorems using Rényi entropies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leditzky, Felix; Datta, Nilanjana; Wilde, Mark M.

    We use a Rényi entropy method to prove strong converse theorems for certain information-theoretic tasks which involve local operations and quantum (or classical) communication between two parties. These include state redistribution, coherent state merging, quantum state splitting, measurement compression with quantum side information, randomness extraction against quantum side information, and data compression with quantum side information. The method we employ in proving these results extends ideas developed by Sharma [preprint http://arxiv.org/abs/1404.5940 [quant-ph] (2014)], which he used to give a new proof of the strong converse theorem for state merging. For state redistribution, we prove the strong converse property for themore » boundary of the entire achievable rate region in the (e, q)-plane, where e and q denote the entanglement cost and quantum communication cost, respectively. In the case of measurement compression with quantum side information, we prove a strong converse theorem for the classical communication cost, which is a new result extending the previously known weak converse. For the remaining tasks, we provide new proofs for strong converse theorems previously established using smooth entropies. For each task, we obtain the strong converse theorem from explicit bounds on the figure of merit of the task in terms of a Rényi generalization of the optimal rate. Hence, we identify candidates for the strong converse exponents for each task discussed in this paper. To prove our results, we establish various new entropic inequalities, which might be of independent interest. These involve conditional entropies and mutual information derived from the sandwiched Rényi divergence. In particular, we obtain novel bounds relating these quantities, as well as the Rényi conditional mutual information, to the fidelity of two quantum states.« less

  6. Prediction of Metabolite Concentrations, Rate Constants and Post-Translational Regulation Using Maximum Entropy-Based Simulations with Application to Central Metabolism of Neurospora crassa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, William; Zucker, Jeremy; Baxter, Douglas

    We report the application of a recently proposed approach for modeling biological systems using a maximum entropy production rate principle in lieu of having in vivo rate constants. The method is applied in four steps: (1) a new ODE-based optimization approach based on Marcelin’s 1910 mass action equation is used to obtain the maximum entropy distribution, (2) the predicted metabolite concentrations are compared to those generally expected from experiment using a loss function from which post-translational regulation of enzymes is inferred, (3) the system is re-optimized with the inferred regulation from which rate constants are determined from the metabolite concentrationsmore » and reaction fluxes, and finally (4) a full ODE-based, mass action simulation with rate parameters and allosteric regulation is obtained. From the last step, the power characteristics and resistance of each reaction can be determined. The method is applied to the central metabolism of Neurospora crassa and the flow of material through the three competing pathways of upper glycolysis, the non-oxidative pentose phosphate pathway, and the oxidative pentose phosphate pathway are evaluated as a function of the NADP/NADPH ratio. It is predicted that regulation of phosphofructokinase (PFK) and flow through the pentose phosphate pathway are essential for preventing an extreme level of fructose 1, 6-bisphophate accumulation. Such an extreme level of fructose 1,6-bisphophate would otherwise result in a glassy cytoplasm with limited diffusion, dramatically decreasing the entropy and energy production rate and, consequently, biological competitiveness.« less

  7. Single-particle spectral density of the unitary Fermi gas: Novel approach based on the operator product expansion, sum rules and the maximum entropy method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gubler, Philipp, E-mail: pgubler@riken.jp; RIKEN Nishina Center, Wako, Saitama 351-0198; Yamamoto, Naoki

    2015-05-15

    Making use of the operator product expansion, we derive a general class of sum rules for the imaginary part of the single-particle self-energy of the unitary Fermi gas. The sum rules are analyzed numerically with the help of the maximum entropy method, which allows us to extract the single-particle spectral density as a function of both energy and momentum. These spectral densities contain basic information on the properties of the unitary Fermi gas, such as the dispersion relation and the superfluid pairing gap, for which we obtain reasonable agreement with the available results based on quantum Monte-Carlo simulations.

  8. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  9. The calculation of transport properties in quantum liquids using the maximum entropy numerical analytic continuation method: Application to liquid para-hydrogen

    PubMed Central

    Rabani, Eran; Reichman, David R.; Krilov, Goran; Berne, Bruce J.

    2002-01-01

    We present a method based on augmenting an exact relation between a frequency-dependent diffusion constant and the imaginary time velocity autocorrelation function, combined with the maximum entropy numerical analytic continuation approach to study transport properties in quantum liquids. The method is applied to the case of liquid para-hydrogen at two thermodynamic state points: a liquid near the triple point and a high-temperature liquid. Good agreement for the self-diffusion constant and for the real-time velocity autocorrelation function is obtained in comparison to experimental measurements and other theoretical predictions. Improvement of the methodology and future applications are discussed. PMID:11830656

  10. Maximum entropy production, carbon assimilation, and the spatial organization of vegetation in river basins

    PubMed Central

    del Jesus, Manuel; Foti, Romano; Rinaldo, Andrea; Rodriguez-Iturbe, Ignacio

    2012-01-01

    The spatial organization of functional vegetation types in river basins is a major determinant of their runoff production, biodiversity, and ecosystem services. The optimization of different objective functions has been suggested to control the adaptive behavior of plants and ecosystems, often without a compelling justification. Maximum entropy production (MEP), rooted in thermodynamics principles, provides a tool to justify the choice of the objective function controlling vegetation organization. The application of MEP at the ecosystem scale results in maximum productivity (i.e., maximum canopy photosynthesis) as the thermodynamic limit toward which the organization of vegetation appears to evolve. Maximum productivity, which incorporates complex hydrologic feedbacks, allows us to reproduce the spatial macroscopic organization of functional types of vegetation in a thoroughly monitored river basin, without the need for a reductionist description of the underlying microscopic dynamics. The methodology incorporates the stochastic characteristics of precipitation and the associated soil moisture on a spatially disaggregated framework. Our results suggest that the spatial organization of functional vegetation types in river basins naturally evolves toward configurations corresponding to dynamically accessible local maxima of the maximum productivity of the ecosystem. PMID:23213227

  11. Crystal structure correlations with the intrinsic thermodynamics of human carbonic anhydrase inhibitor binding

    PubMed Central

    Smirnov, Alexey; Zubrienė, Asta; Manakova, Elena; Gražulis, Saulius

    2018-01-01

    The structure-thermodynamics correlation analysis was performed for a series of fluorine- and chlorine-substituted benzenesulfonamide inhibitors binding to several human carbonic anhydrase (CA) isoforms. The total of 24 crystal structures of 16 inhibitors bound to isoforms CA I, CA II, CA XII, and CA XIII provided the structural information of selective recognition between a compound and CA isoform. The binding thermodynamics of all structures was determined by the analysis of binding-linked protonation events, yielding the intrinsic parameters, i.e., the enthalpy, entropy, and Gibbs energy of binding. Inhibitor binding was compared within structurally similar pairs that differ by para- or meta-substituents enabling to obtain the contributing energies of ligand fragments. The pairs were divided into two groups. First, similar binders—the pairs that keep the same orientation of the benzene ring exhibited classical hydrophobic effect, a less exothermic enthalpy and a more favorable entropy upon addition of the hydrophobic fragments. Second, dissimilar binders—the pairs of binders that demonstrated altered positions of the benzene rings exhibited the non-classical hydrophobic effect, a more favorable enthalpy and variable entropy contribution. A deeper understanding of the energies contributing to the protein-ligand recognition should lead toward the eventual goal of rational drug design where chemical structures of ligands could be designed based on the target protein structure. PMID:29503769

  12. Identifying topological-band insulator transitions in silicene and other 2D gapped Dirac materials by means of Rényi-Wehrl entropy

    NASA Astrophysics Data System (ADS)

    Calixto, M.; Romera, E.

    2015-02-01

    We propose a new method to identify transitions from a topological insulator to a band insulator in silicene (the silicon equivalent of graphene) in the presence of perpendicular magnetic and electric fields, by using the Rényi-Wehrl entropy of the quantum state in phase space. Electron-hole entropies display an inversion/crossing behavior at the charge neutrality point for any Landau level, and the combined entropy of particles plus holes turns out to be maximum at this critical point. The result is interpreted in terms of delocalization of the quantum state in phase space. The entropic description presented in this work will be valid in general 2D gapped Dirac materials, with a strong intrinsic spin-orbit interaction, isostructural with silicene.

  13. Relationships between self-diffusivity, packing fraction, and excess entropy in simple bulk and confined fluids.

    PubMed

    Mittal, Jeetain; Errington, Jeffrey R; Truskett, Thomas M

    2007-08-30

    Static measures such as density and entropy, which are intimately connected to structure, have featured prominently in modern thinking about the dynamics of the liquid state. Here, we explore the connections between self-diffusivity, density, and excess entropy for two of the most widely used model "simple" liquids, the equilibrium Lennard-Jones and square-well fluids, in both bulk and confined environments. We find that the self-diffusivity data of the Lennard-Jones fluid can be approximately collapsed onto a single curve (i) versus effective packing fraction and (ii) in appropriately reduced form versus excess entropy, as suggested by two well-known scaling laws. Similar data collapse does not occur for the square-well fluid, a fact that can be understood on the basis of the nontrivial effects that temperature has on its static structure. Nonetheless, we show that the implications of confinement for the self-diffusivity of both of these model fluids, over a broad range of equilibrium conditions, can be predicted on the basis of knowledge of the bulk fluid behavior and either the effective packing fraction or the excess entropy of the confined fluid. Excess entropy is perhaps the most preferable route due to its superior predictive ability and because it is a standard, unambiguous thermodynamic quantity that can be readily predicted via classical density functional theories of inhomogeneous fluids.

  14. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  15. On the morphological instability of a bubble during inertia-controlled growth

    NASA Astrophysics Data System (ADS)

    Martyushev, L. M.; Birzina, A. I.; Soboleva, A. S.

    2018-06-01

    The morphological stability of a spherical bubble growing under inertia control is analyzed. Based on the comparison of entropy productions for a distorted and undistorted surface and using the maximum entropy production principle, the morphological instability of the bubble under arbitrary amplitude distortions is shown. This result allows explaining a number of experiments where the surface roughness of bubbles was observed during their explosive-type growth.

  16. Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers

    NASA Astrophysics Data System (ADS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-04-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  17. Trends in entropy production during ecosystem development in the Amazon Basin.

    PubMed

    Holdaway, Robert J; Sparrow, Ashley D; Coomes, David A

    2010-05-12

    Understanding successional trends in energy and matter exchange across the ecosystem-atmosphere boundary layer is an essential focus in ecological research; however, a general theory describing the observed pattern remains elusive. This paper examines whether the principle of maximum entropy production could provide the solution. A general framework is developed for calculating entropy production using data from terrestrial eddy covariance and micrometeorological studies. We apply this framework to data from eight tropical forest and pasture flux sites in the Amazon Basin and show that forest sites had consistently higher entropy production rates than pasture sites (0.461 versus 0.422 W m(-2) K(-1), respectively). It is suggested that during development, changes in canopy structure minimize surface albedo, and development of deeper root systems optimizes access to soil water and thus potential transpiration, resulting in lower surface temperatures and increased entropy production. We discuss our results in the context of a theoretical model of entropy production versus ecosystem developmental stage. We conclude that, although further work is required, entropy production could potentially provide a much-needed theoretical basis for understanding the effects of deforestation and land-use change on the land-surface energy balance.

  18. Prediction of pKa Values for Neutral and Basic Drugs based on Hybrid Artificial Intelligence Methods.

    PubMed

    Li, Mengshan; Zhang, Huaijing; Chen, Bingsheng; Wu, Yan; Guan, Lixin

    2018-03-05

    The pKa value of drugs is an important parameter in drug design and pharmacology. In this paper, an improved particle swarm optimization (PSO) algorithm was proposed based on the population entropy diversity. In the improved algorithm, when the population entropy was higher than the set maximum threshold, the convergence strategy was adopted; when the population entropy was lower than the set minimum threshold the divergence strategy was adopted; when the population entropy was between the maximum and minimum threshold, the self-adaptive adjustment strategy was maintained. The improved PSO algorithm was applied in the training of radial basis function artificial neural network (RBF ANN) model and the selection of molecular descriptors. A quantitative structure-activity relationship model based on RBF ANN trained by the improved PSO algorithm was proposed to predict the pKa values of 74 kinds of neutral and basic drugs and then validated by another database containing 20 molecules. The validation results showed that the model had a good prediction performance. The absolute average relative error, root mean square error, and squared correlation coefficient were 0.3105, 0.0411, and 0.9685, respectively. The model can be used as a reference for exploring other quantitative structure-activity relationships.

  19. On the pH Dependence of the Potential of Maximum Entropy of Ir(111) Electrodes.

    PubMed

    Ganassin, Alberto; Sebastián, Paula; Climent, Víctor; Schuhmann, Wolfgang; Bandarenka, Aliaksandr S; Feliu, Juan

    2017-04-28

    Studies over the entropy of components forming the electrode/electrolyte interface can give fundamental insights into the properties of electrified interphases. In particular, the potential where the entropy of formation of the double layer is maximal (potential of maximum entropy, PME) is an important parameter for the characterization of electrochemical systems. Indeed, this parameter determines the majority of electrode processes. In this work, we determine PMEs for Ir(111) electrodes. The latter currently play an important role to understand electrocatalysis for energy provision; and at the same time, iridium is one of the most stable metals against corrosion. For the experiments, we used a combination of the laser induced potential transient to determine the PME, and CO charge-displacement to determine the potentials of zero total charge, (E PZTC ). Both PME and E PZTC were assessed for perchlorate solutions in the pH range from 1 to 4. Surprisingly, we found that those are located in the potential region where the adsorption of hydrogen and hydroxyl species takes place, respectively. The PMEs demonstrated a shift by ~30 mV per a pH unit (in the RHE scale). Connections between the PME and electrocatalytic properties of the electrode surface are discussed.

  20. Developing the fuzzy c-means clustering algorithm based on maximum entropy for multitarget tracking in a cluttered environment

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Li, Yaan; Yu, Jing; Li, Yuxing

    2018-01-01

    For fast and more effective implementation of tracking multiple targets in a cluttered environment, we propose a multiple targets tracking (MTT) algorithm called maximum entropy fuzzy c-means clustering joint probabilistic data association that combines fuzzy c-means clustering and the joint probabilistic data association (PDA) algorithm. The algorithm uses the membership value to express the probability of the target originating from measurement. The membership value is obtained through fuzzy c-means clustering objective function optimized by the maximum entropy principle. When considering the effect of the public measurement, we use a correction factor to adjust the association probability matrix to estimate the state of the target. As this algorithm avoids confirmation matrix splitting, it can solve the high computational load problem of the joint PDA algorithm. The results of simulations and analysis conducted for tracking neighbor parallel targets and cross targets in a different density cluttered environment show that the proposed algorithm can realize MTT quickly and efficiently in a cluttered environment. Further, the performance of the proposed algorithm remains constant with increasing process noise variance. The proposed algorithm has the advantages of efficiency and low computational load, which can ensure optimum performance when tracking multiple targets in a dense cluttered environment.

  1. Maximum entropy methods for extracting the learned features of deep neural networks.

    PubMed

    Finnegan, Alex; Song, Jun S

    2017-10-01

    New architectures of multilayer artificial neural networks and new methods for training them are rapidly revolutionizing the application of machine learning in diverse fields, including business, social science, physical sciences, and biology. Interpreting deep neural networks, however, currently remains elusive, and a critical challenge lies in understanding which meaningful features a network is actually learning. We present a general method for interpreting deep neural networks and extracting network-learned features from input data. We describe our algorithm in the context of biological sequence analysis. Our approach, based on ideas from statistical physics, samples from the maximum entropy distribution over possible sequences, anchored at an input sequence and subject to constraints implied by the empirical function learned by a network. Using our framework, we demonstrate that local transcription factor binding motifs can be identified from a network trained on ChIP-seq data and that nucleosome positioning signals are indeed learned by a network trained on chemical cleavage nucleosome maps. Imposing a further constraint on the maximum entropy distribution also allows us to probe whether a network is learning global sequence features, such as the high GC content in nucleosome-rich regions. This work thus provides valuable mathematical tools for interpreting and extracting learned features from feed-forward neural networks.

  2. Formulating the shear stress distribution in circular open channels based on the Renyi entropy

    NASA Astrophysics Data System (ADS)

    Khozani, Zohreh Sheikh; Bonakdari, Hossein

    2018-01-01

    The principle of maximum entropy is employed to derive the shear stress distribution by maximizing the Renyi entropy subject to some constraints and by assuming that dimensionless shear stress is a random variable. A Renyi entropy-based equation can be used to model the shear stress distribution along the entire wetted perimeter of circular channels and circular channels with flat beds and deposited sediments. A wide range of experimental results for 12 hydraulic conditions with different Froude numbers (0.375 to 1.71) and flow depths (20.3 to 201.5 mm) were used to validate the derived shear stress distribution. For circular channels, model performance enhanced with increasing flow depth (mean relative error (RE) of 0.0414) and only deteriorated slightly at the greatest flow depth (RE of 0.0573). For circular channels with flat beds, the Renyi entropy model predicted the shear stress distribution well at lower sediment depth. The Renyi entropy model results were also compared with Shannon entropy model results. Both models performed well for circular channels, but for circular channels with flat beds the Renyi entropy model displayed superior performance in estimating the shear stress distribution. The Renyi entropy model was highly precise and predicted the shear stress distribution in a circular channel with RE of 0.0480 and in a circular channel with a flat bed with RE of 0.0488.

  3. Linearity of holographic entanglement entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almheiri, Ahmed; Dong, Xi; Swingle, Brian

    Here, we consider the question of whether the leading contribution to the entanglement entropy in holographic CFTs is truly given by the expectation value of a linear operator as is suggested by the Ryu-Takayanagi formula. We investigate this property by computing the entanglement entropy, via the replica trick, in states dual to superpositions of macroscopically distinct geometries and find it consistent with evaluating the expectation value of the area operator within such states. However, we find that this fails once the number of semi-classical states in the superposition grows exponentially in the central charge of the CFT. Moreover, in certainmore » such scenarios we find that the choice of surface on which to evaluate the area operator depends on the density matrix of the entire CFT. This nonlinearity is enforced in the bulk via the homology prescription of Ryu-Takayanagi. We thus conclude that the homology constraint is not a linear property in the CFT. We also discuss the existence of entropy operators in general systems with a large number of degrees of freedom.« less

  4. Linearity of holographic entanglement entropy

    DOE PAGES

    Almheiri, Ahmed; Dong, Xi; Swingle, Brian

    2017-02-14

    Here, we consider the question of whether the leading contribution to the entanglement entropy in holographic CFTs is truly given by the expectation value of a linear operator as is suggested by the Ryu-Takayanagi formula. We investigate this property by computing the entanglement entropy, via the replica trick, in states dual to superpositions of macroscopically distinct geometries and find it consistent with evaluating the expectation value of the area operator within such states. However, we find that this fails once the number of semi-classical states in the superposition grows exponentially in the central charge of the CFT. Moreover, in certainmore » such scenarios we find that the choice of surface on which to evaluate the area operator depends on the density matrix of the entire CFT. This nonlinearity is enforced in the bulk via the homology prescription of Ryu-Takayanagi. We thus conclude that the homology constraint is not a linear property in the CFT. We also discuss the existence of entropy operators in general systems with a large number of degrees of freedom.« less

  5. Linear growth of the entanglement entropy and the Kolmogorov-Sinai rate

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Hackl, Lucas; Yokomizo, Nelson

    2018-03-01

    The rate of entropy production in a classical dynamical system is characterized by the Kolmogorov-Sinai entropy rate h KS given by the sum of all positive Lyapunov exponents of the system. We prove a quantum version of this result valid for bosonic systems with unstable quadratic Hamiltonian. The derivation takes into account the case of time-dependent Hamiltonians with Floquet instabilities. We show that the entanglement entropy S A of a Gaussian state grows linearly for large times in unstable systems, with a rate Λ A ≤ h KS determined by the Lyapunov exponents and the choice of the subsystem A. We apply our results to the analysis of entanglement production in unstable quadratic potentials and due to periodic quantum quenches in many-body quantum systems. Our results are relevant for quantum field theory, for which we present three applications: a scalar field in a symmetry-breaking potential, parametric resonance during post-inflationary reheating and cosmological perturbations during inflation. Finally, we conjecture that the same rate Λ A appears in the entanglement growth of chaotic quantum systems prepared in a semiclassical state.

  6. Towards a second law for Lovelock theories

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Sayantani; Haehl, Felix M.; Kundu, Nilay; Loganayagam, R.; Rangamani, Mukund

    2017-03-01

    In classical general relativity described by Einstein-Hilbert gravity, black holes behave as thermodynamic objects. In particular, the laws of black hole mechanics can be interpreted as laws of thermodynamics. The first law of black hole mechanics extends to higher derivative theories via the Noether charge construction of Wald. One also expects the statement of the second law, which in Einstein-Hilbert theory owes to Hawking's area theorem, to extend to higher derivative theories. To argue for this however one needs a notion of entropy for dynamical black holes, which the Noether charge construction does not provide. We propose such an entropy function for the family of Lovelock theories, treating the higher derivative terms as perturbations to the Einstein-Hilbert theory. Working around a dynamical black hole solution, and making no assumptions about the amplitude of departure from equilibrium, we construct a candidate entropy functional valid to all orders in the low energy effective field theory. This entropy functional satisfies a second law, modulo a certain subtle boundary term, which deserves further investigation in non-spherically symmetric situations.

  7. Coherent Behavior and the Bound State of Water and K+ Imply Another Model of Bioenergetics: Negative Entropy Instead of High-energy Bonds

    PubMed Central

    Jaeken, Laurent; Vasilievich Matveev, Vladimir

    2012-01-01

    Observations of coherent cellular behavior cannot be integrated into widely accepted membrane (pump) theory (MT) and its steady state energetics because of the thermal noise of assumed ordinary cell water and freely soluble cytoplasmic K+. However, Ling disproved MT and proposed an alternative based on coherence, showing that rest (R) and action (A) are two different phases of protoplasm with different energy levels. The R-state is a coherent metastable low-entropy state as water and K+ are bound to unfolded proteins. The A-state is the higher-entropy state because water and K+ are free. The R-to-A phase transition is regarded as a mechanism to release energy for biological work, replacing the classical concept of high-energy bonds. Subsequent inactivation during the endergonic A-to-R phase transition needs an input of metabolic energy to restore the low entropy R-state. Matveev’s native aggregation hypothesis allows to integrate the energetic details of globular proteins into this view. PMID:23264833

  8. An Integrated Approach to Thermodynamics in the Introductory Physics Course.

    ERIC Educational Resources Information Center

    Alonso, Marcelo; Finn, Edward J.

    1995-01-01

    Presents an approach to combine the empirical approach of classical thermodynamics with the structural approach of statistical mechanics. Topics covered include dynamical foundation of the first law; mechanical work, heat, radiation, and the first law; thermal equilibrium; thermal processes; thermodynamic probability; entropy; the second law;…

  9. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  10. Statistics of Infima and Stopping Times of Entropy Production and Applications to Active Molecular Processes

    NASA Astrophysics Data System (ADS)

    Neri, Izaak; Roldán, Édgar; Jülicher, Frank

    2017-01-01

    We study the statistics of infima, stopping times, and passage probabilities of entropy production in nonequilibrium steady states, and we show that they are universal. We consider two examples of stopping times: first-passage times of entropy production and waiting times of stochastic processes, which are the times when a system reaches a given state for the first time. Our main results are as follows: (i) The distribution of the global infimum of entropy production is exponential with mean equal to minus Boltzmann's constant; (ii) we find exact expressions for the passage probabilities of entropy production; (iii) we derive a fluctuation theorem for stopping-time distributions of entropy production. These results have interesting implications for stochastic processes that can be discussed in simple colloidal systems and in active molecular processes. In particular, we show that the timing and statistics of discrete chemical transitions of molecular processes, such as the steps of molecular motors, are governed by the statistics of entropy production. We also show that the extreme-value statistics of active molecular processes are governed by entropy production; for example, we derive a relation between the maximal excursion of a molecular motor against the direction of an external force and the infimum of the corresponding entropy-production fluctuations. Using this relation, we make predictions for the distribution of the maximum backtrack depth of RNA polymerases, which follow from our universal results for entropy-production infima.

  11. Entanglement of Distillation for Lattice Gauge Theories.

    PubMed

    Van Acoleyen, Karel; Bultinck, Nick; Haegeman, Jutho; Marien, Michael; Scholz, Volkher B; Verstraete, Frank

    2016-09-23

    We study the entanglement structure of lattice gauge theories from the local operational point of view, and, similar to Soni and Trivedi [J. High Energy Phys. 1 (2016) 1], we show that the usual entanglement entropy for a spatial bipartition can be written as the sum of an undistillable gauge part and of another part corresponding to the local operations and classical communication distillable entanglement, which is obtained by depolarizing the local superselection sectors. We demonstrate that the distillable entanglement is zero for pure Abelian gauge theories at zero gauge coupling, while it is in general nonzero for the non-Abelian case. We also consider gauge theories with matter, and show in a perturbative approach how area laws-including a topological correction-emerge for the distillable entanglement. Finally, we also discuss the entanglement entropy of gauge fixed states and show that it has no relation to the physical distillable entropy.

  12. Application of exergetic sustainability index to a nano-scale irreversible Brayton cycle operating with ideal Bose and Fermi gasses

    NASA Astrophysics Data System (ADS)

    Açıkkalp, Emin; Caner, Necmettin

    2015-09-01

    In this study, a nano-scale irreversible Brayton cycle operating with quantum gasses including Bose and Fermi gasses is researched. Developments in the nano-technology cause searching the nano-scale machines including thermal systems to be unavoidable. Thermodynamic analysis of a nano-scale irreversible Brayton cycle operating with Bose and Fermi gasses was performed (especially using exergetic sustainability index). In addition, thermodynamic analysis involving classical evaluation parameters such as work output, exergy output, entropy generation, energy and exergy efficiencies were conducted. Results are submitted numerically and finally some useful recommendations were conducted. Some important results are: entropy generation and exergetic sustainability index are affected mostly for Bose gas and power output and exergy output are affected mostly for the Fermi gas by x. At the high temperature conditions, work output and entropy generation have high values comparing with other degeneracy conditions.

  13. Gravitational vacuum condensate stars.

    PubMed

    Mazur, Pawel O; Mottola, Emil

    2004-06-29

    A new final state of gravitational collapse is proposed. By extending the concept of Bose-Einstein condensation to gravitational systems, a cold, dark, compact object with an interior de Sitter condensate p(v) = -rho(v) and an exterior Schwarzschild geometry of arbitrary total mass M is constructed. These regions are separated by a shell with a small but finite proper thickness l of fluid with equation of state p = +rho, replacing both the Schwarzschild and de Sitter classical horizons. The new solution has no singularities, no event horizons, and a global time. Its entropy is maximized under small fluctuations and is given by the standard hydrodynamic entropy of the thin shell, which is of the order k(B)lMc/Planck's over 2 pi, instead of the Bekenstein-Hawking entropy formula, S(BH) = 4 pi k(B)GM(2)/Planck's over 2 pi c. Hence, unlike black holes, the new solution is thermodynamically stable and has no information paradox.

  14. Rényi entropies characterizing the shape and the extension of the phase space representation of quantum wave functions in disordered systems.

    PubMed

    Varga, Imre; Pipek, János

    2003-08-01

    We discuss some properties of the generalized entropies, called Rényi entropies, and their application to the case of continuous distributions. In particular, it is shown that these measures of complexity can be divergent; however, their differences are free from these divergences, thus enabling them to be good candidates for the description of the extension and the shape of continuous distributions. We apply this formalism to the projection of wave functions onto the coherent state basis, i.e., to the Husimi representation. We also show how the localization properties of the Husimi distribution on average can be reconstructed from its marginal distributions that are calculated in position and momentum space in the case when the phase space has no structure, i.e., no classical limit can be defined. Numerical simulations on a one-dimensional disordered system corroborate our expectations.

  15. Spatial correlation in matter-wave interference as a measure of decoherence, dephasing, and entropy

    NASA Astrophysics Data System (ADS)

    Chen, Zilin; Beierle, Peter; Batelaan, Herman

    2018-04-01

    The loss of contrast in double-slit electron diffraction due to dephasing and decoherence processes is studied. It is shown that the spatial intensity correlation function of diffraction patterns can be used to distinguish between dephasing and decoherence. This establishes a measure of time reversibility that does not require the determination of coherence terms of the density matrix, while von Neumann entropy, another measure of time reversibility, does require coherence terms. This technique is exciting in view of the need to understand and control the detrimental experimental effect of contrast loss and for fundamental studies on the transition from the classical to the quantum regime.

  16. Natural Scale for Employee's Payment Based on the Entropy Law

    NASA Astrophysics Data System (ADS)

    Cosma, Ioan; Cosma, Adrian

    2009-05-01

    An econophysical modeling fated to establish an equitable scale of employees' salary in accordance with the importance and effectiveness of labor is considered. Our model, based on the concept and law of entropy, can designate all the parameters connected to the level of personal incomes and taxations, and also to the distribution of employees versus amount of salary in any remuneration system. Consistent with the laws of classical and statistical thermodynamics, this scale reveals that the personal incomes increased progressively in a natural logarithmic way, different compared with other scales arbitrary established by the governments of each country or by employing companies.

  17. Use and validity of principles of extremum of entropy production in the study of complex systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitor Reis, A., E-mail: ahr@uevora.pt

    2014-07-15

    It is shown how both the principles of extremum of entropy production, which are often used in the study of complex systems, follow from the maximization of overall system conductivities, under appropriate constraints. In this way, the maximum rate of entropy production (MEP) occurs when all the forces in the system are kept constant. On the other hand, the minimum rate of entropy production (mEP) occurs when all the currents that cross the system are kept constant. A brief discussion on the validity of the application of the mEP and MEP principles in several cases, and in particular to themore » Earth’s climate is also presented. -- Highlights: •The principles of extremum of entropy production are not first principles. •They result from the maximization of conductivities under appropriate constraints. •The conditions of their validity are set explicitly. •Some long-standing controversies are discussed and clarified.« less

  18. Simultaneous Multi-Scale Diffusion Estimation and Tractography Guided by Entropy Spectrum Pathways

    PubMed Central

    Galinsky, Vitaly L.; Frank, Lawrence R.

    2015-01-01

    We have developed a method for the simultaneous estimation of local diffusion and the global fiber tracts based upon the information entropy flow that computes the maximum entropy trajectories between locations and depends upon the global structure of the multi-dimensional and multi-modal diffusion field. Computation of the entropy spectrum pathways requires only solving a simple eigenvector problem for the probability distribution for which efficient numerical routines exist, and a straight forward integration of the probability conservation through ray tracing of the convective modes guided by a global structure of the entropy spectrum coupled with a small scale local diffusion. The intervoxel diffusion is sampled by multi b-shell multi q-angle DWI data expanded in spherical waves. This novel approach to fiber tracking incorporates global information about multiple fiber crossings in every individual voxel and ranks it in the most scientifically rigorous way. This method has potential significance for a wide range of applications, including studies of brain connectivity. PMID:25532167

  19. A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization

    NASA Astrophysics Data System (ADS)

    Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano

    In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.

  20. Modelling the spreading rate of controlled communicable epidemics through an entropy-based thermodynamic model

    NASA Astrophysics Data System (ADS)

    Wang, WenBin; Wu, ZiNiu; Wang, ChunFeng; Hu, RuiFeng

    2013-11-01

    A model based on a thermodynamic approach is proposed for predicting the dynamics of communicable epidemics assumed to be governed by controlling efforts of multiple scales so that an entropy is associated with the system. All the epidemic details are factored into a single and time-dependent coefficient, the functional form of this coefficient is found through four constraints, including notably the existence of an inflexion point and a maximum. The model is solved to give a log-normal distribution for the spread rate, for which a Shannon entropy can be defined. The only parameter, that characterizes the width of the distribution function, is uniquely determined through maximizing the rate of entropy production. This entropy-based thermodynamic (EBT) model predicts the number of hospitalized cases with a reasonable accuracy for SARS in the year 2003. This EBT model can be of use for potential epidemics such as avian influenza and H7N9 in China.

  1. Entropic criterion for model selection

    NASA Astrophysics Data System (ADS)

    Tseng, Chih-Yuan

    2006-10-01

    Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why use this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficult to get. Following the logic of inductive inference proposed by Caticha [Relative entropy and inductive inference, in: G. Erickson, Y. Zhai (Eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, AIP Conference Proceedings, vol. 707, 2004 (available from arXiv.org/abs/physics/0311093)], we show relative entropy to be a unique criterion, which requires no prior information and can be applied to different fields. We examine this criterion by considering a physical problem, simple fluids, and results are promising.

  2. Steepest entropy ascent quantum thermodynamic model of electron and phonon transport

    NASA Astrophysics Data System (ADS)

    Li, Guanchen; von Spakovsky, Michael R.; Hin, Celine

    2018-01-01

    An advanced nonequilibrium thermodynamic model for electron and phonon transport is formulated based on the steepest-entropy-ascent quantum thermodynamics framework. This framework, based on the principle of steepest entropy ascent (or the equivalent maximum entropy production principle), inherently satisfies the laws of thermodynamics and mechanics and is applicable at all temporal and spatial scales even in the far-from-equilibrium realm. Specifically, the model is proven to recover the Boltzmann transport equations in the near-equilibrium limit and the two-temperature model of electron-phonon coupling when no dispersion is assumed. The heat and mass transport at a temperature discontinuity across a homogeneous interface where the dispersion and coupling of electron and phonon transport are both considered are then modeled. Local nonequilibrium system evolution and nonquasiequilibrium interactions are predicted and the results discussed.

  3. Conserved actions, maximum entropy and dark matter haloes

    NASA Astrophysics Data System (ADS)

    Pontzen, Andrew; Governato, Fabio

    2013-03-01

    We use maximum entropy arguments to derive the phase-space distribution of a virialized dark matter halo. Our distribution function gives an improved representation of the end product of violent relaxation. This is achieved by incorporating physically motivated dynamical constraints (specifically on orbital actions) which prevent arbitrary redistribution of energy. We compare the predictions with three high-resolution dark matter simulations of widely varying mass. The numerical distribution function is accurately predicted by our argument, producing an excellent match for the vast majority of particles. The remaining particles constitute the central cusp of the halo (≲4 per cent of the dark matter). They can be accounted for within the presented framework once the short dynamical time-scales of the centre are taken into account.

  4. Optimal protocol for maximum work extraction in a feedback process with a time-varying potential

    NASA Astrophysics Data System (ADS)

    Kwon, Chulan

    2017-12-01

    The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.

  5. Numerical optimization using flow equations.

    PubMed

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  6. Numerical optimization using flow equations

    NASA Astrophysics Data System (ADS)

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  7. LIBOR troubles: Anomalous movements detection based on maximum entropy

    NASA Astrophysics Data System (ADS)

    Bariviera, Aurelio F.; Martín, María T.; Plastino, Angelo; Vampa, Victoria

    2016-05-01

    According to the definition of the London Interbank Offered Rate (LIBOR), contributing banks should give fair estimates of their own borrowing costs in the interbank market. Between 2007 and 2009, several banks made inappropriate submissions of LIBOR, sometimes motivated by profit-seeking from their trading positions. In 2012, several newspapers' articles began to cast doubt on LIBOR integrity, leading surveillance authorities to conduct investigations on banks' behavior. Such procedures resulted in severe fines imposed to involved banks, who recognized their financial inappropriate conduct. In this paper, we uncover such unfair behavior by using a forecasting method based on the Maximum Entropy principle. Our results are robust against changes in parameter settings and could be of great help for market surveillance.

  8. Maximum entropy modeling of metabolic networks by constraining growth-rate moments predicts coexistence of phenotypes

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele

    2017-12-01

    In this work maximum entropy distributions in the space of steady states of metabolic networks are considered upon constraining the first and second moments of the growth rate. Coexistence of fast and slow phenotypes, with bimodal flux distributions, emerges upon considering control on the average growth (optimization) and its fluctuations (heterogeneity). This is applied to the carbon catabolic core of Escherichia coli where it quantifies the metabolic activity of slow growing phenotypes and it provides a quantitative map with metabolic fluxes, opening the possibility to detect coexistence from flux data. A preliminary analysis on data for E. coli cultures in standard conditions shows degeneracy for the inferred parameters that extend in the coexistence region.

  9. An understanding of human dynamics in urban subway traffic from the Maximum Entropy Principle

    NASA Astrophysics Data System (ADS)

    Yong, Nuo; Ni, Shunjiang; Shen, Shifei; Ji, Xuewei

    2016-08-01

    We studied the distribution of entry time interval in Beijing subway traffic by analyzing the smart card transaction data, and then deduced the probability distribution function of entry time interval based on the Maximum Entropy Principle. Both theoretical derivation and data statistics indicated that the entry time interval obeys power-law distribution with an exponential cutoff. In addition, we pointed out the constraint conditions for the distribution form and discussed how the constraints affect the distribution function. It is speculated that for bursts and heavy tails in human dynamics, when the fitted power exponent is less than 1.0, it cannot be a pure power-law distribution, but with an exponential cutoff, which may be ignored in the previous studies.

  10. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  11. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  12. Comparison of image deconvolution algorithms on simulated and laboratory infrared images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Proctor, D.

    1994-11-15

    We compare Maximum Likelihood, Maximum Entropy, Accelerated Lucy-Richardson, Weighted Goodness of Fit, and Pixon reconstructions of simple scenes as a function of signal-to-noise ratio for simulated images with randomly generated noise. Reconstruction results of infrared images taken with the TAISIR (Temperature and Imaging System InfraRed) are also discussed.

  13. How many universes are necessary for an ice cream to melt?

    NASA Astrophysics Data System (ADS)

    Cirkovic, Milan M.

    We investigate a quantitative consequence of the Acausal-Anthropic approach to solving the long-standing puzzle of the thermodynamical arrow of time. Notably, the size of the required multiverse is estimated on the basis of the classical Boltzmann connection between entropy and probability, as well as the thermodynamic properties of black holes.

  14. Contact symmetries and Hamiltonian thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravetti, A., E-mail: bravetti@correo.nucleares.unam.mx; Lopez-Monsalvo, C.S., E-mail: cesar.slm@correo.nucleares.unam.mx; Nettel, F., E-mail: Francisco.Nettel@roma1.infn.it

    It has been shown that contact geometry is the proper framework underlying classical thermodynamics and that thermodynamic fluctuations are captured by an additional metric structure related to Fisher’s Information Matrix. In this work we analyse several unaddressed aspects about the application of contact and metric geometry to thermodynamics. We consider here the Thermodynamic Phase Space and start by investigating the role of gauge transformations and Legendre symmetries for metric contact manifolds and their significance in thermodynamics. Then we present a novel mathematical characterization of first order phase transitions as equilibrium processes on the Thermodynamic Phase Space for which the Legendremore » symmetry is broken. Moreover, we use contact Hamiltonian dynamics to represent thermodynamic processes in a way that resembles the classical Hamiltonian formulation of conservative mechanics and we show that the relevant Hamiltonian coincides with the irreversible entropy production along thermodynamic processes. Therefore, we use such property to give a geometric definition of thermodynamically admissible fluctuations according to the Second Law of thermodynamics. Finally, we show that the length of a curve describing a thermodynamic process measures its entropy production.« less

  15. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  16. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns.

    PubMed

    Osgood, Nathaniel D; Paul, Tuhin; Stanley, Kevin G; Qian, Weicheng

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist.

  17. Research on Sustainable Development Level Evaluation of Resource-based Cities Based on Shapely Entropy and Chouqet Integral

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Qu, Weilu; Qiu, Weiting

    2018-03-01

    In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.

  18. Measuring Questions: Relevance and its Relation to Entropy

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    The Boolean lattice of logical statements induces the free distributive lattice of questions. Inclusion on this lattice is based on whether one question answers another. Generalizing the zeta function of the question lattice leads to a valuation called relevance or bearing, which is a measure of the degree to which one question answers another. Richard Cox conjectured that this degree can be expressed as a generalized entropy. With the assistance of yet another important result from Janos Acz6l, I show that this is indeed the case; and that the resulting inquiry calculus is a natural generalization of information theory. This approach provides a new perspective of the Principle of Maximum Entropy.

  19. Early history of extended irreversible thermodynamics (1953-1983): An exploration beyond local equilibrium and classical transport theory

    NASA Astrophysics Data System (ADS)

    Lebon, G.; Jou, D.

    2015-06-01

    This paper gives a historical account of the early years (1953-1983) of extended irreversible thermodynamics (EIT). The salient features of this formalism are to upgrade the thermodynamic fluxes of mass, momentum, energy, and others, to the status of independent variables, and to explore the consistency between generalized transport equations and a generalized version of the second law of thermodynamics. This requires going beyond classical irreversible thermodynamics by redefining entropy and entropy flux. EIT provides deeper foundations, closer relations with microscopic formalisms, a wider spectrum of applications, and a more exciting conceptual appeal to non-equilibrium thermodynamics. We first recall the historical contributions by Maxwell, Cattaneo, and Grad on generalized transport equations. A thermodynamic theory wide enough to cope with such transport equations was independently proposed between 1953 and 1983 by several authors, each emphasizing different kinds of problems. In 1983, the first international meeting on this theory took place in Bellaterra (Barcelona). It provided the opportunity for the various authors to meet together for the first time and to discuss the common points and the specific differences of their previous formulations. From then on, a large amount of applications and theoretical confirmations have emerged. From the historical point of view, the emergence of EIT has been an opportunity to revisit the foundations and to open new avenues in thermodynamics, one of the most classical and well consolidated physical theories.

  20. Estimation of depth to magnetic source using maximum entropy power spectra, with application to the Peru-Chile Trench

    USGS Publications Warehouse

    Blakely, Richard J.

    1981-01-01

    Estimations of the depth to magnetic sources using the power spectrum of magnetic anomalies generally require long magnetic profiles. The method developed here uses the maximum entropy power spectrum (MEPS) to calculate depth to source on short windows of magnetic data; resolution is thereby improved. The method operates by dividing a profile into overlapping windows, calculating a maximum entropy power spectrum for each window, linearizing the spectra, and calculating with least squares the various depth estimates. The assumptions of the method are that the source is two dimensional and that the intensity of magnetization includes random noise; knowledge of the direction of magnetization is not required. The method is applied to synthetic data and to observed marine anomalies over the Peru-Chile Trench. The analyses indicate a continuous magnetic basement extending from the eastern margin of the Nazca plate and into the subduction zone. The computed basement depths agree with acoustic basement seaward of the trench axis, but deepen as the plate approaches the inner trench wall. This apparent increase in the computed depths may result from the deterioration of magnetization in the upper part of the ocean crust, possibly caused by compressional disruption of the basaltic layer. Landward of the trench axis, the depth estimates indicate possible thrusting of the oceanic material into the lower slope of the continental margin.

  1. Multiple Diffusion Mechanisms Due to Nanostructuring in Crowded Environments

    PubMed Central

    Sanabria, Hugo; Kubota, Yoshihisa; Waxham, M. Neal

    2007-01-01

    One of the key questions regarding intracellular diffusion is how the environment affects molecular mobility. Mostly, intracellular diffusion has been described as hindered, and the physical reasons for this behavior are: immobile barriers, molecular crowding, and binding interactions with immobile or mobile molecules. Using results from multi-photon fluorescence correlation spectroscopy, we describe how immobile barriers and crowding agents affect translational mobility. To study the hindrance produced by immobile barriers, we used sol-gels (silica nanostructures) that consist of a continuous solid phase and aqueous phase in which fluorescently tagged molecules diffuse. In the case of molecular crowding, translational mobility was assessed in increasing concentrations of 500 kDa dextran solutions. Diffusion of fluorescent tracers in both sol-gels and dextran solutions shows clear evidence of anomalous subdiffusion. In addition, data from the autocorrelation function were analyzed using the maximum entropy method as adapted to fluorescence correlation spectroscopy data and compared with the standard model that incorporates anomalous diffusion. The maximum entropy method revealed evidence of different diffusion mechanisms that had not been revealed using the anomalous diffusion model. These mechanisms likely correspond to nanostructuring in crowded environments and to the relative dimensions of the crowding agent with respect to the tracer molecule. Analysis with the maximum entropy method also revealed information about the degree of heterogeneity in the environment as reported by the behavior of diffusive molecules. PMID:17040979

  2. Possible dynamical explanations for Paltridge's principle of maximum entropy production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Virgo, Nathaniel, E-mail: nathanielvirgo@gmail.com; Ikegami, Takashi, E-mail: nathanielvirgo@gmail.com

    2014-12-05

    Throughout the history of non-equilibrium thermodynamics a number of theories have been proposed in which complex, far from equilibrium flow systems are hypothesised to reach a steady state that maximises some quantity. Perhaps the most celebrated is Paltridge's principle of maximum entropy production for the horizontal heat flux in Earth's atmosphere, for which there is some empirical support. There have been a number of attempts to derive such a principle from maximum entropy considerations. However, we currently lack a more mechanistic explanation of how any particular system might self-organise into a state that maximises some quantity. This is in contrastmore » to equilibrium thermodynamics, in which models such as the Ising model have been a great help in understanding the relationship between the predictions of MaxEnt and the dynamics of physical systems. In this paper we show that, unlike in the equilibrium case, Paltridge-type maximisation in non-equilibrium systems cannot be achieved by a simple dynamical feedback mechanism. Nevertheless, we propose several possible mechanisms by which maximisation could occur. Showing that these occur in any real system is a task for future work. The possibilities presented here may not be the only ones. We hope that by presenting them we can provoke further discussion about the possible dynamical mechanisms behind extremum principles for non-equilibrium systems, and their relationship to predictions obtained through MaxEnt.« less

  3. The information geometry of chaos

    NASA Astrophysics Data System (ADS)

    Cafaro, Carlo

    2008-10-01

    In this Thesis, we propose a new theoretical information-geometric framework (IGAC, Information Geometrodynamical Approach to Chaos) suitable to characterize chaotic dynamical behavior of arbitrary complex systems. First, the problem being investigated is defined; its motivation and relevance are discussed. The basic tools of information physics and the relevant mathematical tools employed in this work are introduced. The basic aspects of Entropic Dynamics (ED) are reviewed. ED is an information-constrained dynamics developed by Ariel Caticha to investigate the possibility that laws of physics---either classical or quantum---may emerge as macroscopic manifestations of underlying microscopic statistical structures. ED is of primary importance in our IGAC. The notion of chaos in classical and quantum physics is introduced. Special focus is devoted to the conventional Riemannian geometrodynamical approach to chaos (Jacobi geometrodynamics) and to the Zurek-Paz quantum chaos criterion of linear entropy growth. After presenting this background material, we show that the ED formalism is not purely an abstract mathematical framework, but is indeed a general theoretical scheme from which conventional Newtonian dynamics is obtained as a special limiting case. The major elements of our IGAC and the novel notion of information geometrodynamical entropy (IGE) are introduced by studying two "toy models". To illustrate the potential power of our IGAC, one application is presented. An information-geometric analogue of the Zurek-Paz quantum chaos criterion of linear entropy growth is suggested. Finally, concluding remarks emphasizing strengths and weak points of our approach are presented and possible further research directions are addressed. At this stage of its development, IGAC remains an ambitious unifying information-geometric theoretical construct for the study of chaotic dynamics with several unsolved problems. However, based on our recent findings, we believe it already provides an interesting, innovative and potentially powerful way to study and understand the very important and challenging problems of classical and quantum chaos.

  4. Minimax Quantum Tomography: Estimators and Relative Entropy Bounds.

    PubMed

    Ferrie, Christopher; Blume-Kohout, Robin

    2016-03-04

    A minimax estimator has the minimum possible error ("risk") in the worst case. We construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O(1/sqrt[N])-in contrast to that of classical probability estimation, which is O(1/N)-where N is the number of copies of the quantum state used. We trace this deficiency to sampling mismatch: future observations that determine risk may come from a different sample space than the past data that determine the estimate. This makes minimax estimators very biased, and we propose a computationally tractable alternative with similar behavior in the worst case, but superior accuracy on most states.

  5. Entropy for Mechanically Vibrating Systems

    NASA Astrophysics Data System (ADS)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  6. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Bilsky, A. V.; Lozhkin, V. A.; Markovich, D. M.; Tokarev, M. P.

    2013-04-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART.

  7. Online Robot Dead Reckoning Localization Using Maximum Relative Entropy Optimization With Model Constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urniezius, Renaldas

    2011-03-14

    The principle of Maximum relative Entropy optimization was analyzed for dead reckoning localization of a rigid body when observation data of two attached accelerometers was collected. Model constraints were derived from the relationships between the sensors. The experiment's results confirmed that accelerometers each axis' noise can be successfully filtered utilizing dependency between channels and the dependency between time series data. Dependency between channels was used for a priori calculation, and a posteriori distribution was derived utilizing dependency between time series data. There was revisited data of autocalibration experiment by removing the initial assumption that instantaneous rotation axis of a rigidmore » body was known. Performance results confirmed that such an approach could be used for online dead reckoning localization.« less

  8. LensEnt2: Maximum-entropy weak lens reconstruction

    NASA Astrophysics Data System (ADS)

    Marshall, P. J.; Hobson, M. P.; Gull, S. F.; Bridle, S. L.

    2013-08-01

    LensEnt2 is a maximum entropy reconstructor of weak lensing mass maps. The method takes each galaxy shape as an independent estimator of the reduced shear field and incorporates an intrinsic smoothness, determined by Bayesian methods, into the reconstruction. The uncertainties from both the intrinsic distribution of galaxy shapes and galaxy shape estimation are carried through to the final mass reconstruction, and the mass within arbitrarily shaped apertures are calculated with corresponding uncertainties. The input is a galaxy ellipticity catalog with each measured galaxy shape treated as a noisy tracer of the reduced shear field, which is inferred on a fine pixel grid assuming positivity, and smoothness on scales of w arcsec where w is an input parameter. The ICF width w can be chosen by computing the evidence for it.

  9. Halo-independence with quantified maximum entropy at DAMA/LIBRA

    NASA Astrophysics Data System (ADS)

    Fowlie, Andrew

    2017-10-01

    Using the DAMA/LIBRA anomaly as an example, we formalise the notion of halo-independence in the context of Bayesian statistics and quantified maximum entropy. We consider an infinite set of possible profiles, weighted by an entropic prior and constrained by a likelihood describing noisy measurements of modulated moments by DAMA/LIBRA. Assuming an isotropic dark matter (DM) profile in the galactic rest frame, we find the most plausible DM profiles and predictions for unmodulated signal rates at DAMA/LIBRA. The entropic prior contains an a priori unknown regularisation factor, β, that describes the strength of our conviction that the profile is approximately Maxwellian. By varying β, we smoothly interpolate between a halo-independent and a halo-dependent analysis, thus exploring the impact of prior information about the DM profile.

  10. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007).

    PubMed

    Pueyo, Salvador

    2012-05-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach.

  11. Bayesian Image Segmentations by Potts Prior and Loopy Belief Propagation

    NASA Astrophysics Data System (ADS)

    Tanaka, Kazuyuki; Kataoka, Shun; Yasuda, Muneki; Waizumi, Yuji; Hsu, Chiou-Ting

    2014-12-01

    This paper presents a Bayesian image segmentation model based on Potts prior and loopy belief propagation. The proposed Bayesian model involves several terms, including the pairwise interactions of Potts models, and the average vectors and covariant matrices of Gauss distributions in color image modeling. These terms are often referred to as hyperparameters in statistical machine learning theory. In order to determine these hyperparameters, we propose a new scheme for hyperparameter estimation based on conditional maximization of entropy in the Potts prior. The algorithm is given based on loopy belief propagation. In addition, we compare our conditional maximum entropy framework with the conventional maximum likelihood framework, and also clarify how the first order phase transitions in loopy belief propagations for Potts models influence our hyperparameter estimation procedures.

  12. On the sufficiency of pairwise interactions in maximum entropy models of networks

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya; Merchan, Lina

    Biological information processing networks consist of many components, which are coupled by an even larger number of complex multivariate interactions. However, analyses of data sets from fields as diverse as neuroscience, molecular biology, and behavior have reported that observed statistics of states of some biological networks can be approximated well by maximum entropy models with only pairwise interactions among the components. Based on simulations of random Ising spin networks with p-spin (p > 2) interactions, here we argue that this reduction in complexity can be thought of as a natural property of some densely interacting networks in certain regimes, and not necessarily as a special property of living systems. This work was supported in part by James S. McDonnell Foundation Grant No. 220020321.

  13. Maximum entropy models as a tool for building precise neural controls.

    PubMed

    Savin, Cristina; Tkačik, Gašper

    2017-10-01

    Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007)

    PubMed Central

    Pueyo, Salvador

    2012-01-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach. PMID:22837843

  15. MaxEnt-Based Ecological Theory: A Template for Integrated Catchment Theory

    NASA Astrophysics Data System (ADS)

    Harte, J.

    2017-12-01

    The maximum information entropy procedure (MaxEnt) is both a powerful tool for inferring least-biased probability distributions from limited data and a framework for the construction of complex systems theory. The maximum entropy theory of ecology (METE) describes remarkably well widely observed patterns in the distribution, abundance and energetics of individuals and taxa in relatively static ecosystems. An extension to ecosystems undergoing change in response to disturbance or natural succession (DynaMETE) is in progress. I describe the structure of both the static and the dynamic theory and show a range of comparisons with census data. I then propose a generalization of the MaxEnt approach that could provide a framework for a predictive theory of both static and dynamic, fully-coupled, eco-socio-hydrological catchment systems.

  16. A non-uniformly sampled 4D HCC(CO)NH-TOCSY experiment processed using maximum entropy for rapid protein sidechain assignment

    PubMed Central

    Mobli, Mehdi; Stern, Alan S.; Bermel, Wolfgang; King, Glenn F.; Hoch, Jeffrey C.

    2010-01-01

    One of the stiffest challenges in structural studies of proteins using NMR is the assignment of sidechain resonances. Typically, a panel of lengthy 3D experiments are acquired in order to establish connectivities and resolve ambiguities due to overlap. We demonstrate that these experiments can be replaced by a single 4D experiment that is time-efficient, yields excellent resolution, and captures unique carbon-proton connectivity information. The approach is made practical by the use of non-uniform sampling in the three indirect time dimensions and maximum entropy reconstruction of the corresponding 3D frequency spectrum. This 4D method will facilitate automated resonance assignment procedures and it should be particularly beneficial for increasing throughput in NMR-based structural genomics initiatives. PMID:20299257

  17. A maximum entropy model for chromatin structure

    NASA Astrophysics Data System (ADS)

    Farre, Pau; Emberly, Eldon; Emberly Group Team

    The DNA inside the nucleus of eukaryotic cells shows a variety of conserved structures at different length scales These structures are formed by interactions between protein complexes that bind to the DNA and regulate gene activity. Recent high throughput sequencing techniques allow for the measurement both of the genome wide contact map of the folded DNA within a cell (HiC) and where various proteins are bound to the DNA (ChIP-seq). In this talk I will present a maximum-entropy method capable of both predicting HiC contact maps from binding data, and binding data from HiC contact maps. This method results in an intuitive Ising-type model that is able to predict how altering the presence of binding factors can modify chromosome conformation, without the need of polymer simulations.

  18. On the shape of things: From holography to elastica

    NASA Astrophysics Data System (ADS)

    Fonda, Piermarco; Jejjala, Vishnu; Veliz-Osorio, Alvaro

    2017-10-01

    We explore the question of which shape a manifold is compelled to take when immersed in another one, provided it must be the extremum of some functional. We consider a family of functionals which depend quadratically on the extrinsic curvatures and on projections of the ambient curvatures. These functionals capture a number of physical setups ranging from holography to the study of membranes and elastica. We present a detailed derivation of the equations of motion, known as the shape equations, placing particular emphasis on the issue of gauge freedom in the choice of normal frame. We apply these equations to the particular case of holographic entanglement entropy for higher curvature three dimensional gravity and find new classes of entangling curves. In particular, we discuss the case of New Massive Gravity where we show that non-geodesic entangling curves have always a smaller on-shell value of the entropy functional. Then we apply this formalism to the computation of the entanglement entropy for dual logarithmic CFTs. Nevertheless, the correct value for the entanglement entropy is provided by geodesics. Then, we discuss the importance of these equations in the context of classical elastica and comment on terms that break gauge invariance.

  19. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  20. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  1. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  2. Like Beauty, Complexity is Hard to Define

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    Like beauty, complexity is hard to define and rather easy to identify: nonlinear dynamics, strongly interconnected simple elements, some sort of divisoria aquorum between order and disorder. Before focusing on complexity, let us remember that the theoretical pillars of contemporary physics are mechanics (Newtonian, relativistic, quantum), Maxwell electromagnetism, and (Boltzmann-Gibbs, BG) statistical mechanics - obligatory basic disciplines in any advanced course in physics. The firstprinciple statistical-mechanical approach starts from (microscopic) electro-mechanics and theory of probabilities, and, through a variety of possible mesoscopic descriptions, arrives to (oscopic) thermodynamics. In the middle of this trip, we cross energy and entropy. Energy is related to the possible microscopic configurations of the system, whereas entropy is related to the corresponding probabilities. Therefore, in some sense, entropy represents a concept which, epistemologically speaking, is one step further with regard to energy. The fact that energy is not parameter-independent is very familiar: the kinetic energy of a truck is very different from that of a fly, and the relativistic energy of a fast electron is very different from its classical value, and so on. What about entropy? One hundred and forty years of tradition, and hundreds - we may even say thousands - of impressive theoretical successes of the parameter-free BG entropy have sedimented, in the mind of many scientists, the conviction that it is unique. However, it can be straightforwardly argued that, in general, this is not the case...

  3. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    performing. All reasonable permutations of factors will be used to develop a multitude of unique combinations. These combinations are considered different...are seen below (Duda et al., 2001). Entropy impurity: () = −�P�ωj�log2P(ωj) j (9) Gini impurity: () =�()�� = 1 2 ∗ [1...proportion of one class to another approaches 0.5, the impurity measure reaches its maximum, which for Entropy is 1.0, while it is 0.5 for Gini and

  4. Regularization of Grad’s 13 -Moment-Equations in Kinetic Gas Theory

    DTIC Science & Technology

    2011-01-01

    variant of the moment method has been proposed by Eu (1980) and is used, e.g., in Myong (2001). Recently, a maximum- entropy 10-moment system has been used...small amplitude linear waves, the R13 system is linearly stable in time for all modes and wave lengths. The instability of the Burnett system indicates...Boltzmann equation. Related to the problem of global hyperbolicity is the questions of the existence of an entropy law for the R13 system . In the linear

  5. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    NASA Astrophysics Data System (ADS)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  6. The Adiabatic Piston and the Second Law of Thermodynamics

    NASA Astrophysics Data System (ADS)

    Crosignani, Bruno; Di Porto, Paolo; Conti, Claudio

    2002-11-01

    A detailed analysis of the adiabatic-piston problem reveals peculiar dynamical features that challenge the general belief that isolated systems necessarily reach a static equilibrium state. In particular, the fact that the piston behaves like a perpetuum mobile, i.e., it never stops but keeps wandering, undergoing sizable oscillations, around the position corresponding to maximum entropy, has remarkable implications on the entropy variations of the system and on the validity of the second law when dealing with systems of mesoscopic dimensions.

  7. A Novel Multivoxel-Based Quantitation of Metabolites and Lipids Noninvasively Combined with Diffusion-Weighted Imaging in Breast Cancer

    DTIC Science & Technology

    2013-10-01

    cancer for improving the overall specificity.  Our recent work has focused on testing retrospective Maximum Entropy and Compressed Sensing of the 4D...terparts and increases the entropy or sparsity of the reconstructed spectrum by narrowing the peak linewidths and de -noising smaller features. This, in...tightened’ beyond the standard de - viation of the noise in an effort to reduce the RMSE and reconstruc- tion non-linearity, but this prevents the

  8. Dissipated energy and entropy production for an unconventional heat engine: the stepwise `circular cycle'

    NASA Astrophysics Data System (ADS)

    di Liberto, Francesco; Pastore, Raffaele; Peruggi, Fulvio

    2011-05-01

    When some entropy is transferred, by means of a reversible engine, from a hot heat source to a colder one, the maximum efficiency occurs, i.e. the maximum available work is obtained. Similarly, a reversible heat pumps transfer entropy from a cold heat source to a hotter one with the minimum expense of energy. In contrast, if we are faced with non-reversible devices, there is some lost work for heat engines, and some extra work for heat pumps. These quantities are both related to entropy production. The lost work, i.e. ? , is also called 'degraded energy' or 'energy unavailable to do work'. The extra work, i.e. ? , is the excess of work performed on the system in the irreversible process with respect to the reversible one (or the excess of heat given to the hotter source in the irreversible process). Both quantities are analysed in detail and are evaluated for a complex process, i.e. the stepwise circular cycle, which is similar to the stepwise Carnot cycle. The stepwise circular cycle is a cycle performed by means of N small weights, dw, which are first added and then removed from the piston of the vessel containing the gas or vice versa. The work performed by the gas can be found as the increase of the potential energy of the dw's. Each single dw is identified and its increase, i.e. its increase in potential energy, evaluated. In such a way it is found how the energy output of the cycle is distributed among the dw's. The size of the dw's affects entropy production and therefore the lost and extra work. The distribution of increases depends on the chosen removal process.

  9. Sample entropy analysis of cervical neoplasia gene-expression signatures

    PubMed Central

    Botting, Shaleen K; Trzeciakowski, Jerome P; Benoit, Michelle F; Salama, Salama A; Diaz-Arrastia, Concepcion R

    2009-01-01

    Background We introduce Approximate Entropy as a mathematical method of analysis for microarray data. Approximate entropy is applied here as a method to classify the complex gene expression patterns resultant of a clinical sample set. Since Entropy is a measure of disorder in a system, we believe that by choosing genes which display minimum entropy in normal controls and maximum entropy in the cancerous sample set we will be able to distinguish those genes which display the greatest variability in the cancerous set. Here we describe a method of utilizing Approximate Sample Entropy (ApSE) analysis to identify genes of interest with the highest probability of producing an accurate, predictive, classification model from our data set. Results In the development of a diagnostic gene-expression profile for cervical intraepithelial neoplasia (CIN) and squamous cell carcinoma of the cervix, we identified 208 genes which are unchanging in all normal tissue samples, yet exhibit a random pattern indicative of the genetic instability and heterogeneity of malignant cells. This may be measured in terms of the ApSE when compared to normal tissue. We have validated 10 of these genes on 10 Normal and 20 cancer and CIN3 samples. We report that the predictive value of the sample entropy calculation for these 10 genes of interest is promising (75% sensitivity, 80% specificity for prediction of cervical cancer over CIN3). Conclusion The success of the Approximate Sample Entropy approach in discerning alterations in complexity from biological system with such relatively small sample set, and extracting biologically relevant genes of interest hold great promise. PMID:19232110

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bialas, A.; Czyz, W.; Zalewski, K.

    The relation between Renyi entropies and moments of the Wigner function, representing the quantum mechanical description of the M-particle semi-inclusive distribution at freeze-out, is investigated. It is shown that in the limit of infinite volume of the system, the classical and quantum descriptions are equivalent. Finite volume corrections are derived and shown to be small for systems encountered in relativistic heavy ion collisions.

  11. Introduction of Differential Scanning Calorimetry in a General Chemistry Laboratory Course: Determination of Thermal Properties of Organic Hydrocarbons

    ERIC Educational Resources Information Center

    D'Amelia, Ronald; Franks, Thomas; Nirode, William F.

    2007-01-01

    In first-year general chemistry undergraduate courses, thermodynamics and thermal properties such as melting points and changes in enthalpy ([Delta]H) and entropy ([Delta]S) of phase changes are frequently discussed. Typically, classical calorimetric methods of analysis are used to determine [Delta]H of reactions. Differential scanning calorimetry…

  12. GABAergic excitation of spider mechanoreceptors increases information capacity by increasing entropy rather than decreasing jitter.

    PubMed

    Pfeiffer, Keram; French, Andrew S

    2009-09-02

    Neurotransmitter chemicals excite or inhibit a range of sensory afferents and sensory pathways. These changes in firing rate or static sensitivity can also be associated with changes in dynamic sensitivity or membrane noise and thus action potential timing. We measured action potential firing produced by random mechanical stimulation of spider mechanoreceptor neurons during long-duration excitation by the GABAA agonist muscimol. Information capacity was estimated from signal-to-noise ratio by averaging responses to repeated identical stimulation sequences. Information capacity was also estimated from the coherence function between input and output signals. Entropy rate was estimated by a data compression algorithm and maximum entropy rate from the firing rate. Action potential timing variability, or jitter, was measured as normalized interspike interval distance. Muscimol increased firing rate, information capacity, and entropy rate, but jitter was unchanged. We compared these data with the effects of increasing firing rate by current injection. Our results indicate that the major increase in information capacity by neurotransmitter action arose from the increased entropy rate produced by increased firing rate, not from reduction in membrane noise and action potential jitter.

  13. An improved method for predicting the evolution of the characteristic parameters of an information system

    NASA Astrophysics Data System (ADS)

    Dushkin, A. V.; Kasatkina, T. I.; Novoseltsev, V. I.; Ivanov, S. V.

    2018-03-01

    The article proposes a forecasting method that allows, based on the given values of entropy and error level of the first and second kind, to determine the allowable time for forecasting the development of the characteristic parameters of a complex information system. The main feature of the method under consideration is the determination of changes in the characteristic parameters of the development of the information system in the form of the magnitude of the increment in the ratios of its entropy. When a predetermined value of the prediction error ratio is reached, that is, the entropy of the system, the characteristic parameters of the system and the depth of the prediction in time are estimated. The resulting values of the characteristics and will be optimal, since at that moment the system possessed the best ratio of entropy as a measure of the degree of organization and orderliness of the structure of the system. To construct a method for estimating the depth of prediction, it is expedient to use the maximum principle of the value of entropy.

  14. Force-Time Entropy of Isometric Impulse.

    PubMed

    Hsieh, Tsung-Yu; Newell, Karl M

    2016-01-01

    The relation between force and temporal variability in discrete impulse production has been viewed as independent (R. A. Schmidt, H. Zelaznik, B. Hawkins, J. S. Frank, & J. T. Quinn, 1979 ) or dependent on the rate of force (L. G. Carlton & K. M. Newell, 1993 ). Two experiments in an isometric single finger force task investigated the joint force-time entropy with (a) fixed time to peak force and different percentages of force level and (b) fixed percentage of force level and different times to peak force. The results showed that the peak force variability increased either with the increment of force level or through a shorter time to peak force that also reduced timing error variability. The peak force entropy and entropy of time to peak force increased on the respective dimension as the parameter conditions approached either maximum force or a minimum rate of force production. The findings show that force error and timing error are dependent but complementary when considered in the same framework with the joint force-time entropy at a minimum in the middle parameter range of discrete impulse.

  15. Optimality and inference in hydrology from entropy production considerations: synthetic hillslope numerical experiments

    NASA Astrophysics Data System (ADS)

    Kollet, S. J.

    2015-05-01

    In this study, entropy production optimization and inference principles are applied to a synthetic semi-arid hillslope in high-resolution, physics-based simulations. The results suggest that entropy or power is indeed maximized, because of the strong nonlinearity of variably saturated flow and competing processes related to soil moisture fluxes, the depletion of gradients, and the movement of a free water table. Thus, it appears that the maximum entropy production (MEP) principle may indeed be applicable to hydrologic systems. In the application to hydrologic system, the free water table constitutes an important degree of freedom in the optimization of entropy production and may also relate the theory to actual observations. In an ensuing analysis, an attempt is made to transfer the complex, "microscopic" hillslope model into a macroscopic model of reduced complexity using the MEP principle as an interference tool to obtain effective conductance coefficients and forces/gradients. The results demonstrate a new approach for the application of MEP to hydrologic systems and may form the basis for fruitful discussions and research in future.

  16. Beamforming using subspace estimation from a diagonally averaged sample covariance.

    PubMed

    Quijano, Jorge E; Zurk, Lisa M

    2017-08-01

    The potential benefit of a large-aperture sonar array for high resolution target localization is often challenged by the lack of sufficient data required for adaptive beamforming. This paper introduces a Toeplitz-constrained estimator of the clairvoyant signal covariance matrix corresponding to multiple far-field targets embedded in background isotropic noise. The estimator is obtained by averaging along subdiagonals of the sample covariance matrix, followed by covariance extrapolation using the method of maximum entropy. The sample covariance is computed from limited data snapshots, a situation commonly encountered with large-aperture arrays in environments characterized by short periods of local stationarity. Eigenvectors computed from the Toeplitz-constrained covariance are used to construct signal-subspace projector matrices, which are shown to reduce background noise and improve detection of closely spaced targets when applied to subspace beamforming. Monte Carlo simulations corresponding to increasing array aperture suggest convergence of the proposed projector to the clairvoyant signal projector, thereby outperforming the classic projector obtained from the sample eigenvectors. Beamforming performance of the proposed method is analyzed using simulated data, as well as experimental data from the Shallow Water Array Performance experiment.

  17. Physics of Inference

    NASA Astrophysics Data System (ADS)

    Toroczkai, Zoltan

    Jaynes's maximum entropy method provides a family of principled models that allow the prediction of a system's properties as constrained by empirical data (observables). However, their use is often hindered by the degeneracy problem characterized by spontaneous symmetry breaking, where predictions fail. Here we show that degeneracy appears when the corresponding density of states function is not log-concave, which is typically the consequence of nonlinear relationships between the constraining observables. We illustrate this phenomenon on several examples, including from complex networks, combinatorics and classical spin systems (e.g., Blume-Emery-Griffiths lattice-spin models). Exploiting these nonlinear relationships we then propose a solution to the degeneracy problem for a large class of systems via transformations that render the density of states function log-concave. The effectiveness of the method is demonstrated on real-world network data. Finally, we discuss the implications of these findings on the relationship between the geometrical properties of the density of states function and phase transitions in spin systems. Supported in part by Grant No. FA9550-12-1-0405 from AFOSR/DARPA and by Grant No. HDTRA 1-09-1-0039 from DTRA.

  18. Thermodynamic precursors, liquid-liquid transitions, dynamic and topological anomalies in densified liquid germania

    NASA Astrophysics Data System (ADS)

    Pacaud, F.; Micoulaut, M.

    2015-08-01

    The thermodynamic, dynamic, structural, and rigidity properties of densified liquid germania (GeO2) have been investigated using classical molecular dynamics simulation. We construct from a thermodynamic framework an analytical equation of state for the liquid allowing the possible detection of thermodynamic precursors (extrema of the derivatives of the free energy), which usually indicate the possibility of a liquid-liquid transition. It is found that for the present germania system, such precursors and the possible underlying liquid-liquid transition are hidden by the slowing down of the dynamics with decreasing temperature. In this respect, germania behaves quite differently when compared to parent tetrahedral systems such as silica or water. We then detect a diffusivity anomaly (a maximum of diffusion with changing density/volume) that is strongly correlated with changes in coordinated species, and the softening of bond-bending (BB) topological constraints that decrease the liquid rigidity and enhance transport. The diffusivity anomaly is finally substantiated from a Rosenfeld-type scaling law linked to the pair correlation entropy, and to structural relaxation.

  19. Numerical solutions of the semiclassical Boltzmann ellipsoidal-statistical kinetic model equation

    PubMed Central

    Yang, Jaw-Yen; Yan, Chin-Yuan; Huang, Juan-Chen; Li, Zhihui

    2014-01-01

    Computations of rarefied gas dynamical flows governed by the semiclassical Boltzmann ellipsoidal-statistical (ES) kinetic model equation using an accurate numerical method are presented. The semiclassical ES model was derived through the maximum entropy principle and conserves not only the mass, momentum and energy, but also contains additional higher order moments that differ from the standard quantum distributions. A different decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. The numerical method in phase space combines the discrete-ordinate method in momentum space and the high-resolution shock capturing method in physical space. Numerical solutions of two-dimensional Riemann problems for two configurations covering various degrees of rarefaction are presented and various contours of the quantities unique to this new model are illustrated. When the relaxation time becomes very small, the main flow features a display similar to that of ideal quantum gas dynamics, and the present solutions are found to be consistent with existing calculations for classical gas. The effect of a parameter that permits an adjustable Prandtl number in the flow is also studied. PMID:25104904

  20. Numerical solutions of ideal quantum gas dynamical flows governed by semiclassical ellipsoidal-statistical distribution

    PubMed Central

    Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin

    2014-01-01

    The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919

  1. Generalized entropy production fluctuation theorems for quantum systems

    NASA Astrophysics Data System (ADS)

    Rana, Shubhashis; Lahiri, Sourabh; Jayannavar, A. M.

    2013-02-01

    Based on trajectory dependent path probability formalism in state space, we derive generalized entropy production fluctuation relations for a quantum system in the presence of measurement and feedback. We have obtained these results for three different cases: (i) the system is evolving in isolation from its surroundings; (ii) the system being weakly coupled to a heat bath; and (iii) system in contact with reservoir using quantum Crooks fluctuation theorem. In case (iii), we build on the treatment carried out in [H. T. Quan and H. Dong, arxiv/cond-mat: 0812.4955], where a quantum trajectory has been defined as a sequence of alternating work and heat steps. The obtained entropy production fluctuation theorems retain the same form as in the classical case. The inequality of second law of thermodynamics gets modified in the presence of information. These fluctuation theorems are robust against intermediate measurements of any observable performed with respect to von Neumann projective measurements as well as weak or positive operator valued measurements.

  2. Memory behaviors of entropy production rates in heat conduction

    NASA Astrophysics Data System (ADS)

    Li, Shu-Nan; Cao, Bing-Yang

    2018-02-01

    Based on the relaxation time approximation and first-order expansion, memory behaviors in heat conduction are found between the macroscopic and Boltzmann-Gibbs-Shannon (BGS) entropy production rates with exponentially decaying memory kernels. In the frameworks of classical irreversible thermodynamics (CIT) and BGS statistical mechanics, the memory dependency on the integrated history is unidirectional, while for the extended irreversible thermodynamics (EIT) and BGS entropy production rates, the memory dependences are bidirectional and coexist with the linear terms. When macroscopic and microscopic relaxation times satisfy a specific relationship, the entropic memory dependences will be eliminated. There also exist initial effects in entropic memory behaviors, which decay exponentially. The second-order term are also discussed, which can be understood as the global non-equilibrium degree. The effects of the second-order term are consisted of three parts: memory dependency, initial value and linear term. The corresponding memory kernels are still exponential and the initial effects of the global non-equilibrium degree also decay exponentially.

  3. Nonadditive entropy Sq and nonextensive statistical mechanics: Applications in geophysics and elsewhere

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2012-06-01

    The celebrated Boltzmann-Gibbs (BG) entropy, S BG = -kΣi p i ln p i, and associated statistical mechanics are essentially based on hypotheses such as ergodicity, i.e., when ensemble averages coincide with time averages. This dynamical simplification occurs in classical systems (and quantum counterparts) whose microscopic evolution is governed by a positive largest Lyapunov exponent (LLE). Under such circumstances, relevant microscopic variables behave, from the probabilistic viewpoint, as (nearly) independent. Many phenomena exist, however, in natural, artificial and social systems (geophysics, astrophysics, biophysics, economics, and others) that violate ergodicity. To cover a (possibly) wide class of such systems, a generalization (nonextensive statistical mechanics) of the BG theory was proposed in 1988. This theory is based on nonadditive entropies such as S_q = kfrac{{1 - sumnolimits_i {p_i^q } }} {{q - 1}}left( {S_1 = S_{BG} } right). Here we comment some central aspects of this theory, and briefly review typical predictions, verifications and applications in geophysics and elsewhere, as illustrated through theoretical, experimental, observational, and computational results.

  4. A Stationary Wavelet Entropy-Based Clustering Approach Accurately Predicts Gene Expression

    PubMed Central

    Nguyen, Nha; Vo, An; Choi, Inchan

    2015-01-01

    Abstract Studying epigenetic landscapes is important to understand the condition for gene regulation. Clustering is a useful approach to study epigenetic landscapes by grouping genes based on their epigenetic conditions. However, classical clustering approaches that often use a representative value of the signals in a fixed-sized window do not fully use the information written in the epigenetic landscapes. Clustering approaches to maximize the information of the epigenetic signals are necessary for better understanding gene regulatory environments. For effective clustering of multidimensional epigenetic signals, we developed a method called Dewer, which uses the entropy of stationary wavelet of epigenetic signals inside enriched regions for gene clustering. Interestingly, the gene expression levels were highly correlated with the entropy levels of epigenetic signals. Dewer separates genes better than a window-based approach in the assessment using gene expression and achieved a correlation coefficient above 0.9 without using any training procedure. Our results show that the changes of the epigenetic signals are useful to study gene regulation. PMID:25383910

  5. Entropy of the information retrieved from black holes

    NASA Astrophysics Data System (ADS)

    Mersini-Houghton, Laura

    2016-07-01

    The retrieval of black hole information was recently presented in two interesting proposals in the ‘Hawking Radiation’ conference: a revised version by Hooft of a proposal he initially suggested 20 years ago and, a new proposal by Hawking. Both proposals address the problem of black hole information loss at the classical level and derive an expression for the scattering matrix. The former uses gravitation back reaction of incoming particles that imprints its information on the outgoing modes. The latter uses supertranslation symmetry of horizons to relate a phase delay of the outgoing wave packet compared to their incoming wave partners. The difficulty in both proposals is that the entropy obtained from them appears to be infinite. By including quantum effects into the Hawking and Hooft’s proposals, I show that a subtlety arising from the inescapable measurement process, the quantum Zeno effect, not only tames divergences but it actually recovers the correct 1/4 of the area Bekenstein-Hawking entropy law of black holes.

  6. Hamiltonian and Thermodynamic Modeling of Quantum Turbulence

    NASA Astrophysics Data System (ADS)

    Grmela, Miroslav

    2010-10-01

    The state variables in the novel model introduced in this paper are the fields playing this role in the classical Landau-Tisza model and additional fields of mass, entropy (or temperature), superfluid velocity, and gradient of the superfluid velocity, all depending on the position vector and another tree dimensional vector labeling the scale, describing the small-scale structure developed in 4He superfluid experiencing turbulent motion. The fluxes of mass, momentum, energy, and entropy in the position space as well as the fluxes of energy and entropy in scales, appear in the time evolution equations as explicit functions of the state variables and of their conjugates. The fundamental thermodynamic relation relating the fields to their conjugates is left in this paper undetermined. The GENERIC structure of the equations serves two purposes: (i) it guarantees that solutions to the governing equations, independently of the choice of the fundamental thermodynamic relation, agree with the observed compatibility with thermodynamics, and (ii) it is used as a guide in the construction of the novel model.

  7. Classical and quantum Reissner-Nordström black hole thermodynamics and first order phase transition

    NASA Astrophysics Data System (ADS)

    Ghaffarnejad, Hossein

    2016-01-01

    First we consider classical Reissner-Nordström black hole (CRNBH) metric which is obtained by solving Einstein-Maxwell metric equation for a point electric charge e inside of a spherical static body with mass M. It has 2 interior and exterior horizons. Using Bekenstein-Hawking entropy theorem we calculate interior and exterior entropy, temperature, Gibbs free energy and heat capacity at constant electric charge. We calculate first derivative of the Gibbs free energy with respect to temperature which become a singular function having a singularity at critical point Mc=2|e|/√{3} with corresponding temperature Tc=1/24π√{3|e|}. Hence we claim first order phase transition is happened there. Temperature same as Gibbs free energy takes absolutely positive (negative) values on the exterior (interior) horizon. The Gibbs free energy takes two different positive values synchronously for 0< T< Tc but not for negative values which means the system is made from two subsystem. For negative temperatures entropy reaches to zero value at Tto-∞ and so takes Bose-Einstein condensation single state. Entropy increases monotonically in case 0< T< Tc. Regarding results of the work presented at Wang and Huang (Phys. Rev. D 63:124014, 2001) we calculate again the mentioned thermodynamical variables for remnant stable final state of evaporating quantum Reissner-Nordström black hole (QRNBH) and obtained results same as one in case of the CRNBH. Finally, we solve mass loss equation of QRNBH against advance Eddington-Finkelstein time coordinate and derive luminosity function. We obtain switching off of QRNBH evaporation before than the mass completely vanishes. It reaches to a could Lukewarm type of RN black hole which its final remnant mass is m_{final}=|e| in geometrical units. Its temperature and luminosity vanish but not in Schwarzschild case of evaporation. Our calculations can be take some acceptable statements about information loss paradox (ILP).

  8. Forest Tree Species Distribution Mapping Using Landsat Satellite Imagery and Topographic Variables with the Maximum Entropy Method in Mongolia

    NASA Astrophysics Data System (ADS)

    Hao Chiang, Shou; Valdez, Miguel; Chen, Chi-Farn

    2016-06-01

    Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM) were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface reflectance coupled with terrain variables produced better result, with the higher overall accuracy and kappa coefficient than first experiment. The results indicate that the Maximum Entropy method is an applicable, and to classify tree species using satellite imagery data coupled with terrain information can improve the classification of tree species in the study area.

  9. Predictive uncertainty in auditory sequence processing

    PubMed Central

    Hansen, Niels Chr.; Pearce, Marcus T.

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

  10. On relativistic generalization of Perelman's W-entropy and thermodynamic description of gravitational fields and cosmology

    NASA Astrophysics Data System (ADS)

    Ruchin, Vyacheslav; Vacaru, Olivia; Vacaru, Sergiu I.

    2017-03-01

    Using double 2+2 and 3+1 nonholonomic fibrations on Lorentz manifolds, we extend the concept of W-entropy for gravitational fields in general relativity (GR). Such F- and W-functionals were introduced in the Ricci flow theory of three dimensional (3-d) Riemannian metrics by Perelman (the entropy formula for the Ricci flow and its geometric applications. arXiv:math.DG/0211159). Non-relativistic 3-d Ricci flows are characterized by associated statistical thermodynamical values determined by W-entropy. Generalizations for geometric flows of 4-d pseudo-Riemannian metrics are considered for models with local thermodynamical equilibrium and separation of dissipative and non-dissipative processes in relativistic hydrodynamics. The approach is elaborated in the framework of classical field theories (relativistic continuum and hydrodynamic models) without an underlying kinetic description, which will be elaborated in other work. The 3+1 splitting allows us to provide a general relativistic definition of gravitational entropy in the Lyapunov-Perelman sense. It increases monotonically as structure forms in the Universe. We can formulate a thermodynamic description of exact solutions in GR depending, in general, on all spacetime coordinates. A corresponding 2+2 splitting with nonholonomic deformation of linear connection and frame structures is necessary for generating in very general form various classes of exact solutions of the Einstein and general relativistic geometric flow equations. Finally, we speculate on physical macrostates and microstate interpretations of the W-entropy in GR, geometric flow theories and possible connections to string theory (a second unsolved problem also contained in Perelman's work) in Polyakov's approach.

  11. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns

    PubMed Central

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist. PMID:27571423

  12. One-Particle Representation of Heat Conduction Described within the Scope of the Second Law.

    PubMed

    Jesudason, Christopher Gunaseelan

    2016-01-01

    The Carnot cycle and its deduction of maximum conversion efficiency of heat inputted and outputted isothermally at different temperatures necessitated the construction of isothermal and adiabatic pathways within the cycle that were mechanically "reversible", leading eventually to the Kelvin-Clausius development of the entropy function S with differential dS = dq/T such that [symbol: see text]C dS = 0 where the heat absorption occurs at the isothermal paths of the elementary Carnot cycle. Another required condition is that the heat transfer processes take place infinitely slowly and "reversibly", implying that rates of transfer are not explicitly featured in the theory. The definition of 'heat' as that form of energy that is transferred as a result of a temperature difference suggests that the local mode of transfer of "heat" in the isothermal segments of the pathway implies a Fourier-like heat conduction mechanism which is apparently irreversible, leading to an increase in entropy of the combined reservoirs at either end of the conducting material, and which is deemed reversible mechanically. These paradoxes are circumvented here by first clarifying the terms used before modeling heat transfer as a thermodynamically reversible but mechanically irreversible process and applied to a one dimensional atomic lattice chain of interacting particles subjected to a temperature difference exemplifying Fourier heat conduction. The basis of a "recoverable trajectory" i.e. that which follows a zero entropy trajectory is identified. The Second Law is strictly maintained in this development. A corollary to this zero entropy trajectory is the generalization of the Zeroth law for steady state non-equilibrium systems with varying temperature, and thus to a statement about "equilibrium" in steady state non-thermostatic conditions. An energy transfer rate term is explicitly identified for each particle and agrees quantitatively (and independently) with the rate of heat absorbed at the reservoirs held at different temperatures and located at the two ends of the lattice chain in MD simulations, where all energy terms in the simulation refer to a single particle interacting with its neighbors. These results validate the theoretical model and provides the necessary boundary conditions (for instance with regard to temperature differentials and force fields) that thermodynamical variables must comply with to satisfy the conditions for a recoverable trajectory, and thus determines the solution of the differential and integral equations that are used to model these processes. These developments and results, if fully pursued would imply that not only can the Carnot cycle be viewed as describing a local process of energy-work conversion by a single interacting particle which feature rates of energy transfer and conversion not possible in the classical Carnot development, but that even irreversible local processes might be brought within the scope of this cycle, implying a unified treatment of thermodynamically (i) irreversible (ii) reversible (iii) isothermal and (iv) adiabatic processes by conflating the classically distinct concept of work and heat energy into a single particle interactional process. A resolution to the fundamental and long-standing conjecture of Benofy and Quay concerning the Fourier principle is one consequence of the analysis.

  13. One-Particle Representation of Heat Conduction Described within the Scope of the Second Law

    PubMed Central

    Jesudason, Christopher Gunaseelan

    2016-01-01

    The Carnot cycle and its deduction of maximum conversion efficiency of heat inputted and outputted isothermally at different temperatures necessitated the construction of isothermal and adiabatic pathways within the cycle that were mechanically “reversible”, leading eventually to the Kelvin-Clausius development of the entropy function S with differential dS=dq/T such that ∮CdS=0 where the heat absorption occurs at the isothermal paths of the elementary Carnot cycle. Another required condition is that the heat transfer processes take place infinitely slowly and “reversibly”, implying that rates of transfer are not explicitly featured in the theory. The definition of ‘heat’ as that form of energy that is transferred as a result of a temperature difference suggests that the local mode of transfer of “heat” in the isothermal segments of the pathway implies a Fourier-like heat conduction mechanism which is apparently irreversible, leading to an increase in entropy of the combined reservoirs at either end of the conducting material, and which is deemed reversible mechanically. These paradoxes are circumvented here by first clarifying the terms used before modeling heat transfer as a thermodynamically reversible but mechanically irreversible process and applied to a one dimensional atomic lattice chain of interacting particles subjected to a temperature difference exemplifying Fourier heat conduction. The basis of a “recoverable trajectory” i.e. that which follows a zero entropy trajectory is identified. The Second Law is strictly maintained in this development. A corollary to this zero entropy trajectory is the generalization of the Zeroth law for steady state non-equilibrium systems with varying temperature, and thus to a statement about “equilibrium” in steady state non-thermostatic conditions. An energy transfer rate term is explicitly identified for each particle and agrees quantitatively (and independently) with the rate of heat absorbed at the reservoirs held at different temperatures and located at the two ends of the lattice chain in MD simulations, where all energy terms in the simulation refer to a single particle interacting with its neighbors. These results validate the theoretical model and provides the necessary boundary conditions (for instance with regard to temperature differentials and force fields) that thermodynamical variables must comply with to satisfy the conditions for a recoverable trajectory, and thus determines the solution of the differential and integral equations that are used to model these processes. These developments and results, if fully pursued would imply that not only can the Carnot cycle be viewed as describing a local process of energy-work conversion by a single interacting particle which feature rates of energy transfer and conversion not possible in the classical Carnot development, but that even irreversible local processes might be brought within the scope of this cycle, implying a unified treatment of thermodynamically (i) irreversible (ii) reversible (iii) isothermal and (iv) adiabatic processes by conflating the classically distinct concept of work and heat energy into a single particle interactional process. A resolution to the fundamental and long-standing conjecture of Benofy and Quay concerning the Fourier principle is one consequence of the analysis. PMID:26760507

  14. Quantifying the entropic cost of cellular growth control

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele; Capuani, Fabrizio; De Martino, Andrea

    2017-07-01

    Viewing the ways a living cell can organize its metabolism as the phase space of a physical system, regulation can be seen as the ability to reduce the entropy of that space by selecting specific cellular configurations that are, in some sense, optimal. Here we quantify the amount of regulation required to control a cell's growth rate by a maximum-entropy approach to the space of underlying metabolic phenotypes, where a configuration corresponds to a metabolic flux pattern as described by genome-scale models. We link the mean growth rate achieved by a population of cells to the minimal amount of metabolic regulation needed to achieve it through a phase diagram that highlights how growth suppression can be as costly (in regulatory terms) as growth enhancement. Moreover, we provide an interpretation of the inverse temperature β controlling maximum-entropy distributions based on the underlying growth dynamics. Specifically, we show that the asymptotic value of β for a cell population can be expected to depend on (i) the carrying capacity of the environment, (ii) the initial size of the colony, and (iii) the probability distribution from which the inoculum was sampled. Results obtained for E. coli and human cells are found to be remarkably consistent with empirical evidence.

  15. A comparative study on the mechanical energy of the normal, ACL, osteoarthritis, and Parkinson subjects.

    PubMed

    Bahreinizad, Hossein; Salimi Bani, Milad; Hasani, Mojtaba; Karimi, Mohammad Taghi; Sharifmoradi, Keyvan; Karimi, Alireza

    2017-08-09

    The influence of various musculoskeletal disorders has been evaluated using different kinetic and kinematic parameters. But the efficiency of walking can be evaluated by measuring the effort of the subject, or by other words the energy that is required to walk. The aim of this study was to identify mechanical energy differences between the normal and pathological groups. Four groups of 15 healthy subjects, 13 Parkinson subjects, 4 osteoarthritis subjects, and 4 ACL reconstructed subjects have participated in this study. The motions of foot, shank and thigh were recorded using a three dimensional motion analysis system. The kinetic, potential and total mechanical energy of each segment was calculated using 3D markers positions and anthropometric measurements. Maximum value and sample entropy of energies was compared between the normal and abnormal subjects. Maximum value of potential energy of OA subjects was lower than the normal subjects. Furthermore, sample entropy of mechanical energy for Parkinson subjects was low in comparison to the normal subjects while sample entropy of mechanical energy for the ACL subjects was higher than that of the normal subjects. Findings of this study suggested that the subjects with different abilities show different mechanical energy during walking.

  16. Multi-GPU maximum entropy image synthesis for radio astronomy

    NASA Astrophysics Data System (ADS)

    Cárcamo, M.; Román, P. E.; Casassus, S.; Moral, V.; Rannou, F. R.

    2018-01-01

    The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has a statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridding MEM, which is tested using real and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This allows to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 min, instead of 2.5 days that takes a sequential version on CPU.

  17. Non-life insurance pricing: multi-agent model

    NASA Astrophysics Data System (ADS)

    Darooneh, A. H.

    2004-11-01

    We use the maximum entropy principle for the pricing of non-life insurance and recover the Bühlmann results for the economic premium principle. The concept of economic equilibrium is revised in this respect.

  18. Maximum entropy principle for stationary states underpinned by stochastic thermodynamics.

    PubMed

    Ford, Ian J

    2015-11-01

    The selection of an equilibrium state by maximizing the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete information. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximizing the change, averaged over all realizations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realizations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.

  19. Origin of generalized entropies and generalized statistical mechanics for superstatistical multifractal systems

    NASA Astrophysics Data System (ADS)

    Gadjiev, Bahruz; Progulova, Tatiana

    2015-01-01

    We consider a multifractal structure as a mixture of fractal substructures and introduce a distribution function f (α), where α is a fractal dimension. Then we can introduce g(p)˜ ∫- ln p μe-yf(y)dy and show that the distribution functions f (α) in the form of f(α) = δ(α-1), f(α) = δ(α-θ) , f(α) = 1/α-1 , f(y)= y α-1 lead to the Boltzmann - Gibbs, Shafee, Tsallis and Anteneodo - Plastino entropies conformably. Here δ(x) is the Dirac delta function. Therefore the Shafee entropy corresponds to a fractal structure, the Tsallis entropy describes a multifractal structure with a homogeneous distribution of fractal substructures and the Anteneodo - Plastino entropy appears in case of a power law distribution f (y). We consider the Fokker - Planck equation for a fractal substructure and determine its stationary solution. To determine the distribution function of a multifractal structure we solve the two-dimensional Fokker - Planck equation and obtain its stationary solution. Then applying the Bayes theorem we obtain a distribution function for the entire system in the form of q-exponential function. We compare the results of the distribution functions obtained due to the superstatistical approach with the ones obtained according to the maximum entropy principle.

  20. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  1. Halo-independence with quantified maximum entropy at DAMA/LIBRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowlie, Andrew, E-mail: andrew.j.fowlie@googlemail.com

    2017-10-01

    Using the DAMA/LIBRA anomaly as an example, we formalise the notion of halo-independence in the context of Bayesian statistics and quantified maximum entropy. We consider an infinite set of possible profiles, weighted by an entropic prior and constrained by a likelihood describing noisy measurements of modulated moments by DAMA/LIBRA. Assuming an isotropic dark matter (DM) profile in the galactic rest frame, we find the most plausible DM profiles and predictions for unmodulated signal rates at DAMA/LIBRA. The entropic prior contains an a priori unknown regularisation factor, β, that describes the strength of our conviction that the profile is approximately Maxwellian.more » By varying β, we smoothly interpolate between a halo-independent and a halo-dependent analysis, thus exploring the impact of prior information about the DM profile.« less

  2. A measure of uncertainty regarding the interval constraint of normal mean elicited by two stages of a prior hierarchy.

    PubMed

    Kim, Hea-Jung

    2014-01-01

    This paper considers a hierarchical screened Gaussian model (HSGM) for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  3. Multi-Group Maximum Entropy Model for Translational Non-Equilibrium

    NASA Technical Reports Server (NTRS)

    Jayaraman, Vegnesh; Liu, Yen; Panesi, Marco

    2017-01-01

    The aim of the current work is to describe a new model for flows in translational non- equilibrium. Starting from the statistical description of a gas proposed by Boltzmann, the model relies on a domain decomposition technique in velocity space. Using the maximum entropy principle, the logarithm of the distribution function in each velocity sub-domain (group) is expressed with a power series in molecular velocity. New governing equations are obtained using the method of weighted residuals by taking the velocity moments of the Boltzmann equation. The model is applied to a spatially homogeneous Boltzmann equation with a Bhatnagar-Gross-Krook1(BGK) model collision operator and the relaxation of an initial non-equilibrium distribution to a Maxwellian is studied using the model. In addition, numerical results obtained using the model for a 1D shock tube problem are also reported.

  4. Maximum entropy perception-action space: a Bayesian model of eye movement selection

    NASA Astrophysics Data System (ADS)

    Colas, Francis; Bessière, Pierre; Girard, Benoît

    2011-03-01

    In this article, we investigate the issue of the selection of eye movements in a free-eye Multiple Object Tracking task. We propose a Bayesian model of retinotopic maps with a complex logarithmic mapping. This model is structured in two parts: a representation of the visual scene, and a decision model based on the representation. We compare different decision models based on different features of the representation and we show that taking into account uncertainty helps predict the eye movements of subjects recorded in a psychophysics experiment. Finally, based on experimental data, we postulate that the complex logarithmic mapping has a functional relevance, as the density of objects in this space in more uniform than expected. This may indicate that the representation space and control strategies are such that the object density is of maximum entropy.

  5. Estimation of typhoon rainfall in GaoPing River: A Multivariate Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Pei-Jui, Wu; Hwa-Lung, Yu

    2016-04-01

    The heavy rainfall from typhoons is the main factor of the natural disaster in Taiwan, which causes the significant loss of human lives and properties. Statistically average 3.5 typhoons invade Taiwan every year, and the serious typhoon, Morakot in 2009, impacted Taiwan in recorded history. Because the duration, path and intensity of typhoon, also affect the temporal and spatial rainfall type in specific region , finding the characteristics of the typhoon rainfall type is advantageous when we try to estimate the quantity of rainfall. This study developed a rainfall prediction model and can be divided three parts. First, using the EEOF(extended empirical orthogonal function) to classify the typhoon events, and decompose the standard rainfall type of all stations of each typhoon event into the EOF and PC(principal component). So we can classify the typhoon events which vary similarly in temporally and spatially as the similar typhoon types. Next, according to the classification above, we construct the PDF(probability density function) in different space and time by means of using the multivariate maximum entropy from the first to forth moment statistically. Therefore, we can get the probability of each stations of each time. Final we use the BME(Bayesian Maximum Entropy method) to construct the typhoon rainfall prediction model , and to estimate the rainfall for the case of GaoPing river which located in south of Taiwan.This study could be useful for typhoon rainfall predictions in future and suitable to government for the typhoon disaster prevention .

  6. How the Second Law of Thermodynamics Has Informed Ecosystem Ecology through Its History

    NASA Astrophysics Data System (ADS)

    Chapman, E. J.; Childers, D. L.; Vallino, J. J.

    2014-12-01

    Throughout the history of ecosystem ecology many attempts have been made to develop a general principle governing how systems develop and organize. We reviewed the historical developments that led to conceptualization of several goal-oriented principles in ecosystem ecology and the relationships among them. We focused our review on two prominent principles—the Maximum Power Principle and the Maximum Entropy Production Principle—and the literature that applies to both. While these principles have considerable conceptual overlap and both use concepts in physics (power and entropy), we found considerable differences in their historical development, the disciplines that apply these principles, and their adoption in the literature. We reviewed the literature using Web of Science keyword searches for the MPP, the MEPP, as well as for papers that cited pioneers in the MPP and the MEPP development. From the 6000 papers that our keyword searches returned, we limited our further meta-analysis to 32 papers by focusing on studies with a foundation in ecosystems research. Despite these seemingly disparate pasts, we concluded that the conceptual approaches of these two principles were more similar than dissimilar and that maximization of power in ecosystems occurs with maximum entropy production. We also found that these two principles have great potential to explain how systems develop, organize, and function, but there are no widely agreed upon theoretical derivations for the MEPP or the MPP, possibly hindering their broader use in ecological research. We end with recommendations for how ecosystems-level studies may better use these principles.

  7. Lorentz violation and perpetual motion

    NASA Astrophysics Data System (ADS)

    Eling, Christopher; Foster, Brendan Z.; Jacobson, Ted; Wall, Aron C.

    2007-05-01

    We show that any Lorentz-violating theory with two or more propagation speeds is in conflict with the generalized second law of black hole thermodynamics. We do this by identifying a classical energy-extraction method, analogous to the Penrose process, which would decrease the black hole entropy. Although the usual definitions of black hole entropy are ambiguous in this context, we require only very mild assumptions about its dependence on the mass. This extends the result found by Dubovsky and Sibiryakov, which uses the Hawking effect and applies only if the fields with different propagation speeds interact just through gravity. We also point out instabilities that could interfere with their black hole perpetuum mobile, but argue that these can be neglected if the black hole mass is sufficiently large.

  8. Frenetic Bounds on the Entropy Production

    NASA Astrophysics Data System (ADS)

    Maes, Christian

    2017-10-01

    We give a systematic derivation of positive lower bounds for the expected entropy production (EP) rate in classical statistical mechanical systems obeying a dynamical large deviation principle. The logic is the same for the return to thermodynamic equilibrium as it is for steady nonequilibria working under the condition of local detailed balance. We recover there recently studied "uncertainty" relations for the EP, appearing in studies about the effectiveness of mesoscopic machines. In general our refinement of the positivity of the expected EP rate is obtained in terms of a positive and even function of the expected current(s) which measures the dynamical activity in the system, a time-symmetric estimate of the changes in the system's configuration. Also underdamped diffusions can be included in the analysis.

  9. [Evaluation of a simplified index (spectral entropy) about sleep state of electrocardiogram recorded by a simplified polygraph, MemCalc-Makin2].

    PubMed

    Ohisa, Noriko; Ogawa, Hiromasa; Murayama, Nobuki; Yoshida, Katsumi

    2010-02-01

    Polysomnography (PSG) is the gold standard for the diagnosis of sleep apnea hypopnea syndrome (SAHS), but it takes time to analyze the PSG and PSG cannot be performed repeatedly because of efforts and costs. Therefore, simplified sleep respiratory disorder indices in which are reflected the PSG results are needed. The Memcalc method, which is a combination of the maximum entropy method for spectral analysis and the non-linear least squares method for fitting analysis (Makin2, Suwa Trust, Tokyo, Japan) has recently been developed. Spectral entropy which is derived by the Memcalc method might be useful to expressing the trend of time-series behavior. Spectral entropy of ECG which is calculated with the Memcalc method was evaluated by comparing to the PSG results. Obstructive SAS patients (n = 79) and control volanteer (n = 7) ECG was recorded using MemCalc-Makin2 (GMS) with PSG recording using Alice IV (Respironics) from 20:00 to 6:00. Spectral entropy of ECG, which was calculated every 2 seconds using the Memcalc method, was compared to sleep stages which were analyzed manually from PSG recordings. Spectral entropy value (-0.473 vs. -0.418, p < 0.05) were significantly increased in the OSAHS compared to the control. For the entropy cutoff level of -0.423, sensitivity and specificity for OSAHS were 86.1% and 71.4%, respectively, resulting in a receiver operating characteristic with an area under the curve of 0.837. The absolute value of entropy had inverse correlation with stage 3. Spectral entropy, which was calculated with Memcalc method, might be a possible index evaluating the quality of sleep.

  10. On the derivation of linear irreversible thermodynamics for classical fluids

    PubMed Central

    Theodosopulu, M.; Grecos, A.; Prigogine, I.

    1978-01-01

    We consider the microscopic derivation of the linearized hydrodynamic equations for an arbitrary simple fluid. Our discussion is based on the concept of hydrodynamical modes, and use is made of the ideas and methods of the theory of subdynamics. We also show that this analysis leads to the Gibbs relation for the entropy of the system. PMID:16592516

  11. Experimental demonstration of information to energy conversion in a quantum system at the Landauer limit.

    PubMed

    Peterson, J P S; Sarthour, R S; Souza, A M; Oliveira, I S; Goold, J; Modi, K; Soares-Pinto, D O; Céleri, L C

    2016-04-01

    Landauer's principle sets fundamental thermodynamical constraints for classical and quantum information processing, thus affecting not only various branches of physics, but also of computer science and engineering. Despite its importance, this principle was only recently experimentally considered for classical systems. Here we employ a nuclear magnetic resonance set-up to experimentally address the information to energy conversion in a quantum system. Specifically, we consider a three nuclear spins [Formula: see text] (qubits) molecule-the system, the reservoir and the ancilla-to measure the heat dissipated during the implementation of a global system-reservoir unitary interaction that changes the information content of the system. By employing an interferometric technique, we were able to reconstruct the heat distribution associated with the unitary interaction. Then, through quantum state tomography, we measured the relative change in the entropy of the system. In this way, we were able to verify that an operation that changes the information content of the system must necessarily generate heat in the reservoir, exactly as predicted by Landauer's principle. The scheme presented here allows for the detailed study of irreversible entropy production in quantum information processors.

  12. Magnetocaloric effect in potassium doped lanthanum manganite perovskites prepared by a pyrophoric method

    NASA Astrophysics Data System (ADS)

    Das, Soma; Dey, T. K.

    2006-08-01

    The magnetocaloric effect (MCE) in fine grained perovskite manganites of the type La1-xKxMnO3 (0

  13. Generic isolated horizons in loop quantum gravity

    NASA Astrophysics Data System (ADS)

    Beetle, Christopher; Engle, Jonathan

    2010-12-01

    Isolated horizons model equilibrium states of classical black holes. A detailed quantization, starting from a classical phase space restricted to spherically symmetric horizons, exists in the literature and has since been extended to axisymmetry. This paper extends the quantum theory to horizons of arbitrary shape. Surprisingly, the Hilbert space obtained by quantizing the full phase space of all generic horizons with a fixed area is identical to that originally found in spherical symmetry. The entropy of a large horizon remains one-quarter its area, with the Barbero-Immirzi parameter retaining its value from symmetric analyses. These results suggest a reinterpretation of the intrinsic quantum geometry of the horizon surface.

  14. Discriminative components of data.

    PubMed

    Peltonen, Jaakko; Kaski, Samuel

    2005-01-01

    A simple probabilistic model is introduced to generalize classical linear discriminant analysis (LDA) in finding components that are informative of or relevant for data classes. The components maximize the predictability of the class distribution which is asymptotically equivalent to 1) maximizing mutual information with the classes, and 2) finding principal components in the so-called learning or Fisher metrics. The Fisher metric measures only distances that are relevant to the classes, that is, distances that cause changes in the class distribution. The components have applications in data exploration, visualization, and dimensionality reduction. In empirical experiments, the method outperformed, in addition to more classical methods, a Renyi entropy-based alternative while having essentially equivalent computational cost.

  15. 1/ f noise from the laws of thermodynamics for finite-size fluctuations.

    PubMed

    Chamberlin, Ralph V; Nasir, Derek M

    2014-07-01

    Computer simulations of the Ising model exhibit white noise if thermal fluctuations are governed by Boltzmann's factor alone; whereas we find that the same model exhibits 1/f noise if Boltzmann's factor is extended to include local alignment entropy to all orders. We show that this nonlinear correction maintains maximum entropy during equilibrium fluctuations. Indeed, as with the usual way to resolve Gibbs' paradox that avoids entropy reduction during reversible processes, the correction yields the statistics of indistinguishable particles. The correction also ensures conservation of energy if an instantaneous contribution from local entropy is included. Thus, a common mechanism for 1/f noise comes from assuming that finite-size fluctuations strictly obey the laws of thermodynamics, even in small parts of a large system. Empirical evidence for the model comes from its ability to match the measured temperature dependence of the spectral-density exponents in several metals and to show non-Gaussian fluctuations characteristic of nanoscale systems.

  16. Entropy in an expanding universe.

    PubMed

    Frautschi, S

    1982-08-13

    The question of how the observed evolution of organized structures from initial chaos in the expanding universe can be reconciled with the laws of statistical mechanics is studied, with emphasis on effects of the expansion and gravity. Some major sources of entropy increase are listed. An expanding "causal" region is defined in which the entropy, though increasing, tends to fall further and further behind its maximum possible value, thus allowing for the development of order. The related questions of whether entropy will continue increasing without limit in the future, and whether such increase in the form of Hawking radiation or radiation from positronium might enable life to maintain itself permanently, are considered. Attempts to find a scheme for preserving life based on solid structures fail because events such as quantum tunneling recurrently disorganize matter on a very long but fixed time scale, whereas all energy sources slow down progressively in an expanding universe. However, there remains hope that other modes of life capable of maintaining themselves permanently can be found.

  17. Quantum and Ecosystem Entropies

    NASA Astrophysics Data System (ADS)

    Kirwan, A. D.

    2008-06-01

    Ecosystems and quantum gases share a number of superficial similarities including enormous numbers of interacting elements and the fundamental role of energy in such interactions. A theory for the synthesis of data and prediction of new phenomena is well established in quantum statistical mechanics. The premise of this paper is that the reason a comparable unifying theory has not emerged in ecology is that a proper role for entropy has yet to be assigned. To this end, a phase space entropy model of ecosystems is developed. Specification of an ecosystem phase space cell size based on microbial mass, length, and time scales gives an ecosystem uncertainty parameter only about three orders of magnitude larger than Planck’s constant. Ecosystem equilibria is specified by conservation of biomass and total metabolic energy, along with the principle of maximum entropy at equilibria. Both Bose - Einstein and Fermi - Dirac equilibrium conditions arise in ecosystems applications. The paper concludes with a discussion of some broader aspects of an ecosystem phase space.

  18. Application of digital image processing techniques to astronomical imagery 1980

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1981-01-01

    Topics include: (1) polar coordinate transformations (M83); (2) multispectral ratios (M82); (3) maximum entropy restoration (M87); (4) automated computation of stellar magnitudes in nebulosity; (5) color and polarization; (6) aliasing.

  19. Nonextensivity in a Dark Maximum Entropy Landscape

    NASA Astrophysics Data System (ADS)

    Leubner, M. P.

    2011-03-01

    Nonextensive statistics along with network science, an emerging branch of graph theory, are increasingly recognized as potential interdisciplinary frameworks whenever systems are subject to long-range interactions and memory. Such settings are characterized by non-local interactions evolving in a non-Euclidean fractal/multi-fractal space-time making their behavior nonextensive. After summarizing the theoretical foundations from first principles, along with a discussion of entropy bifurcation and duality in nonextensive systems, we focus on selected significant astrophysical consequences. Those include the gravitational equilibria of dark matter (DM) and hot gas in clustered structures, the dark energy(DE) negative pressure landscape governed by the highest degree of mutual correlations and the hierarchy of discrete cosmic structure scales, available upon extremizing the generalized nonextensive link entropy in a homogeneous growing network.

  20. Using the principle of entropy maximization to infer genetic interaction networks from gene expression patterns.

    PubMed

    Lezon, Timothy R; Banavar, Jayanth R; Cieplak, Marek; Maritan, Amos; Fedoroff, Nina V

    2006-12-12

    We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems.

  1. Holographic Rényi entropy in AdS3/LCFT2 correspondence

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Song, Feng-yan; Zhang, Jia-ju

    2014-03-01

    The recent study in AdS3/CFT2 correspondence shows that the tree level contribution and 1-loop correction of holographic Rényi entanglement entropy (HRE) exactly match the direct CFT computation in the large central charge limit. This allows the Rényi entanglement entropy to be a new window to study the AdS/CFT correspondence. In this paper we generalize the study of Rényi entanglement entropy in pure AdS3 gravity to the massive gravity theories at the critical points. For the cosmological topological massive gravity (CTMG), the dual conformal field theory (CFT) could be a chiral conformal field theory or a logarithmic conformal field theory (LCFT), depending on the asymptotic boundary conditions imposed. In both cases, by studying the short interval expansion of the Rényi entanglement entropy of two disjoint intervals with small cross ratio x, we find that the classical and 1-loop HRE are in exact match with the CFT results, up to order x 6. To this order, the difference between the massless graviton and logarithmic mode can be seen clearly. Moreover, for the cosmological new massive gravity (CNMG) at critical point, which could be dual to a logarithmic CFT as well, we find the similar agreement in the CNMG/LCFT correspondence. Furthermore we read the 2-loop correction of graviton and logarithmic mode to HRE from CFT computation. It has distinct feature from the one in pure AdS3 gravity.

  2. Entropy, Ergodicity, and Stem Cell Multipotency

    NASA Astrophysics Data System (ADS)

    Ridden, Sonya J.; Chang, Hannah H.; Zygalakis, Konstantinos C.; MacArthur, Ben D.

    2015-11-01

    Populations of mammalian stem cells commonly exhibit considerable cell-cell variability. However, the functional role of this diversity is unclear. Here, we analyze expression fluctuations of the stem cell surface marker Sca1 in mouse hematopoietic progenitor cells using a simple stochastic model and find that the observed dynamics naturally lie close to a critical state, thereby producing a diverse population that is able to respond rapidly to environmental changes. We propose an information-theoretic interpretation of these results that views cellular multipotency as an instance of maximum entropy statistical inference.

  3. Nonequilibrium-thermodynamics approach to open quantum systems

    NASA Astrophysics Data System (ADS)

    Semin, Vitalii; Petruccione, Francesco

    2014-11-01

    Open quantum systems are studied from the thermodynamical point of view unifying the principle of maximum informational entropy and the hypothesis of relaxation times hierarchy. The result of the unification is a non-Markovian and local-in-time master equation that provides a direct connection for dynamical and thermodynamical properties of open quantum systems. The power of the approach is illustrated by the application to the damped harmonic oscillator and the damped driven two-level system, resulting in analytical expressions for the non-Markovian and nonequilibrium entropy and inverse temperature.

  4. The existence of negative absolute temperatures in Axelrod’s social influence model

    NASA Astrophysics Data System (ADS)

    Villegas-Febres, J. C.; Olivares-Rivas, W.

    2008-06-01

    We introduce the concept of temperature as an order parameter in the standard Axelrod’s social influence model. It is defined as the relation between suitably defined entropy and energy functions, T=(. We show that at the critical point, where the order/disorder transition occurs, this absolute temperature changes in sign. At this point, which corresponds to the transition homogeneous/heterogeneous culture, the entropy of the system shows a maximum. We discuss the relationship between the temperature and other properties of the model in terms of cultural traits.

  5. Metallization of vanadium dioxide driven by large phonon entropy

    DOE PAGES

    Budai, John D.; Hong, Jiawang; Manley, Michael E.; ...

    2014-11-10

    Phase competition underlies many remarkable and technologically important phenomena in transition-metal oxides. Vanadium dioxide exhibits a first-order metal-insulator transition (MIT) near room temperature, where conductivity is suppressed and the lattice changes from tetragonal to monoclinic on cooling. Ongoing attempts to explain this coupled structural and electronic transition begin with two classic starting points: a Peierls MIT driven by instabilities in electron-lattice dynamics versus a Mott MIT where strong electron-electron correlations drive charge localization1-10. A key-missing piece of the VO2 puzzle is the role of lattice vibrations. Moreover, a comprehensive thermodynamic treatment must integrate both entropic and energetic aspects of themore » transition. Our measurements establish that the entropy driving the MIT is dominated by strongly anharmonic phonons rather than electronic contributions, and provide a direct determination of phonon dispersions. Our calculations identify softer bonding as the origin of the large vibrational entropy stabilizing the metallic rutile phase. They further reveal how a balance between higher entropy in the metal and orbital-driven lower energy in the insulator fully describes the thermodynamic forces controlling the MIT. This study illustrates the critical role of anharmonic lattice dynamics in metal-oxide phase competition, and provides guidance for the predictive design of new materials.« less

  6. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  7. Implications of Modern Non-Equilibrium Thermodynamics for Georgescu-Roegen's Macro-Economics: lessons from a comprehensive historical review

    NASA Astrophysics Data System (ADS)

    Poisson, Alexandre

    2011-12-01

    In the early 1970s, mathematician and economist Nicolas Georgescu-Roegen developed an alternative framework to macro-economics (his hourglass model) based on two principles of classical thermodynamics applied to the earth-system as a whole. The new model led him to the radical conclusion that "not only growth, but also a zero-growth state, nay, even a declining state which does not converge toward annihilation, cannot exist forever in a finite environment" (Georgescu-Roegen 1976, p.23). Georgescu-Roegen's novel approach long served as a devastating critique of standard neoclassical growth theories. It also helped establish the foundations for the new trans-disciplinary field of ecological economics. In recent decades however, it has remained unclear whether revolutionary developments in "modern non-equilibrium thermodynamics" (Kondepudi and Prigogine 1998) refute some of Georgescu-Roegen's initial conclusions and provide fundamentally new lessons for very long-term macro-economic analysis. Based on a broad historical review of literature from many fields (thermodynamics, cosmology, ecosystems ecology and economics), I argue that Georgescu-Roegen's hourglass model is largely based on old misconceptions and assumptions from 19th century thermodynamics (including an out-dated cosmology) which make it very misleading. Ironically, these assumptions (path independence and linearity of the entropy function in particular) replicate the non-evolutionary thinking he seemed to despise in his colleagues. In light of modern NET, I propose a different model. Contrary to Georgescu-Roegen's hourglass, I do not assume the path independence of the entropy function. In the new model, achieving critical free energy rate density thresholds can abruptly increase the level of complexity and maximum remaining lifespan of stock-based civilizations.

  8. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  9. Transition from the mechanics of material points to the mechanics of structured particles

    NASA Astrophysics Data System (ADS)

    Somsikov, V. M.

    2016-01-01

    In this paper, necessity of creation of mechanics of structured particles is discussed. The way to create this mechanics within the laws of classical mechanics with the use of energy equation is shown. The occurrence of breaking of time symmetry within the mechanics of structured particles is shown, as well as the introduction of concept of entropy in the framework of classical mechanics. The way to create the mechanics of non-equilibrium systems in the thermodynamic approach is shown. It is also shown that the use of hypothesis of holonomic constraints while deriving the canonical Lagrange equation made it impossible to describe irreversible dynamics. The difference between the mechanics of structured particles and the mechanics of material points is discussed. It is also shown that the matter is infinitely divisible according to the laws of classical mechanics.

  10. Enthalpy-entropy compensation for the solubility of drugs in solvent mixtures: paracetamol, acetanilide, and nalidixic acid in dioxane-water.

    PubMed

    Bustamante, P; Romero, S; Pena, A; Escalera, B; Reillo, A

    1998-12-01

    In earlier work, a nonlinear enthalpy-entropy compensation was observed for the solubility of phenacetin in dioxane-water mixtures. This effect had not been earlier reported for the solubility of drugs in solvent mixtures. To gain insight into the compensation effect, the behavior of the apparent thermodynamic magnitudes for the solubility of paracetamol, acetanilide, and nalidixic acid is studied in this work. The solubility of these drugs was measured at several temperatures in dioxane-water mixtures. DSC analysis was performed on the original powders and on the solid phases after equilibration with the solvent mixture. The thermal properties of the solid phases did not show significant changes. The three drugs display a solubility maximum against the cosolvent ratio. The solubility peaks of acetanilide and nalidixic acid shift to a more polar region at the higher temperatures. Nonlinear van't Hoff plots were observed for nalidixic acid whereas acetanilide and paracetamol show linear behavior at the temperature range studied. The apparent enthalpies of solution are endothermic going through a maximum at 50% dioxane. Two different mechanisms, entropy and enthalpy, are suggested to be the driving forces that increase the solubility of the three drugs. Solubility is entropy controlled at the water-rich region (0-50% dioxane) and enthalpy controlled at the dioxane-rich region (50-100% dioxane). The enthalpy-entropy compensation analysis also suggests that two different mechanisms, dependent on cosolvent ratio, are involved in the solubility enhancement of the three drugs. The plots of deltaH versus deltaG are nonlinear, and the slope changes from positive to negative above 50% dioxane. The compensation effect for the thermodynamic magnitudes of transfer from water to the aqueous mixtures can be described by a common empirical nonlinear relationship, with the exception of paracetamol, which follows a separate linear relationship at dioxane ratios above 50%. The results corroborate earlier findings with phenacetin. The similar pattern shown by the drugs studied suggests that the nonlinear enthalpy-entropy compensation effect may be characteristic of the solubility of semipolar drugs in dioxane-water mixtures.

  11. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    NASA Astrophysics Data System (ADS)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  12. Application of the maximum entropy principle to determine ensembles of intrinsically disordered proteins from residual dipolar couplings.

    PubMed

    Sanchez-Martinez, M; Crehuet, R

    2014-12-21

    We present a method based on the maximum entropy principle that can re-weight an ensemble of protein structures based on data from residual dipolar couplings (RDCs). The RDCs of intrinsically disordered proteins (IDPs) provide information on the secondary structure elements present in an ensemble; however even two sets of RDCs are not enough to fully determine the distribution of conformations, and the force field used to generate the structures has a pervasive influence on the refined ensemble. Two physics-based coarse-grained force fields, Profasi and Campari, are able to predict the secondary structure elements present in an IDP, but even after including the RDC data, the re-weighted ensembles differ between both force fields. Thus the spread of IDP ensembles highlights the need for better force fields. We distribute our algorithm in an open-source Python code.

  13. Maximum entropy analysis of NMR data of flexible multirotor molecules partially oriented in nematic solution: 2,2':5',2″-terthiophene, 2,2'- and 3,3'-dithiophene

    NASA Astrophysics Data System (ADS)

    Caldarelli, Stefano; Catalano, Donata; Di Bari, Lorenzo; Lumetti, Marco; Ciofalo, Maurizio; Alberto Veracini, Carlo

    1994-07-01

    The dipolar couplings observed by NMR spectroscopy of solutes in nematic solvents (LX-NMR) are used to build up the maximum entropy (ME) probability distribution function of the variables describing the orientational and internal motion of the molecule. The ME conformational distributions of 2,2'- and 3,3'-dithiophene and 2,2':5',2″-terthiophene (α-terthienyl)thus obtained are compared with the results of previous studies. The 2,2'- and 3,3'-dithiophene molecules exhibit equilibria among cisoid and transoid forms; the probability maxima correspond to planar and twisted conformers for 2,2'- or 3,3'-dithiophene, respectively, 2,2':5',2″-Terthiophene has two internal degrees of freedom; the ME approach indicates that the trans, trans and cis, trans planar conformations are the most probable. The correlation between the two intramolecular rotations is also discussed.

  14. Image reconstruction of IRAS survey scans

    NASA Technical Reports Server (NTRS)

    Bontekoe, Tj. Romke

    1990-01-01

    The IRAS survey data can be used successfully to produce images of extended objects. The major difficulties, viz. non-uniform sampling, different response functions for each detector, and varying signal-to-noise levels for each detector for each scan, were resolved. The results of three different image construction techniques are compared: co-addition, constrained least squares, and maximum entropy. The maximum entropy result is superior. An image of the galaxy M51 with an average spatial resolution of 45 arc seconds is presented, using 60 micron survey data. This exceeds the telescope diffraction limit of 1 minute of arc, at this wavelength. Data fusion is a proposed method for combining data from different instruments, with different spacial resolutions, at different wavelengths. Data estimates of the physical parameters, temperature, density and composition, can be made from the data without prior image (re-)construction. An increase in the accuracy of these parameters is expected as the result of this more systematic approach.

  15. Energy and maximum norm estimates for nonlinear conservation laws

    NASA Technical Reports Server (NTRS)

    Olsson, Pelle; Oliger, Joseph

    1994-01-01

    We have devised a technique that makes it possible to obtain energy estimates for initial-boundary value problems for nonlinear conservation laws. The two major tools to achieve the energy estimates are a certain splitting of the flux vector derivative f(u)(sub x), and a structural hypothesis, referred to as a cone condition, on the flux vector f(u). These hypotheses are fulfilled for many equations that occur in practice, such as the Euler equations of gas dynamics. It should be noted that the energy estimates are obtained without any assumptions on the gradient of the solution u. The results extend to weak solutions that are obtained as point wise limits of vanishing viscosity solutions. As a byproduct we obtain explicit expressions for the entropy function and the entropy flux of symmetrizable systems of conservation laws. Under certain circumstances the proposed technique can be applied repeatedly so as to yield estimates in the maximum norm.

  16. Test images for the maximum entropy image restoration method

    NASA Technical Reports Server (NTRS)

    Mackey, James E.

    1990-01-01

    One of the major activities of any experimentalist is data analysis and reduction. In solar physics, remote observations are made of the sun in a variety of wavelengths and circumstances. In no case is the data collected free from the influence of the design and operation of the data gathering instrument as well as the ever present problem of noise. The presence of significant noise invalidates the simple inversion procedure regardless of the range of known correlation functions. The Maximum Entropy Method (MEM) attempts to perform this inversion by making minimal assumptions about the data. To provide a means of testing the MEM and characterizing its sensitivity to noise, choice of point spread function, type of data, etc., one would like to have test images of known characteristics that can represent the type of data being analyzed. A means of reconstructing these images is presented.

  17. Computing quantum discord is NP-complete

    NASA Astrophysics Data System (ADS)

    Huang, Yichen

    2014-03-01

    We study the computational complexity of quantum discord (a measure of quantum correlation beyond entanglement), and prove that computing quantum discord is NP-complete. Therefore, quantum discord is computationally intractable: the running time of any algorithm for computing quantum discord is believed to grow exponentially with the dimension of the Hilbert space so that computing quantum discord in a quantum system of moderate size is not possible in practice. As by-products, some entanglement measures (namely entanglement cost, entanglement of formation, relative entropy of entanglement, squashed entanglement, classical squashed entanglement, conditional entanglement of mutual information, and broadcast regularization of mutual information) and constrained Holevo capacity are NP-hard/NP-complete to compute. These complexity-theoretic results are directly applicable in common randomness distillation, quantum state merging, entanglement distillation, superdense coding, and quantum teleportation; they may offer significant insights into quantum information processing. Moreover, we prove the NP-completeness of two typical problems: linear optimization over classical states and detecting classical states in a convex set, providing evidence that working with classical states is generically computationally intractable.

  18. Symbolic Analysis of Heart Rate Variability During Exposure to Musical Auditory Stimulation.

    PubMed

    Vanderlei, Franciele Marques; de Abreu, Luiz Carlos; Garner, David Matthew; Valenti, Vitor Engrácia

    2016-01-01

    In recent years, the application of nonlinear methods for analysis of heart rate variability (HRV) has increased. However, studies on the influence of music on cardiac autonomic modulation in those circumstances are rare. The research team aimed to evaluate the acute effects on HRV of selected auditory stimulation by 2 musical styles, measuring the results using nonlinear methods of analysis: Shannon entropy, symbolic analysis, and correlation-dimension analysis. Prospective control study in which the volunteers were exposed to music and variables were compared between control (no auditory stimulation) and during exposure to music. All procedures were performed in a sound-proofed room at the Faculty of Science and Technology at São Paulo State University (UNESP), São Paulo, Brazil. Participants were 22 healthy female students, aged between 18 and 30 y. Prior to the actual intervention, the participants remained at rest for 20 min, and then they were exposed to one of the selected types of music, either classical baroque (64-84 dB) or heavy-metal (75-84 dB). Each musical session lasted a total of 5 min and 15 s. At a point occurring up to 1 wk after that day, the participants listened to the second type of music. The 2 types of music were delivered in a random sequence that depended on the group to which the participant was assigned. The study analyzed the following HRV indices through Shannon entropy; symbolic analysis-0V%, 1V%, 2LV%, and 2ULV%; and correlation-dimension analysis. During exposure to auditory stimulation by heavy-metal or classical baroque music, the study established no statistically significant variations regarding the indices for the Shannon entropy; the symbolic analysis-0V%, 1V%, and 2ULV%; and the correlation-dimension analysis. However, during heavy-metal music, the 2LV% index in the symbolic analysis was reduced compared with the controls. Auditory stimulation with the heavy-metal music reduced the parasympathetic modulation of HRV, whereas no significant changes occurred in cardiac autonomic modulation during exposure to the classical music.

  19. Moisture sorption isotherms and thermodynamic properties of mexican mennonite-style cheese.

    PubMed

    Martinez-Monteagudo, Sergio I; Salais-Fierro, Fabiola

    2014-10-01

    Moisture adsorption isotherms of fresh and ripened Mexican Mennonite-style cheese were investigated using the static gravimetric method at 4, 8, and 12 °C in a water activity range (aw) of 0.08-0.96. These isotherms were modeled using GAB, BET, Oswin and Halsey equations through weighed non-linear regression. All isotherms were sigmoid in shape, showing a type II BET isotherm, and the data were best described by GAB model. GAB model coefficients revealed that water adsorption by cheese matrix is a multilayer process characterized by molecules that are strongly bound in the monolayer and molecules that are slightly structured in a multilayer. Using the GAB model, it was possible to estimate thermodynamic functions (net isosteric heat, differential entropy, integral enthalpy and entropy, and enthalpy-entropy compensation) as function of moisture content. For both samples, the isosteric heat and differential entropy decreased with moisture content in exponential fashion. The integral enthalpy gradually decreased with increasing moisture content after reached a maximum value, while the integral entropy decreased with increasing moisture content after reached a minimum value. A linear compensation was found between integral enthalpy and entropy suggesting enthalpy controlled adsorption. Determination of moisture content and aw relationship yields to important information of controlling the ripening, drying and storage operations as well as understanding of the water state within a cheese matrix.

  20. Minimax Quantum Tomography: Estimators and Relative Entropy Bounds

    DOE PAGES

    Ferrie, Christopher; Blume-Kohout, Robin

    2016-03-04

    A minimax estimator has the minimum possible error (“risk”) in the worst case. Here we construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O (1/more » $$\\sqrt{N}$$ ) —in contrast to that of classical probability estimation, which is O (1/N) —where N is the number of copies of the quantum state used. We trace this deficiency to sampling mismatch: future observations that determine risk may come from a different sample space than the past data that determine the estimate. Lastly, this makes minimax estimators very biased, and we propose a computationally tractable alternative with similar behavior in the worst case, but superior accuracy on most states.« less

Top