Sample records for maximum entropy approach

  1. Unification of field theory and maximum entropy methods for learning probability densities

    NASA Astrophysics Data System (ADS)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  2. Unification of field theory and maximum entropy methods for learning probability densities.

    PubMed

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  3. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  4. Random versus maximum entropy models of neural population activity

    NASA Astrophysics Data System (ADS)

    Ferrari, Ulisse; Obuchi, Tomoyuki; Mora, Thierry

    2017-04-01

    The principle of maximum entropy provides a useful method for inferring statistical mechanics models from observations in correlated systems, and is widely used in a variety of fields where accurate data are available. While the assumptions underlying maximum entropy are intuitive and appealing, its adequacy for describing complex empirical data has been little studied in comparison to alternative approaches. Here, data from the collective spiking activity of retinal neurons is reanalyzed. The accuracy of the maximum entropy distribution constrained by mean firing rates and pairwise correlations is compared to a random ensemble of distributions constrained by the same observables. For most of the tested networks, maximum entropy approximates the true distribution better than the typical or mean distribution from that ensemble. This advantage improves with population size, with groups as small as eight being almost always better described by maximum entropy. Failure of maximum entropy to outperform random models is found to be associated with strong correlations in the population.

  5. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  6. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  7. A unified approach to computational drug discovery.

    PubMed

    Tseng, Chih-Yuan; Tuszynski, Jack

    2015-11-01

    It has been reported that a slowdown in the development of new medical therapies is affecting clinical outcomes. The FDA has thus initiated the Critical Path Initiative project investigating better approaches. We review the current strategies in drug discovery and focus on the advantages of the maximum entropy method being introduced in this area. The maximum entropy principle is derived from statistical thermodynamics and has been demonstrated to be an inductive inference tool. We propose a unified method to drug discovery that hinges on robust information processing using entropic inductive inference. Increasingly, applications of maximum entropy in drug discovery employ this unified approach and demonstrate the usefulness of the concept in the area of pharmaceutical sciences. Copyright © 2015. Published by Elsevier Ltd.

  8. Convex Accelerated Maximum Entropy Reconstruction

    PubMed Central

    Worley, Bradley

    2016-01-01

    Maximum entropy (MaxEnt) spectral reconstruction methods provide a powerful framework for spectral estimation of nonuniformly sampled datasets. Many methods exist within this framework, usually defined based on the magnitude of a Lagrange multiplier in the MaxEnt objective function. An algorithm is presented here that utilizes accelerated first-order convex optimization techniques to rapidly and reliably reconstruct nonuniformly sampled NMR datasets using the principle of maximum entropy. This algorithm – called CAMERA for Convex Accelerated Maximum Entropy Reconstruction Algorithm – is a new approach to spectral reconstruction that exhibits fast, tunable convergence in both constant-aim and constant-lambda modes. A high-performance, open source NMR data processing tool is described that implements CAMERA, and brief comparisons to existing reconstruction methods are made on several example spectra. PMID:26894476

  9. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    NASA Astrophysics Data System (ADS)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  10. Fast and Efficient Stochastic Optimization for Analytic Continuation

    DOE PAGES

    Bao, Feng; Zhang, Guannan; Webster, Clayton G; ...

    2016-09-28

    In this analytic continuation of imaginary-time quantum Monte Carlo data to extract real-frequency spectra remains a key problem in connecting theory with experiment. Here we present a fast and efficient stochastic optimization method (FESOM) as a more accessible variant of the stochastic optimization method introduced by Mishchenko et al. [Phys. Rev. B 62, 6317 (2000)], and we benchmark the resulting spectra with those obtained by the standard maximum entropy method for three representative test cases, including data taken from studies of the two-dimensional Hubbard model. Genearally, we find that our FESOM approach yields spectra similar to the maximum entropy results.more » In particular, while the maximum entropy method yields superior results when the quality of the data is strong, we find that FESOM is able to resolve fine structure with more detail when the quality of the data is poor. In addition, because of its stochastic nature, the method provides detailed information on the frequency-dependent uncertainty of the resulting spectra, while the maximum entropy method does so only for the spectral weight integrated over a finite frequency region. Therefore, we believe that this variant of the stochastic optimization approach provides a viable alternative to the routinely used maximum entropy method, especially for data of poor quality.« less

  11. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  12. Exploiting the Maximum Entropy Principle to Increase Retrieval Effectiveness.

    ERIC Educational Resources Information Center

    Cooper, William S.

    1983-01-01

    Presents information retrieval design approach in which queries of computer-based system consist of sets of terms, either unweighted or weighted with subjective term precision estimates, and retrieval outputs ranked by probability of usefulness estimated by "maximum entropy principle." Boolean and weighted request systems are discussed.…

  13. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    NASA Technical Reports Server (NTRS)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  14. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method

    PubMed Central

    Roux, Benoît; Weare, Jonathan

    2013-01-01

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140

  15. Stationary properties of maximum-entropy random walks.

    PubMed

    Dixit, Purushottam D

    2015-10-01

    Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.

  16. Exact computation of the maximum-entropy potential of spiking neural-network models.

    PubMed

    Cofré, R; Cessac, B

    2014-05-01

    Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.

  17. Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations

    DTIC Science & Technology

    2017-08-21

    distributions, and we discuss some applications for engineered and biological information transmission systems. Keywords: information theory; minimum...of its interpretation as a measure of the amount of information communicable by a neural system to groups of downstream neurons. Previous authors...of the maximum entropy approach. Our results also have relevance for engineered information transmission systems. We show that empirically measured

  18. Elements of the cognitive universe

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2017-06-01

    "The least biased inference, taking available information into account, is the one with maximum entropy". So we are taught by Jaynes. The many followers from a broad spectrum of the natural and social sciences point to the wisdom of this principle, the maximum entropy principle, MaxEnt. But "entropy" need not be tied only to classical entropy and thus to probabilistic thinking. In fact, the arguments found in Jaynes' writings and elsewhere can, as we shall attempt to demonstrate, profitably be revisited, elaborated and transformed to apply in a much more general abstract setting. The approach is based on game theoretical thinking. Philosophical considerations dealing with notions of cognition - basically truth and belief - lie behind. Quantitative elements are introduced via a concept of description effort. An interpretation of Tsallis Entropy is indicated.

  19. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khosla, D.; Singh, M.

    The estimation of three-dimensional dipole current sources on the cortical surface from the measured magnetoencephalogram (MEG) is a highly under determined inverse problem as there are many {open_quotes}feasible{close_quotes} images which are consistent with the MEG data. Previous approaches to this problem have concentrated on the use of weighted minimum norm inverse methods. While these methods ensure a unique solution, they often produce overly smoothed solutions and exhibit severe sensitivity to noise. In this paper we explore the maximum entropy approach to obtain better solutions to the problem. This estimation technique selects that image from the possible set of feasible imagesmore » which has the maximum entropy permitted by the information available to us. In order to account for the presence of noise in the data, we have also incorporated a noise rejection or likelihood term into our maximum entropy method. This makes our approach mirror a Bayesian maximum a posteriori (MAP) formulation. Additional information from other functional techniques like functional magnetic resonance imaging (fMRI) can be incorporated in the proposed method in the form of a prior bias function to improve solutions. We demonstrate the method with experimental phantom data from a clinical 122 channel MEG system.« less

  1. Maximum-Entropy Inference with a Programmable Annealer

    PubMed Central

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.

    2016-01-01

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition. PMID:26936311

  2. Thermodynamic resource theories, non-commutativity and maximum entropy principles

    NASA Astrophysics Data System (ADS)

    Lostaglio, Matteo; Jennings, David; Rudolph, Terry

    2017-04-01

    We discuss some features of thermodynamics in the presence of multiple conserved quantities. We prove a generalisation of Landauer principle illustrating tradeoffs between the erasure costs paid in different ‘currencies’. We then show how the maximum entropy and complete passivity approaches give different answers in the presence of multiple observables. We discuss how this seems to prevent current resource theories from fully capturing thermodynamic aspects of non-commutativity.

  3. Maximum entropy PDF projection: A review

    NASA Astrophysics Data System (ADS)

    Baggenstoss, Paul M.

    2017-06-01

    We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.

  4. From Maximum Entropy Models to Non-Stationarity and Irreversibility

    NASA Astrophysics Data System (ADS)

    Cofre, Rodrigo; Cessac, Bruno; Maldonado, Cesar

    The maximum entropy distribution can be obtained from a variational principle. This is important as a matter of principle and for the purpose of finding approximate solutions. One can exploit this fact to obtain relevant information about the underlying stochastic process. We report here in recent progress in three aspects to this approach.1- Biological systems are expected to show some degree of irreversibility in time. Based on the transfer matrix technique to find the spatio-temporal maximum entropy distribution, we build a framework to quantify the degree of irreversibility of any maximum entropy distribution.2- The maximum entropy solution is characterized by a functional called Gibbs free energy (solution of the variational principle). The Legendre transformation of this functional is the rate function, which controls the speed of convergence of empirical averages to their ergodic mean. We show how the correct description of this functional is determinant for a more rigorous characterization of first and higher order phase transitions.3- We assess the impact of a weak time-dependent external stimulus on the collective statistics of spiking neuronal networks. We show how to evaluate this impact on any higher order spatio-temporal correlation. RC supported by ERC advanced Grant ``Bridges'', BC: KEOPS ANR-CONICYT, Renvision and CM: CONICYT-FONDECYT No. 3140572.

  5. Maximum Relative Entropy of Coherence: An Operational Coherence Measure.

    PubMed

    Bu, Kaifeng; Singh, Uttam; Fei, Shao-Ming; Pati, Arun Kumar; Wu, Junde

    2017-10-13

    The operational characterization of quantum coherence is the cornerstone in the development of the resource theory of coherence. We introduce a new coherence quantifier based on maximum relative entropy. We prove that the maximum relative entropy of coherence is directly related to the maximum overlap with maximally coherent states under a particular class of operations, which provides an operational interpretation of the maximum relative entropy of coherence. Moreover, we show that, for any coherent state, there are examples of subchannel discrimination problems such that this coherent state allows for a higher probability of successfully discriminating subchannels than that of all incoherent states. This advantage of coherent states in subchannel discrimination can be exactly characterized by the maximum relative entropy of coherence. By introducing a suitable smooth maximum relative entropy of coherence, we prove that the smooth maximum relative entropy of coherence provides a lower bound of one-shot coherence cost, and the maximum relative entropy of coherence is equivalent to the relative entropy of coherence in the asymptotic limit. Similar to the maximum relative entropy of coherence, the minimum relative entropy of coherence has also been investigated. We show that the minimum relative entropy of coherence provides an upper bound of one-shot coherence distillation, and in the asymptotic limit the minimum relative entropy of coherence is equivalent to the relative entropy of coherence.

  6. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  7. Derivation of Hunt equation for suspension distribution using Shannon entropy theory

    NASA Astrophysics Data System (ADS)

    Kundu, Snehasis

    2017-12-01

    In this study, the Hunt equation for computing suspension concentration in sediment-laden flows is derived using Shannon entropy theory. Considering the inverse of the void ratio as a random variable and using principle of maximum entropy, probability density function and cumulative distribution function of suspension concentration is derived. A new and more general cumulative distribution function for the flow domain is proposed which includes several specific other models of CDF reported in literature. This general form of cumulative distribution function also helps to derive the Rouse equation. The entropy based approach helps to estimate model parameters using suspension data of sediment concentration which shows the advantage of using entropy theory. Finally model parameters in the entropy based model are also expressed as functions of the Rouse number to establish a link between the parameters of the deterministic and probabilistic approaches.

  8. Bayesian Approach to Spectral Function Reconstruction for Euclidean Quantum Field Theories

    NASA Astrophysics Data System (ADS)

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33TC.

  9. Bayesian approach to spectral function reconstruction for Euclidean quantum field theories.

    PubMed

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33T(C).

  10. Pareto versus lognormal: A maximum entropy test

    NASA Astrophysics Data System (ADS)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  11. Maximum one-shot dissipated work from Rényi divergences

    NASA Astrophysics Data System (ADS)

    Yunger Halpern, Nicole; Garner, Andrew J. P.; Dahlsten, Oscar C. O.; Vedral, Vlatko

    2018-05-01

    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.

  12. Maximum one-shot dissipated work from Rényi divergences.

    PubMed

    Yunger Halpern, Nicole; Garner, Andrew J P; Dahlsten, Oscar C O; Vedral, Vlatko

    2018-05-01

    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.

  13. Metabolic networks evolve towards states of maximum entropy production.

    PubMed

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  15. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007).

    PubMed

    Pueyo, Salvador

    2012-05-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach.

  16. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007)

    PubMed Central

    Pueyo, Salvador

    2012-01-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach. PMID:22837843

  17. Maximum Tsallis entropy with generalized Gini and Gini mean difference indices constraints

    NASA Astrophysics Data System (ADS)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2017-04-01

    Using the maximum entropy principle with Tsallis entropy, some distribution families for modeling income distribution are obtained. By considering income inequality measures, maximum Tsallis entropy distributions under the constraint on generalized Gini and Gini mean difference indices are derived. It is shown that the Tsallis entropy maximizers with the considered constraints belong to generalized Pareto family.

  18. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  19. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  20. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  1. A maximum entropy thermodynamics of small systems.

    PubMed

    Dixit, Purushottam D

    2013-05-14

    We present a maximum entropy approach to analyze the state space of a small system in contact with a large bath, e.g., a solvated macromolecular system. For the solute, the fluctuations around the mean values of observables are not negligible and the probability distribution P(r) of the state space depends on the intricate details of the interaction of the solute with the solvent. Here, we employ a superstatistical approach: P(r) is expressed as a marginal distribution summed over the variation in β, the inverse temperature of the solute. The joint distribution P(β, r) is estimated by maximizing its entropy. We also calculate the first order system-size corrections to the canonical ensemble description of the state space. We test the development on a simple harmonic oscillator interacting with two baths with very different chemical identities, viz., (a) Lennard-Jones particles and (b) water molecules. In both cases, our method captures the state space of the oscillator sufficiently well. Future directions and connections with traditional statistical mechanics are discussed.

  2. Numerical optimization using flow equations.

    PubMed

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  3. Numerical optimization using flow equations

    NASA Astrophysics Data System (ADS)

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  4. Steepest entropy ascent for two-state systems with slowly varying Hamiltonians

    NASA Astrophysics Data System (ADS)

    Militello, Benedetto

    2018-05-01

    The steepest entropy ascent approach is considered and applied to two-state systems. When the Hamiltonian of the system is time-dependent, the principle of maximum entropy production can still be exploited; arguments to support this fact are given. In the limit of slowly varying Hamiltonians, which allows for the adiabatic approximation for the unitary part of the dynamics, the system exhibits significant robustness to the thermalization process. Specific examples such as a spin in a rotating field and a generic two-state system undergoing an avoided crossing are considered.

  5. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines.

    PubMed

    Haseli, Y

    2016-05-01

    The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  6. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  7. In Vivo potassium-39 NMR spectra by the burg maximum-entropy method

    NASA Astrophysics Data System (ADS)

    Uchiyama, Takanori; Minamitani, Haruyuki

    The Burg maximum-entropy method was applied to estimate 39K NMR spectra of mung bean root tips. The maximum-entropy spectra have as good a linearity between peak areas and potassium concentrations as those obtained by fast Fourier transform and give a better estimation of intracellular potassium concentrations. Therefore potassium uptake and loss processes of mung bean root tips are shown to be more clearly traced by the maximum-entropy method.

  8. Maximum Entropy Approach in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.

    PubMed

    Farsani, Zahra Amini; Schmid, Volker J

    2017-01-01

    In the estimation of physiological kinetic parameters from Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) data, the determination of the arterial input function (AIF) plays a key role. This paper proposes a Bayesian method to estimate the physiological parameters of DCE-MRI along with the AIF in situations, where no measurement of the AIF is available. In the proposed algorithm, the maximum entropy method (MEM) is combined with the maximum a posterior approach (MAP). To this end, MEM is used to specify a prior probability distribution of the unknown AIF. The ability of this method to estimate the AIF is validated using the Kullback-Leibler divergence. Subsequently, the kinetic parameters can be estimated with MAP. The proposed algorithm is evaluated with a data set from a breast cancer MRI study. The application shows that the AIF can reliably be determined from the DCE-MRI data using MEM. Kinetic parameters can be estimated subsequently. The maximum entropy method is a powerful tool to reconstructing images from many types of data. This method is useful for generating the probability distribution based on given information. The proposed method gives an alternative way to assess the input function from the existing data. The proposed method allows a good fit of the data and therefore a better estimation of the kinetic parameters. In the end, this allows for a more reliable use of DCE-MRI. Schattauer GmbH.

  9. Combining Experiments and Simulations Using the Maximum Entropy Principle

    PubMed Central

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124

  10. [Maximum entropy model versus remote sensing-based methods for extracting Oncomelania hupensis snail habitats].

    PubMed

    Cong-Cong, Xia; Cheng-Fang, Lu; Si, Li; Tie-Jun, Zhang; Sui-Heng, Lin; Yi, Hu; Ying, Liu; Zhi-Jie, Zhang

    2016-12-02

    To explore the technique of maximum entropy model for extracting Oncomelania hupensis snail habitats in Poyang Lake zone. The information of snail habitats and related environment factors collected in Poyang Lake zone were integrated to set up the maximum entropy based species model and generate snail habitats distribution map. Two Landsat 7 ETM+ remote sensing images of both wet and drought seasons in Poyang Lake zone were obtained, where the two indices of modified normalized difference water index (MNDWI) and normalized difference vegetation index (NDVI) were applied to extract snail habitats. The ROC curve, sensitivities and specificities were applied to assess their results. Furthermore, the importance of the variables for snail habitats was analyzed by using Jackknife approach. The evaluation results showed that the area under receiver operating characteristic curve (AUC) of testing data by the remote sensing-based method was only 0.56, and the sensitivity and specificity were 0.23 and 0.89 respectively. Nevertheless, those indices above-mentioned of maximum entropy model were 0.876, 0.89 and 0.74 respectively. The main concentration of snail habitats in Poyang Lake zone covered the northeast part of Yongxiu County, northwest of Yugan County, southwest of Poyang County and middle of Xinjian County, and the elevation was the most important environment variable affecting the distribution of snails, and the next was land surface temperature (LST). The maximum entropy model is more reliable and accurate than the remote sensing-based method for the sake of extracting snail habitats, which has certain guiding significance for the relevant departments to carry out measures to prevent and control high-risk snail habitats.

  11. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Bilsky, A. V.; Lozhkin, V. A.; Markovich, D. M.; Tokarev, M. P.

    2013-04-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART.

  12. Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers

    NASA Astrophysics Data System (ADS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-04-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  13. Nonequilibrium-thermodynamics approach to open quantum systems

    NASA Astrophysics Data System (ADS)

    Semin, Vitalii; Petruccione, Francesco

    2014-11-01

    Open quantum systems are studied from the thermodynamical point of view unifying the principle of maximum informational entropy and the hypothesis of relaxation times hierarchy. The result of the unification is a non-Markovian and local-in-time master equation that provides a direct connection for dynamical and thermodynamical properties of open quantum systems. The power of the approach is illustrated by the application to the damped harmonic oscillator and the damped driven two-level system, resulting in analytical expressions for the non-Markovian and nonequilibrium entropy and inverse temperature.

  14. A homotopy algorithm for synthesizing robust controllers for flexible structures via the maximum entropy design equations

    NASA Technical Reports Server (NTRS)

    Collins, Emmanuel G., Jr.; Richter, Stephen

    1990-01-01

    One well known deficiency of LQG compensators is that they do not guarantee any measure of robustness. This deficiency is especially highlighted when considering control design for complex systems such as flexible structures. There has thus been a need to generalize LQG theory to incorporate robustness constraints. Here we describe the maximum entropy approach to robust control design for flexible structures, a generalization of LQG theory, pioneered by Hyland, which has proved useful in practice. The design equations consist of a set of coupled Riccati and Lyapunov equations. A homotopy algorithm that is used to solve these design equations is presented.

  15. Nonequilibrium Thermodynamics in Biological Systems

    NASA Astrophysics Data System (ADS)

    Aoki, I.

    2005-12-01

    1. Respiration Oxygen-uptake by respiration in organisms decomposes macromolecules such as carbohydrate, protein and lipid and liberates chemical energy of high quality, which is then used to chemical reactions and motions of matter in organisms to support lively order in structure and function in organisms. Finally, this chemical energy becomes heat energy of low quality and is discarded to the outside (dissipation function). Accompanying this heat energy, entropy production which inevitably occurs by irreversibility also is discarded to the outside. Dissipation function and entropy production are estimated from data of respiration. 2. Human body From the observed data of respiration (oxygen absorption), the entropy production in human body can be estimated. Entropy production from 0 to 75 years old human has been obtained, and extrapolated to fertilized egg (beginning of human life) and to 120 years old (maximum period of human life). Entropy production show characteristic behavior in human life span : early rapid increase in short growing phase and later slow decrease in long aging phase. It is proposed that this tendency is ubiquitous and constitutes a Principle of Organization in complex biotic systems. 3. Ecological communities From the data of respiration of eighteen aquatic communities, specific (i.e. per biomass) entropy productions are obtained. They show two phase character with respect to trophic diversity : early increase and later decrease with the increase of trophic diversity. The trophic diversity in these aquatic ecosystems is shown to be positively correlated with the degree of eutrophication, and the degree of eutrophication is an "arrow of time" in the hierarchy of aquatic ecosystems. Hence specific entropy production has the two phase: early increase and later decrease with time. 4. Entropy principle for living systems The Second Law of Thermodynamics has been expressed as follows. 1) In isolated systems, entropy increases with time and approaches to a maximum value. This is well-known classical Clausius principle. 2) In open systems near equilibrium entropy production always decreases with time approaching a minimum stationary level. This is the minimum entropy production principle by Prigogine. These two principle are established ones. However, living systems are not isolated and not near to equilibrium. Hence, these two principles can not be applied to living systems. What is entropy principle for living systems? Answer: Entropy production in living systems consists of multi-stages with time: early increasing, later decreasing and/or intermediate stages. This tendency is supported by various living systems.

  16. Entropy and equilibrium via games of complexity

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  17. Mixed memory, (non) Hurst effect, and maximum entropy of rainfall in the tropical Andes

    NASA Astrophysics Data System (ADS)

    Poveda, Germán

    2011-02-01

    Diverse linear and nonlinear statistical parameters of rainfall under aggregation in time and the kind of temporal memory are investigated. Data sets from the Andes of Colombia at different resolutions (15 min and 1-h), and record lengths (21 months and 8-40 years) are used. A mixture of two timescales is found in the autocorrelation and autoinformation functions, with short-term memory holding for time lags less than 15-30 min, and long-term memory onwards. Consistently, rainfall variance exhibits different temporal scaling regimes separated at 15-30 min and 24 h. Tests for the Hurst effect evidence the frailty of the R/ S approach in discerning the kind of memory in high resolution rainfall, whereas rigorous statistical tests for short-memory processes do reject the existence of the Hurst effect. Rainfall information entropy grows as a power law of aggregation time, S( T) ˜ Tβ with < β> = 0.51, up to a timescale, TMaxEnt (70-202 h), at which entropy saturates, with β = 0 onwards. Maximum entropy is reached through a dynamic Generalized Pareto distribution, consistently with the maximum information-entropy principle for heavy-tailed random variables, and with its asymptotically infinitely divisible property. The dynamics towards the limit distribution is quantified. Tsallis q-entropies also exhibit power laws with T, such that Sq( T) ˜ Tβ( q) , with β( q) ⩽ 0 for q ⩽ 0, and β( q) ≃ 0.5 for q ⩾ 1. No clear patterns are found in the geographic distribution within and among the statistical parameters studied, confirming the strong variability of tropical Andean rainfall.

  18. Prediction of Metabolite Concentrations, Rate Constants and Post-Translational Regulation Using Maximum Entropy-Based Simulations with Application to Central Metabolism of Neurospora crassa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, William; Zucker, Jeremy; Baxter, Douglas

    We report the application of a recently proposed approach for modeling biological systems using a maximum entropy production rate principle in lieu of having in vivo rate constants. The method is applied in four steps: (1) a new ODE-based optimization approach based on Marcelin’s 1910 mass action equation is used to obtain the maximum entropy distribution, (2) the predicted metabolite concentrations are compared to those generally expected from experiment using a loss function from which post-translational regulation of enzymes is inferred, (3) the system is re-optimized with the inferred regulation from which rate constants are determined from the metabolite concentrationsmore » and reaction fluxes, and finally (4) a full ODE-based, mass action simulation with rate parameters and allosteric regulation is obtained. From the last step, the power characteristics and resistance of each reaction can be determined. The method is applied to the central metabolism of Neurospora crassa and the flow of material through the three competing pathways of upper glycolysis, the non-oxidative pentose phosphate pathway, and the oxidative pentose phosphate pathway are evaluated as a function of the NADP/NADPH ratio. It is predicted that regulation of phosphofructokinase (PFK) and flow through the pentose phosphate pathway are essential for preventing an extreme level of fructose 1, 6-bisphophate accumulation. Such an extreme level of fructose 1,6-bisphophate would otherwise result in a glassy cytoplasm with limited diffusion, dramatically decreasing the entropy and energy production rate and, consequently, biological competitiveness.« less

  19. Statistical mechanical theory for steady state systems. VI. Variational principles

    NASA Astrophysics Data System (ADS)

    Attard, Phil

    2006-12-01

    Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.

  20. A graphic approach to include dissipative-like effects in reversible thermal cycles

    NASA Astrophysics Data System (ADS)

    Gonzalez-Ayala, Julian; Arias-Hernandez, Luis Antonio; Angulo-Brown, Fernando

    2017-05-01

    Since the decade of 1980's, a connection between a family of maximum-work reversible thermal cycles and maximum-power finite-time endoreversible cycles has been established. The endoreversible cycles produce entropy at their couplings with the external heat baths. Thus, this kind of cycles can be optimized under criteria of merit that involve entropy production terms. Meanwhile the relation between the concept of work and power is quite direct, apparently, the finite-time objective functions involving entropy production have not reversible counterparts. In the present paper we show that it is also possible to establish a connection between irreversible cycle models and reversible ones by means of the concept of "geometric dissipation", which has to do with the equivalent role of a deficit of areas between some reversible cycles and the Carnot cycle and actual dissipative terms in a Curzon-Ahlborn engine.

  1. Quantum Rényi relative entropies affirm universality of thermodynamics.

    PubMed

    Misra, Avijit; Singh, Uttam; Bera, Manabendra Nath; Rajagopal, A K

    2015-10-01

    We formulate a complete theory of quantum thermodynamics in the Rényi entropic formalism exploiting the Rényi relative entropies, starting from the maximum entropy principle. In establishing the first and second laws of quantum thermodynamics, we have correctly identified accessible work and heat exchange in both equilibrium and nonequilibrium cases. The free energy (internal energy minus temperature times entropy) remains unaltered, when all the entities entering this relation are suitably defined. Exploiting Rényi relative entropies we have shown that this "form invariance" holds even beyond equilibrium and has profound operational significance in isothermal process. These results reduce to the Gibbs-von Neumann results when the Rényi entropic parameter α approaches 1. Moreover, it is shown that the universality of the Carnot statement of the second law is the consequence of the form invariance of the free energy, which is in turn the consequence of maximum entropy principle. Further, the Clausius inequality, which is the precursor to the Carnot statement, is also shown to hold based on the data processing inequalities for the traditional and sandwiched Rényi relative entropies. Thus, we find that the thermodynamics of nonequilibrium state and its deviation from equilibrium together determine the thermodynamic laws. This is another important manifestation of the concepts of information theory in thermodynamics when they are extended to the quantum realm. Our work is a substantial step towards formulating a complete theory of quantum thermodynamics and corresponding resource theory.

  2. Single-particle spectral density of the unitary Fermi gas: Novel approach based on the operator product expansion, sum rules and the maximum entropy method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gubler, Philipp, E-mail: pgubler@riken.jp; RIKEN Nishina Center, Wako, Saitama 351-0198; Yamamoto, Naoki

    2015-05-15

    Making use of the operator product expansion, we derive a general class of sum rules for the imaginary part of the single-particle self-energy of the unitary Fermi gas. The sum rules are analyzed numerically with the help of the maximum entropy method, which allows us to extract the single-particle spectral density as a function of both energy and momentum. These spectral densities contain basic information on the properties of the unitary Fermi gas, such as the dispersion relation and the superfluid pairing gap, for which we obtain reasonable agreement with the available results based on quantum Monte-Carlo simulations.

  3. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  4. The calculation of transport properties in quantum liquids using the maximum entropy numerical analytic continuation method: Application to liquid para-hydrogen

    PubMed Central

    Rabani, Eran; Reichman, David R.; Krilov, Goran; Berne, Bruce J.

    2002-01-01

    We present a method based on augmenting an exact relation between a frequency-dependent diffusion constant and the imaginary time velocity autocorrelation function, combined with the maximum entropy numerical analytic continuation approach to study transport properties in quantum liquids. The method is applied to the case of liquid para-hydrogen at two thermodynamic state points: a liquid near the triple point and a high-temperature liquid. Good agreement for the self-diffusion constant and for the real-time velocity autocorrelation function is obtained in comparison to experimental measurements and other theoretical predictions. Improvement of the methodology and future applications are discussed. PMID:11830656

  5. Maximum Entropy Production As a Framework for Understanding How Living Systems Evolve, Organize and Function

    NASA Astrophysics Data System (ADS)

    Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.

    2014-12-01

    The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.

  6. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  7. Maximum-entropy probability distributions under Lp-norm constraints

    NASA Technical Reports Server (NTRS)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  8. The maximum entropy production principle: two basic questions.

    PubMed

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  9. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    performing. All reasonable permutations of factors will be used to develop a multitude of unique combinations. These combinations are considered different...are seen below (Duda et al., 2001). Entropy impurity: () = −�P�ωj�log2P(ωj) j (9) Gini impurity: () =�()�� = 1 2 ∗ [1...proportion of one class to another approaches 0.5, the impurity measure reaches its maximum, which for Entropy is 1.0, while it is 0.5 for Gini and

  10. Nonadditive entropy maximization is inconsistent with Bayesian updating

    NASA Astrophysics Data System (ADS)

    Pressé, Steve

    2014-11-01

    The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  11. Nonadditive entropy maximization is inconsistent with Bayesian updating.

    PubMed

    Pressé, Steve

    2014-11-01

    The maximum entropy method-used to infer probabilistic models from data-is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  12. DEM interpolation weight calculation modulus based on maximum entropy

    NASA Astrophysics Data System (ADS)

    Chen, Tian-wei; Yang, Xia

    2015-12-01

    There is negative-weight in traditional interpolation of gridding DEM, in the article, the principle of Maximum Entropy is utilized to analyze the model system which depends on modulus of space weight. Negative-weight problem of the DEM interpolation is researched via building Maximum Entropy model, and adding nonnegative, first and second order's Moment constraints, the negative-weight problem is solved. The correctness and accuracy of the method was validated with genetic algorithm in matlab program. The method is compared with the method of Yang Chizhong interpolation and quadratic program. Comparison shows that the volume and scaling of Maximum Entropy's weight is fit to relations of space and the accuracy is superior to the latter two.

  13. Monitoring of Time-Dependent System Profiles by Multiplex Gas Chromatography with Maximum Entropy Demodulation

    NASA Technical Reports Server (NTRS)

    Becker, Joseph F.; Valentin, Jose

    1996-01-01

    The maximum entropy technique was successfully applied to the deconvolution of overlapped chromatographic peaks. An algorithm was written in which the chromatogram was represented as a vector of sample concentrations multiplied by a peak shape matrix. Simulation results demonstrated that there is a trade off between the detector noise and peak resolution in the sense that an increase of the noise level reduced the peak separation that could be recovered by the maximum entropy method. Real data originated from a sample storage column was also deconvoluted using maximum entropy. Deconvolution is useful in this type of system because the conservation of time dependent profiles depends on the band spreading processes in the chromatographic column, which might smooth out the finer details in the concentration profile. The method was also applied to the deconvolution of previously interpretted Pioneer Venus chromatograms. It was found in this case that the correct choice of peak shape function was critical to the sensitivity of maximum entropy in the reconstruction of these chromatograms.

  14. Quantitative LC-MS of polymers: determining accurate molecular weight distributions by combined size exclusion chromatography and electrospray mass spectrometry with maximum entropy data processing.

    PubMed

    Gruendling, Till; Guilhaus, Michael; Barner-Kowollik, Christopher

    2008-09-15

    We report on the successful application of size exclusion chromatography (SEC) combined with electrospray ionization mass spectrometry (ESI-MS) and refractive index (RI) detection for the determination of accurate molecular weight distributions of synthetic polymers, corrected for chromatographic band broadening. The presented method makes use of the ability of ESI-MS to accurately depict the peak profiles and retention volumes of individual oligomers eluting from the SEC column, whereas quantitative information on the absolute concentration of oligomers is obtained from the RI-detector only. A sophisticated computational algorithm based on the maximum entropy principle is used to process the data gained by both detectors, yielding an accurate molecular weight distribution, corrected for chromatographic band broadening. Poly(methyl methacrylate) standards with molecular weights up to 10 kDa serve as model compounds. Molecular weight distributions (MWDs) obtained by the maximum entropy procedure are compared to MWDs, which were calculated by a conventional calibration of the SEC-retention time axis with peak retention data obtained from the mass spectrometer. Comparison showed that for the employed chromatographic system, distributions below 7 kDa were only weakly influenced by chromatographic band broadening. However, the maximum entropy algorithm could successfully correct the MWD of a 10 kDa standard for band broadening effects. Molecular weight averages were between 5 and 14% lower than the manufacturer stated data obtained by classical means of calibration. The presented method demonstrates a consistent approach for analyzing data obtained by coupling mass spectrometric detectors and concentration sensitive detectors to polymer liquid chromatography.

  15. Predicting protein β-sheet contacts using a maximum entropy-based correlated mutation measure.

    PubMed

    Burkoff, Nikolas S; Várnai, Csilla; Wild, David L

    2013-03-01

    The problem of ab initio protein folding is one of the most difficult in modern computational biology. The prediction of residue contacts within a protein provides a more tractable immediate step. Recently introduced maximum entropy-based correlated mutation measures (CMMs), such as direct information, have been successful in predicting residue contacts. However, most correlated mutation studies focus on proteins that have large good-quality multiple sequence alignments (MSA) because the power of correlated mutation analysis falls as the size of the MSA decreases. However, even with small autogenerated MSAs, maximum entropy-based CMMs contain information. To make use of this information, in this article, we focus not on general residue contacts but contacts between residues in β-sheets. The strong constraints and prior knowledge associated with β-contacts are ideally suited for prediction using a method that incorporates an often noisy CMM. Using contrastive divergence, a statistical machine learning technique, we have calculated a maximum entropy-based CMM. We have integrated this measure with a new probabilistic model for β-contact prediction, which is used to predict both residue- and strand-level contacts. Using our model on a standard non-redundant dataset, we significantly outperform a 2D recurrent neural network architecture, achieving a 5% improvement in true positives at the 5% false-positive rate at the residue level. At the strand level, our approach is competitive with the state-of-the-art single methods achieving precision of 61.0% and recall of 55.4%, while not requiring residue solvent accessibility as an input. http://www2.warwick.ac.uk/fac/sci/systemsbiology/research/software/

  16. Statistical theory on the analytical form of cloud particle size distributions

    NASA Astrophysics Data System (ADS)

    Wu, Wei; McFarquhar, Greg

    2017-11-01

    Several analytical forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying physical explanation as to why certain distribution forms preferentially occur instead of others. Theoretically, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of using a process level approach, the use of the principle of maximum entropy (MaxEnt) for determining the analytical form of PSDs from the perspective of system is examined here. Here, the issue of variability under coordinate transformations that arises using the Gibbs/Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power law relations between state variables, scale invariance and a further constraint on the expectation of one state variable (e.g. bulk water mass). DOE ASR.

  17. 16QAM Blind Equalization via Maximum Entropy Density Approximation Technique and Nonlinear Lagrange Multipliers

    PubMed Central

    Mauda, R.; Pinchas, M.

    2014-01-01

    Recently a new blind equalization method was proposed for the 16QAM constellation input inspired by the maximum entropy density approximation technique with improved equalization performance compared to the maximum entropy approach, Godard's algorithm, and others. In addition, an approximated expression for the minimum mean square error (MSE) was obtained. The idea was to find those Lagrange multipliers that bring the approximated MSE to minimum. Since the derivation of the obtained MSE with respect to the Lagrange multipliers leads to a nonlinear equation for the Lagrange multipliers, the part in the MSE expression that caused the nonlinearity in the equation for the Lagrange multipliers was ignored. Thus, the obtained Lagrange multipliers were not those Lagrange multipliers that bring the approximated MSE to minimum. In this paper, we derive a new set of Lagrange multipliers based on the nonlinear expression for the Lagrange multipliers obtained from minimizing the approximated MSE with respect to the Lagrange multipliers. Simulation results indicate that for the high signal to noise ratio (SNR) case, a faster convergence rate is obtained for a channel causing a high initial intersymbol interference (ISI) while the same equalization performance is obtained for an easy channel (initial ISI low). PMID:24723813

  18. Measuring Questions: Relevance and its Relation to Entropy

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    The Boolean lattice of logical statements induces the free distributive lattice of questions. Inclusion on this lattice is based on whether one question answers another. Generalizing the zeta function of the question lattice leads to a valuation called relevance or bearing, which is a measure of the degree to which one question answers another. Richard Cox conjectured that this degree can be expressed as a generalized entropy. With the assistance of yet another important result from Janos Acz6l, I show that this is indeed the case; and that the resulting inquiry calculus is a natural generalization of information theory. This approach provides a new perspective of the Principle of Maximum Entropy.

  19. Towards operational interpretations of generalized entropies

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2010-12-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  20. Entropic criterion for model selection

    NASA Astrophysics Data System (ADS)

    Tseng, Chih-Yuan

    2006-10-01

    Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why use this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficult to get. Following the logic of inductive inference proposed by Caticha [Relative entropy and inductive inference, in: G. Erickson, Y. Zhai (Eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, AIP Conference Proceedings, vol. 707, 2004 (available from arXiv.org/abs/physics/0311093)], we show relative entropy to be a unique criterion, which requires no prior information and can be applied to different fields. We examine this criterion by considering a physical problem, simple fluids, and results are promising.

  1. Entropy-based goodness-of-fit test: Application to the Pareto distribution

    NASA Astrophysics Data System (ADS)

    Lequesne, Justine

    2013-08-01

    Goodness-of-fit tests based on entropy have been introduced in [13] for testing normality. The maximum entropy distribution in a class of probability distributions defined by linear constraints induces a Pythagorean equality between the Kullback-Leibler information and an entropy difference. This allows one to propose a goodness-of-fit test for maximum entropy parametric distributions which is based on the Kullback-Leibler information. We will focus on the application of the method to the Pareto distribution. The power of the proposed test is computed through Monte Carlo simulation.

  2. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    PubMed

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  3. Maximum entropy method applied to deblurring images on a MasPar MP-1 computer

    NASA Technical Reports Server (NTRS)

    Bonavito, N. L.; Dorband, John; Busse, Tim

    1991-01-01

    A statistical inference method based on the principle of maximum entropy is developed for the purpose of enhancing and restoring satellite images. The proposed maximum entropy image restoration method is shown to overcome the difficulties associated with image restoration and provide the smoothest and most appropriate solution consistent with the measured data. An implementation of the method on the MP-1 computer is described, and results of tests on simulated data are presented.

  4. Bayesian Maximum Entropy Integration of Ozone Observations and Model Predictions: A National Application.

    PubMed

    Xu, Yadong; Serre, Marc L; Reyes, Jeanette; Vizuete, William

    2016-04-19

    To improve ozone exposure estimates for ambient concentrations at a national scale, we introduce our novel Regionalized Air Quality Model Performance (RAMP) approach to integrate chemical transport model (CTM) predictions with the available ozone observations using the Bayesian Maximum Entropy (BME) framework. The framework models the nonlinear and nonhomoscedastic relation between air pollution observations and CTM predictions and for the first time accounts for variability in CTM model performance. A validation analysis using only noncollocated data outside of a validation radius rv was performed and the R(2) between observations and re-estimated values for two daily metrics, the daily maximum 8-h average (DM8A) and the daily 24-h average (D24A) ozone concentrations, were obtained with the OBS scenario using ozone observations only in contrast with the RAMP and a Constant Air Quality Model Performance (CAMP) scenarios. We show that, by accounting for the spatial and temporal variability in model performance, our novel RAMP approach is able to extract more information in terms of R(2) increase percentage, with over 12 times for the DM8A and over 3.5 times for the D24A ozone concentrations, from CTM predictions than the CAMP approach assuming that model performance does not change across space and time.

  5. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    PubMed

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America

  6. Simultaneous Multi-Scale Diffusion Estimation and Tractography Guided by Entropy Spectrum Pathways

    PubMed Central

    Galinsky, Vitaly L.; Frank, Lawrence R.

    2015-01-01

    We have developed a method for the simultaneous estimation of local diffusion and the global fiber tracts based upon the information entropy flow that computes the maximum entropy trajectories between locations and depends upon the global structure of the multi-dimensional and multi-modal diffusion field. Computation of the entropy spectrum pathways requires only solving a simple eigenvector problem for the probability distribution for which efficient numerical routines exist, and a straight forward integration of the probability conservation through ray tracing of the convective modes guided by a global structure of the entropy spectrum coupled with a small scale local diffusion. The intervoxel diffusion is sampled by multi b-shell multi q-angle DWI data expanded in spherical waves. This novel approach to fiber tracking incorporates global information about multiple fiber crossings in every individual voxel and ranks it in the most scientifically rigorous way. This method has potential significance for a wide range of applications, including studies of brain connectivity. PMID:25532167

  7. A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization

    NASA Astrophysics Data System (ADS)

    Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano

    In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.

  8. Modelling the spreading rate of controlled communicable epidemics through an entropy-based thermodynamic model

    NASA Astrophysics Data System (ADS)

    Wang, WenBin; Wu, ZiNiu; Wang, ChunFeng; Hu, RuiFeng

    2013-11-01

    A model based on a thermodynamic approach is proposed for predicting the dynamics of communicable epidemics assumed to be governed by controlling efforts of multiple scales so that an entropy is associated with the system. All the epidemic details are factored into a single and time-dependent coefficient, the functional form of this coefficient is found through four constraints, including notably the existence of an inflexion point and a maximum. The model is solved to give a log-normal distribution for the spread rate, for which a Shannon entropy can be defined. The only parameter, that characterizes the width of the distribution function, is uniquely determined through maximizing the rate of entropy production. This entropy-based thermodynamic (EBT) model predicts the number of hospitalized cases with a reasonable accuracy for SARS in the year 2003. This EBT model can be of use for potential epidemics such as avian influenza and H7N9 in China.

  9. Online Robot Dead Reckoning Localization Using Maximum Relative Entropy Optimization With Model Constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urniezius, Renaldas

    2011-03-14

    The principle of Maximum relative Entropy optimization was analyzed for dead reckoning localization of a rigid body when observation data of two attached accelerometers was collected. Model constraints were derived from the relationships between the sensors. The experiment's results confirmed that accelerometers each axis' noise can be successfully filtered utilizing dependency between channels and the dependency between time series data. Dependency between channels was used for a priori calculation, and a posteriori distribution was derived utilizing dependency between time series data. There was revisited data of autocalibration experiment by removing the initial assumption that instantaneous rotation axis of a rigidmore » body was known. Performance results confirmed that such an approach could be used for online dead reckoning localization.« less

  10. Maximum entropy models as a tool for building precise neural controls.

    PubMed

    Savin, Cristina; Tkačik, Gašper

    2017-10-01

    Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. MaxEnt-Based Ecological Theory: A Template for Integrated Catchment Theory

    NASA Astrophysics Data System (ADS)

    Harte, J.

    2017-12-01

    The maximum information entropy procedure (MaxEnt) is both a powerful tool for inferring least-biased probability distributions from limited data and a framework for the construction of complex systems theory. The maximum entropy theory of ecology (METE) describes remarkably well widely observed patterns in the distribution, abundance and energetics of individuals and taxa in relatively static ecosystems. An extension to ecosystems undergoing change in response to disturbance or natural succession (DynaMETE) is in progress. I describe the structure of both the static and the dynamic theory and show a range of comparisons with census data. I then propose a generalization of the MaxEnt approach that could provide a framework for a predictive theory of both static and dynamic, fully-coupled, eco-socio-hydrological catchment systems.

  12. A non-uniformly sampled 4D HCC(CO)NH-TOCSY experiment processed using maximum entropy for rapid protein sidechain assignment

    PubMed Central

    Mobli, Mehdi; Stern, Alan S.; Bermel, Wolfgang; King, Glenn F.; Hoch, Jeffrey C.

    2010-01-01

    One of the stiffest challenges in structural studies of proteins using NMR is the assignment of sidechain resonances. Typically, a panel of lengthy 3D experiments are acquired in order to establish connectivities and resolve ambiguities due to overlap. We demonstrate that these experiments can be replaced by a single 4D experiment that is time-efficient, yields excellent resolution, and captures unique carbon-proton connectivity information. The approach is made practical by the use of non-uniform sampling in the three indirect time dimensions and maximum entropy reconstruction of the corresponding 3D frequency spectrum. This 4D method will facilitate automated resonance assignment procedures and it should be particularly beneficial for increasing throughput in NMR-based structural genomics initiatives. PMID:20299257

  13. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    PubMed

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  14. Consistent maximum entropy representations of pipe flow networks

    NASA Astrophysics Data System (ADS)

    Waldrip, Steven H.; Niven, Robert K.; Abel, Markus; Schlegel, Michael

    2017-06-01

    The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.

  15. Generalized Maximum Entropy

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Stutz, John

    2005-01-01

    A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].

  16. Maximum entropy production: Can it be used to constrain conceptual hydrological models?

    Treesearch

    M.C. Westhoff; E. Zehe

    2013-01-01

    In recent years, optimality principles have been proposed to constrain hydrological models. The principle of maximum entropy production (MEP) is one of the proposed principles and is subject of this study. It states that a steady state system is organized in such a way that entropy production is maximized. Although successful applications have been reported in...

  17. Maximum entropy methods for extracting the learned features of deep neural networks.

    PubMed

    Finnegan, Alex; Song, Jun S

    2017-10-01

    New architectures of multilayer artificial neural networks and new methods for training them are rapidly revolutionizing the application of machine learning in diverse fields, including business, social science, physical sciences, and biology. Interpreting deep neural networks, however, currently remains elusive, and a critical challenge lies in understanding which meaningful features a network is actually learning. We present a general method for interpreting deep neural networks and extracting network-learned features from input data. We describe our algorithm in the context of biological sequence analysis. Our approach, based on ideas from statistical physics, samples from the maximum entropy distribution over possible sequences, anchored at an input sequence and subject to constraints implied by the empirical function learned by a network. Using our framework, we demonstrate that local transcription factor binding motifs can be identified from a network trained on ChIP-seq data and that nucleosome positioning signals are indeed learned by a network trained on chemical cleavage nucleosome maps. Imposing a further constraint on the maximum entropy distribution also allows us to probe whether a network is learning global sequence features, such as the high GC content in nucleosome-rich regions. This work thus provides valuable mathematical tools for interpreting and extracting learned features from feed-forward neural networks.

  18. Convex foundations for generalized MaxEnt models

    NASA Astrophysics Data System (ADS)

    Frongillo, Rafael; Reid, Mark D.

    2014-12-01

    We present an approach to maximum entropy models that highlights the convex geometry and duality of generalized exponential families (GEFs) and their connection to Bregman divergences. Using our framework, we are able to resolve a puzzling aspect of the bijection of Banerjee and coauthors between classical exponential families and what they call regular Bregman divergences. Their regularity condition rules out all but Bregman divergences generated from log-convex generators. We recover their bijection and show that a much broader class of divergences correspond to GEFs via two key observations: 1) Like classical exponential families, GEFs have a "cumulant" C whose subdifferential contains the mean: Eo˜pθ[φ(o)]∈∂C(θ) ; 2) Generalized relative entropy is a C-Bregman divergence between parameters: DF(pθ,pθ')= D C(θ,θ') , where DF becomes the KL divergence for F = -H. We also show that every incomplete market with cost function C can be expressed as a complete market, where the prices are constrained to be a GEF with cumulant C. This provides an entirely new interpretation of prediction markets, relating their design back to the principle of maximum entropy.

  19. Maximum and minimum entropy states yielding local continuity bounds

    NASA Astrophysics Data System (ADS)

    Hanson, Eric P.; Datta, Nilanjana

    2018-04-01

    Given an arbitrary quantum state (σ), we obtain an explicit construction of a state ρɛ * ( σ ) [respectively, ρ * , ɛ ( σ ) ] which has the maximum (respectively, minimum) entropy among all states which lie in a specified neighborhood (ɛ-ball) of σ. Computing the entropy of these states leads to a local strengthening of the continuity bound of the von Neumann entropy, i.e., the Audenaert-Fannes inequality. Our bound is local in the sense that it depends on the spectrum of σ. The states ρɛ * ( σ ) and ρ * , ɛ (σ) depend only on the geometry of the ɛ-ball and are in fact optimizers for a larger class of entropies. These include the Rényi entropy and the minimum- and maximum-entropies, providing explicit formulas for certain smoothed quantities. This allows us to obtain local continuity bounds for these quantities as well. In obtaining this bound, we first derive a more general result which may be of independent interest, namely, a necessary and sufficient condition under which a state maximizes a concave and Gâteaux-differentiable function in an ɛ-ball around a given state σ. Examples of such a function include the von Neumann entropy and the conditional entropy of bipartite states. Our proofs employ tools from the theory of convex optimization under non-differentiable constraints, in particular Fermat's rule, and majorization theory.

  20. Using the principle of entropy maximization to infer genetic interaction networks from gene expression patterns.

    PubMed

    Lezon, Timothy R; Banavar, Jayanth R; Cieplak, Marek; Maritan, Amos; Fedoroff, Nina V

    2006-12-12

    We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems.

  1. A maximum (non-extensive) entropy approach to equity options bid-ask spread

    NASA Astrophysics Data System (ADS)

    Tapiero, Oren J.

    2013-07-01

    The cross-section of options bid-ask spreads with their strikes are modelled by maximising the Kaniadakis entropy. A theoretical model results with the bid-ask spread depending explicitly on the implied volatility; the probability of expiring at-the-money and an asymmetric information parameter (κ). Considering AIG as a test case for the period between January 2006 and October 2008, we find that information flows uniquely from the trading activity in the underlying asset to its derivatives. Suggesting that κ is possibly an option implied measure of the current state of trading liquidity in the underlying asset.

  2. Steepest entropy ascent model for far-nonequilibrium thermodynamics: Unified implementation of the maximum entropy production principle

    NASA Astrophysics Data System (ADS)

    Beretta, Gian Paolo

    2014-10-01

    By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.

  3. Computational Amide I Spectroscopy for Refinement of Disordered Peptide Ensembles: Maximum Entropy and Related Approaches

    NASA Astrophysics Data System (ADS)

    Reppert, Michael; Tokmakoff, Andrei

    The structural characterization of intrinsically disordered peptides (IDPs) presents a challenging biophysical problem. Extreme heterogeneity and rapid conformational interconversion make traditional methods difficult to interpret. Due to its ultrafast (ps) shutter speed, Amide I vibrational spectroscopy has received considerable interest as a novel technique to probe IDP structure and dynamics. Historically, Amide I spectroscopy has been limited to delivering global secondary structural information. More recently, however, the method has been adapted to study structure at the local level through incorporation of isotope labels into the protein backbone at specific amide bonds. Thanks to the acute sensitivity of Amide I frequencies to local electrostatic interactions-particularly hydrogen bonds-spectroscopic data on isotope labeled residues directly reports on local peptide conformation. Quantitative information can be extracted using electrostatic frequency maps which translate molecular dynamics trajectories into Amide I spectra for comparison with experiment. Here we present our recent efforts in the development of a rigorous approach to incorporating Amide I spectroscopic restraints into refined molecular dynamics structural ensembles using maximum entropy and related approaches. By combining force field predictions with experimental spectroscopic data, we construct refined structural ensembles for a family of short, strongly disordered, elastin-like peptides in aqueous solution.

  4. Causal nexus between energy consumption and carbon dioxide emission for Malaysia using maximum entropy bootstrap approach.

    PubMed

    Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid

    2015-12-01

    This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.

  5. Force-Time Entropy of Isometric Impulse.

    PubMed

    Hsieh, Tsung-Yu; Newell, Karl M

    2016-01-01

    The relation between force and temporal variability in discrete impulse production has been viewed as independent (R. A. Schmidt, H. Zelaznik, B. Hawkins, J. S. Frank, & J. T. Quinn, 1979 ) or dependent on the rate of force (L. G. Carlton & K. M. Newell, 1993 ). Two experiments in an isometric single finger force task investigated the joint force-time entropy with (a) fixed time to peak force and different percentages of force level and (b) fixed percentage of force level and different times to peak force. The results showed that the peak force variability increased either with the increment of force level or through a shorter time to peak force that also reduced timing error variability. The peak force entropy and entropy of time to peak force increased on the respective dimension as the parameter conditions approached either maximum force or a minimum rate of force production. The findings show that force error and timing error are dependent but complementary when considered in the same framework with the joint force-time entropy at a minimum in the middle parameter range of discrete impulse.

  6. Optimality and inference in hydrology from entropy production considerations: synthetic hillslope numerical experiments

    NASA Astrophysics Data System (ADS)

    Kollet, S. J.

    2015-05-01

    In this study, entropy production optimization and inference principles are applied to a synthetic semi-arid hillslope in high-resolution, physics-based simulations. The results suggest that entropy or power is indeed maximized, because of the strong nonlinearity of variably saturated flow and competing processes related to soil moisture fluxes, the depletion of gradients, and the movement of a free water table. Thus, it appears that the maximum entropy production (MEP) principle may indeed be applicable to hydrologic systems. In the application to hydrologic system, the free water table constitutes an important degree of freedom in the optimization of entropy production and may also relate the theory to actual observations. In an ensuing analysis, an attempt is made to transfer the complex, "microscopic" hillslope model into a macroscopic model of reduced complexity using the MEP principle as an interference tool to obtain effective conductance coefficients and forces/gradients. The results demonstrate a new approach for the application of MEP to hydrologic systems and may form the basis for fruitful discussions and research in future.

  7. Crowd macro state detection using entropy model

    NASA Astrophysics Data System (ADS)

    Zhao, Ying; Yuan, Mengqi; Su, Guofeng; Chen, Tao

    2015-08-01

    In the crowd security research area a primary concern is to identify the macro state of crowd behaviors to prevent disasters and to supervise the crowd behaviors. The entropy is used to describe the macro state of a self-organization system in physics. The entropy change indicates the system macro state change. This paper provides a method to construct crowd behavior microstates and the corresponded probability distribution using the individuals' velocity information (magnitude and direction). Then an entropy model was built up to describe the crowd behavior macro state. Simulation experiments and video detection experiments were conducted. It was verified that in the disordered state, the crowd behavior entropy is close to the theoretical maximum entropy; while in ordered state, the entropy is much lower than half of the theoretical maximum entropy. The crowd behavior macro state sudden change leads to the entropy change. The proposed entropy model is more applicable than the order parameter model in crowd behavior detection. By recognizing the entropy mutation, it is possible to detect the crowd behavior macro state automatically by utilizing cameras. Results will provide data support on crowd emergency prevention and on emergency manual intervention.

  8. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;

  9. Application of maximum entropy to statistical inference for inversion of data from a single track segment.

    PubMed

    Stotts, Steven A; Koch, Robert A

    2017-08-01

    In this paper an approach is presented to estimate the constraint required to apply maximum entropy (ME) for statistical inference with underwater acoustic data from a single track segment. Previous algorithms for estimating the ME constraint require multiple source track segments to determine the constraint. The approach is relevant for addressing model mismatch effects, i.e., inaccuracies in parameter values determined from inversions because the propagation model does not account for all acoustic processes that contribute to the measured data. One effect of model mismatch is that the lowest cost inversion solution may be well outside a relatively well-known parameter value's uncertainty interval (prior), e.g., source speed from track reconstruction or towed source levels. The approach requires, for some particular parameter value, the ME constraint to produce an inferred uncertainty interval that encompasses the prior. Motivating this approach is the hypothesis that the proposed constraint determination procedure would produce a posterior probability density that accounts for the effect of model mismatch on inferred values of other inversion parameters for which the priors might be quite broad. Applications to both measured and simulated data are presented for model mismatch that produces minimum cost solutions either inside or outside some priors.

  10. Bayesian maximum entropy integration of ozone observations and model predictions: an application for attainment demonstration in North Carolina.

    PubMed

    de Nazelle, Audrey; Arunachalam, Saravanan; Serre, Marc L

    2010-08-01

    States in the USA are required to demonstrate future compliance of criteria air pollutant standards by using both air quality monitors and model outputs. In the case of ozone, the demonstration tests aim at relying heavily on measured values, due to their perceived objectivity and enforceable quality. Weight given to numerical models is diminished by integrating them in the calculations only in a relative sense. For unmonitored locations, the EPA has suggested the use of a spatial interpolation technique to assign current values. We demonstrate that this approach may lead to erroneous assignments of nonattainment and may make it difficult for States to establish future compliance. We propose a method that combines different sources of information to map air pollution, using the Bayesian Maximum Entropy (BME) Framework. The approach gives precedence to measured values and integrates modeled data as a function of model performance. We demonstrate this approach in North Carolina, using the State's ozone monitoring network in combination with outputs from the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. We show that the BME data integration approach, compared to a spatial interpolation of measured data, improves the accuracy and the precision of ozone estimations across the state.

  11. Automated Classification of Radiology Reports for Acute Lung Injury: Comparison of Keyword and Machine Learning Based Natural Language Processing Approaches.

    PubMed

    Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M

    2009-11-01

    This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.

  12. Automated Classification of Radiology Reports for Acute Lung Injury: Comparison of Keyword and Machine Learning Based Natural Language Processing Approaches

    PubMed Central

    Solti, Imre; Cooke, Colin R.; Xia, Fei; Wurfel, Mark M.

    2010-01-01

    This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators. PMID:21152268

  13. Block entropy and quantum phase transition in the anisotropic Kondo necklace model

    NASA Astrophysics Data System (ADS)

    Mendoza-Arenas, J. J.; Franco, R.; Silva-Valencia, J.

    2010-06-01

    We study the von Neumann block entropy in the Kondo necklace model for different anisotropies η in the XY interaction between conduction spins using the density matrix renormalization group method. It was found that the block entropy presents a maximum for each η considered, and, comparing it with the results of the quantum criticality of the model based on the behavior of the energy gap, we observe that the maximum block entropy occurs at the quantum critical point between an antiferromagnetic and a Kondo singlet state, so this measure of entanglement is useful for giving information about where a quantum phase transition occurs in this model. We observe that the block entropy also presents a maximum at the quantum critical points that are obtained when an anisotropy Δ is included in the Kondo exchange between localized and conduction spins; when Δ diminishes for a fixed value of η, the critical point increases, favoring the antiferromagnetic phase.

  14. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    NASA Astrophysics Data System (ADS)

    Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel

    2017-07-01

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) [13], we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.

  15. Entropy of adsorption of mixed surfactants from solutions onto the air/water interface

    USGS Publications Warehouse

    Chen, L.-W.; Chen, J.-H.; Zhou, N.-F.

    1995-01-01

    The partial molar entropy change for mixed surfactant molecules adsorbed from solution at the air/water interface has been investigated by surface thermodynamics based upon the experimental surface tension isotherms at various temperatures. Results for different surfactant mixtures of sodium dodecyl sulfate and sodium tetradecyl sulfate, decylpyridinium chloride and sodium alkylsulfonates have shown that the partial molar entropy changes for adsorption of the mixed surfactants were generally negative and decreased with increasing adsorption to a minimum near the maximum adsorption and then increased abruptly. The entropy decrease can be explained by the adsorption-orientation of surfactant molecules in the adsorbed monolayer and the abrupt entropy increase at the maximum adsorption is possible due to the strong repulsion between the adsorbed molecules.

  16. Estimation of depth to magnetic source using maximum entropy power spectra, with application to the Peru-Chile Trench

    USGS Publications Warehouse

    Blakely, Richard J.

    1981-01-01

    Estimations of the depth to magnetic sources using the power spectrum of magnetic anomalies generally require long magnetic profiles. The method developed here uses the maximum entropy power spectrum (MEPS) to calculate depth to source on short windows of magnetic data; resolution is thereby improved. The method operates by dividing a profile into overlapping windows, calculating a maximum entropy power spectrum for each window, linearizing the spectra, and calculating with least squares the various depth estimates. The assumptions of the method are that the source is two dimensional and that the intensity of magnetization includes random noise; knowledge of the direction of magnetization is not required. The method is applied to synthetic data and to observed marine anomalies over the Peru-Chile Trench. The analyses indicate a continuous magnetic basement extending from the eastern margin of the Nazca plate and into the subduction zone. The computed basement depths agree with acoustic basement seaward of the trench axis, but deepen as the plate approaches the inner trench wall. This apparent increase in the computed depths may result from the deterioration of magnetization in the upper part of the ocean crust, possibly caused by compressional disruption of the basaltic layer. Landward of the trench axis, the depth estimates indicate possible thrusting of the oceanic material into the lower slope of the continental margin.

  17. Maximum entropy analysis of NMR data of flexible multirotor molecules partially oriented in nematic solution: 2,2':5',2″-terthiophene, 2,2'- and 3,3'-dithiophene

    NASA Astrophysics Data System (ADS)

    Caldarelli, Stefano; Catalano, Donata; Di Bari, Lorenzo; Lumetti, Marco; Ciofalo, Maurizio; Alberto Veracini, Carlo

    1994-07-01

    The dipolar couplings observed by NMR spectroscopy of solutes in nematic solvents (LX-NMR) are used to build up the maximum entropy (ME) probability distribution function of the variables describing the orientational and internal motion of the molecule. The ME conformational distributions of 2,2'- and 3,3'-dithiophene and 2,2':5',2″-terthiophene (α-terthienyl)thus obtained are compared with the results of previous studies. The 2,2'- and 3,3'-dithiophene molecules exhibit equilibria among cisoid and transoid forms; the probability maxima correspond to planar and twisted conformers for 2,2'- or 3,3'-dithiophene, respectively, 2,2':5',2″-Terthiophene has two internal degrees of freedom; the ME approach indicates that the trans, trans and cis, trans planar conformations are the most probable. The correlation between the two intramolecular rotations is also discussed.

  18. Image reconstruction of IRAS survey scans

    NASA Technical Reports Server (NTRS)

    Bontekoe, Tj. Romke

    1990-01-01

    The IRAS survey data can be used successfully to produce images of extended objects. The major difficulties, viz. non-uniform sampling, different response functions for each detector, and varying signal-to-noise levels for each detector for each scan, were resolved. The results of three different image construction techniques are compared: co-addition, constrained least squares, and maximum entropy. The maximum entropy result is superior. An image of the galaxy M51 with an average spatial resolution of 45 arc seconds is presented, using 60 micron survey data. This exceeds the telescope diffraction limit of 1 minute of arc, at this wavelength. Data fusion is a proposed method for combining data from different instruments, with different spacial resolutions, at different wavelengths. Data estimates of the physical parameters, temperature, density and composition, can be made from the data without prior image (re-)construction. An increase in the accuracy of these parameters is expected as the result of this more systematic approach.

  19. Tsallis Entropy and the Transition to Scaling in Fragmentation

    NASA Astrophysics Data System (ADS)

    Sotolongo-Costa, Oscar; Rodriguez, Arezky H.; Rodgers, G. J.

    2000-12-01

    By using the maximum entropy principle with Tsallis entropy we obtain a fragment size distribution function which undergoes a transition to scaling. This distribution function reduces to those obtained by other authors using Shannon entropy. The treatment is easily generalisable to any process of fractioning with suitable constraints.

  20. A pairwise maximum entropy model accurately describes resting-state human brain networks

    PubMed Central

    Watanabe, Takamitsu; Hirose, Satoshi; Wada, Hiroyuki; Imai, Yoshio; Machida, Toru; Shirouzu, Ichiro; Konishi, Seiki; Miyashita, Yasushi; Masuda, Naoki

    2013-01-01

    The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks. PMID:23340410

  1. Using the principle of entropy maximization to infer genetic interaction networks from gene expression patterns

    PubMed Central

    Lezon, Timothy R.; Banavar, Jayanth R.; Cieplak, Marek; Maritan, Amos; Fedoroff, Nina V.

    2006-01-01

    We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems. PMID:17138668

  2. MaxEnt alternatives to pearson family distributions

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie J.

    2012-05-01

    In a previous MaxEnt conference [11] a method of obtaining MaxEnt univariate distributions under a variety of constraints was presented. The Mathematica function Interpolation[], normally used with numerical data, can also process "semi-symbolic" data, and Lagrange Multiplier equations were solved for a set of symbolic ordinates describing the required MaxEnt probability density function. We apply a more developed version of this approach to finding MaxEnt distributions having prescribed β1 and β2 values, and compare the entropy of the MaxEnt distribution to that of the Pearson family distribution having the same β1 and β2. These MaxEnt distributions do have, in general, greater entropy than the related Pearson distribution. In accordance with Jaynes' Maximum Entropy Principle, these MaxEnt distributions are thus to be preferred to the corresponding Pearson distributions as priors in Bayes' Theorem.

  3. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  4. Quantifying the entropic cost of cellular growth control

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele; Capuani, Fabrizio; De Martino, Andrea

    2017-07-01

    Viewing the ways a living cell can organize its metabolism as the phase space of a physical system, regulation can be seen as the ability to reduce the entropy of that space by selecting specific cellular configurations that are, in some sense, optimal. Here we quantify the amount of regulation required to control a cell's growth rate by a maximum-entropy approach to the space of underlying metabolic phenotypes, where a configuration corresponds to a metabolic flux pattern as described by genome-scale models. We link the mean growth rate achieved by a population of cells to the minimal amount of metabolic regulation needed to achieve it through a phase diagram that highlights how growth suppression can be as costly (in regulatory terms) as growth enhancement. Moreover, we provide an interpretation of the inverse temperature β controlling maximum-entropy distributions based on the underlying growth dynamics. Specifically, we show that the asymptotic value of β for a cell population can be expected to depend on (i) the carrying capacity of the environment, (ii) the initial size of the colony, and (iii) the probability distribution from which the inoculum was sampled. Results obtained for E. coli and human cells are found to be remarkably consistent with empirical evidence.

  5. Application of Bayesian Maximum Entropy Filter in parameter calibration of groundwater flow model in PingTung Plain

    NASA Astrophysics Data System (ADS)

    Cheung, Shao-Yong; Lee, Chieh-Han; Yu, Hwa-Lung

    2017-04-01

    Due to the limited hydrogeological observation data and high levels of uncertainty within, parameter estimation of the groundwater model has been an important issue. There are many methods of parameter estimation, for example, Kalman filter provides a real-time calibration of parameters through measurement of groundwater monitoring wells, related methods such as Extended Kalman Filter and Ensemble Kalman Filter are widely applied in groundwater research. However, Kalman Filter method is limited to linearity. This study propose a novel method, Bayesian Maximum Entropy Filtering, which provides a method that can considers the uncertainty of data in parameter estimation. With this two methods, we can estimate parameter by given hard data (certain) and soft data (uncertain) in the same time. In this study, we use Python and QGIS in groundwater model (MODFLOW) and development of Extended Kalman Filter and Bayesian Maximum Entropy Filtering in Python in parameter estimation. This method may provide a conventional filtering method and also consider the uncertainty of data. This study was conducted through numerical model experiment to explore, combine Bayesian maximum entropy filter and a hypothesis for the architecture of MODFLOW groundwater model numerical estimation. Through the virtual observation wells to simulate and observe the groundwater model periodically. The result showed that considering the uncertainty of data, the Bayesian maximum entropy filter will provide an ideal result of real-time parameters estimation.

  6. An adaptive technique to maximize lossless image data compression of satellite images

    NASA Technical Reports Server (NTRS)

    Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe

    1994-01-01

    Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.

  7. Determining Dynamical Path Distributions usingMaximum Relative Entropy

    DTIC Science & Technology

    2015-05-31

    entropy to a one-dimensional continuum labeled by a parameter η. The resulting η-entropies are equivalent to those proposed by Renyi [12] or by Tsallis [13...1995). [12] A. Renyi , “On measures of entropy and information,”Proc. 4th Berkeley Simposium on Mathematical Statistics and Probability, Vol 1, p. 547-461

  8. Ergodicity, Maximum Entropy Production, and Steepest Entropy Ascent in the Proofs of Onsager's Reciprocal Relations

    NASA Astrophysics Data System (ADS)

    Benfenati, Francesco; Beretta, Gian Paolo

    2018-04-01

    We show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler's maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).

  9. Adaptive Statistical Language Modeling; A Maximum Entropy Approach

    DTIC Science & Technology

    1994-04-19

    models exploit the immediate past only. To extract information from further back in the document’s history , I use trigger pairs as the basic information...9 2.2 Context-Free Estimation (Unigram) ...... .................... 12 2.3 Short-Term History (Conventional N-gram...12 2.4 Short-term Class History (Class-Based N-gram) ................... 14 2.5 Intermediate Distance ........ ........................... 16

  10. Automatic Detection of Preposition Errors in Learner Writing

    ERIC Educational Resources Information Center

    De Felice, Rachele; Pulman, Stephen

    2009-01-01

    In this article, we present an approach to the automatic correction of preposition errors in L2 English. Our system, based on a maximum entropy classifier, achieves average precision of 42% and recall of 35% on this task. The discussion of results obtained on correct and incorrect data aims to establish what characteristics of L2 writing prove…

  11. Origin of generalized entropies and generalized statistical mechanics for superstatistical multifractal systems

    NASA Astrophysics Data System (ADS)

    Gadjiev, Bahruz; Progulova, Tatiana

    2015-01-01

    We consider a multifractal structure as a mixture of fractal substructures and introduce a distribution function f (α), where α is a fractal dimension. Then we can introduce g(p)˜ ∫- ln p μe-yf(y)dy and show that the distribution functions f (α) in the form of f(α) = δ(α-1), f(α) = δ(α-θ) , f(α) = 1/α-1 , f(y)= y α-1 lead to the Boltzmann - Gibbs, Shafee, Tsallis and Anteneodo - Plastino entropies conformably. Here δ(x) is the Dirac delta function. Therefore the Shafee entropy corresponds to a fractal structure, the Tsallis entropy describes a multifractal structure with a homogeneous distribution of fractal substructures and the Anteneodo - Plastino entropy appears in case of a power law distribution f (y). We consider the Fokker - Planck equation for a fractal substructure and determine its stationary solution. To determine the distribution function of a multifractal structure we solve the two-dimensional Fokker - Planck equation and obtain its stationary solution. Then applying the Bayes theorem we obtain a distribution function for the entire system in the form of q-exponential function. We compare the results of the distribution functions obtained due to the superstatistical approach with the ones obtained according to the maximum entropy principle.

  12. Inverting ion images without Abel inversion: maximum entropy reconstruction of velocity maps.

    PubMed

    Dick, Bernhard

    2014-01-14

    A new method for the reconstruction of velocity maps from ion images is presented, which is based on the maximum entropy concept. In contrast to other methods used for Abel inversion the new method never applies an inversion or smoothing to the data. Instead, it iteratively finds the map which is the most likely cause for the observed data, using the correct likelihood criterion for data sampled from a Poissonian distribution. The entropy criterion minimizes the information content in this map, which hence contains no information for which there is no evidence in the data. Two implementations are proposed, and their performance is demonstrated with simulated and experimental data: Maximum Entropy Velocity Image Reconstruction (MEVIR) obtains a two-dimensional slice through the velocity distribution and can be compared directly to Abel inversion. Maximum Entropy Velocity Legendre Reconstruction (MEVELER) finds one-dimensional distribution functions Q(l)(v) in an expansion of the velocity distribution in Legendre polynomials P((cos θ) for the angular dependence. Both MEVIR and MEVELER can be used for the analysis of ion images with intensities as low as 0.01 counts per pixel, with MEVELER performing significantly better than MEVIR for images with low intensity. Both methods perform better than pBASEX, in particular for images with less than one average count per pixel.

  13. Chapman Enskog-maximum entropy method on time-dependent neutron transport equation

    NASA Astrophysics Data System (ADS)

    Abdou, M. A.

    2006-09-01

    The time-dependent neutron transport equation in semi and infinite medium with linear anisotropic and Rayleigh scattering is proposed. The problem is solved by means of the flux-limited, Chapman Enskog-maximum entropy for obtaining the solution of the time-dependent neutron transport. The solution gives the neutron distribution density function which is used to compute numerically the radiant energy density E(x,t), net flux F(x,t) and reflectivity Rf. The behaviour of the approximate flux-limited maximum entropy neutron density function are compared with those found by other theories. Numerical calculations for the radiant energy, net flux and reflectivity of the proposed medium are calculated at different time and space.

  14. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    PubMed

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  15. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle.

    PubMed

    Shalymov, Dmitry S; Fradkov, Alexander L

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined.

  16. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle

    PubMed Central

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined. PMID:26997886

  17. Maximum-entropy description of animal movement.

    PubMed

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  18. Sample entropy analysis of cervical neoplasia gene-expression signatures

    PubMed Central

    Botting, Shaleen K; Trzeciakowski, Jerome P; Benoit, Michelle F; Salama, Salama A; Diaz-Arrastia, Concepcion R

    2009-01-01

    Background We introduce Approximate Entropy as a mathematical method of analysis for microarray data. Approximate entropy is applied here as a method to classify the complex gene expression patterns resultant of a clinical sample set. Since Entropy is a measure of disorder in a system, we believe that by choosing genes which display minimum entropy in normal controls and maximum entropy in the cancerous sample set we will be able to distinguish those genes which display the greatest variability in the cancerous set. Here we describe a method of utilizing Approximate Sample Entropy (ApSE) analysis to identify genes of interest with the highest probability of producing an accurate, predictive, classification model from our data set. Results In the development of a diagnostic gene-expression profile for cervical intraepithelial neoplasia (CIN) and squamous cell carcinoma of the cervix, we identified 208 genes which are unchanging in all normal tissue samples, yet exhibit a random pattern indicative of the genetic instability and heterogeneity of malignant cells. This may be measured in terms of the ApSE when compared to normal tissue. We have validated 10 of these genes on 10 Normal and 20 cancer and CIN3 samples. We report that the predictive value of the sample entropy calculation for these 10 genes of interest is promising (75% sensitivity, 80% specificity for prediction of cervical cancer over CIN3). Conclusion The success of the Approximate Sample Entropy approach in discerning alterations in complexity from biological system with such relatively small sample set, and extracting biologically relevant genes of interest hold great promise. PMID:19232110

  19. The moving-window Bayesian maximum entropy framework: estimation of PM(2.5) yearly average concentration across the contiguous United States.

    PubMed

    Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L

    2012-09-01

    Geostatistical methods are widely used in estimating long-term exposures for epidemiological studies on air pollution, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and the uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian maximum entropy (BME) method and applied this framework to estimate fine particulate matter (PM(2.5)) yearly average concentrations over the contiguous US. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingness in the air-monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM(2.5) data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM(2.5). Moreover, the MWBME method further reduces the MSE by 8.4-43.7%, with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM(2.5) across large geographical domains with expected spatial non-stationarity.

  20. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  1. Exploiting Acoustic and Syntactic Features for Automatic Prosody Labeling in a Maximum Entropy Framework

    PubMed Central

    Sridhar, Vivek Kumar Rangarajan; Bangalore, Srinivas; Narayanan, Shrikanth S.

    2009-01-01

    In this paper, we describe a maximum entropy-based automatic prosody labeling framework that exploits both language and speech information. We apply the proposed framework to both prominence and phrase structure detection within the Tones and Break Indices (ToBI) annotation scheme. Our framework utilizes novel syntactic features in the form of supertags and a quantized acoustic–prosodic feature representation that is similar to linear parameterizations of the prosodic contour. The proposed model is trained discriminatively and is robust in the selection of appropriate features for the task of prosody detection. The proposed maximum entropy acoustic–syntactic model achieves pitch accent and boundary tone detection accuracies of 86.0% and 93.1% on the Boston University Radio News corpus, and, 79.8% and 90.3% on the Boston Directions corpus. The phrase structure detection through prosodic break index labeling provides accuracies of 84% and 87% on the two corpora, respectively. The reported results are significantly better than previously reported results and demonstrate the strength of maximum entropy model in jointly modeling simple lexical, syntactic, and acoustic features for automatic prosody labeling. PMID:19603083

  2. Holographic equipartition and the maximization of entropy

    NASA Astrophysics Data System (ADS)

    Krishna, P. B.; Mathew, Titus K.

    2017-09-01

    The accelerated expansion of the Universe can be interpreted as a tendency to satisfy holographic equipartition. It can be expressed by a simple law, Δ V =Δ t (Nsurf-ɛ Nbulk) , where V is the Hubble volume in Planck units, t is the cosmic time in Planck units, and Nsurf /bulk is the number of degrees of freedom on the horizon/bulk of the Universe. We show that this holographic equipartition law effectively implies the maximization of entropy. In the cosmological context, a system that obeys the holographic equipartition law behaves as an ordinary macroscopic system that proceeds to an equilibrium state of maximum entropy. We consider the standard Λ CDM model of the Universe and show that it is consistent with the holographic equipartition law. Analyzing the entropy evolution, we find that it also proceeds to an equilibrium state of maximum entropy.

  3. How the Second Law of Thermodynamics Has Informed Ecosystem Ecology through Its History

    NASA Astrophysics Data System (ADS)

    Chapman, E. J.; Childers, D. L.; Vallino, J. J.

    2014-12-01

    Throughout the history of ecosystem ecology many attempts have been made to develop a general principle governing how systems develop and organize. We reviewed the historical developments that led to conceptualization of several goal-oriented principles in ecosystem ecology and the relationships among them. We focused our review on two prominent principles—the Maximum Power Principle and the Maximum Entropy Production Principle—and the literature that applies to both. While these principles have considerable conceptual overlap and both use concepts in physics (power and entropy), we found considerable differences in their historical development, the disciplines that apply these principles, and their adoption in the literature. We reviewed the literature using Web of Science keyword searches for the MPP, the MEPP, as well as for papers that cited pioneers in the MPP and the MEPP development. From the 6000 papers that our keyword searches returned, we limited our further meta-analysis to 32 papers by focusing on studies with a foundation in ecosystems research. Despite these seemingly disparate pasts, we concluded that the conceptual approaches of these two principles were more similar than dissimilar and that maximization of power in ecosystems occurs with maximum entropy production. We also found that these two principles have great potential to explain how systems develop, organize, and function, but there are no widely agreed upon theoretical derivations for the MEPP or the MPP, possibly hindering their broader use in ecological research. We end with recommendations for how ecosystems-level studies may better use these principles.

  4. Entropic Inference

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  5. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Treesearch

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  6. Classic Maximum Entropy Recovery of the Average Joint Distribution of Apparent FRET Efficiency and Fluorescence Photons for Single-molecule Burst Measurements

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2012-01-01

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694

  7. Maximum efficiency of ideal heat engines based on a small system: correction to the Carnot efficiency at the nanoscale.

    PubMed

    Quan, H T

    2014-06-01

    We study the maximum efficiency of a heat engine based on a small system. It is revealed that due to the finiteness of the system, irreversibility may arise when the working substance contacts with a heat reservoir. As a result, there is a working-substance-dependent correction to the Carnot efficiency. We derive a general and simple expression for the maximum efficiency of a Carnot cycle heat engine in terms of the relative entropy. This maximum efficiency approaches the Carnot efficiency asymptotically when the size of the working substance increases to the thermodynamic limit. Our study extends Carnot's result of the maximum efficiency to an arbitrary working substance and elucidates the subtlety of thermodynamic laws in small systems.

  8. Identification of a Threshold Value for the DEMATEL Method: Using the Maximum Mean De-Entropy Algorithm

    NASA Astrophysics Data System (ADS)

    Chung-Wei, Li; Gwo-Hshiung, Tzeng

    To deal with complex problems, structuring them through graphical representations and analyzing causal influences can aid in illuminating complex issues, systems, or concepts. The DEMATEL method is a methodology which can be used for researching and solving complicated and intertwined problem groups. The end product of the DEMATEL process is a visual representation—the impact-relations map—by which respondents organize their own actions in the world. The applicability of the DEMATEL method is widespread, ranging from analyzing world problematique decision making to industrial planning. The most important property of the DEMATEL method used in the multi-criteria decision making (MCDM) field is to construct interrelations between criteria. In order to obtain a suitable impact-relations map, an appropriate threshold value is needed to obtain adequate information for further analysis and decision-making. In this paper, we propose a method based on the entropy approach, the maximum mean de-entropy algorithm, to achieve this purpose. Using real cases to find the interrelationships between the criteria for evaluating effects in E-learning programs as an examples, we will compare the results obtained from the respondents and from our method, and discuss that the different impact-relations maps from these two methods.

  9. Economics and Maximum Entropy Production

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.

    2003-04-01

    Price differentials, sales volume and profit can be seen as analogues of temperature difference, heat flow and work or entropy production in the climate system. One aspect in which economic systems exhibit more clarity than the climate is that the empirical and/or statistical mechanical tendency for systems to seek a maximum in production is very evident in economics, in that the profit motive is very clear. Noting the common link between 1/f noise, power laws and Self-Organized Criticality with Maximum Entropy Production, the power law fluctuations in security and commodity prices is not inconsistent with the analogy. There is an additional thermodynamic analogy, in that scarcity is valued. A commodity concentrated among a few traders is valued highly by the many who do not have it. The market therefore encourages via prices the spreading of those goods among a wider group, just as heat tends to diffuse, increasing entropy. I explore some empirical price-volume relationships of metals and meteorites in this context.

  10. Image construction from the IRAS survey and data fusion

    NASA Technical Reports Server (NTRS)

    Bontekoe, Tj. R.

    1990-01-01

    The IRAS survey data can be used successfully to produce images of extended objects. The major difficulty, viz. non-uniform sampling, different response functions for each detector, and varying signal-to-noise levels for each detector for each scan, were resolved. The results of three different image construction techniques are compared: co-addition, constrained least squares, and maximum entropy. The maximum entropy result is superior. An image of the galaxy M51 with an average spatial resolution of 45 arc seconds, is presented using 60 micron survey data. This exceeds the telescope diffraction limit of 1 minute of arc, at this wavelength. Data fusion is a proposed method for combining data from different instruments, with different spatial resolutions, at different wavelengths. Direct estimates of the physical parameters, temperature, density and composition, can be made from the data without prior images (re-)construction. An increase in the accuracy of these parameters is expected as the result of this more systematic approach.

  11. Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.

    PubMed

    O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao

    2017-07-01

    Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.

  12. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    USGS Publications Warehouse

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  13. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  14. Time dependence of Hawking radiation entropy

    NASA Astrophysics Data System (ADS)

    Page, Don N.

    2013-09-01

    If a black hole starts in a pure quantum state and evaporates completely by a unitary process, the von Neumann entropy of the Hawking radiation initially increases and then decreases back to zero when the black hole has disappeared. Here numerical results are given for an approximation to the time dependence of the radiation entropy under an assumption of fast scrambling, for large nonrotating black holes that emit essentially only photons and gravitons. The maximum of the von Neumann entropy then occurs after about 53.81% of the evaporation time, when the black hole has lost about 40.25% of its original Bekenstein-Hawking (BH) entropy (an upper bound for its von Neumann entropy) and then has a BH entropy that equals the entropy in the radiation, which is about 59.75% of the original BH entropy 4πM02, or about 7.509M02 ≈ 6.268 × 1076(M0/Msolar)2, using my 1976 calculations that the photon and graviton emission process into empty space gives about 1.4847 times the BH entropy loss of the black hole. Results are also given for black holes in initially impure states. If the black hole starts in a maximally mixed state, the von Neumann entropy of the Hawking radiation increases from zero up to a maximum of about 119.51% of the original BH entropy, or about 15.018M02 ≈ 1.254 × 1077(M0/Msolar)2, and then decreases back down to 4πM02 = 1.049 × 1077(M0/Msolar)2.

  15. Using maximum entropy modeling to identify and prioritize red spruce forest habitat in West Virginia

    Treesearch

    Nathan R. Beane; James S. Rentch; Thomas M. Schuler

    2013-01-01

    Red spruce forests in West Virginia are found in island-like distributions at high elevations and provide essential habitat for the endangered Cheat Mountain salamander and the recently delisted Virginia northern flying squirrel. Therefore, it is important to identify restoration priorities of red spruce forests. Maximum entropy modeling was used to identify areas of...

  16. n-Order and maximum fuzzy similarity entropy for discrimination of signals of different complexity: Application to fetal heart rate signals.

    PubMed

    Zaylaa, Amira; Oudjemia, Souad; Charara, Jamal; Girault, Jean-Marc

    2015-09-01

    This paper presents two new concepts for discrimination of signals of different complexity. The first focused initially on solving the problem of setting entropy descriptors by varying the pattern size instead of the tolerance. This led to the search for the optimal pattern size that maximized the similarity entropy. The second paradigm was based on the n-order similarity entropy that encompasses the 1-order similarity entropy. To improve the statistical stability, n-order fuzzy similarity entropy was proposed. Fractional Brownian motion was simulated to validate the different methods proposed, and fetal heart rate signals were used to discriminate normal from abnormal fetuses. In all cases, it was found that it was possible to discriminate time series of different complexity such as fractional Brownian motion and fetal heart rate signals. The best levels of performance in terms of sensitivity (90%) and specificity (90%) were obtained with the n-order fuzzy similarity entropy. However, it was shown that the optimal pattern size and the maximum similarity measurement were related to intrinsic features of the time series. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Forest Tree Species Distribution Mapping Using Landsat Satellite Imagery and Topographic Variables with the Maximum Entropy Method in Mongolia

    NASA Astrophysics Data System (ADS)

    Hao Chiang, Shou; Valdez, Miguel; Chen, Chi-Farn

    2016-06-01

    Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM) were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface reflectance coupled with terrain variables produced better result, with the higher overall accuracy and kappa coefficient than first experiment. The results indicate that the Maximum Entropy method is an applicable, and to classify tree species using satellite imagery data coupled with terrain information can improve the classification of tree species in the study area.

  18. Automatic recognition of topic-classified relations between prostate cancer and genes using MEDLINE abstracts

    PubMed Central

    Chun, Hong-Woo; Tsuruoka, Yoshimasa; Kim, Jin-Dong; Shiba, Rie; Nagata, Naoki; Hishiki, Teruyoshi; Tsujii, Jun'ichi

    2006-01-01

    Background Automatic recognition of relations between a specific disease term and its relevant genes or protein terms is an important practice of bioinformatics. Considering the utility of the results of this approach, we identified prostate cancer and gene terms with the ID tags of public biomedical databases. Moreover, considering that genetics experts will use our results, we classified them based on six topics that can be used to analyze the type of prostate cancers, genes, and their relations. Methods We developed a maximum entropy-based named entity recognizer and a relation recognizer and applied them to a corpus-based approach. We collected prostate cancer-related abstracts from MEDLINE, and constructed an annotated corpus of gene and prostate cancer relations based on six topics by biologists. We used it to train the maximum entropy-based named entity recognizer and relation recognizer. Results Topic-classified relation recognition achieved 92.1% precision for the relation (an increase of 11.0% from that obtained in a baseline experiment). For all topics, the precision was between 67.6 and 88.1%. Conclusion A series of experimental results revealed two important findings: a carefully designed relation recognition system using named entity recognition can improve the performance of relation recognition, and topic-classified relation recognition can be effectively addressed through a corpus-based approach using manual annotation and machine learning techniques. PMID:17134477

  19. Automatic recognition of topic-classified relations between prostate cancer and genes using MEDLINE abstracts.

    PubMed

    Chun, Hong-Woo; Tsuruoka, Yoshimasa; Kim, Jin-Dong; Shiba, Rie; Nagata, Naoki; Hishiki, Teruyoshi; Tsujii, Jun'ichi

    2006-11-24

    Automatic recognition of relations between a specific disease term and its relevant genes or protein terms is an important practice of bioinformatics. Considering the utility of the results of this approach, we identified prostate cancer and gene terms with the ID tags of public biomedical databases. Moreover, considering that genetics experts will use our results, we classified them based on six topics that can be used to analyze the type of prostate cancers, genes, and their relations. We developed a maximum entropy-based named entity recognizer and a relation recognizer and applied them to a corpus-based approach. We collected prostate cancer-related abstracts from MEDLINE, and constructed an annotated corpus of gene and prostate cancer relations based on six topics by biologists. We used it to train the maximum entropy-based named entity recognizer and relation recognizer. Topic-classified relation recognition achieved 92.1% precision for the relation (an increase of 11.0% from that obtained in a baseline experiment). For all topics, the precision was between 67.6 and 88.1%. A series of experimental results revealed two important findings: a carefully designed relation recognition system using named entity recognition can improve the performance of relation recognition, and topic-classified relation recognition can be effectively addressed through a corpus-based approach using manual annotation and machine learning techniques.

  20. The moving-window Bayesian Maximum Entropy framework: Estimation of PM2.5 yearly average concentration across the contiguous United States

    PubMed Central

    Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L.

    2013-01-01

    Geostatistical methods are widely used in estimating long-term exposures for air pollution epidemiological studies, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian Maximum Entropy (BME) method and applied this framework to estimate fine particulate matter (PM2.5) yearly average concentrations over the contiguous U.S. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingnees in the air monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM2.5 data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM2.5. Moreover, the MWBME method further reduces the MSE by 8.4% to 43.7% with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM2.5 across large geographical domains with expected spatial non-stationarity. PMID:22739679

  1. Efficient Bayesian experimental design for contaminant source identification

    NASA Astrophysics Data System (ADS)

    Zhang, Jiangjiang; Zeng, Lingzao; Chen, Cheng; Chen, Dingjiang; Wu, Laosheng

    2015-01-01

    In this study, an efficient full Bayesian approach is developed for the optimal sampling well location design and source parameters identification of groundwater contaminants. An information measure, i.e., the relative entropy, is employed to quantify the information gain from concentration measurements in identifying unknown parameters. In this approach, the sampling locations that give the maximum expected relative entropy are selected as the optimal design. After the sampling locations are determined, a Bayesian approach based on Markov Chain Monte Carlo (MCMC) is used to estimate unknown parameters. In both the design and estimation, the contaminant transport equation is required to be solved many times to evaluate the likelihood. To reduce the computational burden, an interpolation method based on the adaptive sparse grid is utilized to construct a surrogate for the contaminant transport equation. The approximated likelihood can be evaluated directly from the surrogate, which greatly accelerates the design and estimation process. The accuracy and efficiency of our approach are demonstrated through numerical case studies. It is shown that the methods can be used to assist in both single sampling location and monitoring network design for contaminant source identifications in groundwater.

  2. Interatomic potentials in condensed matter via the maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Carlsson, A. E.

    1987-09-01

    A general method is described for the calculation of interatomic potentials in condensed-matter systems by use of a maximum-entropy Ansatz for the interatomic correlation functions. The interatomic potentials are given explicitly in terms of statistical correlation functions involving the potential energy and the structure factor of a ``reference medium.'' Illustrations are given for Al-Cu alloys and a model transition metal.

  3. Stochastic characteristics of different duration annual maximum rainfall and its spatial difference in China based on information entropy

    NASA Astrophysics Data System (ADS)

    Li, X.; Sang, Y. F.

    2017-12-01

    Mountain torrents, urban floods and other disasters caused by extreme precipitation bring great losses to the ecological environment, social and economic development, people's lives and property security. So there is of great significance to floods prevention and control by the study of its spatial distribution. Based on the annual maximum rainfall data of 60min, 6h and 24h, the paper generate long sequences following Pearson-III distribution, and then use the information entropy index to study the spatial distribution and difference of different duration. The results show that the information entropy value of annual maximum rainfall in the south region is greater than that in the north region, indicating more obvious stochastic characteristics of annual maximum rainfall in the latter. However, the spatial distribution of stochastic characteristics is different in different duration. For example, stochastic characteristics of 60min annual maximum rainfall in the Eastern Tibet is smaller than surrounding, but 6h and 24h annual maximum rainfall is larger than surrounding area. In the Haihe River Basin and the Huaihe River Basin, the stochastic characteristics of the 60min annual maximum rainfall was not significantly different from that in the surrounding area, and stochastic characteristics of 6h and 24h was smaller than that in the surrounding area. We conclude that the spatial distribution of information entropy values of annual maximum rainfall in different duration can reflect the spatial distribution of its stochastic characteristics, thus the results can be an importantly scientific basis for the flood prevention and control, agriculture, economic-social developments and urban flood control and waterlogging.

  4. Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: a basis for q-exponential distributions.

    PubMed

    Abe, Sumiyoshi

    2002-10-01

    The q-exponential distributions, which are generalizations of the Zipf-Mandelbrot power-law distribution, are frequently encountered in complex systems at their stationary states. From the viewpoint of the principle of maximum entropy, they can apparently be derived from three different generalized entropies: the Rényi entropy, the Tsallis entropy, and the normalized Tsallis entropy. Accordingly, mere fittings of observed data by the q-exponential distributions do not lead to identification of the correct physical entropy. Here, stabilities of these entropies, i.e., their behaviors under arbitrary small deformation of a distribution, are examined. It is shown that, among the three, the Tsallis entropy is stable and can provide an entropic basis for the q-exponential distributions, whereas the others are unstable and cannot represent any experimentally observable quantities.

  5. Time dependence of Hawking radiation entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, Don N., E-mail: profdonpage@gmail.com

    2013-09-01

    If a black hole starts in a pure quantum state and evaporates completely by a unitary process, the von Neumann entropy of the Hawking radiation initially increases and then decreases back to zero when the black hole has disappeared. Here numerical results are given for an approximation to the time dependence of the radiation entropy under an assumption of fast scrambling, for large nonrotating black holes that emit essentially only photons and gravitons. The maximum of the von Neumann entropy then occurs after about 53.81% of the evaporation time, when the black hole has lost about 40.25% of its originalmore » Bekenstein-Hawking (BH) entropy (an upper bound for its von Neumann entropy) and then has a BH entropy that equals the entropy in the radiation, which is about 59.75% of the original BH entropy 4πM{sub 0}{sup 2}, or about 7.509M{sub 0}{sup 2} ≈ 6.268 × 10{sup 76}(M{sub 0}/M{sub s}un){sup 2}, using my 1976 calculations that the photon and graviton emission process into empty space gives about 1.4847 times the BH entropy loss of the black hole. Results are also given for black holes in initially impure states. If the black hole starts in a maximally mixed state, the von Neumann entropy of the Hawking radiation increases from zero up to a maximum of about 119.51% of the original BH entropy, or about 15.018M{sub 0}{sup 2} ≈ 1.254 × 10{sup 77}(M{sub 0}/M{sub s}un){sup 2}, and then decreases back down to 4πM{sub 0}{sup 2} = 1.049 × 10{sup 77}(M{sub 0}/M{sub s}un){sup 2}.« less

  6. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models

    PubMed Central

    Grün, Sonja; Helias, Moritz

    2017-01-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition. PMID:28968396

  7. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models.

    PubMed

    Rostami, Vahid; Porta Mana, PierGianLuca; Grün, Sonja; Helias, Moritz

    2017-10-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition.

  8. Cosmic equilibration: A holographic no-hair theorem from the generalized second law

    NASA Astrophysics Data System (ADS)

    Carroll, Sean M.; Chatwin-Davies, Aidan

    2018-02-01

    In a wide class of cosmological models, a positive cosmological constant drives cosmological evolution toward an asymptotically de Sitter phase. Here we connect this behavior to the increase of entropy over time, based on the idea that de Sitter spacetime is a maximum-entropy state. We prove a cosmic no-hair theorem for Robertson-Walker and Bianchi I spacetimes that admit a Q-screen ("quantum" holographic screen) with certain entropic properties: If generalized entropy, in the sense of the cosmological version of the generalized second law conjectured by Bousso and Engelhardt, increases up to a finite maximum value along the screen, then the spacetime is asymptotically de Sitter in the future. Moreover, the limiting value of generalized entropy coincides with the de Sitter horizon entropy. We do not use the Einstein field equations in our proof, nor do we assume the existence of a positive cosmological constant. As such, asymptotic relaxation to a de Sitter phase can, in a precise sense, be thought of as cosmological equilibration.

  9. Hydrodynamic cavitation: from theory towards a new experimental approach

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto; Gervino, Gianpiero

    2009-09-01

    Hydrodynamic cavitation is analysed by a global thermodynamics principle following an approach based on the maximum irreversible entropy variation that has already given promising results for open systems and has been successfully applied in specific engineering problems. In this paper we present a new phenomenological method to evaluate the conditions inducing cavitation. We think this method could be useful in the design of turbo-machineries and related technologies: it represents both an original physical approach to cavitation and an economical saving in planning because the theoretical analysis could allow engineers to reduce the experimental tests and the costs of the design process.

  10. Maximum Entropy for the International Division of Labor.

    PubMed

    Lei, Hongmei; Chen, Ying; Li, Ruiqi; He, Deli; Zhang, Jiang

    2015-01-01

    As a result of the international division of labor, the trade value distribution on different products substantiated by international trade flows can be regarded as one country's strategy for competition. According to the empirical data of trade flows, countries may spend a large fraction of export values on ubiquitous and competitive products. Meanwhile, countries may also diversify their exports share on different types of products to reduce the risk. In this paper, we report that the export share distribution curves can be derived by maximizing the entropy of shares on different products under the product's complexity constraint once the international market structure (the country-product bipartite network) is given. Therefore, a maximum entropy model provides a good fit to empirical data. The empirical data is consistent with maximum entropy subject to a constraint on the expected value of the product complexity for each country. One country's strategy is mainly determined by the types of products this country can export. In addition, our model is able to fit the empirical export share distribution curves of nearly every country very well by tuning only one parameter.

  11. Maximum Entropy for the International Division of Labor

    PubMed Central

    Lei, Hongmei; Chen, Ying; Li, Ruiqi; He, Deli; Zhang, Jiang

    2015-01-01

    As a result of the international division of labor, the trade value distribution on different products substantiated by international trade flows can be regarded as one country’s strategy for competition. According to the empirical data of trade flows, countries may spend a large fraction of export values on ubiquitous and competitive products. Meanwhile, countries may also diversify their exports share on different types of products to reduce the risk. In this paper, we report that the export share distribution curves can be derived by maximizing the entropy of shares on different products under the product’s complexity constraint once the international market structure (the country-product bipartite network) is given. Therefore, a maximum entropy model provides a good fit to empirical data. The empirical data is consistent with maximum entropy subject to a constraint on the expected value of the product complexity for each country. One country’s strategy is mainly determined by the types of products this country can export. In addition, our model is able to fit the empirical export share distribution curves of nearly every country very well by tuning only one parameter. PMID:26172052

  12. Maximum entropy production in environmental and ecological systems.

    PubMed

    Kleidon, Axel; Malhi, Yadvinder; Cox, Peter M

    2010-05-12

    The coupled biosphere-atmosphere system entails a vast range of processes at different scales, from ecosystem exchange fluxes of energy, water and carbon to the processes that drive global biogeochemical cycles, atmospheric composition and, ultimately, the planetary energy balance. These processes are generally complex with numerous interactions and feedbacks, and they are irreversible in their nature, thereby producing entropy. The proposed principle of maximum entropy production (MEP), based on statistical mechanics and information theory, states that thermodynamic processes far from thermodynamic equilibrium will adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate. This issue focuses on the latest development of applications of MEP to the biosphere-atmosphere system including aspects of the atmospheric circulation, the role of clouds, hydrology, vegetation effects, ecosystem exchange of energy and mass, biogeochemical interactions and the Gaia hypothesis. The examples shown in this special issue demonstrate the potential of MEP to contribute to improved understanding and modelling of the biosphere and the wider Earth system, and also explore limitations and constraints to the application of the MEP principle.

  13. Propane spectral resolution enhancement by the maximum entropy method

    NASA Technical Reports Server (NTRS)

    Bonavito, N. L.; Stewart, K. P.; Hurley, E. J.; Yeh, K. C.; Inguva, R.

    1990-01-01

    The Burg algorithm for maximum entropy power spectral density estimation is applied to a time series of data obtained from a Michelson interferometer and compared with a standard FFT estimate for resolution capability. The propane transmittance spectrum was estimated by use of the FFT with a 2 to the 18th data sample interferogram, giving a maximum unapodized resolution of 0.06/cm. This estimate was then interpolated by zero filling an additional 2 to the 18th points, and the final resolution was taken to be 0.06/cm. Comparison of the maximum entropy method (MEM) estimate with the FFT was made over a 45/cm region of the spectrum for several increasing record lengths of interferogram data beginning at 2 to the 10th. It is found that over this region the MEM estimate with 2 to the 16th data samples is in close agreement with the FFT estimate using 2 to the 18th samples.

  14. Entropy production rate as a criterion for inconsistency in decision theory

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.

    2018-05-01

    Individual and group decisions are complex, often involving choosing an apt alternative from a multitude of options. Evaluating pairwise comparisons breaks down such complex decision problems into tractable ones. Pairwise comparison matrices (PCMs) are regularly used to solve multiple-criteria decision-making problems, for example, using Saaty’s analytic hierarchy process (AHP) framework. However, there are two significant drawbacks of using PCMs. First, humans evaluate PCMs in an inconsistent manner. Second, not all entries of a large PCM can be reliably filled by human decision makers. We address these two issues by first establishing a novel connection between PCMs and time-irreversible Markov processes. Specifically, we show that every PCM induces a family of dissipative maximum path entropy random walks (MERW) over the set of alternatives. We show that only ‘consistent’ PCMs correspond to detailed balanced MERWs. We identify the non-equilibrium entropy production in the induced MERWs as a metric of inconsistency of the underlying PCMs. Notably, the entropy production satisfies all of the recently laid out criteria for reasonable consistency indices. We also propose an approach to use incompletely filled PCMs in AHP. Potential future avenues are discussed as well.

  15. Using the Maximum Entropy Principle as a Unifying Theory Characterization and Sampling of Multi-Scaling Processes in Hydrometeorology

    DTIC Science & Technology

    2015-08-20

    evapotranspiration (ET) over oceans may be significantly lower than previously thought. The MEP model parameterized turbulent transfer coefficients...fluxes, ocean freshwater fluxes, regional crop yield among others. An on-going study suggests that the global annual evapotranspiration (ET) over...Bras, Jingfeng Wang. A model of evapotranspiration based on the theory of maximum entropy production, Water Resources Research, (03 2011): 0. doi

  16. Hydrodynamic equations for electrons in graphene obtained from the maximum entropy principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barletti, Luigi, E-mail: luigi.barletti@unifi.it

    2014-08-15

    The maximum entropy principle is applied to the formal derivation of isothermal, Euler-like equations for semiclassical fermions (electrons and holes) in graphene. After proving general mathematical properties of the equations so obtained, their asymptotic form corresponding to significant physical regimes is investigated. In particular, the diffusive regime, the Maxwell-Boltzmann regime (high temperature), the collimation regime and the degenerate gas limit (vanishing temperature) are considered.

  17. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  18. Assessing Resilience in Power Grids as a Particular Case of Supply Chain Management

    DTIC Science & Technology

    2010-03-01

    system , the budget needs, or the subject in question, would point to a differentiated approach. Table 1. Protection and Resilience Relationship...coast. Likewise, the US National Oceanic and Atmospheric Administration (NOAA) publishes 19 statistics about severe weather. Climatological models...toward maximum entropy . However, living systems are “open” in the sense that they continually draw upon external sources of energy and maintain a

  19. Estimation of Fine Particulate Matter in Taipei Using Landuse Regression and Bayesian Maximum Entropy Methods

    PubMed Central

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-01-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005–2007. PMID:21776223

  20. Lattice NRQCD study of S- and P-wave bottomonium states in a thermal medium with Nf=2 +1 light flavors

    NASA Astrophysics Data System (ADS)

    Kim, Seyong; Petreczky, Peter; Rothkopf, Alexander

    2015-03-01

    We investigate the properties of S - and P -wave bottomonium states in the vicinity of the deconfinement transition temperature. The light degrees of freedom are represented by dynamical lattice quantum chromodynamics (QCD) configurations of the HotQCD collaboration with Nf=2 +1 flavors. Bottomonium correlators are obtained from bottom quark propagators, computed in nonrelativistic QCD under the background of these gauge field configurations. The spectral functions for the 3S1 (ϒ ) and 3P1 (χb 1) channel are extracted from the Euclidean time correlators using a novel Bayesian approach in the temperature region 140 MeV ≤T ≤249 MeV and the results are contrasted to those from the standard maximum entropy method. We find that the new Bayesian approach is far superior to the maximum entropy method. It enables us to study reliably the presence or absence of the lowest state signal in the spectral function of a certain channel, even under the limitations present in the finite temperature setup. We find that χb 1 survives up to T =249 MeV , the highest temperature considered in our study, and put stringent constraints on the size of the medium modification of ϒ and χb 1 states.

  1. Estimation of fine particulate matter in Taipei using landuse regression and bayesian maximum entropy methods.

    PubMed

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-06-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.

  2. Chemical library subset selection algorithms: a unified derivation using spatial statistics.

    PubMed

    Hamprecht, Fred A; Thiel, Walter; van Gunsteren, Wilfred F

    2002-01-01

    If similar compounds have similar activity, rational subset selection becomes superior to random selection in screening for pharmacological lead discovery programs. Traditional approaches to this experimental design problem fall into two classes: (i) a linear or quadratic response function is assumed (ii) some space filling criterion is optimized. The assumptions underlying the first approach are clear but not always defendable; the second approach yields more intuitive designs but lacks a clear theoretical foundation. We model activity in a bioassay as realization of a stochastic process and use the best linear unbiased estimator to construct spatial sampling designs that optimize the integrated mean square prediction error, the maximum mean square prediction error, or the entropy. We argue that our approach constitutes a unifying framework encompassing most proposed techniques as limiting cases and sheds light on their underlying assumptions. In particular, vector quantization is obtained, in dimensions up to eight, in the limiting case of very smooth response surfaces for the integrated mean square error criterion. Closest packing is obtained for very rough surfaces under the integrated mean square error and entropy criteria. We suggest to use either the integrated mean square prediction error or the entropy as optimization criteria rather than approximations thereof and propose a scheme for direct iterative minimization of the integrated mean square prediction error. Finally, we discuss how the quality of chemical descriptors manifests itself and clarify the assumptions underlying the selection of diverse or representative subsets.

  3. Moisture sorption isotherms and thermodynamic properties of bovine leather

    NASA Astrophysics Data System (ADS)

    Fakhfakh, Rihab; Mihoubi, Daoued; Kechaou, Nabil

    2018-04-01

    This study was aimed at the determination of bovine leather moisture sorption characteristics using a static gravimetric method at 30, 40, 50, 60 and 70 °C. The curves exhibit type II behaviour according to the BET classification. The sorption isotherms fitting by seven equations shows that GAB model is able to reproduce the equilibrium moisture content evolution with water activity for moisture range varying from 0.02 to 0.83 kg/kg d.b (0.9898 < R2 < 0.999). The sorption isotherms exhibit hysteresis effect. Additionally, sorption isotherms data were used to determine the thermodynamic properties such as isosteric heat of sorption, sorption entropy, spreading pressure, net integral enthalpy and entropy. Net isosteric heat of sorption and differential entropy were evaluated through direct use of moisture isotherms by applying the Clausius-Clapeyron equation and used to investigate the enthalpy-entropy compensation theory. Both sorption enthalpy and entropy for desorption increase to a maximum with increasing moisture content, and then decrease sharply with rising moisture content. Adsorption enthalpy decreases with increasing moisture content. Whereas, adsorption entropy increases smoothly with increasing moisture content to a maximum of 6.29 J/K.mol. Spreading pressure increases with rising water activity. The net integral enthalpy seemed to decrease and then increase to become asymptotic. The net integral entropy decreased with moisture content increase.

  4. Validating predictions from climate envelope models

    USGS Publications Warehouse

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  5. Entropy jump across an inviscid shock wave

    NASA Technical Reports Server (NTRS)

    Salas, Manuel D.; Iollo, Angelo

    1995-01-01

    The shock jump conditions for the Euler equations in their primitive form are derived by using generalized functions. The shock profiles for specific volume, speed, and pressure are shown to be the same, however density has a different shock profile. Careful study of the equations that govern the entropy shows that the inviscid entropy profile has a local maximum within the shock layer. We demonstrate that because of this phenomenon, the entropy, propagation equation cannot be used as a conservation law.

  6. An improved wavelet neural network medical image segmentation algorithm with combined maximum entropy

    NASA Astrophysics Data System (ADS)

    Hu, Xiaoqian; Tao, Jinxu; Ye, Zhongfu; Qiu, Bensheng; Xu, Jinzhang

    2018-05-01

    In order to solve the problem of medical image segmentation, a wavelet neural network medical image segmentation algorithm based on combined maximum entropy criterion is proposed. Firstly, we use bee colony algorithm to optimize the network parameters of wavelet neural network, get the parameters of network structure, initial weights and threshold values, and so on, we can quickly converge to higher precision when training, and avoid to falling into relative extremum; then the optimal number of iterations is obtained by calculating the maximum entropy of the segmented image, so as to achieve the automatic and accurate segmentation effect. Medical image segmentation experiments show that the proposed algorithm can reduce sample training time effectively and improve convergence precision, and segmentation effect is more accurate and effective than traditional BP neural network (back propagation neural network : a multilayer feed forward neural network which trained according to the error backward propagation algorithm.

  7. Maximum entropy deconvolution of the optical jet of 3C 273

    NASA Technical Reports Server (NTRS)

    Evans, I. N.; Ford, H. C.; Hui, X.

    1989-01-01

    The technique of maximum entropy image restoration is applied to the problem of deconvolving the point spread function from a deep, high-quality V band image of the optical jet of 3C 273. The resulting maximum entropy image has an approximate spatial resolution of 0.6 arcsec and has been used to study the morphology of the optical jet. Four regularly-spaced optical knots are clearly evident in the data, together with an optical 'extension' at each end of the optical jet. The jet oscillates around its center of gravity, and the spatial scale of the oscillations is very similar to the spacing between the optical knots. The jet is marginally resolved in the transverse direction and has an asymmetric profile perpendicular to the jet axis. The distribution of V band flux along the length of the jet, and accurate astrometry of the optical knot positions are presented.

  8. Local image statistics: maximum-entropy constructions and perceptual salience

    PubMed Central

    Victor, Jonathan D.; Conte, Mary M.

    2012-01-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397

  9. Quantum entropy and uncertainty for two-mode squeezed, coherent and intelligent spin states

    NASA Technical Reports Server (NTRS)

    Aragone, C.; Mundarain, D.

    1993-01-01

    We compute the quantum entropy for monomode and two-mode systems set in squeezed states. Thereafter, the quantum entropy is also calculated for angular momentum algebra when the system is either in a coherent or in an intelligent spin state. These values are compared with the corresponding values of the respective uncertainties. In general, quantum entropies and uncertainties have the same minimum and maximum points. However, for coherent and intelligent spin states, it is found that some minima for the quantum entropy turn out to be uncertainty maxima. We feel that the quantum entropy we use provides the right answer, since it is given in an essentially unique way.

  10. Entropy and climate. I - ERBE observations of the entropy production of the earth

    NASA Technical Reports Server (NTRS)

    Stephens, G. L.; O'Brien, D. M.

    1993-01-01

    An approximate method for estimating the global distributions of the entropy fluxes flowing through the upper boundary of the climate system is introduced, and an estimate of the entropy exchange between the earth and space and the entropy production of the planet is provided. Entropy fluxes calculated from the Earth Radiation Budget Experiment measurements show how the long-wave entropy flux densities dominate the total entropy fluxes at all latitudes compared with the entropy flux densities associated with reflected sunlight, although the short-wave flux densities are important in the context of clear sky-cloudy sky net entropy flux differences. It is suggested that the entropy production of the planet is both constant for the 36 months of data considered and very near its maximum possible value. The mean value of this production is 0.68 x 10 exp 15 W/K, and the amplitude of the annual cycle is approximately 1 to 2 percent of this value.

  11. Biological evolution of replicator systems: towards a quantitative approach.

    PubMed

    Martin, Osmel; Horvath, J E

    2013-04-01

    The aim of this work is to study the features of a simple replicator chemical model of the relation between kinetic stability and entropy production under the action of external perturbations. We quantitatively explore the different paths leading to evolution in a toy model where two independent replicators compete for the same substrate. To do that, the same scenario described originally by Pross (J Phys Org Chem 17:312-316, 2004) is revised and new criteria to define the kinetic stability are proposed. Our results suggest that fast replicator populations are continually favored by the effects of strong stochastic environmental fluctuations capable to determine the global population, the former assumed to be the only acting evolution force. We demonstrate that the process is continually driven by strong perturbations only, and that population crashes may be useful proxies for these catastrophic environmental fluctuations. As expected, such behavior is particularly enhanced under very large scale perturbations, suggesting a likely dynamical footprint in the recovery patterns of new species after mass extinction events in the Earth's geological past. Furthermore, the hypothesis that natural selection always favors the faster processes may give theoretical support to different studies that claim the applicability of maximum principles like the Maximum Metabolic Flux (MMF) or Maximum Entropy Productions Principle (MEPP), seen as the main goal of biological evolution.

  12. Biological Evolution of Replicator Systems: Towards a Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Martin, Osmel; Horvath, J. E.

    2013-04-01

    The aim of this work is to study the features of a simple replicator chemical model of the relation between kinetic stability and entropy production under the action of external perturbations. We quantitatively explore the different paths leading to evolution in a toy model where two independent replicators compete for the same substrate. To do that, the same scenario described originally by Pross (J Phys Org Chem 17:312-316, 2004) is revised and new criteria to define the kinetic stability are proposed. Our results suggest that fast replicator populations are continually favored by the effects of strong stochastic environmental fluctuations capable to determine the global population, the former assumed to be the only acting evolution force. We demonstrate that the process is continually driven by strong perturbations only, and that population crashes may be useful proxies for these catastrophic environmental fluctuations. As expected, such behavior is particularly enhanced under very large scale perturbations, suggesting a likely dynamical footprint in the recovery patterns of new species after mass extinction events in the Earth's geological past. Furthermore, the hypothesis that natural selection always favors the faster processes may give theoretical support to different studies that claim the applicability of maximum principles like the Maximum Metabolic Flux (MMF) or Maximum Entropy Productions Principle (MEPP), seen as the main goal of biological evolution.

  13. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  14. An Improved Evidential-IOWA Sensor Data Fusion Approach in Fault Diagnosis

    PubMed Central

    Zhou, Deyun; Zhuang, Miaoyan; Fang, Xueyi; Xie, Chunhe

    2017-01-01

    As an important tool of information fusion, Dempster–Shafer evidence theory is widely applied in handling the uncertain information in fault diagnosis. However, an incorrect result may be obtained if the combined evidence is highly conflicting, which may leads to failure in locating the fault. To deal with the problem, an improved evidential-Induced Ordered Weighted Averaging (IOWA) sensor data fusion approach is proposed in the frame of Dempster–Shafer evidence theory. In the new method, the IOWA operator is used to determine the weight of different sensor data source, while determining the parameter of the IOWA, both the distance of evidence and the belief entropy are taken into consideration. First, based on the global distance of evidence and the global belief entropy, the α value of IOWA is obtained. Simultaneously, a weight vector is given based on the maximum entropy method model. Then, according to IOWA operator, the evidence are modified before applying the Dempster’s combination rule. The proposed method has a better performance in conflict management and fault diagnosis due to the fact that the information volume of each evidence is taken into consideration. A numerical example and a case study in fault diagnosis are presented to show the rationality and efficiency of the proposed method. PMID:28927017

  15. A Bayesian maximum entropy-based methodology for optimal spatiotemporal design of groundwater monitoring networks.

    PubMed

    Hosseini, Marjan; Kerachian, Reza

    2017-09-01

    This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.

  16. Multivariate quadrature for representing cloud condensation nuclei activity of aerosol populations

    DOE PAGES

    Fierce, Laura; McGraw, Robert L.

    2017-07-26

    Here, sparse representations of atmospheric aerosols are needed for efficient regional- and global-scale chemical transport models. Here we introduce a new framework for representing aerosol distributions, based on the quadrature method of moments. Given a set of moment constraints, we show how linear programming, combined with an entropy-inspired cost function, can be used to construct optimized quadrature representations of aerosol distributions. The sparse representations derived from this approach accurately reproduce cloud condensation nuclei (CCN) activity for realistically complex distributions simulated by a particleresolved model. Additionally, the linear programming techniques described in this study can be used to bound key aerosolmore » properties, such as the number concentration of CCN. Unlike the commonly used sparse representations, such as modal and sectional schemes, the maximum-entropy approach described here is not constrained to pre-determined size bins or assumed distribution shapes. This study is a first step toward a particle-based aerosol scheme that will track multivariate aerosol distributions with sufficient computational efficiency for large-scale simulations.« less

  17. Multivariate quadrature for representing cloud condensation nuclei activity of aerosol populations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fierce, Laura; McGraw, Robert L.

    Here, sparse representations of atmospheric aerosols are needed for efficient regional- and global-scale chemical transport models. Here we introduce a new framework for representing aerosol distributions, based on the quadrature method of moments. Given a set of moment constraints, we show how linear programming, combined with an entropy-inspired cost function, can be used to construct optimized quadrature representations of aerosol distributions. The sparse representations derived from this approach accurately reproduce cloud condensation nuclei (CCN) activity for realistically complex distributions simulated by a particleresolved model. Additionally, the linear programming techniques described in this study can be used to bound key aerosolmore » properties, such as the number concentration of CCN. Unlike the commonly used sparse representations, such as modal and sectional schemes, the maximum-entropy approach described here is not constrained to pre-determined size bins or assumed distribution shapes. This study is a first step toward a particle-based aerosol scheme that will track multivariate aerosol distributions with sufficient computational efficiency for large-scale simulations.« less

  18. Competition between Homophily and Information Entropy Maximization in Social Networks

    PubMed Central

    Zhao, Jichang; Liang, Xiao; Xu, Ke

    2015-01-01

    In social networks, it is conventionally thought that two individuals with more overlapped friends tend to establish a new friendship, which could be stated as homophily breeding new connections. While the recent hypothesis of maximum information entropy is presented as the possible origin of effective navigation in small-world networks. We find there exists a competition between information entropy maximization and homophily in local structure through both theoretical and experimental analysis. This competition suggests that a newly built relationship between two individuals with more common friends would lead to less information entropy gain for them. We demonstrate that in the evolution of the social network, both of the two assumptions coexist. The rule of maximum information entropy produces weak ties in the network, while the law of homophily makes the network highly clustered locally and the individuals would obtain strong and trust ties. A toy model is also presented to demonstrate the competition and evaluate the roles of different rules in the evolution of real networks. Our findings could shed light on the social network modeling from a new perspective. PMID:26334994

  19. Nonequilibrium thermodynamics and maximum entropy production in the Earth system: applications and implications.

    PubMed

    Kleidon, Axel

    2009-06-01

    The Earth system is maintained in a unique state far from thermodynamic equilibrium, as, for instance, reflected in the high concentration of reactive oxygen in the atmosphere. The myriad of processes that transform energy, that result in the motion of mass in the atmosphere, in oceans, and on land, processes that drive the global water, carbon, and other biogeochemical cycles, all have in common that they are irreversible in their nature. Entropy production is a general consequence of these processes and measures their degree of irreversibility. The proposed principle of maximum entropy production (MEP) states that systems are driven to steady states in which they produce entropy at the maximum possible rate given the prevailing constraints. In this review, the basics of nonequilibrium thermodynamics are described, as well as how these apply to Earth system processes. Applications of the MEP principle are discussed, ranging from the strength of the atmospheric circulation, the hydrological cycle, and biogeochemical cycles to the role that life plays in these processes. Nonequilibrium thermodynamics and the MEP principle have potentially wide-ranging implications for our understanding of Earth system functioning, how it has evolved in the past, and why it is habitable. Entropy production allows us to quantify an objective direction of Earth system change (closer to vs further away from thermodynamic equilibrium, or, equivalently, towards a state of MEP). When a maximum in entropy production is reached, MEP implies that the Earth system reacts to perturbations primarily with negative feedbacks. In conclusion, this nonequilibrium thermodynamic view of the Earth system shows great promise to establish a holistic description of the Earth as one system. This perspective is likely to allow us to better understand and predict its function as one entity, how it has evolved in the past, and how it is modified by human activities in the future.

  20. Beyond maximum entropy: Fractal pixon-based image reconstruction

    NASA Technical Reports Server (NTRS)

    Puetter, R. C.; Pina, R. K.

    1994-01-01

    We have developed a new Bayesian image reconstruction method that has been shown to be superior to the best implementations of other methods, including Goodness-of-Fit (e.g. Least-Squares and Lucy-Richardson) and Maximum Entropy (ME). Our new method is based on the concept of the pixon, the fundamental, indivisible unit of picture information. Use of the pixon concept provides an improved image model, resulting in an image prior which is superior to that of standard ME.

  1. Applications of the principle of maximum entropy: from physics to ecology.

    PubMed

    Banavar, Jayanth R; Maritan, Amos; Volkov, Igor

    2010-02-17

    There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.

  2. New approach in the quantum statistical parton distribution

    NASA Astrophysics Data System (ADS)

    Sohaily, Sozha; Vaziri (Khamedi), Mohammad

    2017-12-01

    An attempt to find simple parton distribution functions (PDFs) based on quantum statistical approach is presented. The PDFs described by the statistical model have very interesting physical properties which help to understand the structure of partons. The longitudinal portion of distribution functions are given by applying the maximum entropy principle. An interesting and simple approach to determine the statistical variables exactly without fitting and fixing parameters is surveyed. Analytic expressions of the x-dependent PDFs are obtained in the whole x region [0, 1], and the computed distributions are consistent with the experimental observations. The agreement with experimental data, gives a robust confirm of our simple presented statistical model.

  3. Clauser-Horne-Shimony-Holt violation and the entropy-concurrence plane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derkacz, Lukasz; Jakobczyk, Lech

    2005-10-15

    We characterize violation of Clauser-Horne-Shimony-Holt (CHSH) inequalities for mixed two-qubit states by their mixedness and entanglement. The class of states that have maximum degree of CHSH violation for a given linear entropy is also constructed.

  4. Maximum entropy, fluctuations and priors

    NASA Astrophysics Data System (ADS)

    Caticha, A.

    2001-05-01

    The method of maximum entropy (ME) is extended to address the following problem: Once one accepts that the ME distribution is to be preferred over all others, the question is to what extent are distributions with lower entropy supposed to be ruled out. Two applications are given. The first is to the theory of thermodynamic fluctuations. The formulation is exact, covariant under changes of coordinates, and allows fluctuations of both the extensive and the conjugate intensive variables. The second application is to the construction of an objective prior for Bayesian inference. The prior obtained by following the ME method to its inevitable conclusion turns out to be a special case (α=1) of what are currently known under the name of entropic priors. .

  5. REMARKS ON THE MAXIMUM ENTROPY METHOD APPLIED TO FINITE TEMPERATURE LATTICE QCD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    UMEDA, T.; MATSUFURU, H.

    2005-07-25

    We make remarks on the Maximum Entropy Method (MEM) for studies of the spectral function of hadronic correlators in finite temperature lattice QCD. We discuss the virtues and subtlety of MEM in the cases that one does not have enough number of data points such as at finite temperature. Taking these points into account, we suggest several tests which one should examine to keep the reliability for the results, and also apply them using mock and lattice QCD data.

  6. Tendency towards maximum complexity in a nonequilibrium isolated system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calbet, Xavier; Lopez-Ruiz, Ricardo

    2001-06-01

    The time evolution equations of a simplified isolated ideal gas, the {open_quotes}tetrahedral{close_quotes} gas, are derived. The dynamical behavior of the Lopez-Ruiz{endash}Mancini{endash}Calbet complexity [R. Lopez-Ruiz, H. L. Mancini, and X. Calbet, Phys. Lett. A >209, 321 (1995)] is studied in this system. In general, it is shown that the complexity remains within the bounds of minimum and maximum complexity. We find that there are certain restrictions when the isolated {open_quotes}tetrahedral{close_quotes} gas evolves towards equilibrium. In addition to the well-known increase in entropy, the quantity called disequilibrium decreases monotonically with time. Furthermore, the trajectories of the system in phase space approach themore » maximum complexity path as it evolves toward equilibrium.« less

  7. Characterizing Protease Specificity: How Many Substrates Do We Need?

    PubMed Central

    Schauperl, Michael; Fuchs, Julian E.; Waldner, Birgit J.; Huber, Roland G.; Kramer, Christian; Liedl, Klaus R.

    2015-01-01

    Calculation of cleavage entropies allows to quantify, map and compare protease substrate specificity by an information entropy based approach. The metric intrinsically depends on the number of experimentally determined substrates (data points). Thus a statistical analysis of its numerical stability is crucial to estimate the systematic error made by estimating specificity based on a limited number of substrates. In this contribution, we show the mathematical basis for estimating the uncertainty in cleavage entropies. Sets of cleavage entropies are calculated using experimental cleavage data and modeled extreme cases. By analyzing the underlying mathematics and applying statistical tools, a linear dependence of the metric in respect to 1/n was found. This allows us to extrapolate the values to an infinite number of samples and to estimate the errors. Analyzing the errors, a minimum number of 30 substrates was found to be necessary to characterize substrate specificity, in terms of amino acid variability, for a protease (S4-S4’) with an uncertainty of 5 percent. Therefore, we encourage experimental researchers in the protease field to record specificity profiles of novel proteases aiming to identify at least 30 peptide substrates of maximum sequence diversity. We expect a full characterization of protease specificity helpful to rationalize biological functions of proteases and to assist rational drug design. PMID:26559682

  8. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaerer, Roman Pascal, E-mail: schaerer@mathcces.rwth-aachen.de; Bansal, Pratyuksh; Torrilhon, Manuel

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropymore » distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.« less

  9. Direct 4D reconstruction of parametric images incorporating anato-functional joint entropy.

    PubMed

    Tang, Jing; Kuwabara, Hiroto; Wong, Dean F; Rahmim, Arman

    2010-08-07

    We developed an anatomy-guided 4D closed-form algorithm to directly reconstruct parametric images from projection data for (nearly) irreversible tracers. Conventional methods consist of individually reconstructing 2D/3D PET data, followed by graphical analysis on the sequence of reconstructed image frames. The proposed direct reconstruction approach maintains the simplicity and accuracy of the expectation-maximization (EM) algorithm by extending the system matrix to include the relation between the parametric images and the measured data. A closed-form solution was achieved using a different hidden complete-data formulation within the EM framework. Furthermore, the proposed method was extended to maximum a posterior reconstruction via incorporation of MR image information, taking the joint entropy between MR and parametric PET features as the prior. Using realistic simulated noisy [(11)C]-naltrindole PET and MR brain images/data, the quantitative performance of the proposed methods was investigated. Significant improvements in terms of noise versus bias performance were demonstrated when performing direct parametric reconstruction, and additionally upon extending the algorithm to its Bayesian counterpart using the MR-PET joint entropy measure.

  10. It is not the entropy you produce, rather, how you produce it

    PubMed Central

    Volk, Tyler; Pauluis, Olivier

    2010-01-01

    The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise the entropy production rate. Our analysis makes explicit fundamental questions for MEP that should be brought into focus: can MEP predict not just the overall state of entropy production of a system but also the details of the sub-systems of dissipaters within the system? Which fluxes of the system are those that are most likely to be maximized? How it is possible for MEP theory to be so domain-neutral that it can claim to apply equally to both purely physical–chemical systems and also systems governed by the ‘laws’ of biological evolution? We conclude that the principle of MEP needs to take on the issue of exactly how entropy is produced. PMID:20368249

  11. Reconstruction of calmodulin single-molecule FRET states, dye interactions, and CaMKII peptide binding by MultiNest and classic maximum entropy

    NASA Astrophysics Data System (ADS)

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2013-08-01

    We analyzed single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.

  12. Reconstruction of Calmodulin Single-Molecule FRET States, Dye-Interactions, and CaMKII Peptide Binding by MultiNest and Classic Maximum Entropy

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2013-01-01

    We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data. PMID:24223465

  13. Perspective: Maximum caliber is a general variational principle for dynamical systems

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A.

    2018-01-01

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics—such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production—are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  14. Reconstruction of Calmodulin Single-Molecule FRET States, Dye-Interactions, and CaMKII Peptide Binding by MultiNest and Classic Maximum Entropy.

    PubMed

    Devore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2013-08-30

    We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca 2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.

  15. Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung

    2015-04-01

    In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting

  16. Perspective: Maximum caliber is a general variational principle for dynamical systems.

    PubMed

    Dixit, Purushottam D; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A

    2018-01-07

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics-such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production-are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  17. Information and Entropy

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel

    2007-11-01

    What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.

  18. Development and application of the maximum entropy method and other spectral estimation techniques

    NASA Astrophysics Data System (ADS)

    King, W. R.

    1980-09-01

    This summary report is a collection of four separate progress reports prepared under three contracts, which are all sponsored by the Office of Naval Research in Arlington, Virginia. This report contains the results of investigations into the application of the maximum entropy method (MEM), a high resolution, frequency and wavenumber estimation technique. The report also contains a description of two, new, stable, high resolution spectral estimation techniques that is provided in the final report section. Many examples of wavenumber spectral patterns for all investigated techniques are included throughout the report. The maximum entropy method is also known as the maximum entropy spectral analysis (MESA) technique, and both names are used in the report. Many MEM wavenumber spectral patterns are demonstrated using both simulated and measured radar signal and noise data. Methods for obtaining stable MEM wavenumber spectra are discussed, broadband signal detection using the MEM prediction error transform (PET) is discussed, and Doppler radar narrowband signal detection is demonstrated using the MEM technique. It is also shown that MEM cannot be applied to randomly sampled data. The two new, stable, high resolution, spectral estimation techniques discussed in the final report section, are named the Wiener-King and the Fourier spectral estimation techniques. The two new techniques have a similar derivation based upon the Wiener prediction filter, but the two techniques are otherwise quite different. Further development of the techniques and measurement of the technique spectral characteristics is recommended for subsequent investigation.

  19. Molecular extended thermodynamics of rarefied polyatomic gases and wave velocities for increasing number of moments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arima, Takashi, E-mail: tks@stat.nitech.ac.jp; Mentrelli, Andrea, E-mail: andrea.mentrelli@unibo.it; Ruggeri, Tommaso, E-mail: tommaso.ruggeri@unibo.it

    Molecular extended thermodynamics of rarefied polyatomic gases is characterized by two hierarchies of equations for moments of a suitable distribution function in which the internal degrees of freedom of a molecule is taken into account. On the basis of physical relevance the truncation orders of the two hierarchies are proven to be not independent on each other, and the closure procedures based on the maximum entropy principle (MEP) and on the entropy principle (EP) are proven to be equivalent. The characteristic velocities of the emerging hyperbolic system of differential equations are compared to those obtained for monatomic gases and themore » lower bound estimate for the maximum equilibrium characteristic velocity established for monatomic gases (characterized by only one hierarchy for moments with truncation order of moments N) by Boillat and Ruggeri (1997) (λ{sub (N)}{sup E,max})/(c{sub 0}) ⩾√(6/5 (N−1/2 )),(c{sub 0}=√(5/3 k/m T)) is proven to hold also for rarefied polyatomic gases independently from the degrees of freedom of a molecule. -- Highlights: •Molecular extended thermodynamics of rarefied polyatomic gases is studied. •The relation between two hierarchies of equations for moments is derived. •The equivalence of maximum entropy principle and entropy principle is proven. •The characteristic velocities are compared to those of monatomic gases. •The lower bound of the maximum characteristic velocity is estimated.« less

  20. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  1. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  2. Energy conservation and maximal entropy production in enzyme reactions.

    PubMed

    Dobovišek, Andrej; Vitas, Marko; Brumen, Milan; Fajmut, Aleš

    2017-08-01

    A procedure for maximization of the density of entropy production in a single stationary two-step enzyme reaction is developed. Under the constraints of mass conservation, fixed equilibrium constant of a reaction and fixed products of forward and backward enzyme rate constants the existence of maximum in the density of entropy production is demonstrated. In the state with maximal density of entropy production the optimal enzyme rate constants, the stationary concentrations of the substrate and the product, the stationary product yield as well as the stationary reaction flux are calculated. The test, whether these calculated values of the reaction parameters are consistent with their corresponding measured values, is performed for the enzyme Glucose Isomerase. It is found that calculated and measured rate constants agree within an order of magnitude, whereas the calculated reaction flux and the product yield differ from their corresponding measured values for less than 20 % and 5 %, respectively. This indicates that the enzyme Glucose Isomerase, considered in a non-equilibrium stationary state, as found in experiments using the continuous stirred tank reactors, possibly operates close to the state with the maximum in the density of entropy production. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Shock heating of the solar wind plasma

    NASA Technical Reports Server (NTRS)

    Whang, Y. C.; Liu, Shaoliang; Burlaga, L. F.

    1990-01-01

    The role played by shocks in heating solar-wind plasma is investigated using data on 413 shocks which were identified from the plasma and magnetic-field data collected between 1973 and 1982 by Pioneer and Voyager spacecraft. It is found that the average shock strength increased with the heliocentric distance outside 1 AU, reaching a maximum near 5 AU, after which the shock strength decreased with the distance; the entropy of the solar wind protons also reached a maximum at 5 AU. An MHD simulation model in which shock heating is the only heating mechanism available was used to calculate the entropy changes for the November 1977 event. The calculated entropy agreed well with the value calculated from observational data, suggesting that shocks are chiefly responsible for heating solar wind plasma between 1 and 15 AU.

  4. Entropy maximization under the constraints on the generalized Gini index and its application in modeling income distributions

    NASA Astrophysics Data System (ADS)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2015-11-01

    In economics and social sciences, the inequality measures such as Gini index, Pietra index etc., are commonly used to measure the statistical dispersion. There is a generalization of Gini index which includes it as special case. In this paper, we use principle of maximum entropy to approximate the model of income distribution with a given mean and generalized Gini index. Many distributions have been used as descriptive models for the distribution of income. The most widely known of these models are the generalized beta of second kind and its subclass distributions. The obtained maximum entropy distributions are fitted to the US family total money income in 2009, 2011 and 2013 and their relative performances with respect to generalized beta of second kind family are compared.

  5. A Maximum Entropy Method for Particle Filtering

    NASA Astrophysics Data System (ADS)

    Eyink, Gregory L.; Kim, Sangil

    2006-06-01

    Standard ensemble or particle filtering schemes do not properly represent states of low priori probability when the number of available samples is too small, as is often the case in practical applications. We introduce here a set of parametric resampling methods to solve this problem. Motivated by a general H-theorem for relative entropy, we construct parametric models for the filter distributions as maximum-entropy/minimum-information models consistent with moments of the particle ensemble. When the prior distributions are modeled as mixtures of Gaussians, our method naturally generalizes the ensemble Kalman filter to systems with highly non-Gaussian statistics. We apply the new particle filters presented here to two simple test cases: a one-dimensional diffusion process in a double-well potential and the three-dimensional chaotic dynamical system of Lorenz.

  6. Maximum entropy and equations of state for random cellular structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivier, N.

    Random, space-filling cellular structures (biological tissues, metallurgical grain aggregates, foams, etc.) are investigated. Maximum entropy inference under a few constraints yields structural equations of state, relating the size of cells to their topological shape. These relations are known empirically as Lewis's law in Botany, or Desch's relation in Metallurgy. Here, the functional form of the constraints is now known as a priori, and one takes advantage of this arbitrariness to increase the entropy further. The resulting structural equations of state are independent of priors, they are measurable experimentally and constitute therefore a direct test for the applicability of MaxEnt inferencemore » (given that the structure is in statistical equilibrium, a fact which can be tested by another simple relation (Aboav's law)). 23 refs., 2 figs., 1 tab.« less

  7. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S.

    1989-05-15

    Several theorems on the mechanics of gathering information are proved, and the possibility of violating the second law of thermodynamics by obtaining information is discussed in light of these theorems. Maxwell's demon can lower the entropy of his surroundings by an amount equal to the difference between the maximum entropy of his recording device and its initial entropy, without generating a compensating entropy increase. A demon with human-scale recording devices can reduce the entropy of a gas by a negligible amount only, but the proof of the demon's impracticability leaves open the possibility that systems highly correlated with their environmentmore » can reduce the environment's entropy by a substantial amount without increasing entropy elsewhere. In the event that a boundary condition for the universe requires it to be in a state of low entropy when small, the correlations induced between different particle modes during the expansion phase allow the modes to behave like Maxwell's demons during the contracting phase, reducing the entropy of the universe to a low value.« less

  8. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations

    NASA Astrophysics Data System (ADS)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-01

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  9. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    PubMed

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  10. Exact Maximum-Entropy Estimation with Feynman Diagrams

    NASA Astrophysics Data System (ADS)

    Netser Zernik, Amitai; Schlank, Tomer M.; Tessler, Ran J.

    2018-02-01

    A longstanding open problem in statistics is finding an explicit expression for the probability measure which maximizes entropy with respect to given constraints. In this paper a solution to this problem is found, using perturbative Feynman calculus. The explicit expression is given as a sum over weighted trees.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine A.

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Cemore » 4+/Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.« less

  12. Nonequilibrium Entropy in a Shock

    DOE PAGES

    Margolin, Len G.

    2017-07-19

    In a classic paper, Morduchow and Libby use an analytic solution for the profile of a Navier–Stokes shock to show that the equilibrium thermodynamic entropy has a maximum inside the shock. There is no general nonequilibrium thermodynamic formulation of entropy; the extension of equilibrium theory to nonequililbrium processes is usually made through the assumption of local thermodynamic equilibrium (LTE). However, gas kinetic theory provides a perfectly general formulation of a nonequilibrium entropy in terms of the probability distribution function (PDF) solutions of the Boltzmann equation. In this paper I will evaluate the Boltzmann entropy for the PDF that underlies themore » Navier–Stokes equations and also for the PDF of the Mott–Smith shock solution. I will show that both monotonically increase in the shock. As a result, I will propose a new nonequilibrium thermodynamic entropy and show that it is also monotone and closely approximates the Boltzmann entropy.« less

  13. Nonequilibrium Entropy in a Shock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolin, Len G.

    In a classic paper, Morduchow and Libby use an analytic solution for the profile of a Navier–Stokes shock to show that the equilibrium thermodynamic entropy has a maximum inside the shock. There is no general nonequilibrium thermodynamic formulation of entropy; the extension of equilibrium theory to nonequililbrium processes is usually made through the assumption of local thermodynamic equilibrium (LTE). However, gas kinetic theory provides a perfectly general formulation of a nonequilibrium entropy in terms of the probability distribution function (PDF) solutions of the Boltzmann equation. In this paper I will evaluate the Boltzmann entropy for the PDF that underlies themore » Navier–Stokes equations and also for the PDF of the Mott–Smith shock solution. I will show that both monotonically increase in the shock. As a result, I will propose a new nonequilibrium thermodynamic entropy and show that it is also monotone and closely approximates the Boltzmann entropy.« less

  14. Efficient Bayesian experimental design for contaminant source identification

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Zeng, L.

    2013-12-01

    In this study, an efficient full Bayesian approach is developed for the optimal sampling well location design and source parameter identification of groundwater contaminants. An information measure, i.e., the relative entropy, is employed to quantify the information gain from indirect concentration measurements in identifying unknown source parameters such as the release time, strength and location. In this approach, the sampling location that gives the maximum relative entropy is selected as the optimal one. Once the sampling location is determined, a Bayesian approach based on Markov Chain Monte Carlo (MCMC) is used to estimate unknown source parameters. In both the design and estimation, the contaminant transport equation is required to be solved many times to evaluate the likelihood. To reduce the computational burden, an interpolation method based on the adaptive sparse grid is utilized to construct a surrogate for the contaminant transport. The approximated likelihood can be evaluated directly from the surrogate, which greatly accelerates the design and estimation process. The accuracy and efficiency of our approach are demonstrated through numerical case studies. Compared with the traditional optimal design, which is based on the Gaussian linear assumption, the method developed in this study can cope with arbitrary nonlinearity. It can be used to assist in groundwater monitor network design and identification of unknown contaminant sources. Contours of the expected information gain. The optimal observing location corresponds to the maximum value. Posterior marginal probability densities of unknown parameters, the thick solid black lines are for the designed location. For comparison, other 7 lines are for randomly chosen locations. The true values are denoted by vertical lines. It is obvious that the unknown parameters are estimated better with the desinged location.

  15. Respiration-Averaged CT for Attenuation Correction of PET Images – Impact on PET Texture Features in Non-Small Cell Lung Cancer Patients

    PubMed Central

    Cheng, Nai-Ming; Fang, Yu-Hua Dean; Tsan, Din-Li

    2016-01-01

    Purpose We compared attenuation correction of PET images with helical CT (PET/HCT) and respiration-averaged CT (PET/ACT) in patients with non-small-cell lung cancer (NSCLC) with the goal of investigating the impact of respiration-averaged CT on 18F FDG PET texture parameters. Materials and Methods A total of 56 patients were enrolled. Tumors were segmented on pretreatment PET images using the adaptive threshold. Twelve different texture parameters were computed: standard uptake value (SUV) entropy, uniformity, entropy, dissimilarity, homogeneity, coarseness, busyness, contrast, complexity, grey-level nonuniformity, zone-size nonuniformity, and high grey-level large zone emphasis. Comparisons of PET/HCT and PET/ACT were performed using Wilcoxon signed-rank tests, intraclass correlation coefficients, and Bland-Altman analysis. Receiver operating characteristic (ROC) curves as well as univariate and multivariate Cox regression analyses were used to identify the parameters significantly associated with disease-specific survival (DSS). A fixed threshold at 45% of the maximum SUV (T45) was used for validation. Results SUV maximum and total lesion glycolysis (TLG) were significantly higher in PET/ACT. However, texture parameters obtained with PET/ACT and PET/HCT showed a high degree of agreement. The lowest levels of variation between the two modalities were observed for SUV entropy (9.7%) and entropy (9.8%). SUV entropy, entropy, and coarseness from both PET/ACT and PET/HCT were significantly associated with DSS. Validation analyses using T45 confirmed the usefulness of SUV entropy and entropy in both PET/HCT and PET/ACT for the prediction of DSS, but only coarseness from PET/ACT achieved the statistical significance threshold. Conclusions Our results indicate that 1) texture parameters from PET/ACT are clinically useful in the prediction of survival in NSCLC patients and 2) SUV entropy and entropy are robust to attenuation correction methods. PMID:26930211

  16. Maximum entropy approach to H -theory: Statistical mechanics of hierarchical systems

    NASA Astrophysics Data System (ADS)

    Vasconcelos, Giovani L.; Salazar, Domingos S. P.; Macêdo, A. M. S.

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem—representing the region where the measurements are made—in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017), 10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  17. Maximum entropy approach to H-theory: Statistical mechanics of hierarchical systems.

    PubMed

    Vasconcelos, Giovani L; Salazar, Domingos S P; Macêdo, A M S

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem-representing the region where the measurements are made-in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017)10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  18. Spectral functions at small energies and the electrical conductivity in hot quenched lattice QCD.

    PubMed

    Aarts, Gert; Allton, Chris; Foley, Justin; Hands, Simon; Kim, Seyong

    2007-07-13

    In lattice QCD, the maximum entropy method can be used to reconstruct spectral functions from Euclidean correlators obtained in numerical simulations. We show that at finite temperature the most commonly used algorithm, employing Bryan's method, is inherently unstable at small energies and gives a modification that avoids this. We demonstrate this approach using the vector current-current correlator obtained in quenched QCD at finite temperature. Our first results indicate a small electrical conductivity above the deconfinement transition.

  19. Using Maximum Entropy to Find Patterns in Genomes

    NASA Astrophysics Data System (ADS)

    Liu, Sophia; Hockenberry, Adam; Lancichinetti, Andrea; Jewett, Michael; Amaral, Luis

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. To accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. This approach can also be easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes. National Institute of General Medical Science, Northwestern University Presidential Fellowship, National Science Foundation, David and Lucile Packard Foundation, Camille Dreyfus Teacher Scholar Award.

  20. Spatiotemporal modeling of PM2.5 concentrations at the national scale combining land use regression and Bayesian maximum entropy in China.

    PubMed

    Chen, Li; Gao, Shuang; Zhang, Hui; Sun, Yanling; Ma, Zhenxing; Vedal, Sverre; Mao, Jian; Bai, Zhipeng

    2018-05-03

    Concentrations of particulate matter with aerodynamic diameter <2.5 μm (PM 2.5 ) are relatively high in China. Estimation of PM 2.5 exposure is complex because PM 2.5 exhibits complex spatiotemporal patterns. To improve the validity of exposure predictions, several methods have been developed and applied worldwide. A hybrid approach combining a land use regression (LUR) model and Bayesian Maximum Entropy (BME) interpolation of the LUR space-time residuals were developed to estimate the PM 2.5 concentrations on a national scale in China. This hybrid model could potentially provide more valid predictions than a commonly-used LUR model. The LUR/BME model had good performance characteristics, with R 2  = 0.82 and root mean square error (RMSE) of 4.6 μg/m 3 . Prediction errors of the LUR/BME model were reduced by incorporating soft data accounting for data uncertainty, with the R 2 increasing by 6%. The performance of LUR/BME is better than OK/BME. The LUR/BME model is the most accurate fine spatial scale PM 2.5 model developed to date for China. Copyright © 2018. Published by Elsevier Ltd.

  1. Fast Maximum Entropy Moment Closure Approach to Solving the Boltzmann Equation

    NASA Astrophysics Data System (ADS)

    Summy, Dustin; Pullin, Dale

    2015-11-01

    We describe a method for a moment-based solution of the Boltzmann Equation (BE). This is applicable to an arbitrary set of velocity moments whose transport is governed by partial-differential equations (PDEs) derived from the BE. The equations are unclosed, containing both higher-order moments and molecular-collision terms. These are evaluated using a maximum-entropy reconstruction of the velocity distribution function f (c , x , t) , from the known moments, within a finite-box domain of single-particle velocity (c) space. Use of a finite-domain alleviates known problems (Junk and Unterreiter, Continuum Mech. Thermodyn., 2002) concerning existence and uniqueness of the reconstruction. Unclosed moments are evaluated with quadrature while collision terms are calculated using any desired method. This allows integration of the moment PDEs in time. The high computational cost of the general method is greatly reduced by careful choice of the velocity moments, allowing the necessary integrals to be reduced from three- to one-dimensional in the case of strictly 1D flows. A method to extend this enhancement to fully 3D flows is discussed. Comparison with relaxation and shock-wave problems using the DSMC method will be presented. Partially supported by NSF grant DMS-1418903.

  2. Computational design of hepatitis C vaccines using maximum entropy models and population dynamics

    NASA Astrophysics Data System (ADS)

    Hart, Gregory; Ferguson, Andrew

    Hepatitis C virus (HCV) afflicts 170 million people and kills 350,000 annually. Vaccination offers the most realistic and cost effective hope of controlling this epidemic. Despite 20 years of research, no vaccine is available. A major obstacle is the virus' extreme genetic variability and rapid mutational escape from immune pressure. Improvements in the vaccine design process are urgently needed. Coupling data mining with spin glass models and maximum entropy inference, we have developed a computational approach to translate sequence databases into empirical fitness landscapes. These landscapes explicitly connect viral genotype to phenotypic fitness and reveal vulnerable targets that can be exploited to rationally design immunogens. Viewing these landscapes as the mutational ''playing field'' over which the virus is constrained to evolve, we have integrated them with agent-based models of the viral mutational and host immune response dynamics, establishing a data-driven immune simulator of HCV infection. We have employed this simulator to perform in silico screening of HCV immunogens. By systematically identifying a small number of promising vaccine candidates, these models can accelerate the search for a vaccine by massively reducing the experimental search space.

  3. Information theory lateral density distribution for Earth inferred from global gravity field

    NASA Technical Reports Server (NTRS)

    Rubincam, D. P.

    1981-01-01

    Information Theory Inference, better known as the Maximum Entropy Method, was used to infer the lateral density distribution inside the Earth. The approach assumed that the Earth consists of indistinguishable Maxwell-Boltzmann particles populating infinitesimal volume elements, and followed the standard methods of statistical mechanics (maximizing the entropy function). The GEM 10B spherical harmonic gravity field coefficients, complete to degree and order 36, were used as constraints on the lateral density distribution. The spherically symmetric part of the density distribution was assumed to be known. The lateral density variation was assumed to be small compared to the spherically symmetric part. The resulting information theory density distribution for the cases of no crust removed, 30 km of compensated crust removed, and 30 km of uncompensated crust removed all gave broad density anomalies extending deep into the mantle, but with the density contrasts being the greatest towards the surface (typically + or 0.004 g cm 3 in the first two cases and + or - 0.04 g cm 3 in the third). None of the density distributions resemble classical organized convection cells. The information theory approach may have use in choosing Standard Earth Models, but, the inclusion of seismic data into the approach appears difficult.

  4. Giant onsite electronic entropy enhances the performance of ceria for water splitting

    DOE PAGES

    Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine A.; ...

    2017-08-18

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Cemore » 4+/Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.« less

  5. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    NASA Astrophysics Data System (ADS)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  6. Giant onsite electronic entropy enhances the performance of ceria for water splitting.

    PubMed

    Naghavi, S Shahab; Emery, Antoine A; Hansen, Heine A; Zhou, Fei; Ozolins, Vidvuds; Wolverton, Chris

    2017-08-18

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Ce 4+ /Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.Solid-state entropy of reduction increases the thermodynamic efficiency of ceria for two-step thermochemical water splitting. Here, the authors report a large and different source of entropy, the onsite electronic configurational entropy arising from coupling between orbital and spin angular momenta in f orbitals.

  7. The limit behavior of the evolution of the Tsallis entropy in self-gravitating systems

    NASA Astrophysics Data System (ADS)

    Zheng, Yahui; Du, Jiulin; Liang, Faku

    2017-06-01

    In this letter, we study the limit behavior of the evolution of the Tsallis entropy in self-gravitating systems. The study is carried out under two different situations, drawing the same conclusion. No matter in the energy transfer process or in the mass transfer process inside the system, when the nonextensive parameter q is more than unity, the total entropy is bounded; on the contrary, when this parameter is less than unity, the total entropy is unbounded. There are proofs in both theory and observation that the q is always more than unity. So the Tsallis entropy in self-gravitating systems generally exhibits a bounded property. This indicates the existence of a global maximum of the Tsallis entropy. It is possible for self-gravitating systems to evolve to thermodynamically stable states.

  8. The More the Merrier?. Entropy and Statistics of Asexual Reproduction in Freshwater Planarians

    NASA Astrophysics Data System (ADS)

    Quinodoz, Sofia; Thomas, Michael A.; Dunkel, Jörn; Schötz, Eva-Maria

    2011-04-01

    The trade-off between traits in life-history strategies has been widely studied for sexual and parthenogenetic organisms, but relatively little is known about the reproduction strategies of asexual animals. Here, we investigate clonal reproduction in the freshwater planarian Schmidtea mediterranea, an important model organism for regeneration and stem cell research. We find that these flatworms adopt a randomized reproduction strategy that comprises both asymmetric binary fission and fragmentation (generation of multiple offspring during a reproduction cycle). Fragmentation in planarians has primarily been regarded as an abnormal behavior in the past; using a large-scale experimental approach, we now show that about one third of the reproduction events in S. mediterranea are fragmentations, implying that fragmentation is part of their normal reproductive behavior. Our analysis further suggests that certain characteristic aspects of the reproduction statistics can be explained in terms of a maximum relative entropy principle.

  9. A fault diagnosis scheme for planetary gearboxes using modified multi-scale symbolic dynamic entropy and mRMR feature selection

    NASA Astrophysics Data System (ADS)

    Li, Yongbo; Yang, Yuantao; Li, Guoyan; Xu, Minqiang; Huang, Wenhu

    2017-07-01

    Health condition identification of planetary gearboxes is crucial to reduce the downtime and maximize productivity. This paper aims to develop a novel fault diagnosis method based on modified multi-scale symbolic dynamic entropy (MMSDE) and minimum redundancy maximum relevance (mRMR) to identify the different health conditions of planetary gearbox. MMSDE is proposed to quantify the regularity of time series, which can assess the dynamical characteristics over a range of scales. MMSDE has obvious advantages in the detection of dynamical changes and computation efficiency. Then, the mRMR approach is introduced to refine the fault features. Lastly, the obtained new features are fed into the least square support vector machine (LSSVM) to complete the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault types of planetary gearboxes.

  10. Using max entropy ratio of recurrence plot to measure electrocorticogram changes in epilepsy patients

    NASA Astrophysics Data System (ADS)

    Yan, Jiaqing; Wang, Yinghua; Ouyang, Gaoxiang; Yu, Tao; Li, Xiaoli

    2016-02-01

    A maximum entropy ratio (MER) method is firstly adapted to investigate the high-dimensional Electrocorticogram (ECoG) data from epilepsy patients. MER is a symbolic analysis approach for the detection of recurrence domains of complex dynamical systems from time series. Data were chosen from eight patients undergoing pre-surgical evaluation for drug-resistant temporal lobe epilepsy. MERs for interictal and ictal data were calculated and compared. A statistical test was performed to evaluate the ability of MER to separate the interictal state from the ictal state. MER showed significant changes from the interictal state into the ictal state, where MER was low at the ictal state and is significantly different with that at the interictal state. These suggest that MER is able to separate the ictal state from the interictal state based on ECoG data. It has the potential of detecting the transition between normal brain activity and the ictal state.

  11. Maximum caliber inference of nonequilibrium processes

    NASA Astrophysics Data System (ADS)

    Otten, Moritz; Stock, Gerhard

    2010-07-01

    Thirty years ago, Jaynes suggested a general theoretical approach to nonequilibrium statistical mechanics, called maximum caliber (MaxCal) [Annu. Rev. Phys. Chem. 31, 579 (1980)]. MaxCal is a variational principle for dynamics in the same spirit that maximum entropy is a variational principle for equilibrium statistical mechanics. Motivated by the success of maximum entropy inference methods for equilibrium problems, in this work the MaxCal formulation is applied to the inference of nonequilibrium processes. That is, given some time-dependent observables of a dynamical process, one constructs a model that reproduces these input data and moreover, predicts the underlying dynamics of the system. For example, the observables could be some time-resolved measurements of the folding of a protein, which are described by a few-state model of the free energy landscape of the system. MaxCal then calculates the probabilities of an ensemble of trajectories such that on average the data are reproduced. From this probability distribution, any dynamical quantity of the system can be calculated, including population probabilities, fluxes, or waiting time distributions. After briefly reviewing the formalism, the practical numerical implementation of MaxCal in the case of an inference problem is discussed. Adopting various few-state models of increasing complexity, it is demonstrated that the MaxCal principle indeed works as a practical method of inference: The scheme is fairly robust and yields correct results as long as the input data are sufficient. As the method is unbiased and general, it can deal with any kind of time dependency such as oscillatory transients and multitime decays.

  12. Human vision is determined based on information theory.

    PubMed

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-03

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  13. Human vision is determined based on information theory

    NASA Astrophysics Data System (ADS)

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  14. Quantifying Extrinsic Noise in Gene Expression Using the Maximum Entropy Framework

    PubMed Central

    Dixit, Purushottam D.

    2013-01-01

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. PMID:23790383

  15. Quantifying extrinsic noise in gene expression using the maximum entropy framework.

    PubMed

    Dixit, Purushottam D

    2013-06-18

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Human vision is determined based on information theory

    PubMed Central

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-01-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition. PMID:27808236

  17. Optimization of rainfall networks using information entropy and temporal variability analysis

    NASA Astrophysics Data System (ADS)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  18. Classification of pulmonary pathology from breath sounds using the wavelet packet transform and an extreme learning machine.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian; Huliraj, N; Revadi, S S

    2017-06-08

    Auscultation is a medical procedure used for the initial diagnosis and assessment of lung and heart diseases. From this perspective, we propose assessing the performance of the extreme learning machine (ELM) classifiers for the diagnosis of pulmonary pathology using breath sounds. Energy and entropy features were extracted from the breath sound using the wavelet packet transform. The statistical significance of the extracted features was evaluated by one-way analysis of variance (ANOVA). The extracted features were inputted into the ELM classifier. The maximum classification accuracies obtained for the conventional validation (CV) of the energy and entropy features were 97.36% and 98.37%, respectively, whereas the accuracies obtained for the cross validation (CRV) of the energy and entropy features were 96.80% and 97.91%, respectively. In addition, maximum classification accuracies of 98.25% and 99.25% were obtained for the CV and CRV of the ensemble features, respectively. The results indicate that the classification accuracy obtained with the ensemble features was higher than those obtained with the energy and entropy features.

  19. An Improved Otsu Threshold Segmentation Method for Underwater Simultaneous Localization and Mapping-Based Navigation

    PubMed Central

    Yuan, Xin; Martínez, José-Fernán; Eckert, Martina; López-Santidrián, Lourdes

    2016-01-01

    The main focus of this paper is on extracting features with SOund Navigation And Ranging (SONAR) sensing for further underwater landmark-based Simultaneous Localization and Mapping (SLAM). According to the characteristics of sonar images, in this paper, an improved Otsu threshold segmentation method (TSM) has been developed for feature detection. In combination with a contour detection algorithm, the foreground objects, although presenting different feature shapes, are separated much faster and more precisely than by other segmentation methods. Tests have been made with side-scan sonar (SSS) and forward-looking sonar (FLS) images in comparison with other four TSMs, namely the traditional Otsu method, the local TSM, the iterative TSM and the maximum entropy TSM. For all the sonar images presented in this work, the computational time of the improved Otsu TSM is much lower than that of the maximum entropy TSM, which achieves the highest segmentation precision among the four above mentioned TSMs. As a result of the segmentations, the centroids of the main extracted regions have been computed to represent point landmarks which can be used for navigation, e.g., with the help of an Augmented Extended Kalman Filter (AEKF)-based SLAM algorithm. The AEKF-SLAM approach is a recursive and iterative estimation-update process, which besides a prediction and an update stage (as in classical Extended Kalman Filter (EKF)), includes an augmentation stage. During navigation, the robot localizes the centroids of different segments of features in sonar images, which are detected by our improved Otsu TSM, as point landmarks. Using them with the AEKF achieves more accurate and robust estimations of the robot pose and the landmark positions, than with those detected by the maximum entropy TSM. Together with the landmarks identified by the proposed segmentation algorithm, the AEKF-SLAM has achieved reliable detection of cycles in the map and consistent map update on loop closure, which is shown in simulated experiments. PMID:27455279

  20. An Improved Otsu Threshold Segmentation Method for Underwater Simultaneous Localization and Mapping-Based Navigation.

    PubMed

    Yuan, Xin; Martínez, José-Fernán; Eckert, Martina; López-Santidrián, Lourdes

    2016-07-22

    The main focus of this paper is on extracting features with SOund Navigation And Ranging (SONAR) sensing for further underwater landmark-based Simultaneous Localization and Mapping (SLAM). According to the characteristics of sonar images, in this paper, an improved Otsu threshold segmentation method (TSM) has been developed for feature detection. In combination with a contour detection algorithm, the foreground objects, although presenting different feature shapes, are separated much faster and more precisely than by other segmentation methods. Tests have been made with side-scan sonar (SSS) and forward-looking sonar (FLS) images in comparison with other four TSMs, namely the traditional Otsu method, the local TSM, the iterative TSM and the maximum entropy TSM. For all the sonar images presented in this work, the computational time of the improved Otsu TSM is much lower than that of the maximum entropy TSM, which achieves the highest segmentation precision among the four above mentioned TSMs. As a result of the segmentations, the centroids of the main extracted regions have been computed to represent point landmarks which can be used for navigation, e.g., with the help of an Augmented Extended Kalman Filter (AEKF)-based SLAM algorithm. The AEKF-SLAM approach is a recursive and iterative estimation-update process, which besides a prediction and an update stage (as in classical Extended Kalman Filter (EKF)), includes an augmentation stage. During navigation, the robot localizes the centroids of different segments of features in sonar images, which are detected by our improved Otsu TSM, as point landmarks. Using them with the AEKF achieves more accurate and robust estimations of the robot pose and the landmark positions, than with those detected by the maximum entropy TSM. Together with the landmarks identified by the proposed segmentation algorithm, the AEKF-SLAM has achieved reliable detection of cycles in the map and consistent map update on loop closure, which is shown in simulated experiments.

  1. Comment on "Inference with minimal Gibbs free energy in information field theory".

    PubMed

    Iatsenko, D; Stefanovska, A; McClintock, P V E

    2012-03-01

    Enßlin and Weig [Phys. Rev. E 82, 051112 (2010)] have introduced a "minimum Gibbs free energy" (MGFE) approach for estimation of the mean signal and signal uncertainty in Bayesian inference problems: it aims to combine the maximum a posteriori (MAP) and maximum entropy (ME) principles. We point out, however, that there are some important questions to be clarified before the new approach can be considered fully justified, and therefore able to be used with confidence. In particular, after obtaining a Gaussian approximation to the posterior in terms of the MGFE at some temperature T, this approximation should always be raised to the power of T to yield a reliable estimate. In addition, we show explicitly that MGFE indeed incorporates the MAP principle, as well as the MDI (minimum discrimination information) approach, but not the well-known ME principle of Jaynes [E.T. Jaynes, Phys. Rev. 106, 620 (1957)]. We also illuminate some related issues and resolve apparent discrepancies. Finally, we investigate the performance of MGFE estimation for different values of T, and we discuss the advantages and shortcomings of the approach.

  2. Molecular classification of pesticides including persistent organic pollutants, phenylurea and sulphonylurea herbicides.

    PubMed

    Torrens, Francisco; Castellano, Gloria

    2014-06-05

    Pesticide residues in wine were analyzed by liquid chromatography-tandem mass spectrometry. Retentions are modelled by structure-property relationships. Bioplastic evolution is an evolutionary perspective conjugating effect of acquired characters and evolutionary indeterminacy-morphological determination-natural selection principles; its application to design co-ordination index barely improves correlations. Fractal dimensions and partition coefficient differentiate pesticides. Classification algorithms are based on information entropy and its production. Pesticides allow a structural classification by nonplanarity, and number of O, S, N and Cl atoms and cycles; different behaviours depend on number of cycles. The novelty of the approach is that the structural parameters are related to retentions. Classification algorithms are based on information entropy. When applying procedures to moderate-sized sets, excessive results appear compatible with data suffering a combinatorial explosion. However, equipartition conjecture selects criterion resulting from classification between hierarchical trees. Information entropy permits classifying compounds agreeing with principal component analyses. Periodic classification shows that pesticides in the same group present similar properties; those also in equal period, maximum resemblance. The advantage of the classification is to predict the retentions for molecules not included in the categorization. Classification extends to phenyl/sulphonylureas and the application will be to predict their retentions.

  3. New Fault Recognition Method for Rotary Machinery Based on Information Entropy and a Probabilistic Neural Network.

    PubMed

    Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu

    2018-01-24

    Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.

  4. Applications of quantum entropy to statistics

    NASA Astrophysics Data System (ADS)

    Silver, R. N.; Martz, H. F.

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.

  5. Thermodynamics of extremal rotating thin shells in an extremal BTZ spacetime and the extremal black hole entropy

    NASA Astrophysics Data System (ADS)

    Lemos, José P. S.; Minamitsuji, Masato; Zaslavskii, Oleg B.

    2017-02-01

    In a (2 +1 )-dimensional spacetime with a negative cosmological constant, the thermodynamics and the entropy of an extremal rotating thin shell, i.e., an extremal rotating ring, are investigated. The outer and inner regions with respect to the shell are taken to be the Bañados-Teitelbom-Zanelli (BTZ) spacetime and the vacuum ground state anti-de Sitter spacetime, respectively. By applying the first law of thermodynamics to the extremal thin shell, one shows that the entropy of the shell is an arbitrary well-behaved function of the gravitational area A+ alone, S =S (A+). When the thin shell approaches its own gravitational radius r+ and turns into an extremal rotating BTZ black hole, it is found that the entropy of the spacetime remains such a function of A+, both when the local temperature of the shell at the gravitational radius is zero and nonzero. It is thus vindicated by this analysis that extremal black holes, here extremal BTZ black holes, have different properties from the corresponding nonextremal black holes, which have a definite entropy, the Bekenstein-Hawking entropy S (A+)=A/+4G , where G is the gravitational constant. It is argued that for extremal black holes, in particular for extremal BTZ black holes, one should set 0 ≤S (A+)≤A/+4G;i.e., the extremal black hole entropy has values in between zero and the maximum Bekenstein-Hawking entropy A/+4 G . Thus, rather than having just two entropies for extremal black holes, as previous results have debated, namely, 0 and A/+4 G , it is shown here that extremal black holes, in particular extremal BTZ black holes, may have a continuous range of entropies, limited by precisely those two entropies. Surely, the entropy that a particular extremal black hole picks must depend on past processes, notably on how it was formed. A remarkable relation between the third law of thermodynamics and the impossibility for a massive body to reach the velocity of light is also found. In addition, in the procedure, it becomes clear that there are two distinct angular velocities for the shell, the mechanical and thermodynamic angular velocities. We comment on the relationship between these two velocities. In passing, we clarify, for a static spacetime with a thermal shell, the meaning of the Tolman temperature formula at a generic radius and at the shell.

  6. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    PubMed

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  7. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies

    PubMed Central

    Lorenz, Ralph D.

    2010-01-01

    The ‘two-box model’ of planetary climate is discussed. This model has been used to demonstrate consistency of the equator–pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b. PMID:20368253

  8. Optimal behavior of viscoelastic flow at resonant frequencies.

    PubMed

    Lambert, A A; Ibáñez, G; Cuevas, S; del Río, J A

    2004-11-01

    The global entropy generation rate in the zero-mean oscillatory flow of a Maxwell fluid in a pipe is analyzed with the aim of determining its behavior at resonant flow conditions. This quantity is calculated explicitly using the analytic expression for the velocity field and assuming isothermal conditions. The global entropy generation rate shows well-defined peaks at the resonant frequencies where the flow displays maximum velocities. It was found that resonant frequencies can be considered optimal in the sense that they maximize the power transmitted to the pulsating flow at the expense of maximum dissipation.

  9. Twenty-five years of maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  10. Ergodicity, configurational entropy and free energy in pigment solutions and plant photosystems: influence of excited state lifetime.

    PubMed

    Jennings, Robert C; Zucchelli, Giuseppe

    2014-01-01

    We examine ergodicity and configurational entropy for a dilute pigment solution and for a suspension of plant photosystem particles in which both ground and excited state pigments are present. It is concluded that the pigment solution, due to the extreme brevity of the excited state lifetime, is non-ergodic and the configurational entropy approaches zero. Conversely, due to the rapid energy transfer among pigments, each photosystem is ergodic and the configurational entropy is positive. This decreases the free energy of the single photosystem pigment array by a small amount. On the other hand, the suspension of photosystems is non-ergodic and the configurational entropy approaches zero. The overall configurational entropy which, in principle, includes contributions from both the single excited photosystems and the suspension which contains excited photosystems, also approaches zero. Thus the configurational entropy upon photon absorption by either a pigment solution or a suspension of photosystem particles is approximately zero. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  12. Energy transports by ocean and atmosphere based on an entropy extremum principle. I - Zonal averaged transports

    NASA Technical Reports Server (NTRS)

    Sohn, Byung-Ju; Smith, Eric A.

    1993-01-01

    The maximum entropy production principle suggested by Paltridge (1975) is applied to separating the satellite-determined required total transports into atmospheric and oceanic components. Instead of using the excessively restrictive equal energy dissipation hypothesis as a deterministic tool for separating transports between the atmosphere and ocean fluids, the satellite-inferred required 2D energy transports are imposed on Paltridge's energy balance model, which is then solved as a variational problem using the equal energy dissipation hypothesis only to provide an initial guess field. It is suggested that Southern Ocean transports are weaker than previously reported. It is argued that a maximum entropy production principle can serve as a governing rule on macroscale global climate, and, in conjunction with conventional satellite measurements of the net radiation balance, provides a means to decompose atmosphere and ocean transports from the total transport field.

  13. Direct measurement of the electrocaloric effect in poly(vinylidene fluoride-trifluoroethylene-chlorotrifluoroethylene) terpolymer films

    NASA Astrophysics Data System (ADS)

    Basso, Vittorio; Russo, Florence; Gerard, Jean-François; Pruvost, Sébastien

    2013-11-01

    We investigated the entropy change in poly(vinylidene fluoride-trifluoroethylene-chlorotrifluoroethylene) (P(VDF-TrFE-CTFE)) films in the temperature range between -5 ∘C and 60 ∘C by direct heat flux calorimetry using Peltier cell heat flux sensors. At the electric field E = 50 MVm-1 the isothermal entropy change attains a maximum of |Δs|=4.2 Jkg-1K-1 at 31∘C with an adiabatic temperature change ΔTad=1.1 K. At temperatures below the maximum, in the range from 25 ∘C to -5 ∘C, the entropy change |Δs | rapidly decreases and the unipolar P vs E relationship becomes hysteretic. This phenomenon is interpreted as the fact that the fluctuations of the polar segments of the polymer chain, responsible for the electrocaloric effect ECE in the polymer, becomes progressively frozen below the relaxor transition.

  14. Maximum Renyi entropy principle for systems with power-law Hamiltonians.

    PubMed

    Bashkirov, A G

    2004-09-24

    The Renyi distribution ensuring the maximum of Renyi entropy is investigated for a particular case of a power-law Hamiltonian. Both Lagrange parameters alpha and beta can be eliminated. It is found that beta does not depend on a Renyi parameter q and can be expressed in terms of an exponent kappa of the power-law Hamiltonian and an average energy U. The Renyi entropy for the resulting Renyi distribution reaches its maximal value at q=1/(1+kappa) that can be considered as the most probable value of q when we have no additional information on the behavior of the stochastic process. The Renyi distribution for such q becomes a power-law distribution with the exponent -(kappa+1). When q=1/(1+kappa)+epsilon (0

  15. Entropy Inequality Violations from Ultraspinning Black Holes.

    PubMed

    Hennigar, Robie A; Mann, Robert B; Kubizňák, David

    2015-07-17

    We construct a new class of rotating anti-de Sitter (AdS) black hole solutions with noncompact event horizons of finite area in any dimension and study their thermodynamics. In four dimensions these black holes are solutions to gauged supergravity. We find that their entropy exceeds the maximum implied from the conjectured reverse isoperimetric inequality, which states that for a given thermodynamic volume, the black hole entropy is maximized for Schwarzschild-AdS space. We use this result to suggest more stringent conditions under which this conjecture may hold.

  16. Bayesian approach to inverse statistical mechanics.

    PubMed

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  17. Bayesian approach to inverse statistical mechanics

    NASA Astrophysics Data System (ADS)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  18. Data free inference with processed data products

    DOE PAGES

    Chowdhary, K.; Najm, H. N.

    2014-07-12

    Here, we consider the context of probabilistic inference of model parameters given error bars or confidence intervals on model output values, when the data is unavailable. We introduce a class of algorithms in a Bayesian framework, relying on maximum entropy arguments and approximate Bayesian computation methods, to generate consistent data with the given summary statistics. Once we obtain consistent data sets, we pool the respective posteriors, to arrive at a single, averaged density on the parameters. This approach allows us to perform accurate forward uncertainty propagation consistent with the reported statistics.

  19. Co-complex protein membership evaluation using Maximum Entropy on GO ontology and InterPro annotation.

    PubMed

    Armean, Irina M; Lilley, Kathryn S; Trotter, Matthew W B; Pilkington, Nicholas C V; Holden, Sean B

    2018-06-01

    Protein-protein interactions (PPI) play a crucial role in our understanding of protein function and biological processes. The standardization and recording of experimental findings is increasingly stored in ontologies, with the Gene Ontology (GO) being one of the most successful projects. Several PPI evaluation algorithms have been based on the application of probabilistic frameworks or machine learning algorithms to GO properties. Here, we introduce a new training set design and machine learning based approach that combines dependent heterogeneous protein annotations from the entire ontology to evaluate putative co-complex protein interactions determined by empirical studies. PPI annotations are built combinatorically using corresponding GO terms and InterPro annotation. We use a S.cerevisiae high-confidence complex dataset as a positive training set. A series of classifiers based on Maximum Entropy and support vector machines (SVMs), each with a composite counterpart algorithm, are trained on a series of training sets. These achieve a high performance area under the ROC curve of ≤0.97, outperforming go2ppi-a previously established prediction tool for protein-protein interactions (PPI) based on Gene Ontology (GO) annotations. https://github.com/ima23/maxent-ppi. sbh11@cl.cam.ac.uk. Supplementary data are available at Bioinformatics online.

  20. Linearized semiclassical initial value time correlation functions with maximum entropy analytic continuation.

    PubMed

    Liu, Jian; Miller, William H

    2008-09-28

    The maximum entropy analytic continuation (MEAC) method is used to extend the range of accuracy of the linearized semiclassical initial value representation (LSC-IVR)/classical Wigner approximation for real time correlation functions. LSC-IVR provides a very effective "prior" for the MEAC procedure since it is very good for short times, exact for all time and temperature for harmonic potentials (even for correlation functions of nonlinear operators), and becomes exact in the classical high temperature limit. This combined MEAC+LSC/IVR approach is applied here to two highly nonlinear dynamical systems, a pure quartic potential in one dimensional and liquid para-hydrogen at two thermal state points (25 and 14 K under nearly zero external pressure). The former example shows the MEAC procedure to be a very significant enhancement of the LSC-IVR for correlation functions of both linear and nonlinear operators, and especially at low temperature where semiclassical approximations are least accurate. For liquid para-hydrogen, the LSC-IVR is seen already to be excellent at T=25 K, but the MEAC procedure produces a significant correction at the lower temperature (T=14 K). Comparisons are also made as to how the MEAC procedure is able to provide corrections for other trajectory-based dynamical approximations when used as priors.

  1. Predicting Changes in Macrophyte Community Structure from Functional Traits in a Freshwater Lake: A Test of Maximum Entropy Model

    PubMed Central

    Fu, Hui; Zhong, Jiayou; Yuan, Guixiang; Guo, Chunjing; Lou, Qian; Zhang, Wei; Xu, Jun; Ni, Leyi; Xie, Ping; Cao, Te

    2015-01-01

    Trait-based approaches have been widely applied to investigate how community dynamics respond to environmental gradients. In this study, we applied a series of maximum entropy (maxent) models incorporating functional traits to unravel the processes governing macrophyte community structure along water depth gradient in a freshwater lake. We sampled 42 plots and 1513 individual plants, and measured 16 functional traits and abundance of 17 macrophyte species. Study results showed that maxent model can be highly robust (99.8%) in predicting the species relative abundance of macrophytes with observed community-weighted mean (CWM) traits as the constraints, while relative low (about 30%) with CWM traits fitted from water depth gradient as the constraints. The measured traits showed notably distinct importance in predicting species abundances, with lowest for perennial growth form and highest for leaf dry mass content. For tuber and leaf nitrogen content, there were significant shifts in their effects on species relative abundance from positive in shallow water to negative in deep water. This result suggests that macrophyte species with tuber organ and greater leaf nitrogen content would become more abundant in shallow water, but would become less abundant in deep water. Our study highlights how functional traits distributed across gradients provide a robust path towards predictive community ecology. PMID:26167856

  2. Representing and computing regular languages on massively parallel networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M.I.; O'Sullivan, J.A.; Boysam, B.

    1991-01-01

    This paper proposes a general method for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach first established the formal connection of rules to Chomsky grammars, and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibb's representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochasticmore » diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs' probability law. The coupling to stochastic search methods yields the all-important practical result that fully parallel stochastic cellular automata may be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determines the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively-parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images.« less

  3. Toward the Application of the Maximum Entropy Production Principle to a Broader Range of Far From Equilibrium Dissipative Systems

    NASA Astrophysics Data System (ADS)

    Lineweaver, C. H.

    2005-12-01

    The principle of Maximum Entropy Production (MEP) is being usefully applied to a wide range of non-equilibrium processes including flows in planetary atmospheres and the bioenergetics of photosynthesis. Our goal of applying the principle of maximum entropy production to an even wider range of Far From Equilibrium Dissipative Systems (FFEDS) depends on the reproducibility of the evolution of the system from macro-state A to macro-state B. In an attempt to apply the principle of MEP to astronomical and cosmological structures, we investigate the problematic relationship between gravity and entropy. In the context of open and non-equilibrium systems, we use a generalization of the Gibbs free energy to include the sources of free energy extracted by non-living FFEDS such as hurricanes and convection cells. Redox potential gradients and thermal and pressure gradients provide the free energy for a broad range of FFEDS, both living and non-living. However, these gradients have to be within certain ranges. If the gradients are too weak, FFEDS do not appear. If the gradients are too strong FFEDS disappear. Living and non-living FFEDS often have different source gradients (redox potential gradients vs thermal and pressure gradients) and when they share the same gradient, they exploit different ranges of the gradient. In a preliminary attempt to distinguish living from non-living FFEDS, we investigate the parameter space of: type of gradient and steepness of gradient.

  4. Learning Probabilities From Random Observables in High Dimensions: The Maximum Entropy Distribution and Others

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi

    2015-11-01

    We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).

  5. Optimized Kernel Entropy Components.

    PubMed

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  6. Analysis of rapid eye movement periodicity in narcoleptics based on maximum entropy method.

    PubMed

    Honma, H; Ohtomo, N; Kohsaka, M; Fukuda, N; Kobayashi, R; Sakakibara, S; Nakamura, F; Koyama, T

    1999-04-01

    We examined REM sleep periodicity in typical narcoleptics and patients who had shown signs of a narcoleptic tetrad without HLA-DRB1*1501/DQB1*0602 or DR2 antigens, using spectral analysis based on the maximum entropy method. The REM sleep period of typical narcoleptics showed two peaks, one at 70-90 min and one at 110-130 min at night, and a single peak at around 70-90 min during the daytime. The nocturnal REM sleep period of typical narcoleptics may be composed of several different periods, one of which corresponds to that of their daytime REM sleep.

  7. Stimulus-dependent Maximum Entropy Models of Neural Population Codes

    PubMed Central

    Segev, Ronen; Schneidman, Elad

    2013-01-01

    Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population. PMID:23516339

  8. Maximum entropy production principle for geostrophic turbulence

    NASA Astrophysics Data System (ADS)

    Sommeria, J.; Bouchet, F.; Chavanis, P. H.

    2003-04-01

    In 2D turbulence, complex stirring leads to the formation of steady organized states, once fine scale fluctuations have been filtered out. This self-organization can be explained in terms of statistical equilibrium for vorticity, as the most likely outcome of vorticity parcel rearrangements with the constraints of the conservation laws. A mixing entropy describing the vorticity rearrangements is introduced. Extension to the shallow water system has been proposed by Chavanis P.H. and Sommeria J. (2002), Phys. Rev. E. Generalization to multi-layer geostrophic flows is formally straightforward. Outside equilibrium, eddy fluxes should drive the system toward equilibrium, in the spirit of non equilibrium linear thermodynamics. This can been formalized in terms of a principle of maximum entropy production (MEP), as shown by Robert and Sommeria (1991), Phys. Rev. Lett. 69. Then a parameterization of eddy fluxes is obtained, involving an eddy diffusivity plus a drift term acting at larger scale. These two terms balance each other at equilibrium, resulting in a non trivial steady flow, which is the mean state of the statistical equilibrium. Applications of this eddy parametrization will be presented, in the context of oceanic circulation and Jupiter's Great Red Spot. Quantitative tests will be discussed, obtained by comparisons with direct numerical simulations. Kinetic models, inspired from plasma physics, provide a more precise description of the relaxation toward equilibrium, as shown by Chavanis P.H. 2000 ``Quasilinear theory of the 2D Euler equation'', Phys. Rev. Lett. 84. This approach provides relaxation equations with a form similar to the MEP, but not identical. In conclusion, the MEP provides the right trends of the system but its precise justification remains elusive.

  9. Bioinspired Resource Management for Multiple-Sensor Target Tracking Systems

    DTIC Science & Technology

    2011-06-20

    Section 2, we also present the Renyi o-entropy and a-divergence [13] that are extensively utilized in our information-theoretic approach (cf. [9] and...gain in information. The Renyi a-entropy provides a general scalar measure of uncertainty [10]: Ua (Slrft) = YZT^ 1(>g / ^ (XA’ I Zl:*^ (/XA:- (7...it follows that as a approaches unity, the Renyi a-entropy (7) reduces to the Shannon entropy: TMzi*) = Urni/Ha(zi;fc) = - / p(xk\\zhk)\\ogp{xk\\zi:k

  10. Entropy in self-similar shock profiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolin, Len G.; Reisner, Jon Michael; Jordan, Pedro M.

    In this paper, we study the structure of a gaseous shock, and in particular the distribution of entropy within, in both a thermodynamics and a statistical mechanics context. The problem of shock structure has a long and distinguished history that we review. We employ the Navier–Stokes equations to construct a self–similar version of Becker’s solution for a shock assuming a particular (physically plausible) Prandtl number; that solution reproduces the well–known result of Morduchow & Libby that features a maximum of the equilibrium entropy inside the shock profile. We then construct an entropy profile, based on gas kinetic theory, that ismore » smooth and monotonically increasing. The extension of equilibrium thermodynamics to irreversible processes is based in part on the assumption of local thermodynamic equilibrium. We show that this assumption is not valid except for the weakest shocks. Finally, we conclude by hypothesizing a thermodynamic nonequilibrium entropy and demonstrating that it closely estimates the gas kinetic nonequilibrium entropy within a shock.« less

  11. Entropy in self-similar shock profiles

    DOE PAGES

    Margolin, Len G.; Reisner, Jon Michael; Jordan, Pedro M.

    2017-07-16

    In this paper, we study the structure of a gaseous shock, and in particular the distribution of entropy within, in both a thermodynamics and a statistical mechanics context. The problem of shock structure has a long and distinguished history that we review. We employ the Navier–Stokes equations to construct a self–similar version of Becker’s solution for a shock assuming a particular (physically plausible) Prandtl number; that solution reproduces the well–known result of Morduchow & Libby that features a maximum of the equilibrium entropy inside the shock profile. We then construct an entropy profile, based on gas kinetic theory, that ismore » smooth and monotonically increasing. The extension of equilibrium thermodynamics to irreversible processes is based in part on the assumption of local thermodynamic equilibrium. We show that this assumption is not valid except for the weakest shocks. Finally, we conclude by hypothesizing a thermodynamic nonequilibrium entropy and demonstrating that it closely estimates the gas kinetic nonequilibrium entropy within a shock.« less

  12. Finding the quantum thermoelectric with maximal efficiency and minimal entropy production at given power output

    NASA Astrophysics Data System (ADS)

    Whitney, Robert S.

    2015-03-01

    We investigate the nonlinear scattering theory for quantum systems with strong Seebeck and Peltier effects, and consider their use as heat engines and refrigerators with finite power outputs. This paper gives detailed derivations of the results summarized in a previous paper [R. S. Whitney, Phys. Rev. Lett. 112, 130601 (2014), 10.1103/PhysRevLett.112.130601]. It shows how to use the scattering theory to find (i) the quantum thermoelectric with maximum possible power output, and (ii) the quantum thermoelectric with maximum efficiency at given power output. The latter corresponds to a minimal entropy production at that power output. These quantities are of quantum origin since they depend on system size over electronic wavelength, and so have no analog in classical thermodynamics. The maximal efficiency coincides with Carnot efficiency at zero power output, but decreases with increasing power output. This gives a fundamental lower bound on entropy production, which means that reversibility (in the thermodynamic sense) is impossible for finite power output. The suppression of efficiency by (nonlinear) phonon and photon effects is addressed in detail; when these effects are strong, maximum efficiency coincides with maximum power. Finally, we show in particular limits (typically without magnetic fields) that relaxation within the quantum system does not allow the system to exceed the bounds derived for relaxation-free systems, however, a general proof of this remains elusive.

  13. Application of a multiscale maximum entropy image restoration algorithm to HXMT observations

    NASA Astrophysics Data System (ADS)

    Guan, Ju; Song, Li-Ming; Huo, Zhuo-Xi

    2016-08-01

    This paper introduces a multiscale maximum entropy (MSME) algorithm for image restoration of the Hard X-ray Modulation Telescope (HXMT), which is a collimated scan X-ray satellite mainly devoted to a sensitive all-sky survey and pointed observations in the 1-250 keV range. The novelty of the MSME method is to use wavelet decomposition and multiresolution support to control noise amplification at different scales. Our work is focused on the application and modification of this method to restore diffuse sources detected by HXMT scanning observations. An improved method, the ensemble multiscale maximum entropy (EMSME) algorithm, is proposed to alleviate the problem of mode mixing exiting in MSME. Simulations have been performed on the detection of the diffuse source Cen A by HXMT in all-sky survey mode. The results show that the MSME method is adapted to the deconvolution task of HXMT for diffuse source detection and the improved method could suppress noise and improve the correlation and signal-to-noise ratio, thus proving itself a better algorithm for image restoration. Through one all-sky survey, HXMT could reach a capacity of detecting a diffuse source with maximum differential flux of 0.5 mCrab. Supported by Strategic Priority Research Program on Space Science, Chinese Academy of Sciences (XDA04010300) and National Natural Science Foundation of China (11403014)

  14. Ecosystem functioning and maximum entropy production: a quantitative test of hypotheses.

    PubMed

    Meysman, Filip J R; Bruers, Stijn

    2010-05-12

    The idea that entropy production puts a constraint on ecosystem functioning is quite popular in ecological thermodynamics. Yet, until now, such claims have received little quantitative verification. Here, we examine three 'entropy production' hypotheses that have been forwarded in the past. The first states that increased entropy production serves as a fingerprint of living systems. The other two hypotheses invoke stronger constraints. The state selection hypothesis states that when a system can attain multiple steady states, the stable state will show the highest entropy production rate. The gradient response principle requires that when the thermodynamic gradient increases, the system's new stable state should always be accompanied by a higher entropy production rate. We test these three hypotheses by applying them to a set of conventional food web models. Each time, we calculate the entropy production rate associated with the stable state of the ecosystem. This analysis shows that the first hypothesis holds for all the food webs tested: the living state shows always an increased entropy production over the abiotic state. In contrast, the state selection and gradient response hypotheses break down when the food web incorporates more than one trophic level, indicating that they are not generally valid.

  15. Consistent Application of the Boltzmann Distribution to Residual Entropy in Crystals

    ERIC Educational Resources Information Center

    Kozliak, Evguenii I.

    2007-01-01

    Four different approaches to residual entropy (the entropy remaining in crystals comprised of nonsymmetric molecules like CO, N[subscript 2]O, FClO[subscript 3], and H[subscript 2]O as temperatures approach 0 K) are analyzed and a new method of its calculation is developed based on application of the Boltzmann distribution. The inherent connection…

  16. Entropy for Mechanically Vibrating Systems

    NASA Astrophysics Data System (ADS)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  17. a Maximum Entropy Model of the Bearded Capuchin Monkey Habitat Incorporating Topography and Spectral Unmixing Analysis

    NASA Astrophysics Data System (ADS)

    Howard, A. M.; Bernardes, S.; Nibbelink, N.; Biondi, L.; Presotto, A.; Fragaszy, D. M.; Madden, M.

    2012-07-01

    Movement patterns of bearded capuchin monkeys (Cebus (Sapajus) libidinosus) in northeastern Brazil are likely impacted by environmental features such as elevation, vegetation density, or vegetation type. Habitat preferences of these monkeys provide insights regarding the impact of environmental features on species ecology and the degree to which they incorporate these features in movement decisions. In order to evaluate environmental features influencing movement patterns and predict areas suitable for movement, we employed a maximum entropy modelling approach, using observation points along capuchin monkey daily routes as species presence points. We combined these presence points with spatial data on important environmental features from remotely sensed data on land cover and topography. A spectral mixing analysis procedure was used to generate fraction images that represent green vegetation, shade and soil of the study area. A Landsat Thematic Mapper scene of the area of study was geometrically and atmospherically corrected and used as input in a Minimum Noise Fraction (MNF) procedure and a linear spectral unmixing approach was used to generate the fraction images. These fraction images and elevation were the environmental layer inputs for our logistic MaxEnt model of capuchin movement. Our models' predictive power (test AUC) was 0.775. Areas of high elevation (>450 m) showed low probabilities of presence, and percent green vegetation was the greatest overall contributor to model AUC. This work has implications for predicting daily movement patterns of capuchins in our field site, as suitability values from our model may relate to habitat preference and facility of movement.

  18. Improving Bayesian credibility intervals for classifier error rates using maximum entropy empirical priors.

    PubMed

    Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders

    2010-06-01

    Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Rogue waves and entropy consumption

    NASA Astrophysics Data System (ADS)

    Hadjihoseini, Ali; Lind, Pedro G.; Mori, Nobuhito; Hoffmann, Norbert P.; Peinke, Joachim

    2017-11-01

    Based on data from the Sea of Japan and the North Sea the occurrence of rogue waves is analyzed by a scale-dependent stochastic approach, which interlinks fluctuations of waves for different spacings. With this approach we are able to determine a stochastic cascade process, which provides information of the general multipoint statistics. Furthermore the evolution of single trajectories in scale, which characterize wave height fluctuations in the surroundings of a chosen location, can be determined. The explicit knowledge of the stochastic process enables to assign entropy values to all wave events. We show that for these entropies the integral fluctuation theorem, a basic law of non-equilibrium thermodynamics, is valid. This implies that positive and negative entropy events must occur. Extreme events like rogue waves are characterized as negative entropy events. The statistics of these entropy fluctuations changes with the wave state, thus for the Sea of Japan the statistics of the entropies has a more pronounced tail for negative entropy values, indicating a higher probability of rogue waves.

  20. Quantum chaos: An entropy approach

    NASA Astrophysics Data System (ADS)

    Sl/omczyński, Wojciech; Życzkowski, Karol

    1994-11-01

    A new definition of the entropy of a given dynamical system and of an instrument describing the measurement process is proposed within the operational approach to quantum mechanics. It generalizes other definitions of entropy, in both the classical and quantum cases. The Kolmogorov-Sinai (KS) entropy is obtained for a classical system and the sharp measurement instrument. For a quantum system and a coherent states instrument, a new quantity, coherent states entropy, is defined. It may be used to measure chaos in quantum mechanics. The following correspondence principle is proved: the upper limit of the coherent states entropy of a quantum map as ℏ→0 is less than or equal to the KS-entropy of the corresponding classical map. ``Chaos umpire sits, And by decision more imbroils the fray By which he reigns: next him high arbiter Chance governs all.'' John Milton, Paradise Lost, Book II

  1. Entropy from State Probabilities: Hydration Entropy of Cations

    PubMed Central

    2013-01-01

    Entropy is an important energetic quantity determining the progression of chemical processes. We propose a new approach to obtain hydration entropy directly from probability density functions in state space. We demonstrate the validity of our approach for a series of cations in aqueous solution. Extensive validation of simulation results was performed. Our approach does not make prior assumptions about the shape of the potential energy landscape and is capable of calculating accurate hydration entropy values. Sampling times in the low nanosecond range are sufficient for the investigated ionic systems. Although the presented strategy is at the moment limited to systems for which a scalar order parameter can be derived, this is not a principal limitation of the method. The strategy presented is applicable to any chemical system where sufficient sampling of conformational space is accessible, for example, by computer simulations. PMID:23651109

  2. Entropy of international trades

    NASA Astrophysics Data System (ADS)

    Oh, Chang-Young; Lee, D.-S.

    2017-05-01

    The organization of international trades is highly complex under the collective efforts towards economic profits of participating countries given inhomogeneous resources for production. Considering the trade flux as the probability of exporting a product from a country to another, we evaluate the entropy of the world trades in the period 1950-2000. The trade entropy has increased with time, and we show that it is mainly due to the extension of trade partnership. For a given number of trade partners, the mean trade entropy is about 60% of the maximum possible entropy, independent of time, which can be regarded as a characteristic of the trade fluxes' heterogeneity and is shown to be derived from the scaling and functional behaviors of the universal trade-flux distribution. The correlation and time evolution of the individual countries' gross-domestic products and the number of trade partners show that most countries achieved their economic growth partly by extending their trade relationship.

  3. Jarzynski equality in the context of maximum path entropy

    NASA Astrophysics Data System (ADS)

    González, Diego; Davis, Sergio

    2017-06-01

    In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.

  4. High resolution schemes and the entropy condition

    NASA Technical Reports Server (NTRS)

    Osher, S.; Chakravarthy, S.

    1983-01-01

    A systematic procedure for constructing semidiscrete, second order accurate, variation diminishing, five point band width, approximations to scalar conservation laws, is presented. These schemes are constructed to also satisfy a single discrete entropy inequality. Thus, in the convex flux case, convergence is proven to be the unique physically correct solution. For hyperbolic systems of conservation laws, this construction is used formally to extend the first author's first order accurate scheme, and show (under some minor technical hypotheses) that limit solutions satisfy an entropy inequality. Results concerning discrete shocks, a maximum principle, and maximal order of accuracy are obtained. Numerical applications are also presented.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Lin, E-mail: godyalin@163.com; Singh, Uttam, E-mail: uttamsingh@hri.res.in; Pati, Arun K., E-mail: akpati@hri.res.in

    Compact expressions for the average subentropy and coherence are obtained for random mixed states that are generated via various probability measures. Surprisingly, our results show that the average subentropy of random mixed states approaches the maximum value of the subentropy which is attained for the maximally mixed state as we increase the dimension. In the special case of the random mixed states sampled from the induced measure via partial tracing of random bipartite pure states, we establish the typicality of the relative entropy of coherence for random mixed states invoking the concentration of measure phenomenon. Our results also indicate thatmore » mixed quantum states are less useful compared to pure quantum states in higher dimension when we extract quantum coherence as a resource. This is because of the fact that average coherence of random mixed states is bounded uniformly, however, the average coherence of random pure states increases with the increasing dimension. As an important application, we establish the typicality of relative entropy of entanglement and distillable entanglement for a specific class of random bipartite mixed states. In particular, most of the random states in this specific class have relative entropy of entanglement and distillable entanglement equal to some fixed number (to within an arbitrary small error), thereby hugely reducing the complexity of computation of these entanglement measures for this specific class of mixed states.« less

  6. An exploratory statistical approach to depression pattern identification

    NASA Astrophysics Data System (ADS)

    Feng, Qing Yi; Griffiths, Frances; Parsons, Nick; Gunn, Jane

    2013-02-01

    Depression is a complex phenomenon thought to be due to the interaction of biological, psychological and social factors. Currently depression assessment uses self-reported depressive symptoms but this is limited in the degree to which it can characterise the different expressions of depression emerging from the complex causal pathways that are thought to underlie depression. In this study, we aimed to represent the different patterns of depression with pattern values unique to each individual, where each value combines all the available information about an individual’s depression. We considered the depressed individual as a subsystem of an open complex system, proposed Generalized Information Entropy (GIE) to represent the general characteristics of information entropy of the system, and then implemented Maximum Entropy Estimates to derive equations for depression patterns. We also introduced a numerical simulation method to process the depression related data obtained by the Diamond Cohort Study which has been underway in Australia since 2005 involving 789 people. Unlike traditional assessment, we obtained a unique value for each depressed individual which gives an overall assessment of the depression pattern. Our work provides a novel way to visualise and quantitatively measure the depression pattern of the depressed individual which could be used for pattern categorisation. This may have potential for tailoring health interventions to depressed individuals to maximize health benefit.

  7. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production

    PubMed Central

    Kleidon, A.

    2010-01-01

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion. PMID:20368248

  8. Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions

    NASA Astrophysics Data System (ADS)

    Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus

    2017-10-01

    We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.

  9. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production.

    PubMed

    Kleidon, A

    2010-05-12

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion.

  10. An entropy-based method for determining the flow depth distribution in natural channels

    NASA Astrophysics Data System (ADS)

    Moramarco, Tommaso; Corato, Giovanni; Melone, Florisa; Singh, Vijay P.

    2013-08-01

    A methodology for determining the bathymetry of river cross-sections during floods by the sampling of surface flow velocity and existing low flow hydraulic data is developed . Similar to Chiu (1988) who proposed an entropy-based velocity distribution, the flow depth distribution in a cross-section of a natural channel is derived by entropy maximization. The depth distribution depends on one parameter, whose estimate is straightforward, and on the maximum flow depth. Applying to a velocity data set of five river gage sites, the method modeled the flow area observed during flow measurements and accurately assessed the corresponding discharge by coupling the flow depth distribution and the entropic relation between mean velocity and maximum velocity. The methodology unfolds a new perspective for flow monitoring by remote sensing, considering that the two main quantities on which the methodology is based, i.e., surface flow velocity and flow depth, might be potentially sensed by new sensors operating aboard an aircraft or satellite.

  11. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  12. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  13. Statistical mechanics of letters in words

    PubMed Central

    Stephens, Greg J.; Bialek, William

    2013-01-01

    We consider words as a network of interacting letters, and approximate the probability distribution of states taken on by this network. Despite the intuition that the rules of English spelling are highly combinatorial and arbitrary, we find that maximum entropy models consistent with pairwise correlations among letters provide a surprisingly good approximation to the full statistics of words, capturing ~92% of the multi-information in four-letter words and even “discovering” words that were not represented in the data. These maximum entropy models incorporate letter interactions through a set of pairwise potentials and thus define an energy landscape on the space of possible words. Guided by the large letter redundancy we seek a lower-dimensional encoding of the letter distribution and show that distinctions between local minima in the landscape account for ~68% of the four-letter entropy. We suggest that these states provide an effective vocabulary which is matched to the frequency of word use and much smaller than the full lexicon. PMID:20866490

  14. Weak scale from the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  15. Minimax Estimation of Functionals of Discrete Distributions

    PubMed Central

    Jiao, Jiantao; Venkat, Kartik; Han, Yanjun; Weissman, Tsachy

    2017-01-01

    We propose a general methodology for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional parameters, and elaborate on the case of discrete distributions, where the support size S is unknown and may be comparable with or even much larger than the number of observations n. We treat the respective regions where the functional is nonsmooth and smooth separately. In the nonsmooth regime, we apply an unbiased estimator for the best polynomial approximation of the functional whereas, in the smooth regime, we apply a bias-corrected version of the maximum likelihood estimator (MLE). We illustrate the merit of this approach by thoroughly analyzing the performance of the resulting schemes for estimating two important information measures: 1) the entropy H(P)=∑i=1S−pilnpi and 2) Fα(P)=∑i=1Spiα, α > 0. We obtain the minimax L2 rates for estimating these functionals. In particular, we demonstrate that our estimator achieves the optimal sample complexity n ≍ S/ln S for entropy estimation. We also demonstrate that the sample complexity for estimating Fα(P), 0 < α < 1, is n ≍ S1/α/ln S, which can be achieved by our estimator but not the MLE. For 1 < α < 3/2, we show the minimax L2 rate for estimating Fα(P) is (n ln n)−2(α−1) for infinite support size, while the maximum L2 rate for the MLE is n−2(α−1). For all the above cases, the behavior of the minimax rate-optimal estimators with n samples is essentially that of the MLE (plug-in rule) with n ln n samples, which we term “effective sample size enlargement.” We highlight the practical advantages of our schemes for the estimation of entropy and mutual information. We compare our performance with various existing approaches, and demonstrate that our approach reduces running time and boosts the accuracy. Moreover, we show that the minimax rate-optimal mutual information estimator yielded by our framework leads to significant performance boosts over the Chow–Liu algorithm in learning graphical models. The wide use of information measure estimation suggests that the insights and estimators obtained in this paper could be broadly applicable. PMID:29375152

  16. Entropy Is Simple, Qualitatively.

    ERIC Educational Resources Information Center

    Lambert, Frank L.

    2002-01-01

    Suggests that qualitatively, entropy is simple. Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental "disorder" as a descriptor of entropy change. (MM)

  17. Swine influenza and vaccines: an alternative approach for decision making about pandemic prevention.

    PubMed

    Basili, Marcello; Ferrini, Silvia; Montomoli, Emanuele

    2013-08-01

    During the global pandemic of A/H1N1/California/07/2009 (A/H1N1/Cal) influenza, many governments signed contracts with vaccine producers for a universal influenza immunization program and bought hundreds of millions of vaccines doses. We argue that, as Health Ministers assumed the occurrence of the worst possible scenario (generalized pandemic influenza) and followed the strong version of the Precautionary Principle, they undervalued the possibility of mild or weak pandemic wave. An alternative decision rule, based on the non-extensive entropy principle, is introduced, and a different Precautionary Principle characterization is applied. This approach values extreme negative results (catastrophic events) in a different way and predicts more plausible and mild events. It introduces less pessimistic forecasts in the case of uncertain influenza pandemic outbreaks. A simplified application is presented using seasonal data of morbidity and severity among Italian children influenza-like illness for the period 2003-10. Established literature results predict an average attack rate of not less than 15% for the next pandemic influenza [Meltzer M, Cox N, Fukuda K. The economic impact of pandemic influenza in the United States: implications for setting priorities for interventions. Emerg Infect Dis 1999;5:659-71; Meltzer M, Cox N, Fukuda K. Modeling the Economic Impact of Pandemic Influenza in the United States: Implications for Setting Priorities for Intervention. Background paper. Atlanta, GA: CDC, 1999. Available at: http://www.cdc.gov/ncidod/eid/vol5no5/melt_back.htm (7 January 2011, date last accessed))]. The strong version of the Precautionary Principle would suggest using this prediction for vaccination campaigns. On the contrary, the non-extensive maximum entropy principle predicts a lower attack rate, which induces a 20% saving in public funding for vaccines doses. The need for an effective influenza pandemic prevention program, coupled with an efficient use of public funding, calls for a rethinking of the Precautionary Principle. The non-extensive maximum entropy principle, which incorporates vague and incomplete information available to decision makers, produces a more coherent forecast of possible influenza pandemic and a conservative spending in public funding.

  18. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  19. Maximum entropy production, carbon assimilation, and the spatial organization of vegetation in river basins

    PubMed Central

    del Jesus, Manuel; Foti, Romano; Rinaldo, Andrea; Rodriguez-Iturbe, Ignacio

    2012-01-01

    The spatial organization of functional vegetation types in river basins is a major determinant of their runoff production, biodiversity, and ecosystem services. The optimization of different objective functions has been suggested to control the adaptive behavior of plants and ecosystems, often without a compelling justification. Maximum entropy production (MEP), rooted in thermodynamics principles, provides a tool to justify the choice of the objective function controlling vegetation organization. The application of MEP at the ecosystem scale results in maximum productivity (i.e., maximum canopy photosynthesis) as the thermodynamic limit toward which the organization of vegetation appears to evolve. Maximum productivity, which incorporates complex hydrologic feedbacks, allows us to reproduce the spatial macroscopic organization of functional types of vegetation in a thoroughly monitored river basin, without the need for a reductionist description of the underlying microscopic dynamics. The methodology incorporates the stochastic characteristics of precipitation and the associated soil moisture on a spatially disaggregated framework. Our results suggest that the spatial organization of functional vegetation types in river basins naturally evolves toward configurations corresponding to dynamically accessible local maxima of the maximum productivity of the ecosystem. PMID:23213227

  20. Identifying topological-band insulator transitions in silicene and other 2D gapped Dirac materials by means of Rényi-Wehrl entropy

    NASA Astrophysics Data System (ADS)

    Calixto, M.; Romera, E.

    2015-02-01

    We propose a new method to identify transitions from a topological insulator to a band insulator in silicene (the silicon equivalent of graphene) in the presence of perpendicular magnetic and electric fields, by using the Rényi-Wehrl entropy of the quantum state in phase space. Electron-hole entropies display an inversion/crossing behavior at the charge neutrality point for any Landau level, and the combined entropy of particles plus holes turns out to be maximum at this critical point. The result is interpreted in terms of delocalization of the quantum state in phase space. The entropic description presented in this work will be valid in general 2D gapped Dirac materials, with a strong intrinsic spin-orbit interaction, isostructural with silicene.

  1. Maximum entropy approach to fuzzy control

    NASA Technical Reports Server (NTRS)

    Ramer, Arthur; Kreinovich, Vladik YA.

    1992-01-01

    For the same expert knowledge, if one uses different &- and V-operations in a fuzzy control methodology, one ends up with different control strategies. Each choice of these operations restricts the set of possible control strategies. Since a wrong choice can lead to a low quality control, it is reasonable to try to loose as few possibilities as possible. This idea is formalized and it is shown that it leads to the choice of min(a + b,1) for V and min(a,b) for &. This choice was tried on NASA Shuttle simulator; it leads to a maximally stable control.

  2. Detecting recurrence domains of dynamical systems by symbolic dynamics.

    PubMed

    beim Graben, Peter; Hutt, Axel

    2013-04-12

    We propose an algorithm for the detection of recurrence domains of complex dynamical systems from time series. Our approach exploits the characteristic checkerboard texture of recurrence domains exhibited in recurrence plots. In phase space, recurrence plots yield intersecting balls around sampling points that could be merged into cells of a phase space partition. We construct this partition by a rewriting grammar applied to the symbolic dynamics of time indices. A maximum entropy principle defines the optimal size of intersecting balls. The final application to high-dimensional brain signals yields an optimal symbolic recurrence plot revealing functional components of the signal.

  3. Charmonium ground and excited states at finite temperature from complex Borel sum rules

    NASA Astrophysics Data System (ADS)

    Araki, Ken-Ji; Suzuki, Kei; Gubler, Philipp; Oka, Makoto

    2018-05-01

    Charmonium spectral functions in vector and pseudoscalar channels at finite temperature are investigated through the complex Borel sum rules and the maximum entropy method. Our approach enables us to extract the peaks corresponding to the excited charmonia, ψ‧ and ηc‧ , as well as those of the ground states, J / ψ and ηc, which has never been achieved in usual QCD sum rule analyses. We show the spectral functions in vacuum and their thermal modification around the critical temperature, which leads to the almost simultaneous melting (or peak disappearance) of the ground and excited states.

  4. Developing Soil Moisture Profiles Utilizing Remotely Sensed MW and TIR Based SM Estimates Through Principle of Maximum Entropy

    NASA Astrophysics Data System (ADS)

    Mishra, V.; Cruise, J. F.; Mecikalski, J. R.

    2015-12-01

    Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Earlier studies show that the principle of maximum entropy (POME) can be utilized to develop vertical soil moisture profiles with accuracy (MAE of about 1% for a monotonically dry profile; nearly 2% for monotonically wet profiles and 3.8% for mixed profiles) with minimum constraints (surface, mean and bottom soil moisture contents). In this study, the constraints for the vertical soil moisture profiles were obtained from remotely sensed data. Low resolution (25 km) MW soil moisture estimates (AMSR-E) were downscaled to 4 km using a soil evaporation efficiency index based disaggregation approach. The downscaled MW soil moisture estimates served as a surface boundary condition, while 4 km resolution TIR based Atmospheric Land Exchange Inverse (ALEXI) estimates provided the required mean root-zone soil moisture content. Bottom soil moisture content is assumed to be a soil dependent constant. Mulit-year (2002-2011) gridded profiles were developed for the southeastern United States using the POME method. The soil moisture profiles were compared to those generated in land surface models (Land Information System (LIS) and an agricultural model DSSAT) along with available NRCS SCAN sites in the study region. The end product, spatial soil moisture profiles, can be assimilated into agricultural and hydrologic models in lieu of precipitation for data scarce regions.Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Previous studies have shown that the principle of maximum entropy (POME) can be utilized with minimal constraints to develop vertical soil moisture profiles with accuracy (MAE = 1% for monotonically dry profiles; MAE = 2% for monotonically wet profiles and MAE = 3.8% for mixed profiles) when compared to laboratory and field data. In this study, vertical soil moisture profiles were developed using the POME model to evaluate an irrigation schedule over a maze field in north central Alabama (USA). The model was validated using both field data and a physically based mathematical model. The results demonstrate that a simple two-constraint entropy model under the assumption of a uniform initial soil moisture distribution can simulate most soil moisture profiles within the field area for 6 different soil types. The results of the irrigation simulation demonstrated that the POME model produced a very efficient irrigation strategy with loss of about 1.9% of the total applied irrigation water. However, areas of fine-textured soil (i.e. silty clay) resulted in plant stress of nearly 30% of the available moisture content due to insufficient water supply on the last day of the drying phase of the irrigation cycle. Overall, the POME approach showed promise as a general strategy to guide irrigation in humid environments, with minimum input requirements.

  5. A long-term target detection approach in infrared image sequence

    NASA Astrophysics Data System (ADS)

    Li, Hang; Zhang, Qi; Wang, Xin; Hu, Chao

    2016-10-01

    An automatic target detection method used in long term infrared (IR) image sequence from a moving platform is proposed. Firstly, based on POME(the principle of maximum entropy), target candidates are iteratively segmented. Then the real target is captured via two different selection approaches. At the beginning of image sequence, the genuine target with litter texture is discriminated from other candidates by using contrast-based confidence measure. On the other hand, when the target becomes larger, we apply online EM method to estimate and update the distributions of target's size and position based on the prior detection results, and then recognize the genuine one which satisfies both the constraints of size and position. Experimental results demonstrate that the presented method is accurate, robust and efficient.

  6. An Accessible Approach to Understanding Entropy and Change

    ERIC Educational Resources Information Center

    Johnson, Philip

    2018-01-01

    This article challenges the notion that entropy is something to be avoided. A line of argument is presented that is accessible to those not having specialist knowledge and that offers a new perspective to those more familiar with the concept. It shows that temperature is better understood by addressing entropy. Entropy change diagrams are…

  7. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  8. On the morphological instability of a bubble during inertia-controlled growth

    NASA Astrophysics Data System (ADS)

    Martyushev, L. M.; Birzina, A. I.; Soboleva, A. S.

    2018-06-01

    The morphological stability of a spherical bubble growing under inertia control is analyzed. Based on the comparison of entropy productions for a distorted and undistorted surface and using the maximum entropy production principle, the morphological instability of the bubble under arbitrary amplitude distortions is shown. This result allows explaining a number of experiments where the surface roughness of bubbles was observed during their explosive-type growth.

  9. Trends in entropy production during ecosystem development in the Amazon Basin.

    PubMed

    Holdaway, Robert J; Sparrow, Ashley D; Coomes, David A

    2010-05-12

    Understanding successional trends in energy and matter exchange across the ecosystem-atmosphere boundary layer is an essential focus in ecological research; however, a general theory describing the observed pattern remains elusive. This paper examines whether the principle of maximum entropy production could provide the solution. A general framework is developed for calculating entropy production using data from terrestrial eddy covariance and micrometeorological studies. We apply this framework to data from eight tropical forest and pasture flux sites in the Amazon Basin and show that forest sites had consistently higher entropy production rates than pasture sites (0.461 versus 0.422 W m(-2) K(-1), respectively). It is suggested that during development, changes in canopy structure minimize surface albedo, and development of deeper root systems optimizes access to soil water and thus potential transpiration, resulting in lower surface temperatures and increased entropy production. We discuss our results in the context of a theoretical model of entropy production versus ecosystem developmental stage. We conclude that, although further work is required, entropy production could potentially provide a much-needed theoretical basis for understanding the effects of deforestation and land-use change on the land-surface energy balance.

  10. Prediction of pKa Values for Neutral and Basic Drugs based on Hybrid Artificial Intelligence Methods.

    PubMed

    Li, Mengshan; Zhang, Huaijing; Chen, Bingsheng; Wu, Yan; Guan, Lixin

    2018-03-05

    The pKa value of drugs is an important parameter in drug design and pharmacology. In this paper, an improved particle swarm optimization (PSO) algorithm was proposed based on the population entropy diversity. In the improved algorithm, when the population entropy was higher than the set maximum threshold, the convergence strategy was adopted; when the population entropy was lower than the set minimum threshold the divergence strategy was adopted; when the population entropy was between the maximum and minimum threshold, the self-adaptive adjustment strategy was maintained. The improved PSO algorithm was applied in the training of radial basis function artificial neural network (RBF ANN) model and the selection of molecular descriptors. A quantitative structure-activity relationship model based on RBF ANN trained by the improved PSO algorithm was proposed to predict the pKa values of 74 kinds of neutral and basic drugs and then validated by another database containing 20 molecules. The validation results showed that the model had a good prediction performance. The absolute average relative error, root mean square error, and squared correlation coefficient were 0.3105, 0.0411, and 0.9685, respectively. The model can be used as a reference for exploring other quantitative structure-activity relationships.

  11. On the pH Dependence of the Potential of Maximum Entropy of Ir(111) Electrodes.

    PubMed

    Ganassin, Alberto; Sebastián, Paula; Climent, Víctor; Schuhmann, Wolfgang; Bandarenka, Aliaksandr S; Feliu, Juan

    2017-04-28

    Studies over the entropy of components forming the electrode/electrolyte interface can give fundamental insights into the properties of electrified interphases. In particular, the potential where the entropy of formation of the double layer is maximal (potential of maximum entropy, PME) is an important parameter for the characterization of electrochemical systems. Indeed, this parameter determines the majority of electrode processes. In this work, we determine PMEs for Ir(111) electrodes. The latter currently play an important role to understand electrocatalysis for energy provision; and at the same time, iridium is one of the most stable metals against corrosion. For the experiments, we used a combination of the laser induced potential transient to determine the PME, and CO charge-displacement to determine the potentials of zero total charge, (E PZTC ). Both PME and E PZTC were assessed for perchlorate solutions in the pH range from 1 to 4. Surprisingly, we found that those are located in the potential region where the adsorption of hydrogen and hydroxyl species takes place, respectively. The PMEs demonstrated a shift by ~30 mV per a pH unit (in the RHE scale). Connections between the PME and electrocatalytic properties of the electrode surface are discussed.

  12. Developing the fuzzy c-means clustering algorithm based on maximum entropy for multitarget tracking in a cluttered environment

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Li, Yaan; Yu, Jing; Li, Yuxing

    2018-01-01

    For fast and more effective implementation of tracking multiple targets in a cluttered environment, we propose a multiple targets tracking (MTT) algorithm called maximum entropy fuzzy c-means clustering joint probabilistic data association that combines fuzzy c-means clustering and the joint probabilistic data association (PDA) algorithm. The algorithm uses the membership value to express the probability of the target originating from measurement. The membership value is obtained through fuzzy c-means clustering objective function optimized by the maximum entropy principle. When considering the effect of the public measurement, we use a correction factor to adjust the association probability matrix to estimate the state of the target. As this algorithm avoids confirmation matrix splitting, it can solve the high computational load problem of the joint PDA algorithm. The results of simulations and analysis conducted for tracking neighbor parallel targets and cross targets in a different density cluttered environment show that the proposed algorithm can realize MTT quickly and efficiently in a cluttered environment. Further, the performance of the proposed algorithm remains constant with increasing process noise variance. The proposed algorithm has the advantages of efficiency and low computational load, which can ensure optimum performance when tracking multiple targets in a dense cluttered environment.

  13. Formulating the shear stress distribution in circular open channels based on the Renyi entropy

    NASA Astrophysics Data System (ADS)

    Khozani, Zohreh Sheikh; Bonakdari, Hossein

    2018-01-01

    The principle of maximum entropy is employed to derive the shear stress distribution by maximizing the Renyi entropy subject to some constraints and by assuming that dimensionless shear stress is a random variable. A Renyi entropy-based equation can be used to model the shear stress distribution along the entire wetted perimeter of circular channels and circular channels with flat beds and deposited sediments. A wide range of experimental results for 12 hydraulic conditions with different Froude numbers (0.375 to 1.71) and flow depths (20.3 to 201.5 mm) were used to validate the derived shear stress distribution. For circular channels, model performance enhanced with increasing flow depth (mean relative error (RE) of 0.0414) and only deteriorated slightly at the greatest flow depth (RE of 0.0573). For circular channels with flat beds, the Renyi entropy model predicted the shear stress distribution well at lower sediment depth. The Renyi entropy model results were also compared with Shannon entropy model results. Both models performed well for circular channels, but for circular channels with flat beds the Renyi entropy model displayed superior performance in estimating the shear stress distribution. The Renyi entropy model was highly precise and predicted the shear stress distribution in a circular channel with RE of 0.0480 and in a circular channel with a flat bed with RE of 0.0488.

  14. Gradient Dynamics and Entropy Production Maximization

    NASA Astrophysics Data System (ADS)

    Janečka, Adam; Pavelka, Michal

    2018-01-01

    We compare two methods for modeling dissipative processes, namely gradient dynamics and entropy production maximization. Both methods require similar physical inputs-how energy (or entropy) is stored and how it is dissipated. Gradient dynamics describes irreversible evolution by means of dissipation potential and entropy, it automatically satisfies Onsager reciprocal relations as well as their nonlinear generalization (Maxwell-Onsager relations), and it has statistical interpretation. Entropy production maximization is based on knowledge of free energy (or another thermodynamic potential) and entropy production. It also leads to the linear Onsager reciprocal relations and it has proven successful in thermodynamics of complex materials. Both methods are thermodynamically sound as they ensure approach to equilibrium, and we compare them and discuss their advantages and shortcomings. In particular, conditions under which the two approaches coincide and are capable of providing the same constitutive relations are identified. Besides, a commonly used but not often mentioned step in the entropy production maximization is pinpointed and the condition of incompressibility is incorporated into gradient dynamics.

  15. Single water entropy: hydrophobic crossover and application to drug binding.

    PubMed

    Sasikala, Wilbee D; Mukherjee, Arnab

    2014-09-11

    Entropy of water plays an important role in both chemical and biological processes e.g. hydrophobic effect, molecular recognition etc. Here we use a new approach to calculate translational and rotational entropy of the individual water molecules around different hydrophobic and charged solutes. We show that for small hydrophobic solutes, the translational and rotational entropies of each water molecule increase as a function of its distance from the solute reaching finally to a constant bulk value. As the size of the solute increases (0.746 nm), the behavior of the translational entropy is opposite; water molecules closest to the solute have higher entropy that reduces with distance from the solute. This indicates that there is a crossover in translational entropy of water molecules around hydrophobic solutes from negative to positive values as the size of the solute is increased. Rotational entropy of water molecules around hydrophobic solutes for all sizes increases with distance from the solute, indicating the absence of crossover in rotational entropy. This makes the crossover in total entropy (translation + rotation) of water molecule happen at much larger size (>1.5 nm) for hydrophobic solutes. Translational entropy of single water molecule scales logarithmically (Str(QH) = C + kB ln V), with the volume V obtained from the ellipsoid of inertia. We further discuss the origin of higher entropy of water around water and show the possibility of recovering the entropy loss of some hypothetical solutes. The results obtained are helpful to understand water entropy behavior around various hydrophobic and charged environments within biomolecules. Finally, we show how our approach can be used to calculate the entropy of the individual water molecules in a protein cavity that may be replaced during ligand binding.

  16. Two aspects of black hole entropy in Lanczos-Lovelock models of gravity

    NASA Astrophysics Data System (ADS)

    Kolekar, Sanved; Kothawala, Dawood; Padmanabhan, T.

    2012-03-01

    We consider two specific approaches to evaluate the black hole entropy which are known to produce correct results in the case of Einstein’s theory and generalize them to Lanczos-Lovelock models. In the first approach (which could be called extrinsic), we use a procedure motivated by earlier work by Pretorius, Vollick, and Israel, and by Oppenheim, and evaluate the entropy of a configuration of densely packed gravitating shells on the verge of forming a black hole in Lanczos-Lovelock theories of gravity. We find that this matter entropy is not equal to (it is less than) Wald entropy, except in the case of Einstein theory, where they are equal. The matter entropy is proportional to the Wald entropy if we consider a specific mth-order Lanczos-Lovelock model, with the proportionality constant depending on the spacetime dimensions D and the order m of the Lanczos-Lovelock theory as (D-2m)/(D-2). Since the proportionality constant depends on m, the proportionality between matter entropy and Wald entropy breaks down when we consider a sum of Lanczos-Lovelock actions involving different m. In the second approach (which could be called intrinsic), we generalize a procedure, previously introduced by Padmanabhan in the context of general relativity, to study off-shell entropy of a class of metrics with horizon using a path integral method. We consider the Euclidean action of Lanczos-Lovelock models for a class of metrics off shell and interpret it as a partition function. We show that in the case of spherically symmetric metrics, one can interpret the Euclidean action as the free energy and read off both the entropy and energy of a black hole spacetime. Surprisingly enough, this leads to exactly the Wald entropy and the energy of the spacetime in Lanczos-Lovelock models obtained by other methods. We comment on possible implications of the result.

  17. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  18. Stochastic approach to equilibrium and nonequilibrium thermodynamics.

    PubMed

    Tomé, Tânia; de Oliveira, Mário J

    2015-04-01

    We develop the stochastic approach to thermodynamics based on stochastic dynamics, which can be discrete (master equation) and continuous (Fokker-Planck equation), and on two assumptions concerning entropy. The first is the definition of entropy itself and the second the definition of entropy production rate, which is non-negative and vanishes in thermodynamic equilibrium. Based on these assumptions, we study interacting systems with many degrees of freedom in equilibrium or out of thermodynamic equilibrium and how the macroscopic laws are derived from the stochastic dynamics. These studies include the quasiequilibrium processes; the convexity of the equilibrium surface; the monotonic time behavior of thermodynamic potentials, including entropy; the bilinear form of the entropy production rate; the Onsager coefficients and reciprocal relations; and the nonequilibrium steady states of chemical reactions.

  19. Statistics of Infima and Stopping Times of Entropy Production and Applications to Active Molecular Processes

    NASA Astrophysics Data System (ADS)

    Neri, Izaak; Roldán, Édgar; Jülicher, Frank

    2017-01-01

    We study the statistics of infima, stopping times, and passage probabilities of entropy production in nonequilibrium steady states, and we show that they are universal. We consider two examples of stopping times: first-passage times of entropy production and waiting times of stochastic processes, which are the times when a system reaches a given state for the first time. Our main results are as follows: (i) The distribution of the global infimum of entropy production is exponential with mean equal to minus Boltzmann's constant; (ii) we find exact expressions for the passage probabilities of entropy production; (iii) we derive a fluctuation theorem for stopping-time distributions of entropy production. These results have interesting implications for stochastic processes that can be discussed in simple colloidal systems and in active molecular processes. In particular, we show that the timing and statistics of discrete chemical transitions of molecular processes, such as the steps of molecular motors, are governed by the statistics of entropy production. We also show that the extreme-value statistics of active molecular processes are governed by entropy production; for example, we derive a relation between the maximal excursion of a molecular motor against the direction of an external force and the infimum of the corresponding entropy-production fluctuations. Using this relation, we make predictions for the distribution of the maximum backtrack depth of RNA polymerases, which follow from our universal results for entropy-production infima.

  20. Stochastic model of the NASA/MSFC ground facility for large space structures with uncertain parameters: The maximum entropy approach, part 2

    NASA Technical Reports Server (NTRS)

    Hsia, Wei Shen

    1989-01-01

    A validated technology data base is being developed in the areas of control/structures interaction, deployment dynamics, and system performance for Large Space Structures (LSS). A Ground Facility (GF), in which the dynamics and control systems being considered for LSS applications can be verified, was designed and built. One of the important aspects of the GF is to verify the analytical model for the control system design. The procedure is to describe the control system mathematically as well as possible, then to perform tests on the control system, and finally to factor those results into the mathematical model. The reduction of the order of a higher order control plant was addressed. The computer program was improved for the maximum entropy principle adopted in Hyland's MEOP method. The program was tested against the testing problem. It resulted in a very close match. Two methods of model reduction were examined: Wilson's model reduction method and Hyland's optimal projection (OP) method. Design of a computer program for Hyland's OP method was attempted. Due to the difficulty encountered at the stage where a special matrix factorization technique is needed in order to obtain the required projection matrix, the program was successful up to the finding of the Linear Quadratic Gaussian solution but not beyond. Numerical results along with computer programs which employed ORACLS are presented.

  1. Phonological Concept Learning.

    PubMed

    Moreton, Elliott; Pater, Joe; Pertsova, Katya

    2017-01-01

    Linguistic and non-linguistic pattern learning have been studied separately, but we argue for a comparative approach. Analogous inductive problems arise in phonological and visual pattern learning. Evidence from three experiments shows that human learners can solve them in analogous ways, and that human performance in both cases can be captured by the same models. We test GMECCS (Gradual Maximum Entropy with a Conjunctive Constraint Schema), an implementation of the Configural Cue Model (Gluck & Bower, ) in a Maximum Entropy phonotactic-learning framework (Goldwater & Johnson, ; Hayes & Wilson, ) with a single free parameter, against the alternative hypothesis that learners seek featurally simple algebraic rules ("rule-seeking"). We study the full typology of patterns introduced by Shepard, Hovland, and Jenkins () ("SHJ"), instantiated as both phonotactic patterns and visual analogs, using unsupervised training. Unlike SHJ, Experiments 1 and 2 found that both phonotactic and visual patterns that depended on fewer features could be more difficult than those that depended on more features, as predicted by GMECCS but not by rule-seeking. GMECCS also correctly predicted performance differences between stimulus subclasses within each pattern. A third experiment tried supervised training (which can facilitate rule-seeking in visual learning) to elicit simple rule-seeking phonotactic learning, but cue-based behavior persisted. We conclude that similar cue-based cognitive processes are available for phonological and visual concept learning, and hence that studying either kind of learning can lead to significant insights about the other. Copyright © 2015 Cognitive Science Society, Inc.

  2. Phase equilibria computations of multicomponent mixtures at specified internal energy and volume

    NASA Astrophysics Data System (ADS)

    Myint, Philip C.; Nichols, Albert L., III; Springer, H. Keo

    2017-06-01

    Hydrodynamic simulation codes for high-energy density science applications often use internal energy and volume as their working variables. As a result, the codes must determine the thermodynamic state that corresponds to the specified energy and volume by finding the global maximum in entropy. This task is referred to as the isoenergetic-isochoric flash. Solving it for multicomponent mixtures is difficult because one must find not only the temperature and pressure consistent with the energy and volume, but also the number of phases present and the composition of the phases. The few studies on isoenergetic-isochoric flash that currently exist all require the evaluation of many derivatives that can be tedious to implement. We present an alternative approach that is based on a derivative-free method: particle swarm optimization. The global entropy maximum is found by running several instances of particle swarm optimization over different sets of randomly selected points in the search space. For verification, we compare the predicted temperature and pressure to results from the related, but simpler problem of isothermal-isobaric flash. All of our examples involve the equation of state we have recently developed for multiphase mixtures of the energetic materials HMX, RDX, and TNT. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  3. Excess entropy and crystallization in Stillinger-Weber and Lennard-Jones fluids

    NASA Astrophysics Data System (ADS)

    Dhabal, Debdas; Nguyen, Andrew Huy; Singh, Murari; Khatua, Prabir; Molinero, Valeria; Bandyopadhyay, Sanjoy; Chakravarty, Charusita

    2015-10-01

    Molecular dynamics simulations are used to contrast the supercooling and crystallization behaviour of monatomic liquids that exemplify the transition from simple to anomalous, tetrahedral liquids. As examples of simple fluids, we use the Lennard-Jones (LJ) liquid and a pair-dominated Stillinger-Weber liquid (SW16). As examples of tetrahedral, water-like fluids, we use the Stillinger-Weber model with variable tetrahedrality parameterized for germanium (SW20), silicon (SW21), and water (SW23.15 or mW model). The thermodynamic response functions show clear qualitative differences between simple and water-like liquids. For simple liquids, the compressibility and the heat capacity remain small on isobaric cooling. The tetrahedral liquids in contrast show a very sharp rise in these two response functions as the lower limit of liquid-phase stability is reached. While the thermal expansivity decreases with temperature but never crosses zero in simple liquids, in all three tetrahedral liquids at the studied pressure, there is a temperature of maximum density below which thermal expansivity is negative. In contrast to the thermodynamic response functions, the excess entropy on isobaric cooling does not show qualitatively different features for simple and water-like liquids; however, the slope and curvature of the entropy-temperature plots reflect the heat capacity trends. Two trajectory-based computational estimation methods for the entropy and the heat capacity are compared for possible structural insights into supercooling, with the entropy obtained from thermodynamic integration. The two-phase thermodynamic estimator for the excess entropy proves to be fairly accurate in comparison to the excess entropy values obtained by thermodynamic integration, for all five Lennard-Jones and Stillinger-Weber liquids. The entropy estimator based on the multiparticle correlation expansion that accounts for both pair and triplet correlations, denoted by Strip, is also studied. Strip is a good entropy estimator for liquids where pair and triplet correlations are important such as Ge and Si, but loses accuracy for purely pair-dominated liquids, like LJ fluid, or near the crystallization temperature (Tthr). Since local tetrahedral order is compatible with both liquid and crystalline states, the reorganisation of tetrahedral liquids is accompanied by a clear rise in the pair, triplet, and thermodynamic contributions to the heat capacity, resulting in the heat capacity anomaly. In contrast, the pair-dominated liquids show increasing dominance of triplet correlations on approaching crystallization but no sharp rise in either the pair or thermodynamic heat capacities.

  4. Use and validity of principles of extremum of entropy production in the study of complex systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitor Reis, A., E-mail: ahr@uevora.pt

    2014-07-15

    It is shown how both the principles of extremum of entropy production, which are often used in the study of complex systems, follow from the maximization of overall system conductivities, under appropriate constraints. In this way, the maximum rate of entropy production (MEP) occurs when all the forces in the system are kept constant. On the other hand, the minimum rate of entropy production (mEP) occurs when all the currents that cross the system are kept constant. A brief discussion on the validity of the application of the mEP and MEP principles in several cases, and in particular to themore » Earth’s climate is also presented. -- Highlights: •The principles of extremum of entropy production are not first principles. •They result from the maximization of conductivities under appropriate constraints. •The conditions of their validity are set explicitly. •Some long-standing controversies are discussed and clarified.« less

  5. Steepest entropy ascent quantum thermodynamic model of electron and phonon transport

    NASA Astrophysics Data System (ADS)

    Li, Guanchen; von Spakovsky, Michael R.; Hin, Celine

    2018-01-01

    An advanced nonequilibrium thermodynamic model for electron and phonon transport is formulated based on the steepest-entropy-ascent quantum thermodynamics framework. This framework, based on the principle of steepest entropy ascent (or the equivalent maximum entropy production principle), inherently satisfies the laws of thermodynamics and mechanics and is applicable at all temporal and spatial scales even in the far-from-equilibrium realm. Specifically, the model is proven to recover the Boltzmann transport equations in the near-equilibrium limit and the two-temperature model of electron-phonon coupling when no dispersion is assumed. The heat and mass transport at a temperature discontinuity across a homogeneous interface where the dispersion and coupling of electron and phonon transport are both considered are then modeled. Local nonequilibrium system evolution and nonquasiequilibrium interactions are predicted and the results discussed.

  6. Shifting distributions of adult Atlantic sturgeon amidst post-industrialization and future impacts in the Delaware River: a maximum entropy approach.

    PubMed

    Breece, Matthew W; Oliver, Matthew J; Cimino, Megan A; Fox, Dewayne A

    2013-01-01

    Atlantic sturgeon (Acipenser oxyrinchus oxyrinchus) experienced severe declines due to habitat destruction and overfishing beginning in the late 19(th) century. Subsequent to the boom and bust period of exploitation, there has been minimal fishing pressure and improving habitats. However, lack of recovery led to the 2012 listing of Atlantic sturgeon under the Endangered Species Act. Although habitats may be improving, the availability of high quality spawning habitat, essential for the survival and development of eggs and larvae may still be a limiting factor in the recovery of Atlantic sturgeon. To estimate adult Atlantic sturgeon spatial distributions during riverine occupancy in the Delaware River, we utilized a maximum entropy (MaxEnt) approach along with passive biotelemetry during the likely spawning season. We found that substrate composition and distance from the salt front significantly influenced the locations of adult Atlantic sturgeon in the Delaware River. To broaden the scope of this study we projected our model onto four scenarios depicting varying locations of the salt front in the Delaware River: the contemporary location of the salt front during the likely spawning season, the location of the salt front during the historic fishery in the late 19(th) century, an estimated shift in the salt front by the year 2100 due to climate change, and an extreme drought scenario, similar to that which occurred in the 1960's. The movement of the salt front upstream as a result of dredging and climate change likely eliminated historic spawning habitats and currently threatens areas where Atlantic sturgeon spawning may be taking place. Identifying where suitable spawning substrate and water chemistry intersect with the likely occurrence of adult Atlantic sturgeon in the Delaware River highlights essential spawning habitats, enhancing recovery prospects for this imperiled species.

  7. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    NASA Astrophysics Data System (ADS)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  8. Conserved actions, maximum entropy and dark matter haloes

    NASA Astrophysics Data System (ADS)

    Pontzen, Andrew; Governato, Fabio

    2013-03-01

    We use maximum entropy arguments to derive the phase-space distribution of a virialized dark matter halo. Our distribution function gives an improved representation of the end product of violent relaxation. This is achieved by incorporating physically motivated dynamical constraints (specifically on orbital actions) which prevent arbitrary redistribution of energy. We compare the predictions with three high-resolution dark matter simulations of widely varying mass. The numerical distribution function is accurately predicted by our argument, producing an excellent match for the vast majority of particles. The remaining particles constitute the central cusp of the halo (≲4 per cent of the dark matter). They can be accounted for within the presented framework once the short dynamical time-scales of the centre are taken into account.

  9. Optimal protocol for maximum work extraction in a feedback process with a time-varying potential

    NASA Astrophysics Data System (ADS)

    Kwon, Chulan

    2017-12-01

    The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.

  10. LIBOR troubles: Anomalous movements detection based on maximum entropy

    NASA Astrophysics Data System (ADS)

    Bariviera, Aurelio F.; Martín, María T.; Plastino, Angelo; Vampa, Victoria

    2016-05-01

    According to the definition of the London Interbank Offered Rate (LIBOR), contributing banks should give fair estimates of their own borrowing costs in the interbank market. Between 2007 and 2009, several banks made inappropriate submissions of LIBOR, sometimes motivated by profit-seeking from their trading positions. In 2012, several newspapers' articles began to cast doubt on LIBOR integrity, leading surveillance authorities to conduct investigations on banks' behavior. Such procedures resulted in severe fines imposed to involved banks, who recognized their financial inappropriate conduct. In this paper, we uncover such unfair behavior by using a forecasting method based on the Maximum Entropy principle. Our results are robust against changes in parameter settings and could be of great help for market surveillance.

  11. Maximum entropy modeling of metabolic networks by constraining growth-rate moments predicts coexistence of phenotypes

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele

    2017-12-01

    In this work maximum entropy distributions in the space of steady states of metabolic networks are considered upon constraining the first and second moments of the growth rate. Coexistence of fast and slow phenotypes, with bimodal flux distributions, emerges upon considering control on the average growth (optimization) and its fluctuations (heterogeneity). This is applied to the carbon catabolic core of Escherichia coli where it quantifies the metabolic activity of slow growing phenotypes and it provides a quantitative map with metabolic fluxes, opening the possibility to detect coexistence from flux data. A preliminary analysis on data for E. coli cultures in standard conditions shows degeneracy for the inferred parameters that extend in the coexistence region.

  12. An understanding of human dynamics in urban subway traffic from the Maximum Entropy Principle

    NASA Astrophysics Data System (ADS)

    Yong, Nuo; Ni, Shunjiang; Shen, Shifei; Ji, Xuewei

    2016-08-01

    We studied the distribution of entry time interval in Beijing subway traffic by analyzing the smart card transaction data, and then deduced the probability distribution function of entry time interval based on the Maximum Entropy Principle. Both theoretical derivation and data statistics indicated that the entry time interval obeys power-law distribution with an exponential cutoff. In addition, we pointed out the constraint conditions for the distribution form and discussed how the constraints affect the distribution function. It is speculated that for bursts and heavy tails in human dynamics, when the fitted power exponent is less than 1.0, it cannot be a pure power-law distribution, but with an exponential cutoff, which may be ignored in the previous studies.

  13. Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations

    PubMed Central

    Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro

    2015-01-01

    Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic. PMID:26177039

  14. Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations.

    PubMed

    Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro

    2015-01-01

    Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic.

  15. Exact solutions for the entropy production rate of several irreversible processes.

    PubMed

    Ross, John; Vlad, Marcel O

    2005-11-24

    We investigate thermal conduction described by Newton's law of cooling and by Fourier's transport equation and chemical reactions based on mass action kinetics where we detail a simple example of a reaction mechanism with one intermediate. In these cases we derive exact expressions for the entropy production rate and its differential. We show that at a stationary state the entropy production rate is an extremum if and only if the stationary state is a state of thermodynamic equilibrium. These results are exact and independent of any expansions of the entropy production rate. In the case of thermal conduction we compare our exact approach with the conventional approach based on the expansion of the entropy production rate near equilibrium. If we expand the entropy production rate in a series and keep terms up to the third order in the deviation variables and then differentiate, we find out that the entropy production rate is not an extremum at a nonequilibrium steady state. If there is a strict proportionality between fluxes and forces, then the entropy production rate is an extremum at the stationary state even if the stationary state is far away from equilibrium.

  16. Comparison of image deconvolution algorithms on simulated and laboratory infrared images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Proctor, D.

    1994-11-15

    We compare Maximum Likelihood, Maximum Entropy, Accelerated Lucy-Richardson, Weighted Goodness of Fit, and Pixon reconstructions of simple scenes as a function of signal-to-noise ratio for simulated images with randomly generated noise. Reconstruction results of infrared images taken with the TAISIR (Temperature and Imaging System InfraRed) are also discussed.

  17. Entropy stable discontinuous interfaces coupling for the three-dimensional compressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Parsani, Matteo; Carpenter, Mark H.; Nielsen, Eric J.

    2015-06-01

    Non-linear entropy stability and a summation-by-parts (SBP) framework are used to derive entropy stable interior interface coupling for the semi-discretized three-dimensional (3D) compressible Navier-Stokes equations. A complete semi-discrete entropy estimate for the interior domain is achieved combining a discontinuous entropy conservative operator of any order [1,2] with an entropy stable coupling condition for the inviscid terms, and a local discontinuous Galerkin (LDG) approach with an interior penalty (IP) procedure for the viscous terms. The viscous penalty contributions scale with the inverse of the Reynolds number (Re) so that for Re → ∞ their contributions vanish and only the entropy stable inviscid interface penalty term is recovered. This paper extends the interface couplings presented [1,2] and provides a simple and automatic way to compute the magnitude of the viscous IP term. The approach presented herein is compatible with any diagonal norm summation-by-parts (SBP) spatial operator, including finite element, finite volume, finite difference schemes and the class of high-order accurate methods which include the large family of discontinuous Galerkin discretizations and flux reconstruction schemes.

  18. Introduction of Entropy via the Boltzmann Distribution in Undergraduate Physical Chemistry: A Molecular Approach

    ERIC Educational Resources Information Center

    Kozliak, Evguenii I.

    2004-01-01

    A molecular approach for introducing entropy in undergraduate physical chemistry course and incorporating the features of Davies' treatment that meets the needs of the students but ignores the complexities of statistics and upgrades the qualitative, intuitive approach of Lambert for general chemistry to a semiquantitative treatment using Boltzmann…

  19. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  20. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns.

    PubMed

    Osgood, Nathaniel D; Paul, Tuhin; Stanley, Kevin G; Qian, Weicheng

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist.

  1. Time-dependent entropy evolution in microscopic and macroscopic electromagnetic relaxation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker-Jarvis, James

    This paper is a study of entropy and its evolution in the time and frequency domains upon application of electromagnetic fields to materials. An understanding of entropy and its evolution in electromagnetic interactions bridges the boundaries between electromagnetism and thermodynamics. The approach used here is a Liouville-based statistical-mechanical theory. I show that the microscopic entropy is reversible and the macroscopic entropy satisfies an H theorem. The spectral entropy development can be very useful for studying the frequency response of materials. Using a projection-operator based nonequilibrium entropy, different equations are derived for the entropy and entropy production and are applied tomore » the polarization, magnetization, and macroscopic fields. I begin by proving an exact H theorem for the entropy, progress to application of time-dependent entropy in electromagnetics, and then apply the theory to relevant applications in electromagnetics. The paper concludes with a discussion of the relationship of the frequency-domain form of the entropy to the permittivity, permeability, and impedance.« less

  2. Research on Sustainable Development Level Evaluation of Resource-based Cities Based on Shapely Entropy and Chouqet Integral

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Qu, Weilu; Qiu, Weiting

    2018-03-01

    In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.

  3. A two-phase copula entropy-based multiobjective optimization approach to hydrometeorological gauge network design

    NASA Astrophysics Data System (ADS)

    Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin

    2017-12-01

    Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.

  4. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  5. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data

    NASA Astrophysics Data System (ADS)

    White, Andrew D.; Knight, Chris; Hocky, Glen M.; Voth, Gregory A.

    2017-01-01

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  6. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data.

    PubMed

    White, Andrew D; Knight, Chris; Hocky, Glen M; Voth, Gregory A

    2017-01-28

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  7. Multiple Diffusion Mechanisms Due to Nanostructuring in Crowded Environments

    PubMed Central

    Sanabria, Hugo; Kubota, Yoshihisa; Waxham, M. Neal

    2007-01-01

    One of the key questions regarding intracellular diffusion is how the environment affects molecular mobility. Mostly, intracellular diffusion has been described as hindered, and the physical reasons for this behavior are: immobile barriers, molecular crowding, and binding interactions with immobile or mobile molecules. Using results from multi-photon fluorescence correlation spectroscopy, we describe how immobile barriers and crowding agents affect translational mobility. To study the hindrance produced by immobile barriers, we used sol-gels (silica nanostructures) that consist of a continuous solid phase and aqueous phase in which fluorescently tagged molecules diffuse. In the case of molecular crowding, translational mobility was assessed in increasing concentrations of 500 kDa dextran solutions. Diffusion of fluorescent tracers in both sol-gels and dextran solutions shows clear evidence of anomalous subdiffusion. In addition, data from the autocorrelation function were analyzed using the maximum entropy method as adapted to fluorescence correlation spectroscopy data and compared with the standard model that incorporates anomalous diffusion. The maximum entropy method revealed evidence of different diffusion mechanisms that had not been revealed using the anomalous diffusion model. These mechanisms likely correspond to nanostructuring in crowded environments and to the relative dimensions of the crowding agent with respect to the tracer molecule. Analysis with the maximum entropy method also revealed information about the degree of heterogeneity in the environment as reported by the behavior of diffusive molecules. PMID:17040979

  8. Possible dynamical explanations for Paltridge's principle of maximum entropy production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Virgo, Nathaniel, E-mail: nathanielvirgo@gmail.com; Ikegami, Takashi, E-mail: nathanielvirgo@gmail.com

    2014-12-05

    Throughout the history of non-equilibrium thermodynamics a number of theories have been proposed in which complex, far from equilibrium flow systems are hypothesised to reach a steady state that maximises some quantity. Perhaps the most celebrated is Paltridge's principle of maximum entropy production for the horizontal heat flux in Earth's atmosphere, for which there is some empirical support. There have been a number of attempts to derive such a principle from maximum entropy considerations. However, we currently lack a more mechanistic explanation of how any particular system might self-organise into a state that maximises some quantity. This is in contrastmore » to equilibrium thermodynamics, in which models such as the Ising model have been a great help in understanding the relationship between the predictions of MaxEnt and the dynamics of physical systems. In this paper we show that, unlike in the equilibrium case, Paltridge-type maximisation in non-equilibrium systems cannot be achieved by a simple dynamical feedback mechanism. Nevertheless, we propose several possible mechanisms by which maximisation could occur. Showing that these occur in any real system is a task for future work. The possibilities presented here may not be the only ones. We hope that by presenting them we can provoke further discussion about the possible dynamical mechanisms behind extremum principles for non-equilibrium systems, and their relationship to predictions obtained through MaxEnt.« less

  9. Information entropy and dark energy evolution

    NASA Astrophysics Data System (ADS)

    Capozziello, Salvatore; Luongo, Orlando

    Here, the information entropy is investigated in the context of early and late cosmology under the hypothesis that distinct phases of universe evolution are entangled between them. The approach is based on the entangled state ansatz, representing a coarse-grained definition of primordial dark temperature associated to an effective entangled energy density. The dark temperature definition comes from assuming either Von Neumann or linear entropy as sources of cosmological thermodynamics. We interpret the involved information entropies by means of probabilities of forming structures during cosmic evolution. Following this recipe, we propose that quantum entropy is simply associated to the thermodynamical entropy and we investigate the consequences of our approach using the adiabatic sound speed. As byproducts, we analyze two phases of universe evolution: the late and early stages. To do so, we first recover that dark energy reduces to a pure cosmological constant, as zero-order entanglement contribution, and second that inflation is well-described by means of an effective potential. In both cases, we infer numerical limits which are compatible with current observations.

  10. Bounding species distribution models

    USGS Publications Warehouse

    Stohlgren, T.J.; Jarnevich, C.S.; Esaias, W.E.; Morisette, J.T.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used. ?? 2011 Current Zoology.

  11. Bounding Species Distribution Models

    NASA Technical Reports Server (NTRS)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  12. Discretization and Preconditioning Algorithms for the Euler and Navier-Stokes Equations on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Bart, Timothy J.; Kutler, Paul (Technical Monitor)

    1998-01-01

    Chapter 1 briefly reviews several related topics associated with the symmetrization of systems of conservation laws and quasi-conservation laws: (1) Basic Entropy Symmetrization Theory; (2) Symmetrization and eigenvector scaling; (3) Symmetrization of the compressible Navier-Stokes equations; and (4) Symmetrization of the quasi-conservative form of the magnetohydrodynamic (MHD) equations. Chapter 2 describes one of the best known tools employed in the study of differential equations, the maximum principle: any function f(x) which satisfies the inequality f(double prime)>0 on the interval [a,b] attains its maximum value at one of the endpoints on the interval. Chapter three examines the upwind finite volume schemes for scalar and system conservation laws. The basic tasks in the upwind finite volume approach have already been presented: reconstruction, flux evaluation, and evolution. By far, the most difficult task in this process is the reconstruction step.

  13. A centroid molecular dynamics study of liquid para-hydrogen and ortho-deuterium.

    PubMed

    Hone, Tyler D; Voth, Gregory A

    2004-10-01

    Centroid molecular dynamics (CMD) is applied to the study of collective and single-particle dynamics in liquid para-hydrogen at two state points and liquid ortho-deuterium at one state point. The CMD results are compared with the results of classical molecular dynamics, quantum mode coupling theory, a maximum entropy analytic continuation approach, pair-product forward- backward semiclassical dynamics, and available experimental results. The self-diffusion constants are in excellent agreement with the experimental measurements for all systems studied. Furthermore, it is shown that the method is able to adequately describe both the single-particle and collective dynamics of quantum liquids. (c) 2004 American Institute of Physics

  14. Exact analytical thermodynamic expressions for a Brownian heat engine

    NASA Astrophysics Data System (ADS)

    Taye, Mesfin Asfaw

    2015-09-01

    The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t . Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.

  15. Exact analytical thermodynamic expressions for a Brownian heat engine.

    PubMed

    Taye, Mesfin Asfaw

    2015-09-01

    The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t. Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.

  16. Soft context clustering for F0 modeling in HMM-based speech synthesis

    NASA Astrophysics Data System (ADS)

    Khorram, Soheil; Sameti, Hossein; King, Simon

    2015-12-01

    This paper proposes the use of a new binary decision tree, which we call a soft decision tree, to improve generalization performance compared to the conventional `hard' decision tree method that is used to cluster context-dependent model parameters in statistical parametric speech synthesis. We apply the method to improve the modeling of fundamental frequency, which is an important factor in synthesizing natural-sounding high-quality speech. Conventionally, hard decision tree-clustered hidden Markov models (HMMs) are used, in which each model parameter is assigned to a single leaf node. However, this `divide-and-conquer' approach leads to data sparsity, with the consequence that it suffers from poor generalization, meaning that it is unable to accurately predict parameters for models of unseen contexts: the hard decision tree is a weak function approximator. To alleviate this, we propose the soft decision tree, which is a binary decision tree with soft decisions at the internal nodes. In this soft clustering method, internal nodes select both their children with certain membership degrees; therefore, each node can be viewed as a fuzzy set with a context-dependent membership function. The soft decision tree improves model generalization and provides a superior function approximator because it is able to assign each context to several overlapped leaves. In order to use such a soft decision tree to predict the parameters of the HMM output probability distribution, we derive the smoothest (maximum entropy) distribution which captures all partial first-order moments and a global second-order moment of the training samples. Employing such a soft decision tree architecture with maximum entropy distributions, a novel speech synthesis system is trained using maximum likelihood (ML) parameter re-estimation and synthesis is achieved via maximum output probability parameter generation. In addition, a soft decision tree construction algorithm optimizing a log-likelihood measure is developed. Both subjective and objective evaluations were conducted and indicate a considerable improvement over the conventional method.

  17. Using heteroclinic orbits to quantify topological entropy in fluid flows

    NASA Astrophysics Data System (ADS)

    Sattari, Sulimon; Chen, Qianting; Mitchell, Kevin A.

    2016-03-01

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or "ghost," rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding of ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.

  18. Entropic and near-field improvements of thermoradiative cells

    DOE PAGES

    Hsu, Wei -Chun; Tong, Jonathan K.; Liao, Bolin; ...

    2016-10-13

    A p-n junction maintained at above ambient temperature can work as a heat engine, converting some of the supplied heat into electricity and rejecting entropy by interband emission. Such thermoradiative cells have potential to harvest low-grade heat into electricity. By analyzing the entropy content of different spectral components of thermal radiation, we identify an approach to increase the efficiency of thermoradiative cells via spectrally selecting long-wavelength photons for radiative exchange. Furthermore, we predict that the near-field photon extraction by coupling photons generated from interband electronic transition to phonon polariton modes on the surface of a heat sink can increase themore » conversion efficiency as well as the power generation density, providing more opportunities to efficiently utilize terrestrial emission for clean energy. An ideal InSb thermoradiative cell can achieve a maximum efficiency and power density up to 20.4% and 327 Wm -2, respectively, between a hot source at 500 K and a cold sink at 300 K. Furthermore, sub-bandgap and non-radiative losses will significantly degrade the cell performance.« less

  19. Entropic and Near-Field Improvements of Thermoradiative Cells

    PubMed Central

    Hsu, Wei-Chun; Tong, Jonathan K.; Liao, Bolin; Huang, Yi; Boriskina, Svetlana V.; Chen, Gang

    2016-01-01

    A p-n junction maintained at above ambient temperature can work as a heat engine, converting some of the supplied heat into electricity and rejecting entropy by interband emission. Such thermoradiative cells have potential to harvest low-grade heat into electricity. By analyzing the entropy content of different spectral components of thermal radiation, we identify an approach to increase the efficiency of thermoradiative cells via spectrally selecting long-wavelength photons for radiative exchange. Furthermore, we predict that the near-field photon extraction by coupling photons generated from interband electronic transition to phonon polariton modes on the surface of a heat sink can increase the conversion efficiency as well as the power generation density, providing more opportunities to efficiently utilize terrestrial emission for clean energy. An ideal InSb thermoradiative cell can achieve a maximum efficiency and power density up to 20.4% and 327 Wm−2, respectively, between a hot source at 500 K and a cold sink at 300 K. However, sub-bandgap and non-radiative losses will significantly degrade the cell performance. PMID:27734902

  20. Thermodynamic geometry for a non-extensive ideal gas

    NASA Astrophysics Data System (ADS)

    López, J. L.; Obregón, O.; Torres-Arenas, J.

    2018-05-01

    A generalized entropy arising in the context of superstatistics is applied to an ideal gas. The curvature scalar associated to the thermodynamic space generated by this modified entropy is calculated using two formalisms of the geometric approach to thermodynamics. By means of the curvature/interaction hypothesis of the geometric approach to thermodynamic geometry it is found that as a consequence of considering a generalized statistics, an effective interaction arises but the interaction is not enough to generate a phase transition. This generalized entropy seems to be relevant in confinement or in systems with not so many degrees of freedom, so it could be interesting to use such entropies to characterize the thermodynamics of small systems.

  1. A general Bayesian image reconstruction algorithm with entropy prior: Preliminary application to HST data

    NASA Astrophysics Data System (ADS)

    Nunez, Jorge; Llacer, Jorge

    1993-10-01

    This paper describes a general Bayesian iterative algorithm with entropy prior for image reconstruction. It solves the cases of both pure Poisson data and Poisson data with Gaussian readout noise. The algorithm maintains positivity of the solution; it includes case-specific prior information (default map) and flatfield corrections; it removes background and can be accelerated to be faster than the Richardson-Lucy algorithm. In order to determine the hyperparameter that balances the entropy and liklihood terms in the Bayesian approach, we have used a liklihood cross-validation technique. Cross-validation is more robust than other methods because it is less demanding in terms of the knowledge of exact data characteristics and of the point-spread function. We have used the algorithm to reconstruct successfully images obtained in different space-and ground-based imaging situations. It has been possible to recover most of the original intended capabilities of the Hubble Space Telescope (HST) wide field and planetary camera (WFPC) and faint object camera (FOC) from images obtained in their present state. Semireal simulations for the future wide field planetary camera 2 show that even after the repair of the spherical abberration problem, image reconstruction can play a key role in improving the resolution of the cameras, well beyond the design of the Hubble instruments. We also show that ground-based images can be reconstructed successfully with the algorithm. A technique which consists of dividing the CCD observations into two frames, with one-half the exposure time each, emerges as a recommended procedure for the utilization of the described algorithms. We have compared our technique with two commonly used reconstruction algorithms: the Richardson-Lucy and the Cambridge maximum entropy algorithms.

  2. Origin and characteristics of high Shannon entropy at the pivot of locally stable rotors: insights from computational simulation.

    PubMed

    Ganesan, Anand N; Kuklik, Pawel; Gharaviri, Ali; Brooks, Anthony; Chapman, Darius; Lau, Dennis H; Roberts-Thomson, Kurt C; Sanders, Prashanthan

    2014-01-01

    Rotors are postulated to maintain cardiac fibrillation. Despite the importance of bipolar electrograms in clinical electrophysiology, few data exist on the properties of bipolar electrograms at rotor sites. The pivot of a spiral wave is characterized by relative uncertainty of wavefront propagation direction compared to the periphery. The bipolar electrograms used in electrophysiology recording encode information on both direction and timing of approaching wavefronts. To test the hypothesis that bipolar electrograms from the pivot of rotors have higher Shannon entropy (ShEn) than electrograms recorded at the periphery due to the spatial dynamics of spiral waves. We studied spiral wave propagation in 2-dimensional sheets constructed using a simple cell automaton (FitzHugh-Nagumo), atrial (Courtemanche-Ramirez-Nattel) and ventricular (Luo-Rudy) myocyte cell models and in a geometric model spiral wave. In each system, bipolar electrogram recordings were simulated, and Shannon entropy maps constructed as a measure of electrogram information content. ShEn was consistently highest in the pivoting region associated with the phase singularity of the spiral wave. This property was consistently preserved across; (i) variation of model system (ii) alterations in bipolar electrode spacing, (iii) alternative bipolar electrode orientation (iv) bipolar electrogram filtering and (v) in the presence of rotor meander. Directional activation plots demonstrated that the origin of high ShEn at the pivot was the directional diversity of wavefront propagation observed in this location. The pivot of the rotor is consistently associated with high Shannon entropy of bipolar electrograms despite differences in action potential model, bipolar electrode spacing, signal filtering and rotor meander. Maximum ShEn is co-located with the pivot for rotors observed in the bipolar electrogram recording mode, and may be an intrinsic property of spiral wave dynamic behaviour.

  3. A general methodology for population analysis

    NASA Astrophysics Data System (ADS)

    Lazov, Petar; Lazov, Igor

    2014-12-01

    For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ρ, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ν, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

  4. Stochastic thermodynamics, fluctuation theorems and molecular machines.

    PubMed

    Seifert, Udo

    2012-12-01

    Stochastic thermodynamics as reviewed here systematically provides a framework for extending the notions of classical thermodynamics such as work, heat and entropy production to the level of individual trajectories of well-defined non-equilibrium ensembles. It applies whenever a non-equilibrium process is still coupled to one (or several) heat bath(s) of constant temperature. Paradigmatic systems are single colloidal particles in time-dependent laser traps, polymers in external flow, enzymes and molecular motors in single molecule assays, small biochemical networks and thermoelectric devices involving single electron transport. For such systems, a first-law like energy balance can be identified along fluctuating trajectories. For a basic Markovian dynamics implemented either on the continuum level with Langevin equations or on a discrete set of states as a master equation, thermodynamic consistency imposes a local-detailed balance constraint on noise and rates, respectively. Various integral and detailed fluctuation theorems, which are derived here in a unifying approach from one master theorem, constrain the probability distributions for work, heat and entropy production depending on the nature of the system and the choice of non-equilibrium conditions. For non-equilibrium steady states, particularly strong results hold like a generalized fluctuation-dissipation theorem involving entropy production. Ramifications and applications of these concepts include optimal driving between specified states in finite time, the role of measurement-based feedback processes and the relation between dissipation and irreversibility. Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production.

  5. LensEnt2: Maximum-entropy weak lens reconstruction

    NASA Astrophysics Data System (ADS)

    Marshall, P. J.; Hobson, M. P.; Gull, S. F.; Bridle, S. L.

    2013-08-01

    LensEnt2 is a maximum entropy reconstructor of weak lensing mass maps. The method takes each galaxy shape as an independent estimator of the reduced shear field and incorporates an intrinsic smoothness, determined by Bayesian methods, into the reconstruction. The uncertainties from both the intrinsic distribution of galaxy shapes and galaxy shape estimation are carried through to the final mass reconstruction, and the mass within arbitrarily shaped apertures are calculated with corresponding uncertainties. The input is a galaxy ellipticity catalog with each measured galaxy shape treated as a noisy tracer of the reduced shear field, which is inferred on a fine pixel grid assuming positivity, and smoothness on scales of w arcsec where w is an input parameter. The ICF width w can be chosen by computing the evidence for it.

  6. Halo-independence with quantified maximum entropy at DAMA/LIBRA

    NASA Astrophysics Data System (ADS)

    Fowlie, Andrew

    2017-10-01

    Using the DAMA/LIBRA anomaly as an example, we formalise the notion of halo-independence in the context of Bayesian statistics and quantified maximum entropy. We consider an infinite set of possible profiles, weighted by an entropic prior and constrained by a likelihood describing noisy measurements of modulated moments by DAMA/LIBRA. Assuming an isotropic dark matter (DM) profile in the galactic rest frame, we find the most plausible DM profiles and predictions for unmodulated signal rates at DAMA/LIBRA. The entropic prior contains an a priori unknown regularisation factor, β, that describes the strength of our conviction that the profile is approximately Maxwellian. By varying β, we smoothly interpolate between a halo-independent and a halo-dependent analysis, thus exploring the impact of prior information about the DM profile.

  7. Bayesian Image Segmentations by Potts Prior and Loopy Belief Propagation

    NASA Astrophysics Data System (ADS)

    Tanaka, Kazuyuki; Kataoka, Shun; Yasuda, Muneki; Waizumi, Yuji; Hsu, Chiou-Ting

    2014-12-01

    This paper presents a Bayesian image segmentation model based on Potts prior and loopy belief propagation. The proposed Bayesian model involves several terms, including the pairwise interactions of Potts models, and the average vectors and covariant matrices of Gauss distributions in color image modeling. These terms are often referred to as hyperparameters in statistical machine learning theory. In order to determine these hyperparameters, we propose a new scheme for hyperparameter estimation based on conditional maximization of entropy in the Potts prior. The algorithm is given based on loopy belief propagation. In addition, we compare our conditional maximum entropy framework with the conventional maximum likelihood framework, and also clarify how the first order phase transitions in loopy belief propagations for Potts models influence our hyperparameter estimation procedures.

  8. On the sufficiency of pairwise interactions in maximum entropy models of networks

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya; Merchan, Lina

    Biological information processing networks consist of many components, which are coupled by an even larger number of complex multivariate interactions. However, analyses of data sets from fields as diverse as neuroscience, molecular biology, and behavior have reported that observed statistics of states of some biological networks can be approximated well by maximum entropy models with only pairwise interactions among the components. Based on simulations of random Ising spin networks with p-spin (p > 2) interactions, here we argue that this reduction in complexity can be thought of as a natural property of some densely interacting networks in certain regimes, and not necessarily as a special property of living systems. This work was supported in part by James S. McDonnell Foundation Grant No. 220020321.

  9. A maximum entropy model for chromatin structure

    NASA Astrophysics Data System (ADS)

    Farre, Pau; Emberly, Eldon; Emberly Group Team

    The DNA inside the nucleus of eukaryotic cells shows a variety of conserved structures at different length scales These structures are formed by interactions between protein complexes that bind to the DNA and regulate gene activity. Recent high throughput sequencing techniques allow for the measurement both of the genome wide contact map of the folded DNA within a cell (HiC) and where various proteins are bound to the DNA (ChIP-seq). In this talk I will present a maximum-entropy method capable of both predicting HiC contact maps from binding data, and binding data from HiC contact maps. This method results in an intuitive Ising-type model that is able to predict how altering the presence of binding factors can modify chromosome conformation, without the need of polymer simulations.

  10. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhabal, Debdas; Chakravarty, Charusita, E-mail: charus@chemistry.iitd.ac.in; Nguyen, Andrew Huy

    Molecular dynamics simulations are used to contrast the supercooling and crystallization behaviour of monatomic liquids that exemplify the transition from simple to anomalous, tetrahedral liquids. As examples of simple fluids, we use the Lennard-Jones (LJ) liquid and a pair-dominated Stillinger-Weber liquid (SW{sub 16}). As examples of tetrahedral, water-like fluids, we use the Stillinger-Weber model with variable tetrahedrality parameterized for germanium (SW{sub 20}), silicon (SW{sub 21}), and water (SW{sub 23.15} or mW model). The thermodynamic response functions show clear qualitative differences between simple and water-like liquids. For simple liquids, the compressibility and the heat capacity remain small on isobaric cooling. Themore » tetrahedral liquids in contrast show a very sharp rise in these two response functions as the lower limit of liquid-phase stability is reached. While the thermal expansivity decreases with temperature but never crosses zero in simple liquids, in all three tetrahedral liquids at the studied pressure, there is a temperature of maximum density below which thermal expansivity is negative. In contrast to the thermodynamic response functions, the excess entropy on isobaric cooling does not show qualitatively different features for simple and water-like liquids; however, the slope and curvature of the entropy-temperature plots reflect the heat capacity trends. Two trajectory-based computational estimation methods for the entropy and the heat capacity are compared for possible structural insights into supercooling, with the entropy obtained from thermodynamic integration. The two-phase thermodynamic estimator for the excess entropy proves to be fairly accurate in comparison to the excess entropy values obtained by thermodynamic integration, for all five Lennard-Jones and Stillinger-Weber liquids. The entropy estimator based on the multiparticle correlation expansion that accounts for both pair and triplet correlations, denoted by S{sub trip}, is also studied. S{sub trip} is a good entropy estimator for liquids where pair and triplet correlations are important such as Ge and Si, but loses accuracy for purely pair-dominated liquids, like LJ fluid, or near the crystallization temperature (T{sub thr}). Since local tetrahedral order is compatible with both liquid and crystalline states, the reorganisation of tetrahedral liquids is accompanied by a clear rise in the pair, triplet, and thermodynamic contributions to the heat capacity, resulting in the heat capacity anomaly. In contrast, the pair-dominated liquids show increasing dominance of triplet correlations on approaching crystallization but no sharp rise in either the pair or thermodynamic heat capacities.« less

  12. Investigations on entropy layer along hypersonic hyperboloids using a defect boundary layer

    NASA Technical Reports Server (NTRS)

    Brazier, J. P.; Aupoix, B.; Cousteix, J.

    1992-01-01

    A defect approach coupled with matched asymptotic expansions is used to derive a new set of boundary layer equations. This method ensures a smooth matching of the boundary layer with the inviscid solution. These equations are solved to calculate boundary layers over hypersonic blunt bodies involving the entropy gradient effect. Systematic comparisons are made for both axisymmetric and plane flows in several cases with different Mach and Reynolds numbers. After a brief survey of the entropy layer characteristics, the defect boundary layer results are compared with standard boundary layer and full Navier-Stokes solutions. The entropy gradient effects are found to be more important in the axisymmetric case than in the plane one. The wall temperature has a great influence on the results through the displacement effect. Good predictions can be obtained with the defect approach over a cold wall in the nose region, with a first order solution. However, the defect approach gives less accurate results far from the nose on axisymmetric bodies because of the thinning of the entropy layer.

  13. High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains

    NASA Technical Reports Server (NTRS)

    Fisher, Travis C.; Carpenter, Mark H.

    2013-01-01

    Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.

  14. Entropy Generation in a Chemical Reaction

    ERIC Educational Resources Information Center

    Miranda, E. N.

    2010-01-01

    Entropy generation in a chemical reaction is analysed without using the general formalism of non-equilibrium thermodynamics at a level adequate for advanced undergraduates. In a first approach to the problem, the phenomenological kinetic equation of an elementary first-order reaction is used to show that entropy production is always positive. A…

  15. Regularization of Grad’s 13 -Moment-Equations in Kinetic Gas Theory

    DTIC Science & Technology

    2011-01-01

    variant of the moment method has been proposed by Eu (1980) and is used, e.g., in Myong (2001). Recently, a maximum- entropy 10-moment system has been used...small amplitude linear waves, the R13 system is linearly stable in time for all modes and wave lengths. The instability of the Burnett system indicates...Boltzmann equation. Related to the problem of global hyperbolicity is the questions of the existence of an entropy law for the R13 system . In the linear

  16. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    NASA Astrophysics Data System (ADS)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  17. The Adiabatic Piston and the Second Law of Thermodynamics

    NASA Astrophysics Data System (ADS)

    Crosignani, Bruno; Di Porto, Paolo; Conti, Claudio

    2002-11-01

    A detailed analysis of the adiabatic-piston problem reveals peculiar dynamical features that challenge the general belief that isolated systems necessarily reach a static equilibrium state. In particular, the fact that the piston behaves like a perpetuum mobile, i.e., it never stops but keeps wandering, undergoing sizable oscillations, around the position corresponding to maximum entropy, has remarkable implications on the entropy variations of the system and on the validity of the second law when dealing with systems of mesoscopic dimensions.

  18. Maximum Entropy Calculations on a Discrete Probability Space

    DTIC Science & Technology

    1986-01-01

    constraints acting besides normalization. Statement 3: " The aim of this paper is to show that the die experiment just spoken of has solutions by classical ...analysis. Statement 4: We snall solve this problem in a purely classical way, without the need for recourse to any exotic estimator, such as ME." Note... The I’iximoun Entropy Principle lin i rejirk.ible -series ofT papers beginning in 1957, E. T. J.ayiieti (1957) be~gan a revuluuion in inductive

  19. A Novel Multivoxel-Based Quantitation of Metabolites and Lipids Noninvasively Combined with Diffusion-Weighted Imaging in Breast Cancer

    DTIC Science & Technology

    2013-10-01

    cancer for improving the overall specificity.  Our recent work has focused on testing retrospective Maximum Entropy and Compressed Sensing of the 4D...terparts and increases the entropy or sparsity of the reconstructed spectrum by narrowing the peak linewidths and de -noising smaller features. This, in...tightened’ beyond the standard de - viation of the noise in an effort to reduce the RMSE and reconstruc- tion non-linearity, but this prevents the

  20. Dissipated energy and entropy production for an unconventional heat engine: the stepwise `circular cycle'

    NASA Astrophysics Data System (ADS)

    di Liberto, Francesco; Pastore, Raffaele; Peruggi, Fulvio

    2011-05-01

    When some entropy is transferred, by means of a reversible engine, from a hot heat source to a colder one, the maximum efficiency occurs, i.e. the maximum available work is obtained. Similarly, a reversible heat pumps transfer entropy from a cold heat source to a hotter one with the minimum expense of energy. In contrast, if we are faced with non-reversible devices, there is some lost work for heat engines, and some extra work for heat pumps. These quantities are both related to entropy production. The lost work, i.e. ? , is also called 'degraded energy' or 'energy unavailable to do work'. The extra work, i.e. ? , is the excess of work performed on the system in the irreversible process with respect to the reversible one (or the excess of heat given to the hotter source in the irreversible process). Both quantities are analysed in detail and are evaluated for a complex process, i.e. the stepwise circular cycle, which is similar to the stepwise Carnot cycle. The stepwise circular cycle is a cycle performed by means of N small weights, dw, which are first added and then removed from the piston of the vessel containing the gas or vice versa. The work performed by the gas can be found as the increase of the potential energy of the dw's. Each single dw is identified and its increase, i.e. its increase in potential energy, evaluated. In such a way it is found how the energy output of the cycle is distributed among the dw's. The size of the dw's affects entropy production and therefore the lost and extra work. The distribution of increases depends on the chosen removal process.

  1. GABAergic excitation of spider mechanoreceptors increases information capacity by increasing entropy rather than decreasing jitter.

    PubMed

    Pfeiffer, Keram; French, Andrew S

    2009-09-02

    Neurotransmitter chemicals excite or inhibit a range of sensory afferents and sensory pathways. These changes in firing rate or static sensitivity can also be associated with changes in dynamic sensitivity or membrane noise and thus action potential timing. We measured action potential firing produced by random mechanical stimulation of spider mechanoreceptor neurons during long-duration excitation by the GABAA agonist muscimol. Information capacity was estimated from signal-to-noise ratio by averaging responses to repeated identical stimulation sequences. Information capacity was also estimated from the coherence function between input and output signals. Entropy rate was estimated by a data compression algorithm and maximum entropy rate from the firing rate. Action potential timing variability, or jitter, was measured as normalized interspike interval distance. Muscimol increased firing rate, information capacity, and entropy rate, but jitter was unchanged. We compared these data with the effects of increasing firing rate by current injection. Our results indicate that the major increase in information capacity by neurotransmitter action arose from the increased entropy rate produced by increased firing rate, not from reduction in membrane noise and action potential jitter.

  2. An improved method for predicting the evolution of the characteristic parameters of an information system

    NASA Astrophysics Data System (ADS)

    Dushkin, A. V.; Kasatkina, T. I.; Novoseltsev, V. I.; Ivanov, S. V.

    2018-03-01

    The article proposes a forecasting method that allows, based on the given values of entropy and error level of the first and second kind, to determine the allowable time for forecasting the development of the characteristic parameters of a complex information system. The main feature of the method under consideration is the determination of changes in the characteristic parameters of the development of the information system in the form of the magnitude of the increment in the ratios of its entropy. When a predetermined value of the prediction error ratio is reached, that is, the entropy of the system, the characteristic parameters of the system and the depth of the prediction in time are estimated. The resulting values of the characteristics and will be optimal, since at that moment the system possessed the best ratio of entropy as a measure of the degree of organization and orderliness of the structure of the system. To construct a method for estimating the depth of prediction, it is expedient to use the maximum principle of the value of entropy.

  3. Estimating Thermal Inertia with a Maximum Entropy Boundary Condition

    NASA Astrophysics Data System (ADS)

    Nearing, G.; Moran, M. S.; Scott, R.; Ponce-Campos, G.

    2012-04-01

    Thermal inertia, P [Jm-2s-1/2K-1], is a physical property the land surface which determines resistance to temperature change under seasonal or diurnal heating. It is a function of volumetric heat capacity, c [Jm-3K-1], and thermal conductivity, k [Wm-1K-1] of the soil near the surface: P=√ck. Thermal inertia of soil varies with moisture content due the difference between thermal properties of water and air, and a number of studies have demonstrated that it is feasible to estimate soil moisture given thermal inertia (e.g. Lu et al, 2009, Murray and Verhoef, 2007). We take the common approach to estimating thermal inertia using measurements of surface temperature by modeling the Earth's surface as a 1-dimensional homogeneous diffusive half-space. In this case, surface temperature is a function of the ground heat flux (G) boundary condition and thermal inertia and a daily value of P was estimated by matching measured and modeled diurnal surface temperature fluctuations. The difficulty is in measuring G; we demonstrate that the new maximum entropy production (MEP) method for partitioning net radiation into surface energy fluxes (Wang and Bras, 2011) provides a suitable boundary condition for estimating P. Adding the diffusion representation of heat transfer in the soil reduces the number of free parameters in the MEP model from two to one, and we provided a sensitivity analysis which suggests that, for the purpose of estimating P, it is preferable to parameterize the coupled MEP-diffusion model by the ratio of thermal inertia of the soil to the effective thermal inertia of convective heat transfer to the atmosphere. We used this technique to estimate thermal inertia at two semiarid, non-vegetated locations in the Walnut Gulch Experimental Watershed in southeast AZ, USA and compared these estimates to estimates of P made using the Xue and Cracknell (1995) solution for a linearized ground heat flux boundary condition, and we found that the MEP-diffusion model produced superior thermal inertia estimates. The MEP-diffusion estimates also agreed well with P estimates made using a boundary condition measured with buried flux plates. We further demonstrated the new method using diurnal surface temperature fluctuations estimated from day/night MODIS image pairs and, excluding instances where the soil was extremely dry, found a strong relationship between estimated thermal inertia and measured 5 cm soil moisture. Lu, S., Ju, Z.Q., Ren, T.S. & Horton, R. (2009). A general approach to estimate soil water content from thermal inertia. Agricultural and Forest Meteorology, 149, 1693-1698. Murray, T. & Verhoef, A. (2007). Moving towards a more mechanistic approach in the determination of soil heat flux from remote measurements - I. A universal approach to calculate thermal inertia. Agricultural and Forest Meteorology, 147, 80-87. Wang, J.F. & Bras, R.L. (2011). A model of evapotranspiration based on the theory of maximum entropy production. Water Resources Research, 47. Xue, Y. & Cracknell, A.P. (1995). Advanced thermal inertia modeling. International Journal of Remote Sensing, 16, 431-446.

  4. From Finite Time to Finite Physical Dimensions Thermodynamics: The Carnot Engine and Onsager's Relations Revisited

    NASA Astrophysics Data System (ADS)

    Feidt, Michel; Costea, Monica

    2018-04-01

    Many works have been devoted to finite time thermodynamics since the Curzon and Ahlborn [1] contribution, which is generally considered as its origin. Nevertheless, previous works in this domain have been revealed [2], [3], and recently, results of the attempt to correlate Finite Time Thermodynamics with Linear Irreversible Thermodynamics according to Onsager's theory were reported [4]. The aim of the present paper is to extend and improve the approach relative to thermodynamic optimization of generic objective functions of a Carnot engine with linear response regime presented in [4]. The case study of the Carnot engine is revisited within the steady state hypothesis, when non-adiabaticity of the system is considered, and heat loss is accounted for by an overall heat leak between the engine heat reservoirs. The optimization is focused on the main objective functions connected to engineering conditions, namely maximum efficiency or power output, except the one relative to entropy that is more fundamental. Results given in reference [4] relative to the maximum power output and minimum entropy production as objective function are reconsidered and clarified, and the change from finite time to finite physical dimension was shown to be done by the heat flow rate at the source. Our modeling has led to new results of the Carnot engine optimization and proved that the primary interest for an engineer is mainly connected to what we called Finite Physical Dimensions Optimal Thermodynamics.

  5. Unified approach to the entropy of an extremal rotating BTZ black hole: Thin shells and horizon limits

    NASA Astrophysics Data System (ADS)

    Lemos, José P. S.; Minamitsuji, Masato; Zaslavskii, Oleg B.

    2017-10-01

    Using a thin shell, the first law of thermodynamics, and a unified approach, we study the thermodymanics and find the entropy of a (2 +1 )-dimensional extremal rotating Bañados-Teitelbom-Zanelli (BTZ) black hole. The shell in (2 +1 ) dimensions, i.e., a ring, is taken to be circularly symmetric and rotating, with the inner region being a ground state of the anti-de Sitter spacetime and the outer region being the rotating BTZ spacetime. The extremal BTZ rotating black hole can be obtained in three different ways depending on the way the shell approaches its own gravitational or horizon radius. These ways are explicitly worked out. The resulting three cases give that the BTZ black hole entropy is either the Bekenstein-Hawking entropy, S =A/+ 4 G , or an arbitrary function of A+, S =S (A+) , where A+=2 π r+ is the area, i.e., the perimeter, of the event horizon in (2 +1 ) dimensions. We speculate that the entropy of an extremal black hole should obey 0 ≤S (A+)≤A/+ 4 G . We also show that the contributions from the various thermodynamic quantities, namely, the mass, the circular velocity, and the temperature, for the entropy in all three cases are distinct. This study complements the previous studies in thin shell thermodynamics and entropy for BTZ black holes. It also corroborates the results found for a (3 +1 )-dimensional extremal electrically charged Reissner-Nordström black hole.

  6. Gypsophila bermejoi G. López: A possible case of speciation repressed by bioclimatic factors.

    PubMed

    de Luis, Miguel; Bartolomé, Carmen; García Cardo, Óscar; Álvarez-Jiménez, Julio

    2018-01-01

    Gypsophila bermejoi G. López is an allopolyploid species derived from the parental G. struthium L. subsp. struthium and G. tomentosa L. All these plants are gypsophytes endemic to the Iberian Peninsula of particular ecological, evolutionary and biochemical interest. In this study, we present evidence of a possible repression on the process of G. bermejoi speciation by climatic factors. We modelled the ecological niches of the three taxa considered here using a maximum entropy approach and employing a series of bioclimatic variables. Subsequently, we projected these models onto the geographical space of the Iberian Peninsula in the present age and at two past ages: the Last Glacial Maximum and the mid-Holocene period. Furthermore, we compared these niches using the statistical method devised by Warren to calculate their degree of overlap. We also evaluated the evolution of the bioclimatic habitat suitability at those sites were the soil favors the growth of these species. Both the maximum entropy model and the degree of overlap indicated that the ecological behavior of the hybrid differs notably from that of the parental species. During the Last Glacial Maximum, the two parental species appear to take refuge in the western coastal strip of the Peninsula, a region in which there are virtually no sites where G. bermejoi could potentially be found. However, in the mid-Holocene period the suitability of G. bermejoi to sites with favorable soils shifts from almost null to a strong adaptation, a clear change in this tendency. These results suggest that the ecological niches of hybrid allopolyploids can be considerably different to those of their parental species, which may have evolutionary and ecologically relevant consequences. The data obtained indicate that certain bioclimatic variables may possibly repress the processes by which new species are formed. The difference in the ecological niche of G. bermejoi with respect to its parental species prevented it from prospering during the Last Glacial Maximum. However, the climatic change in the mid-Holocene period released this block and as such, it permitted the new species to establish itself. Accordingly, we favor a recent origin of the current populations of G. bermejoi.

  7. Gypsophila bermejoi G. López: A possible case of speciation repressed by bioclimatic factors

    PubMed Central

    de Luis, Miguel; García Cardo, Óscar; Álvarez-Jiménez, Julio

    2018-01-01

    Gypsophila bermejoi G. López is an allopolyploid species derived from the parental G. struthium L. subsp. struthium and G. tomentosa L. All these plants are gypsophytes endemic to the Iberian Peninsula of particular ecological, evolutionary and biochemical interest. In this study, we present evidence of a possible repression on the process of G. bermejoi speciation by climatic factors. We modelled the ecological niches of the three taxa considered here using a maximum entropy approach and employing a series of bioclimatic variables. Subsequently, we projected these models onto the geographical space of the Iberian Peninsula in the present age and at two past ages: the Last Glacial Maximum and the mid-Holocene period. Furthermore, we compared these niches using the statistical method devised by Warren to calculate their degree of overlap. We also evaluated the evolution of the bioclimatic habitat suitability at those sites were the soil favors the growth of these species. Both the maximum entropy model and the degree of overlap indicated that the ecological behavior of the hybrid differs notably from that of the parental species. During the Last Glacial Maximum, the two parental species appear to take refuge in the western coastal strip of the Peninsula, a region in which there are virtually no sites where G. bermejoi could potentially be found. However, in the mid-Holocene period the suitability of G. bermejoi to sites with favorable soils shifts from almost null to a strong adaptation, a clear change in this tendency. These results suggest that the ecological niches of hybrid allopolyploids can be considerably different to those of their parental species, which may have evolutionary and ecologically relevant consequences. The data obtained indicate that certain bioclimatic variables may possibly repress the processes by which new species are formed. The difference in the ecological niche of G. bermejoi with respect to its parental species prevented it from prospering during the Last Glacial Maximum. However, the climatic change in the mid-Holocene period released this block and as such, it permitted the new species to establish itself. Accordingly, we favor a recent origin of the current populations of G. bermejoi. PMID:29338010

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattari, Sulimon, E-mail: ssattari2@ucmerced.edu; Chen, Qianting, E-mail: qchen2@ucmerced.edu; Mitchell, Kevin A., E-mail: kmitchell@ucmerced.edu

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or “ghost,” rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding ofmore » ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.« less

  9. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns

    PubMed Central

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist. PMID:27571423

  10. Statistical field estimators for multiscale simulations.

    PubMed

    Eapen, Jacob; Li, Ju; Yip, Sidney

    2005-11-01

    We present a systematic approach for generating smooth and accurate fields from particle simulation data using the notions of statistical inference. As an extension to a parametric representation based on the maximum likelihood technique previously developed for velocity and temperature fields, a nonparametric estimator based on the principle of maximum entropy is proposed for particle density and stress fields. Both estimators are applied to represent molecular dynamics data on shear-driven flow in an enclosure which exhibits a high degree of nonlinear characteristics. We show that the present density estimator is a significant improvement over ad hoc bin averaging and is also free of systematic boundary artifacts that appear in the method of smoothing kernel estimates. Similarly, the velocity fields generated by the maximum likelihood estimator do not show any edge effects that can be erroneously interpreted as slip at the wall. For low Reynolds numbers, the velocity fields and streamlines generated by the present estimator are benchmarked against Newtonian continuum calculations. For shear velocities that are a significant fraction of the thermal speed, we observe a form of shear localization that is induced by the confining boundary.

  11. Surface entropy of liquids via a direct Monte Carlo approach - Application to liquid Si

    NASA Technical Reports Server (NTRS)

    Wang, Z. Q.; Stroud, D.

    1990-01-01

    Two methods are presented for a direct Monte Carlo evaluation of the surface entropy S(s) of a liquid interacting by specified, volume-independent potentials. The first method is based on an application of the approach of Ferrenberg and Swendsen (1988, 1989) to Monte Carlo simulations at two different temperatures; it gives much more reliable results for S(s) in liquid Si than previous calculations based on numerical differentiation. The second method expresses the surface entropy directly as a canonical average at fixed temperature.

  12. A comparative study on the mechanical energy of the normal, ACL, osteoarthritis, and Parkinson subjects.

    PubMed

    Bahreinizad, Hossein; Salimi Bani, Milad; Hasani, Mojtaba; Karimi, Mohammad Taghi; Sharifmoradi, Keyvan; Karimi, Alireza

    2017-08-09

    The influence of various musculoskeletal disorders has been evaluated using different kinetic and kinematic parameters. But the efficiency of walking can be evaluated by measuring the effort of the subject, or by other words the energy that is required to walk. The aim of this study was to identify mechanical energy differences between the normal and pathological groups. Four groups of 15 healthy subjects, 13 Parkinson subjects, 4 osteoarthritis subjects, and 4 ACL reconstructed subjects have participated in this study. The motions of foot, shank and thigh were recorded using a three dimensional motion analysis system. The kinetic, potential and total mechanical energy of each segment was calculated using 3D markers positions and anthropometric measurements. Maximum value and sample entropy of energies was compared between the normal and abnormal subjects. Maximum value of potential energy of OA subjects was lower than the normal subjects. Furthermore, sample entropy of mechanical energy for Parkinson subjects was low in comparison to the normal subjects while sample entropy of mechanical energy for the ACL subjects was higher than that of the normal subjects. Findings of this study suggested that the subjects with different abilities show different mechanical energy during walking.

  13. Multi-GPU maximum entropy image synthesis for radio astronomy

    NASA Astrophysics Data System (ADS)

    Cárcamo, M.; Román, P. E.; Casassus, S.; Moral, V.; Rannou, F. R.

    2018-01-01

    The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has a statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridding MEM, which is tested using real and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This allows to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 min, instead of 2.5 days that takes a sequential version on CPU.

  14. “It sounds like…”: A Natural Language Processing Approach to Detecting Counselor Reflections in Motivational Interviewing

    PubMed Central

    Can, Doğan; Marín, Rebeca A.; Georgiou, Panayiotis G.; Imel, Zac E.; Atkins, David C.; Narayanan, Shrikanth S.

    2016-01-01

    The dissemination and evaluation of evidence based behavioral treatments for substance abuse problems rely on the evaluation of counselor interventions. In Motivational Interviewing (MI), a treatment that directs the therapist to utilize a particular linguistic style, proficiency is assessed via behavioral coding - a time consuming, non-technological approach. Natural language processing techniques have the potential to scale up the evaluation of behavioral treatments like MI. We present a novel computational approach to assessing components of MI, focusing on one specific counselor behavior – reflections – that are believed to be a critical MI ingredient. Using 57 sessions from 3 MI clinical trials, we automatically detected counselor reflections in a Maximum Entropy Markov Modeling framework using the raw linguistic data derived from session transcripts. We achieved 93% recall, 90% specificity, and 73% precision. Results provide insight into the linguistic information used by coders to make ratings and demonstrate the feasibility of new computational approaches to scaling up the evaluation of behavioral treatments. PMID:26784286

  15. Planck absolute entropy of a rotating BTZ black hole

    NASA Astrophysics Data System (ADS)

    Riaz, S. M. Jawwad

    2018-04-01

    In this paper, the Planck absolute entropy and the Bekenstein-Smarr formula of the rotating Banados-Teitelboim-Zanelli (BTZ) black hole are presented via a complex thermodynamical system contributed by its inner and outer horizons. The redefined entropy approaches zero as the temperature of the rotating BTZ black hole tends to absolute zero, satisfying the Nernst formulation of a black hole. Hence, it can be regarded as the Planck absolute entropy of the rotating BTZ black hole.

  16. High-Order Entropy Stable Formulations for Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Fisher, Travis C.

    2013-01-01

    A systematic approach is presented for developing entropy stable (SS) formulations of any order for the Navier-Stokes equations. These SS formulations discretely conserve mass, momentum, energy and satisfy a mathematical entropy inequality. They are valid for smooth as well as discontinuous flows provided sufficient dissipation is added at shocks and discontinuities. Entropy stable formulations exist for all diagonal norm, summation-by-parts (SBP) operators, including all centered finite-difference operators, Legendre collocation finite-element operators, and certain finite-volume operators. Examples are presented using various entropy stable formulations that demonstrate the current state-of-the-art of these schemes.

  17. Non-life insurance pricing: multi-agent model

    NASA Astrophysics Data System (ADS)

    Darooneh, A. H.

    2004-11-01

    We use the maximum entropy principle for the pricing of non-life insurance and recover the Bühlmann results for the economic premium principle. The concept of economic equilibrium is revised in this respect.

  18. Maximum entropy principle for stationary states underpinned by stochastic thermodynamics.

    PubMed

    Ford, Ian J

    2015-11-01

    The selection of an equilibrium state by maximizing the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete information. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximizing the change, averaged over all realizations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realizations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.

  19. Equivalence of MAXENT and Poisson point process models for species distribution modeling in ecology.

    PubMed

    Renner, Ian W; Warton, David I

    2013-03-01

    Modeling the spatial distribution of a species is a fundamental problem in ecology. A number of modeling methods have been developed, an extremely popular one being MAXENT, a maximum entropy modeling approach. In this article, we show that MAXENT is equivalent to a Poisson regression model and hence is related to a Poisson point process model, differing only in the intercept term, which is scale-dependent in MAXENT. We illustrate a number of improvements to MAXENT that follow from these relations. In particular, a point process model approach facilitates methods for choosing the appropriate spatial resolution, assessing model adequacy, and choosing the LASSO penalty parameter, all currently unavailable to MAXENT. The equivalence result represents a significant step in the unification of the species distribution modeling literature. Copyright © 2013, The International Biometric Society.

  20. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  1. Halo-independence with quantified maximum entropy at DAMA/LIBRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowlie, Andrew, E-mail: andrew.j.fowlie@googlemail.com

    2017-10-01

    Using the DAMA/LIBRA anomaly as an example, we formalise the notion of halo-independence in the context of Bayesian statistics and quantified maximum entropy. We consider an infinite set of possible profiles, weighted by an entropic prior and constrained by a likelihood describing noisy measurements of modulated moments by DAMA/LIBRA. Assuming an isotropic dark matter (DM) profile in the galactic rest frame, we find the most plausible DM profiles and predictions for unmodulated signal rates at DAMA/LIBRA. The entropic prior contains an a priori unknown regularisation factor, β, that describes the strength of our conviction that the profile is approximately Maxwellian.more » By varying β, we smoothly interpolate between a halo-independent and a halo-dependent analysis, thus exploring the impact of prior information about the DM profile.« less

  2. A measure of uncertainty regarding the interval constraint of normal mean elicited by two stages of a prior hierarchy.

    PubMed

    Kim, Hea-Jung

    2014-01-01

    This paper considers a hierarchical screened Gaussian model (HSGM) for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  3. Multi-Group Maximum Entropy Model for Translational Non-Equilibrium

    NASA Technical Reports Server (NTRS)

    Jayaraman, Vegnesh; Liu, Yen; Panesi, Marco

    2017-01-01

    The aim of the current work is to describe a new model for flows in translational non- equilibrium. Starting from the statistical description of a gas proposed by Boltzmann, the model relies on a domain decomposition technique in velocity space. Using the maximum entropy principle, the logarithm of the distribution function in each velocity sub-domain (group) is expressed with a power series in molecular velocity. New governing equations are obtained using the method of weighted residuals by taking the velocity moments of the Boltzmann equation. The model is applied to a spatially homogeneous Boltzmann equation with a Bhatnagar-Gross-Krook1(BGK) model collision operator and the relaxation of an initial non-equilibrium distribution to a Maxwellian is studied using the model. In addition, numerical results obtained using the model for a 1D shock tube problem are also reported.

  4. Maximum entropy perception-action space: a Bayesian model of eye movement selection

    NASA Astrophysics Data System (ADS)

    Colas, Francis; Bessière, Pierre; Girard, Benoît

    2011-03-01

    In this article, we investigate the issue of the selection of eye movements in a free-eye Multiple Object Tracking task. We propose a Bayesian model of retinotopic maps with a complex logarithmic mapping. This model is structured in two parts: a representation of the visual scene, and a decision model based on the representation. We compare different decision models based on different features of the representation and we show that taking into account uncertainty helps predict the eye movements of subjects recorded in a psychophysics experiment. Finally, based on experimental data, we postulate that the complex logarithmic mapping has a functional relevance, as the density of objects in this space in more uniform than expected. This may indicate that the representation space and control strategies are such that the object density is of maximum entropy.

  5. Estimation of typhoon rainfall in GaoPing River: A Multivariate Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Pei-Jui, Wu; Hwa-Lung, Yu

    2016-04-01

    The heavy rainfall from typhoons is the main factor of the natural disaster in Taiwan, which causes the significant loss of human lives and properties. Statistically average 3.5 typhoons invade Taiwan every year, and the serious typhoon, Morakot in 2009, impacted Taiwan in recorded history. Because the duration, path and intensity of typhoon, also affect the temporal and spatial rainfall type in specific region , finding the characteristics of the typhoon rainfall type is advantageous when we try to estimate the quantity of rainfall. This study developed a rainfall prediction model and can be divided three parts. First, using the EEOF(extended empirical orthogonal function) to classify the typhoon events, and decompose the standard rainfall type of all stations of each typhoon event into the EOF and PC(principal component). So we can classify the typhoon events which vary similarly in temporally and spatially as the similar typhoon types. Next, according to the classification above, we construct the PDF(probability density function) in different space and time by means of using the multivariate maximum entropy from the first to forth moment statistically. Therefore, we can get the probability of each stations of each time. Final we use the BME(Bayesian Maximum Entropy method) to construct the typhoon rainfall prediction model , and to estimate the rainfall for the case of GaoPing river which located in south of Taiwan.This study could be useful for typhoon rainfall predictions in future and suitable to government for the typhoon disaster prevention .

  6. Normal Mode Analysis in Zeolites: Toward an Efficient Calculation of Adsorption Entropies.

    PubMed

    De Moor, Bart A; Ghysels, An; Reyniers, Marie-Françoise; Van Speybroeck, Veronique; Waroquier, Michel; Marin, Guy B

    2011-04-12

    An efficient procedure for normal-mode analysis of extended systems, such as zeolites, is developed and illustrated for the physisorption and chemisorption of n-octane and isobutene in H-ZSM-22 and H-FAU using periodic DFT calculations employing the Vienna Ab Initio Simulation Package. Physisorption and chemisorption entropies resulting from partial Hessian vibrational analysis (PHVA) differ at most 10 J mol(-1) K(-1) from those resulting from full Hessian vibrational analysis, even for PHVA schemes in which only a very limited number of atoms are considered free. To acquire a well-conditioned Hessian, much tighter optimization criteria than commonly used for electronic energy calculations in zeolites are required, i.e., at least an energy cutoff of 400 eV, maximum force of 0.02 eV/Å, and self-consistent field loop convergence criteria of 10(-8) eV. For loosely bonded complexes the mobile adsorbate method is applied, in which frequency contributions originating from translational or rotational motions of the adsorbate are removed from the total partition function and replaced by free translational and/or rotational contributions. The frequencies corresponding with these translational and rotational modes can be selected unambiguously based on a mobile block Hessian-PHVA calculation, allowing the prediction of physisorption entropies within an accuracy of 10-15 J mol(-1) K(-1) as compared to experimental values. The approach presented in this study is useful for studies on other extended catalytic systems.

  7. Activity-Based Approach for Teaching Aqueous Solubility, Energy, and Entropy

    ERIC Educational Resources Information Center

    Eisen, Laura; Marano, Nadia; Glazier, Samantha

    2014-01-01

    We describe an activity-based approach for teaching aqueous solubility to introductory chemistry students that provides a more balanced presentation of the roles of energy and entropy in dissolution than is found in most general chemistry textbooks. In the first few activities, students observe that polar substances dissolve in water, whereas…

  8. An entropy-based analysis of lane changing behavior: An interactive approach.

    PubMed

    Kosun, Caglar; Ozdemir, Serhan

    2017-05-19

    As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would match discretionary lane changing behavior. This article states that driver behaviors would be in the nonadditive entropy domain to provide a safe traffic stream and hence with vehicle accident prevention in mind.

  9. [Evaluation of a simplified index (spectral entropy) about sleep state of electrocardiogram recorded by a simplified polygraph, MemCalc-Makin2].

    PubMed

    Ohisa, Noriko; Ogawa, Hiromasa; Murayama, Nobuki; Yoshida, Katsumi

    2010-02-01

    Polysomnography (PSG) is the gold standard for the diagnosis of sleep apnea hypopnea syndrome (SAHS), but it takes time to analyze the PSG and PSG cannot be performed repeatedly because of efforts and costs. Therefore, simplified sleep respiratory disorder indices in which are reflected the PSG results are needed. The Memcalc method, which is a combination of the maximum entropy method for spectral analysis and the non-linear least squares method for fitting analysis (Makin2, Suwa Trust, Tokyo, Japan) has recently been developed. Spectral entropy which is derived by the Memcalc method might be useful to expressing the trend of time-series behavior. Spectral entropy of ECG which is calculated with the Memcalc method was evaluated by comparing to the PSG results. Obstructive SAS patients (n = 79) and control volanteer (n = 7) ECG was recorded using MemCalc-Makin2 (GMS) with PSG recording using Alice IV (Respironics) from 20:00 to 6:00. Spectral entropy of ECG, which was calculated every 2 seconds using the Memcalc method, was compared to sleep stages which were analyzed manually from PSG recordings. Spectral entropy value (-0.473 vs. -0.418, p < 0.05) were significantly increased in the OSAHS compared to the control. For the entropy cutoff level of -0.423, sensitivity and specificity for OSAHS were 86.1% and 71.4%, respectively, resulting in a receiver operating characteristic with an area under the curve of 0.837. The absolute value of entropy had inverse correlation with stage 3. Spectral entropy, which was calculated with Memcalc method, might be a possible index evaluating the quality of sleep.

  10. Identifying Student Resources in Reasoning about Entropy and the Approach to Thermal Equilibrium

    ERIC Educational Resources Information Center

    Loverude, Michael

    2015-01-01

    As part of an ongoing project to examine student learning in upper-division courses in thermal and statistical physics, we have examined student reasoning about entropy and the second law of thermodynamics. We have examined reasoning in terms of heat transfer, entropy maximization, and statistical treatments of multiplicity and probability. In…

  11. Magnetocaloric effect in potassium doped lanthanum manganite perovskites prepared by a pyrophoric method

    NASA Astrophysics Data System (ADS)

    Das, Soma; Dey, T. K.

    2006-08-01

    The magnetocaloric effect (MCE) in fine grained perovskite manganites of the type La1-xKxMnO3 (0

  12. Entrofy: Participant Selection Made Easy

    NASA Astrophysics Data System (ADS)

    Huppenkothen, Daniela

    2016-03-01

    Selection participants for a workshop out of a much larger applicant pool can be a difficult task, especially when the goal is diversifying over a range of criteria (e.g. academic seniority, research field, skill levels, gender etc). In this talk I am presenting our tool, Entrofy, aimed at aiding organizers in this task. Entrofy is an open-source tool using a maximum entropy-based algorithm that aims to select a set of participants out of the applicant pool such that a pre-defined range of criteria are globally maximized. This approach allows for a potentially more transparent and less biased selection process while encouraging organizers to think deeply about the goals and the process of their participant selection.

  13. Entropy of gaseous boron monobromide

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Feng; Peng, Xiao-Long; Zhang, Lie-Hui; Wang, Chao-Wen; Jia, Chun-Sheng

    2017-10-01

    We present an explicit representation of molar entropy for gaseous boron monobromide in terms of experimental values of only three molecular constants. Fortunately, through comparison of theoretically calculated results and experimental data, we find that the molar entropy of gaseous boron monobromide can be well predicted by employing the improved Manning-Rosen oscillator to describe the internal vibration of boron monobromide molecule. The present approach provides also opportunities for theoretical predictions of molar entropy for other gases with no use of large amounts of experimental spectroscopy data.

  14. 1/ f noise from the laws of thermodynamics for finite-size fluctuations.

    PubMed

    Chamberlin, Ralph V; Nasir, Derek M

    2014-07-01

    Computer simulations of the Ising model exhibit white noise if thermal fluctuations are governed by Boltzmann's factor alone; whereas we find that the same model exhibits 1/f noise if Boltzmann's factor is extended to include local alignment entropy to all orders. We show that this nonlinear correction maintains maximum entropy during equilibrium fluctuations. Indeed, as with the usual way to resolve Gibbs' paradox that avoids entropy reduction during reversible processes, the correction yields the statistics of indistinguishable particles. The correction also ensures conservation of energy if an instantaneous contribution from local entropy is included. Thus, a common mechanism for 1/f noise comes from assuming that finite-size fluctuations strictly obey the laws of thermodynamics, even in small parts of a large system. Empirical evidence for the model comes from its ability to match the measured temperature dependence of the spectral-density exponents in several metals and to show non-Gaussian fluctuations characteristic of nanoscale systems.

  15. Entropy in an expanding universe.

    PubMed

    Frautschi, S

    1982-08-13

    The question of how the observed evolution of organized structures from initial chaos in the expanding universe can be reconciled with the laws of statistical mechanics is studied, with emphasis on effects of the expansion and gravity. Some major sources of entropy increase are listed. An expanding "causal" region is defined in which the entropy, though increasing, tends to fall further and further behind its maximum possible value, thus allowing for the development of order. The related questions of whether entropy will continue increasing without limit in the future, and whether such increase in the form of Hawking radiation or radiation from positronium might enable life to maintain itself permanently, are considered. Attempts to find a scheme for preserving life based on solid structures fail because events such as quantum tunneling recurrently disorganize matter on a very long but fixed time scale, whereas all energy sources slow down progressively in an expanding universe. However, there remains hope that other modes of life capable of maintaining themselves permanently can be found.

  16. Quantum and Ecosystem Entropies

    NASA Astrophysics Data System (ADS)

    Kirwan, A. D.

    2008-06-01

    Ecosystems and quantum gases share a number of superficial similarities including enormous numbers of interacting elements and the fundamental role of energy in such interactions. A theory for the synthesis of data and prediction of new phenomena is well established in quantum statistical mechanics. The premise of this paper is that the reason a comparable unifying theory has not emerged in ecology is that a proper role for entropy has yet to be assigned. To this end, a phase space entropy model of ecosystems is developed. Specification of an ecosystem phase space cell size based on microbial mass, length, and time scales gives an ecosystem uncertainty parameter only about three orders of magnitude larger than Planck’s constant. Ecosystem equilibria is specified by conservation of biomass and total metabolic energy, along with the principle of maximum entropy at equilibria. Both Bose - Einstein and Fermi - Dirac equilibrium conditions arise in ecosystems applications. The paper concludes with a discussion of some broader aspects of an ecosystem phase space.

  17. Application of digital image processing techniques to astronomical imagery 1980

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1981-01-01

    Topics include: (1) polar coordinate transformations (M83); (2) multispectral ratios (M82); (3) maximum entropy restoration (M87); (4) automated computation of stellar magnitudes in nebulosity; (5) color and polarization; (6) aliasing.

  18. Nonextensivity in a Dark Maximum Entropy Landscape

    NASA Astrophysics Data System (ADS)

    Leubner, M. P.

    2011-03-01

    Nonextensive statistics along with network science, an emerging branch of graph theory, are increasingly recognized as potential interdisciplinary frameworks whenever systems are subject to long-range interactions and memory. Such settings are characterized by non-local interactions evolving in a non-Euclidean fractal/multi-fractal space-time making their behavior nonextensive. After summarizing the theoretical foundations from first principles, along with a discussion of entropy bifurcation and duality in nonextensive systems, we focus on selected significant astrophysical consequences. Those include the gravitational equilibria of dark matter (DM) and hot gas in clustered structures, the dark energy(DE) negative pressure landscape governed by the highest degree of mutual correlations and the hierarchy of discrete cosmic structure scales, available upon extremizing the generalized nonextensive link entropy in a homogeneous growing network.

  19. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  20. Entropy, Ergodicity, and Stem Cell Multipotency

    NASA Astrophysics Data System (ADS)

    Ridden, Sonya J.; Chang, Hannah H.; Zygalakis, Konstantinos C.; MacArthur, Ben D.

    2015-11-01

    Populations of mammalian stem cells commonly exhibit considerable cell-cell variability. However, the functional role of this diversity is unclear. Here, we analyze expression fluctuations of the stem cell surface marker Sca1 in mouse hematopoietic progenitor cells using a simple stochastic model and find that the observed dynamics naturally lie close to a critical state, thereby producing a diverse population that is able to respond rapidly to environmental changes. We propose an information-theoretic interpretation of these results that views cellular multipotency as an instance of maximum entropy statistical inference.

  1. Third law of thermodynamics in the presence of a heat flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camacho, J.

    1995-01-01

    Following a maximum entropy formalism, we study a one-dimensional crystal under a heat flux. We obtain the phonon distribution function and evaluate the nonequilibrium temperature, the specific heat, and the entropy as functions of the internal energy and the heat flux, in both the quantum and the classical limits. Some analogies between the behavior of equilibrium systems at low absolute temperature and nonequilibrium steady states under high values of the heat flux are shown, which point to a possible generalization of the third law in nonequilibrium situations.

  2. The existence of negative absolute temperatures in Axelrod’s social influence model

    NASA Astrophysics Data System (ADS)

    Villegas-Febres, J. C.; Olivares-Rivas, W.

    2008-06-01

    We introduce the concept of temperature as an order parameter in the standard Axelrod’s social influence model. It is defined as the relation between suitably defined entropy and energy functions, T=(. We show that at the critical point, where the order/disorder transition occurs, this absolute temperature changes in sign. At this point, which corresponds to the transition homogeneous/heterogeneous culture, the entropy of the system shows a maximum. We discuss the relationship between the temperature and other properties of the model in terms of cultural traits.

  3. Application of a Real-Time, Calculable Limiting Form of the Renyi Entropy for Molecular Imaging of Tumors

    PubMed Central

    Marsh, J. N.; Wallace, K. D.; McCarthy, J. E.; Wickerhauser, M. V.; Maurizi, B. N.; Lanza, G. M.; Wickline, S. A.; Hughes, M. S.

    2011-01-01

    Previously, we reported new methods for ultrasound signal characterization using entropy, Hf; a generalized entropy, the Renyi entropy, If(r); and a limiting form of Renyi entropy suitable for real-time calculation, If,∞. All of these quantities demonstrated significantly more sensitivity to subtle changes in scattering architecture than energy-based methods in certain settings. In this study, the real-time calculable limit of the Renyi entropy, If,∞, is applied for the imaging of angiogenic murine neovasculature in a breast cancer xenograft using a targeted contrast agent. It is shown that this approach may be used to detect reliably the accumulation of targeted nanoparticles at five minutes post-injection in this in vivo model. PMID:20679020

  4. A synergistic approach to protein crystallization: Combination of a fixed-arm carrier with surface entropy reduction

    PubMed Central

    Moon, Andrea F; Mueller, Geoffrey A; Zhong, Xuejun; Pedersen, Lars C

    2010-01-01

    Protein crystallographers are often confronted with recalcitrant proteins not readily crystallizable, or which crystallize in problematic forms. A variety of techniques have been used to surmount such obstacles: crystallization using carrier proteins or antibody complexes, chemical modification, surface entropy reduction, proteolytic digestion, and additive screening. Here we present a synergistic approach for successful crystallization of proteins that do not form diffraction quality crystals using conventional methods. This approach combines favorable aspects of carrier-driven crystallization with surface entropy reduction. We have generated a series of maltose binding protein (MBP) fusion constructs containing different surface mutations designed to reduce surface entropy and encourage crystal lattice formation. The MBP advantageously increases protein expression and solubility, and provides a streamlined purification protocol. Using this technique, we have successfully solved the structures of three unrelated proteins that were previously unattainable. This crystallization technique represents a valuable rescue strategy for protein structure solution when conventional methods fail. PMID:20196072

  5. Entropy of spatial network ensembles

    NASA Astrophysics Data System (ADS)

    Coon, Justin P.; Dettmann, Carl P.; Georgiou, Orestis

    2018-04-01

    We analyze complexity in spatial network ensembles through the lens of graph entropy. Mathematically, we model a spatial network as a soft random geometric graph, i.e., a graph with two sources of randomness, namely nodes located randomly in space and links formed independently between pairs of nodes with probability given by a specified function (the "pair connection function") of their mutual distance. We consider the general case where randomness arises in node positions as well as pairwise connections (i.e., for a given pair distance, the corresponding edge state is a random variable). Classical random geometric graph and exponential graph models can be recovered in certain limits. We derive a simple bound for the entropy of a spatial network ensemble and calculate the conditional entropy of an ensemble given the node location distribution for hard and soft (probabilistic) pair connection functions. Under this formalism, we derive the connection function that yields maximum entropy under general constraints. Finally, we apply our analytical framework to study two practical examples: ad hoc wireless networks and the US flight network. Through the study of these examples, we illustrate that both exhibit properties that are indicative of nearly maximally entropic ensembles.

  6. The Dynameomics Entropy Dictionary: A Large-Scale Assessment of Conformational Entropy across Protein Fold Space.

    PubMed

    Towse, Clare-Louise; Akke, Mikael; Daggett, Valerie

    2017-04-27

    Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing data set of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identifies deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, α-helices have lower entropy on average than do β-sheets, and both are lower than coil regions.

  7. Demystifying Introductory Chemistry. Part 4: An Approach to Reaction Thermodynamics through Enthalpies, Entropies, and Free Energies of Atomization.

    ERIC Educational Resources Information Center

    Spencer, James N.; And Others

    1996-01-01

    Presents an alternative approach to teaching reaction thermodynamics in introductory chemistry courses using calculations of enthalpies, entropies, and free energies of atomization. Uses a consistent concept, that of decomposition of a compound to its gaseous atoms, to discuss not only thermodynamic parameters but also equilibrium and…

  8. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin.

    PubMed

    Hacisuleyman, Aysima; Erman, Burak

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins.

  9. Quench action and Rényi entropies in integrable systems

    NASA Astrophysics Data System (ADS)

    Alba, Vincenzo; Calabrese, Pasquale

    2017-09-01

    Entropy is a fundamental concept in equilibrium statistical mechanics, yet its origin in the nonequilibrium dynamics of isolated quantum systems is not fully understood. A strong consensus is emerging around the idea that the stationary thermodynamic entropy is the von Neumann entanglement entropy of a large subsystem embedded in an infinite system. Also motivated by cold-atom experiments, here we consider the generalization to Rényi entropies. We develop a new technique to calculate the diagonal Rényi entropy in the quench action formalism. In the spirit of the replica treatment for the entanglement entropy, the diagonal Rényi entropies are generalized free energies evaluated over a thermodynamic macrostate which depends on the Rényi index and, in particular, is not the same state describing von Neumann entropy. The technical reason for this perhaps surprising result is that the evaluation of the moments of the diagonal density matrix shifts the saddle point of the quench action. An interesting consequence is that different Rényi entropies encode information about different regions of the spectrum of the postquench Hamiltonian. Our approach provides a very simple proof of the long-standing issue that, for integrable systems, the diagonal entropy is half of the thermodynamic one and it allows us to generalize this result to the case of arbitrary Rényi entropy.

  10. Enthalpy-entropy compensation for the solubility of drugs in solvent mixtures: paracetamol, acetanilide, and nalidixic acid in dioxane-water.

    PubMed

    Bustamante, P; Romero, S; Pena, A; Escalera, B; Reillo, A

    1998-12-01

    In earlier work, a nonlinear enthalpy-entropy compensation was observed for the solubility of phenacetin in dioxane-water mixtures. This effect had not been earlier reported for the solubility of drugs in solvent mixtures. To gain insight into the compensation effect, the behavior of the apparent thermodynamic magnitudes for the solubility of paracetamol, acetanilide, and nalidixic acid is studied in this work. The solubility of these drugs was measured at several temperatures in dioxane-water mixtures. DSC analysis was performed on the original powders and on the solid phases after equilibration with the solvent mixture. The thermal properties of the solid phases did not show significant changes. The three drugs display a solubility maximum against the cosolvent ratio. The solubility peaks of acetanilide and nalidixic acid shift to a more polar region at the higher temperatures. Nonlinear van't Hoff plots were observed for nalidixic acid whereas acetanilide and paracetamol show linear behavior at the temperature range studied. The apparent enthalpies of solution are endothermic going through a maximum at 50% dioxane. Two different mechanisms, entropy and enthalpy, are suggested to be the driving forces that increase the solubility of the three drugs. Solubility is entropy controlled at the water-rich region (0-50% dioxane) and enthalpy controlled at the dioxane-rich region (50-100% dioxane). The enthalpy-entropy compensation analysis also suggests that two different mechanisms, dependent on cosolvent ratio, are involved in the solubility enhancement of the three drugs. The plots of deltaH versus deltaG are nonlinear, and the slope changes from positive to negative above 50% dioxane. The compensation effect for the thermodynamic magnitudes of transfer from water to the aqueous mixtures can be described by a common empirical nonlinear relationship, with the exception of paracetamol, which follows a separate linear relationship at dioxane ratios above 50%. The results corroborate earlier findings with phenacetin. The similar pattern shown by the drugs studied suggests that the nonlinear enthalpy-entropy compensation effect may be characteristic of the solubility of semipolar drugs in dioxane-water mixtures.

  11. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    NASA Astrophysics Data System (ADS)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  12. Application of the maximum entropy principle to determine ensembles of intrinsically disordered proteins from residual dipolar couplings.

    PubMed

    Sanchez-Martinez, M; Crehuet, R

    2014-12-21

    We present a method based on the maximum entropy principle that can re-weight an ensemble of protein structures based on data from residual dipolar couplings (RDCs). The RDCs of intrinsically disordered proteins (IDPs) provide information on the secondary structure elements present in an ensemble; however even two sets of RDCs are not enough to fully determine the distribution of conformations, and the force field used to generate the structures has a pervasive influence on the refined ensemble. Two physics-based coarse-grained force fields, Profasi and Campari, are able to predict the secondary structure elements present in an IDP, but even after including the RDC data, the re-weighted ensembles differ between both force fields. Thus the spread of IDP ensembles highlights the need for better force fields. We distribute our algorithm in an open-source Python code.

  13. Energy and maximum norm estimates for nonlinear conservation laws

    NASA Technical Reports Server (NTRS)

    Olsson, Pelle; Oliger, Joseph

    1994-01-01

    We have devised a technique that makes it possible to obtain energy estimates for initial-boundary value problems for nonlinear conservation laws. The two major tools to achieve the energy estimates are a certain splitting of the flux vector derivative f(u)(sub x), and a structural hypothesis, referred to as a cone condition, on the flux vector f(u). These hypotheses are fulfilled for many equations that occur in practice, such as the Euler equations of gas dynamics. It should be noted that the energy estimates are obtained without any assumptions on the gradient of the solution u. The results extend to weak solutions that are obtained as point wise limits of vanishing viscosity solutions. As a byproduct we obtain explicit expressions for the entropy function and the entropy flux of symmetrizable systems of conservation laws. Under certain circumstances the proposed technique can be applied repeatedly so as to yield estimates in the maximum norm.

  14. Test images for the maximum entropy image restoration method

    NASA Technical Reports Server (NTRS)

    Mackey, James E.

    1990-01-01

    One of the major activities of any experimentalist is data analysis and reduction. In solar physics, remote observations are made of the sun in a variety of wavelengths and circumstances. In no case is the data collected free from the influence of the design and operation of the data gathering instrument as well as the ever present problem of noise. The presence of significant noise invalidates the simple inversion procedure regardless of the range of known correlation functions. The Maximum Entropy Method (MEM) attempts to perform this inversion by making minimal assumptions about the data. To provide a means of testing the MEM and characterizing its sensitivity to noise, choice of point spread function, type of data, etc., one would like to have test images of known characteristics that can represent the type of data being analyzed. A means of reconstructing these images is presented.

  15. Two dissimilar approaches to dynamical systems on hyper MV -algebras and their information entropy

    NASA Astrophysics Data System (ADS)

    Mehrpooya, Adel; Ebrahimi, Mohammad; Davvaz, Bijan

    2017-09-01

    Measuring the flow of information that is related to the evolution of a system which is modeled by applying a mathematical structure is of capital significance for science and usually for mathematics itself. Regarding this fact, a major issue in concern with hyperstructures is their dynamics and the complexity of the varied possible dynamics that exist over them. Notably, the dynamics and uncertainty of hyper MV -algebras which are hyperstructures and extensions of a central tool in infinite-valued Lukasiewicz propositional calculus that models many valued logics are of primary concern. Tackling this problem, in this paper we focus on the subject of dynamical systems on hyper MV -algebras and their entropy. In this respect, we adopt two varied approaches. One is the set-based approach in which hyper MV -algebra dynamical systems are developed by employing set functions and set partitions. By the other method that is based on points and point partitions, we establish the concept of hyper injective dynamical systems on hyper MV -algebras. Next, we study the notion of entropy for both kinds of systems. Furthermore, we consider essential ergodic characteristics of those systems and their entropy. In particular, we introduce the concept of isomorphic hyper injective and hyper MV -algebra dynamical systems, and we demonstrate that isomorphic systems have the same entropy. We present a couple of theorems in order to help calculate entropy. In particular, we prove a contemporary version of addition and Kolmogorov-Sinai Theorems. Furthermore, we provide a comparison between the indispensable properties of hyper injective and semi-independent dynamical systems. Specifically, we present and prove theorems that draw comparisons between the entropies of such systems. Lastly, we discuss some possible relationships between the theories of hyper MV -algebra and MV -algebra dynamical systems.

  16. Increased resting-state brain entropy in Alzheimer's disease.

    PubMed

    Xue, Shao-Wei; Guo, Yonghu

    2018-03-07

    Entropy analysis of resting-state functional MRI (R-fMRI) is a novel approach to characterize brain temporal dynamics and facilitates the identification of abnormal brain activity caused by several disease conditions. However, Alzheimer's disease (AD)-related brain entropy mapping based on R-fMRI has not been assessed. Here, we measured the sample entropy and voxel-wise connectivity of the network degree centrality (DC) of the intrinsic brain activity acquired by R-fMRI in 26 patients with AD and 26 healthy controls. Compared with the controls, AD patients showed increased entropy in the middle temporal gyrus and the precentral gyrus and also showed decreased DC in the precuneus. Moreover, the magnitude of the negative correlation between local brain activity (entropy) and network connectivity (DC) was increased in AD patients in comparison with healthy controls. These findings provide new evidence on AD-related brain entropy alterations.

  17. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  18. Diffusion profiling of tumor volumes using a histogram approach can predict proliferation and further microarchitectural features in medulloblastoma.

    PubMed

    Schob, Stefan; Beeskow, Anne; Dieckow, Julia; Meyer, Hans-Jonas; Krause, Matthias; Frydrychowicz, Clara; Hirsch, Franz-Wolfgang; Surov, Alexey

    2018-05-31

    Medulloblastomas are the most common central nervous system tumors in childhood. Treatment and prognosis strongly depend on histology and transcriptomic profiling. However, the proliferative potential also has prognostical value. Our study aimed to investigate correlations between histogram profiling of diffusion-weighted images and further microarchitectural features. Seven patients (age median 14.6 years, minimum 2 years, maximum 20 years; 5 male, 2 female) were included in this retrospective study. Using a Matlab-based analysis tool, histogram analysis of whole apparent diffusion coefficient (ADC) volumes was performed. ADC entropy revealed a strong inverse correlation with the expression of the proliferation marker Ki67 (r = - 0.962, p = 0.009) and with total nuclear area (r = - 0.888, p = 0.044). Furthermore, ADC percentiles, most of all ADCp90, showed significant correlations with Ki67 expression (r = 0.902, p = 0.036). Diffusion histogram profiling of medulloblastomas provides valuable in vivo information which potentially can be used for risk stratification and prognostication. First of all, entropy revealed to be the most promising imaging biomarker. However, further studies are warranted.

  19. Uncertainty vs. Information (Invited)

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  20. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  1. Shifting Distributions of Adult Atlantic Sturgeon Amidst Post-Industrialization and Future Impacts in the Delaware River: a Maximum Entropy Approach

    PubMed Central

    Breece, Matthew W.; Oliver, Matthew J.; Cimino, Megan A.; Fox, Dewayne A.

    2013-01-01

    Atlantic sturgeon (Acipenser oxyrinchus oxyrinchus) experienced severe declines due to habitat destruction and overfishing beginning in the late 19th century. Subsequent to the boom and bust period of exploitation, there has been minimal fishing pressure and improving habitats. However, lack of recovery led to the 2012 listing of Atlantic sturgeon under the Endangered Species Act. Although habitats may be improving, the availability of high quality spawning habitat, essential for the survival and development of eggs and larvae may still be a limiting factor in the recovery of Atlantic sturgeon. To estimate adult Atlantic sturgeon spatial distributions during riverine occupancy in the Delaware River, we utilized a maximum entropy (MaxEnt) approach along with passive biotelemetry during the likely spawning season. We found that substrate composition and distance from the salt front significantly influenced the locations of adult Atlantic sturgeon in the Delaware River. To broaden the scope of this study we projected our model onto four scenarios depicting varying locations of the salt front in the Delaware River: the contemporary location of the salt front during the likely spawning season, the location of the salt front during the historic fishery in the late 19th century, an estimated shift in the salt front by the year 2100 due to climate change, and an extreme drought scenario, similar to that which occurred in the 1960’s. The movement of the salt front upstream as a result of dredging and climate change likely eliminated historic spawning habitats and currently threatens areas where Atlantic sturgeon spawning may be taking place. Identifying where suitable spawning substrate and water chemistry intersect with the likely occurrence of adult Atlantic sturgeon in the Delaware River highlights essential spawning habitats, enhancing recovery prospects for this imperiled species. PMID:24260570

  2. Bose-Einstein condensation of light: general theory.

    PubMed

    Sob'yanin, Denis Nikolaevich

    2013-08-01

    A theory of Bose-Einstein condensation of light in a dye-filled optical microcavity is presented. The theory is based on the hierarchical maximum entropy principle and allows one to investigate the fluctuating behavior of the photon gas in the microcavity for all numbers of photons, dye molecules, and excitations at all temperatures, including the whole critical region. The master equation describing the interaction between photons and dye molecules in the microcavity is derived and the equivalence between the hierarchical maximum entropy principle and the master equation approach is shown. The cases of a fixed mean total photon number and a fixed total excitation number are considered, and a much sharper, nonparabolic onset of a macroscopic Bose-Einstein condensation of light in the latter case is demonstrated. The theory does not use the grand canonical approximation, takes into account the photon polarization degeneracy, and exactly describes the microscopic, mesoscopic, and macroscopic Bose-Einstein condensation of light. Under certain conditions, it predicts sub-Poissonian statistics of the photon condensate and the polarized photon condensate, and a universal relation takes place between the degrees of second-order coherence for these condensates. In the macroscopic case, there appear a sharp jump in the degrees of second-order coherence, a sharp jump and kink in the reduced standard deviations of the fluctuating numbers of photons in the polarized and whole condensates, and a sharp peak, a cusp, of the Mandel parameter for the whole condensate in the critical region. The possibility of nonclassical light generation in the microcavity with the photon Bose-Einstein condensate is predicted.

  3. Mapping distribution of Rastrelliger kanagurta in the exclusive economic zone (EEZ) of Malaysia using maximum entropy modeling approach

    NASA Astrophysics Data System (ADS)

    Yusop, Syazwani Mohd; Mustapha, Muzzneena Ahmad

    2018-04-01

    The coupling of fishing locations for R. kanagurta obtained from SEAFDEC and multi-sensor satellite imageries of oceanographic variables; sea surface temperature (SST), sea surface height (SSH) and chl-a concentration (chl-a) were utilized to evaluate the performance of maximum entropy (MaxEnt) models for R. kanagurta fishing ground for prediction. Besides, this study was conducted to identify the relative percentage contribution of each environmental variable considered in order to describe the effects of the oceanographic factors on the species distribution in the study area. The potential fishing grounds during intermonsoon periods; April and October 2008-2009 were simulated separately and covered the near-coast of Kelantan, Terengganu, Pahang and Johor. The oceanographic conditions differed between regions by the inherent seasonal variability. The seasonal and spatial extents of potential fishing grounds were largely explained by chl-a concentration (0.21-0.99 mg/m3 in April and 0.28-1.00 mg/m3 in October), SSH (77.37-85.90 cm in April and 107.60-108.97 cm in October) and SST (30.43-33.70 °C in April and 30.48-30.97 °C in October). The constructed models were applicable and therefore they were suitable for predicting the potential fishing zones of R. kanagurta in EEZ. The results from this study revealed MaxEnt's potential for predicting the spatial distribution of R. kanagurta and highlighted the use of multispectral satellite images for describing the seasonal potential fishing grounds.

  4. Using maximum entropy to predict suitable habitat for the endangered dwarf wedgemussel in the Maryland Coastal Plain

    USGS Publications Warehouse

    Campbell, Cara; Hilderbrand, Robert H.

    2017-01-01

    Species distribution modelling can be useful for the conservation of rare and endangered species. Freshwater mussel declines have thinned species ranges producing spatially fragmented distributions across large areas. Spatial fragmentation in combination with a complex life history and heterogeneous environment makes predictive modelling difficult.A machine learning approach (maximum entropy) was used to model occurrences and suitable habitat for the federally endangered dwarf wedgemussel, Alasmidonta heterodon, in Maryland's Coastal Plain catchments. Landscape-scale predictors (e.g. land cover, land use, soil characteristics, geology, flow characteristics, and climate) were used to predict the suitability of individual stream segments for A. heterodon.The best model contained variables at three scales: minimum elevation (segment scale), percentage Tertiary deposits, low intensity development, and woody wetlands (sub-catchment), and percentage low intensity development, pasture/hay agriculture, and average depth to the water table (catchment). Despite a very small sample size owing to the rarity of A. heterodon, cross-validated prediction accuracy was 91%.Most predicted suitable segments occur in catchments not known to contain A. heterodon, which provides opportunities for new discoveries or population restoration. These model predictions can guide surveys toward the streams with the best chance of containing the species or, alternatively, away from those streams with little chance of containing A. heterodon.Developed reaches had low predicted suitability for A. heterodon in the Coastal Plain. Urban and exurban sprawl continues to modify stream ecosystems in the region, underscoring the need to preserve existing populations and to discover and protect new populations.

  5. Spatiotemporal modeling of ozone levels in Quebec (Canada): a comparison of kriging, land-use regression (LUR), and combined Bayesian maximum entropy-LUR approaches.

    PubMed

    Adam-Poupart, Ariane; Brand, Allan; Fournier, Michel; Jerrett, Michael; Smargiassi, Audrey

    2014-09-01

    Ambient air ozone (O3) is a pulmonary irritant that has been associated with respiratory health effects including increased lung inflammation and permeability, airway hyperreactivity, respiratory symptoms, and decreased lung function. Estimation of O3 exposure is a complex task because the pollutant exhibits complex spatiotemporal patterns. To refine the quality of exposure estimation, various spatiotemporal methods have been developed worldwide. We sought to compare the accuracy of three spatiotemporal models to predict summer ground-level O3 in Quebec, Canada. We developed a land-use mixed-effects regression (LUR) model based on readily available data (air quality and meteorological monitoring data, road networks information, latitude), a Bayesian maximum entropy (BME) model incorporating both O3 monitoring station data and the land-use mixed model outputs (BME-LUR), and a kriging method model based only on available O3 monitoring station data (BME kriging). We performed leave-one-station-out cross-validation and visually assessed the predictive capability of each model by examining the mean temporal and spatial distributions of the average estimated errors. The BME-LUR was the best predictive model (R2 = 0.653) with the lowest root mean-square error (RMSE ;7.06 ppb), followed by the LUR model (R2 = 0.466, RMSE = 8.747) and the BME kriging model (R2 = 0.414, RMSE = 9.164). Our findings suggest that errors of estimation in the interpolation of O3 concentrations with BME can be greatly reduced by incorporating outputs from a LUR model developed with readily available data.

  6. Application of a real-time, calculable limiting form of the Renyi entropy for molecular imaging of tumors.

    PubMed

    Marsh, Jon N; Wallace, Kirk D; McCarthy, John E; Wickerhauser, Mladen V; Maurizi, Brian N; Lanza, Gregory M; Wickline, Samuel A; Hughes, Michael S

    2010-08-01

    Previously, we reported new methods for ultrasound signal characterization using entropy, H(f); a generalized entropy, the Renyi entropy, I(f)(r); and a limiting form of Renyi entropy suitable for real-time calculation, I(f),(infinity). All of these quantities demonstrated significantly more sensitivity to subtle changes in scattering architecture than energy-based methods in certain settings. In this study, the real-time calculable limit of the Renyi entropy, I(f),(infinity), is applied for the imaging of angiogenic murine neovasculature in a breast cancer xenograft using a targeted contrast agent. It is shown that this approach may be used to reliably detect the accumulation of targeted nanoparticles at five minutes post-injection in this in vivo model.

  7. Moisture sorption isotherms and thermodynamic properties of mexican mennonite-style cheese.

    PubMed

    Martinez-Monteagudo, Sergio I; Salais-Fierro, Fabiola

    2014-10-01

    Moisture adsorption isotherms of fresh and ripened Mexican Mennonite-style cheese were investigated using the static gravimetric method at 4, 8, and 12 °C in a water activity range (aw) of 0.08-0.96. These isotherms were modeled using GAB, BET, Oswin and Halsey equations through weighed non-linear regression. All isotherms were sigmoid in shape, showing a type II BET isotherm, and the data were best described by GAB model. GAB model coefficients revealed that water adsorption by cheese matrix is a multilayer process characterized by molecules that are strongly bound in the monolayer and molecules that are slightly structured in a multilayer. Using the GAB model, it was possible to estimate thermodynamic functions (net isosteric heat, differential entropy, integral enthalpy and entropy, and enthalpy-entropy compensation) as function of moisture content. For both samples, the isosteric heat and differential entropy decreased with moisture content in exponential fashion. The integral enthalpy gradually decreased with increasing moisture content after reached a maximum value, while the integral entropy decreased with increasing moisture content after reached a minimum value. A linear compensation was found between integral enthalpy and entropy suggesting enthalpy controlled adsorption. Determination of moisture content and aw relationship yields to important information of controlling the ripening, drying and storage operations as well as understanding of the water state within a cheese matrix.

  8. Entanglement entropy and mutual information production rates in acoustic black holes.

    PubMed

    Giovanazzi, Stefano

    2011-01-07

    A method to investigate acoustic Hawking radiation is proposed, where entanglement entropy and mutual information are measured from the fluctuations of the number of particles. The rate of entropy radiated per one-dimensional (1D) channel is given by S=κ/12, where κ is the sound acceleration on the sonic horizon. This entropy production is accompanied by a corresponding formation of mutual information to ensure the overall conservation of information. The predictions are confirmed using an ab initio analytical approach in transonic flows of 1D degenerate ideal Fermi fluids.

  9. Entropy/information flux in Hawking radiation

    NASA Astrophysics Data System (ADS)

    Alonso-Serrano, Ana; Visser, Matt

    2018-01-01

    Blackbody radiation contains (on average) an entropy of 3.9 ± 2.5 bits per photon. If the emission process is unitary, then this entropy is exactly compensated by "hidden information" in the correlations. We extend this argument to the Hawking radiation from GR black holes, demonstrating that the assumption of unitarity leads to a perfectly reasonable entropy/information budget. The key technical aspect of our calculation is a variant of the "average subsystem" approach developed by Page, which we extend beyond bipartite pure systems, to a tripartite pure system that considers the influence of the environment.

  10. Shannon entropy and avoided crossings in closed and open quantum billiards

    NASA Astrophysics Data System (ADS)

    Park, Kyu-Won; Moon, Songky; Shin, Younghoon; Kim, Jinuk; Jeong, Kabgyun; An, Kyungwon

    2018-06-01

    The relation between Shannon entropy and avoided crossings is investigated in dielectric microcavities. The Shannon entropy of the probability density for eigenfunctions in an open elliptic billiard as well as a closed quadrupole billiard increases as the center of the avoided crossing is approached. These results are opposite to those of atomic physics for electrons. It is found that the collective Lamb shift of the open quantum system and the symmetry breaking in the closed chaotic quantum system have equivalent effects on the Shannon entropy.

  11. The predictive power of singular value decomposition entropy for stock market dynamics

    NASA Astrophysics Data System (ADS)

    Caraiani, Petre

    2014-01-01

    We use a correlation-based approach to analyze financial data from the US stock market, both daily and monthly observations from the Dow Jones. We compute the entropy based on the singular value decomposition of the correlation matrix for the components of the Dow Jones Industrial Index. Based on a moving window, we derive time varying measures of entropy for both daily and monthly data. We find that the entropy has a predictive ability with respect to stock market dynamics as indicated by the Granger causality tests.

  12. Entropy and generalized least square methods in assessment of the regional value of streamgages

    USGS Publications Warehouse

    Markus, M.; Vernon, Knapp H.; Tasker, Gary D.

    2003-01-01

    The Illinois State Water Survey performed a study to assess the streamgaging network in the State of Illinois. One of the important aspects of the study was to assess the regional value of each station through an assessment of the information transfer among gaging records for low, average, and high flow conditions. This analysis was performed for the main hydrologic regions in the State, and the stations were initially evaluated using a new approach based on entropy analysis. To determine the regional value of each station within a region, several information parameters, including total net information, were defined based on entropy. Stations were ranked based on the total net information. For comparison, the regional value of the same stations was assessed using the generalized least square regression (GLS) method, developed by the US Geological Survey. Finally, a hybrid combination of GLS and entropy was created by including a function of the negative net information as a penalty function in the GLS. The weights of the combined model were determined to maximize the average correlation with the results of GLS and entropy. The entropy and GLS methods were evaluated using the high-flow data from southern Illinois stations. The combined method was compared with the entropy and GLS approaches using the high-flow data from eastern Illinois stations. ?? 2003 Elsevier B.V. All rights reserved.

  13. Investigating dynamical complexity in the magnetosphere using various entropy measures

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos

    2009-09-01

    The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather applications.

  14. Ab initio-informed maximum entropy modeling of rovibrational relaxation and state-specific dissociation with application to the O2 + O system

    NASA Astrophysics Data System (ADS)

    Kulakhmetov, Marat; Gallis, Michael; Alexeenko, Alina

    2016-05-01

    Quasi-classical trajectory (QCT) calculations are used to study state-specific ro-vibrational energy exchange and dissociation in the O2 + O system. Atom-diatom collisions with energy between 0.1 and 20 eV are calculated with a double many body expansion potential energy surface by Varandas and Pais [Mol. Phys. 65, 843 (1988)]. Inelastic collisions favor mono-quantum vibrational transitions at translational energies above 1.3 eV although multi-quantum transitions are also important. Post-collision vibrational favoring decreases first exponentially and then linearly as Δv increases. Vibrationally elastic collisions (Δv = 0) favor small ΔJ transitions while vibrationally inelastic collisions have equilibrium post-collision rotational distributions. Dissociation exhibits both vibrational and rotational favoring. New vibrational-translational (VT), vibrational-rotational-translational (VRT) energy exchange, and dissociation models are developed based on QCT observations and maximum entropy considerations. Full set of parameters for state-to-state modeling of oxygen is presented. The VT energy exchange model describes 22 000 state-to-state vibrational cross sections using 11 parameters and reproduces vibrational relaxation rates within 30% in the 2500-20 000 K temperature range. The VRT model captures 80 × 106 state-to-state ro-vibrational cross sections using 19 parameters and reproduces vibrational relaxation rates within 60% in the 5000-15 000 K temperature range. The developed dissociation model reproduces state-specific and equilibrium dissociation rates within 25% using just 48 parameters. The maximum entropy framework makes it feasible to upscale ab initio simulation to full nonequilibrium flow calculations.

  15. Secondary structural entropy in RNA switch (Riboswitch) identification.

    PubMed

    Manzourolajdad, Amirhossein; Arnold, Jonathan

    2015-04-28

    RNA regulatory elements play a significant role in gene regulation. Riboswitches, a widespread group of regulatory RNAs, are vital components of many bacterial genomes. These regulatory elements generally function by forming a ligand-induced alternative fold that controls access to ribosome binding sites or other regulatory sites in RNA. Riboswitch-mediated mechanisms are ubiquitous across bacterial genomes. A typical class of riboswitch has its own unique structural and biological complexity, making de novo riboswitch identification a formidable task. Traditionally, riboswitches have been identified through comparative genomics based on sequence and structural homology. The limitations of structural-homology-based approaches, coupled with the assumption that there is a great diversity of undiscovered riboswitches, suggests the need for alternative methods for riboswitch identification, possibly based on features intrinsic to their structure. As of yet, no such reliable method has been proposed. We used structural entropy of riboswitch sequences as a measure of their secondary structural dynamics. Entropy values of a diverse set of riboswitches were compared to that of their mutants, their dinucleotide shuffles, and their reverse complement sequences under different stochastic context-free grammar folding models. Significance of our results was evaluated by comparison to other approaches, such as the base-pairing entropy and energy landscapes dynamics. Classifiers based on structural entropy optimized via sequence and structural features were devised as riboswitch identifiers and tested on Bacillus subtilis, Escherichia coli, and Synechococcus elongatus as an exploration of structural entropy based approaches. The unusually long untranslated region of the cotH in Bacillus subtilis, as well as upstream regions of certain genes, such as the sucC genes were associated with significant structural entropy values in genome-wide examinations. Various tests show that there is in fact a relationship between higher structural entropy and the potential for the RNA sequence to have alternative structures, within the limitations of our methodology. This relationship, though modest, is consistent across various tests. Understanding the behavior of structural entropy as a fairly new feature for RNA conformational dynamics, however, may require extensive exploratory investigation both across RNA sequences and folding models.

  16. A comparison of image restoration approaches applied to three-dimensional confocal and wide-field fluorescence microscopy.

    PubMed

    Verveer, P. J; Gemkow, M. J; Jovin, T. M

    1999-01-01

    We have compared different image restoration approaches for fluorescence microscopy. The most widely used algorithms were classified with a Bayesian theory according to the assumed noise model and the type of regularization imposed. We considered both Gaussian and Poisson models for the noise in combination with Tikhonov regularization, entropy regularization, Good's roughness and without regularization (maximum likelihood estimation). Simulations of fluorescence confocal imaging were used to examine the different noise models and regularization approaches using the mean squared error criterion. The assumption of a Gaussian noise model yielded only slightly higher errors than the Poisson model. Good's roughness was the best choice for the regularization. Furthermore, we compared simulated confocal and wide-field data. In general, restored confocal data are superior to restored wide-field data, but given sufficient higher signal level for the wide-field data the restoration result may rival confocal data in quality. Finally, a visual comparison of experimental confocal and wide-field data is presented.

  17. Absolute Equilibrium Entropy

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1997-01-01

    The entropy associated with absolute equilibrium ensemble theories of ideal, homogeneous, fluid and magneto-fluid turbulence is discussed and the three-dimensional fluid case is examined in detail. A sigma-function is defined, whose minimum value with respect to global parameters is the entropy. A comparison is made between the use of global functions sigma and phase functions H (associated with the development of various H-theorems of ideal turbulence). It is shown that the two approaches are complimentary though conceptually different: H-theorems show that an isolated system tends to equilibrium while sigma-functions allow the demonstration that entropy never decreases when two previously isolated systems are combined. This provides a more complete picture of entropy in the statistical mechanics of ideal fluids.

  18. Evidence of the big fix

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  19. Magnetic and thermodynamic properties of the Pr-based ferromagnet PrGe2-δ

    NASA Astrophysics Data System (ADS)

    Matsumoto, Keisuke T.; Morioka, Naoya; Hiraoka, Koichi

    2018-03-01

    We investigated the magnetization, M, and specific heat, C, of ThSi2-type PrGe2-δ. A polycrystalline sample of PrGe2-δ was prepared by arc-melting. Magnetization divided by magnetic field, M / B, increased sharply and C showed a clear jump at the Curie temperature, TC, of 14.6 K; these results indicate that PrGe2-δ ordered ferromagnetically. The magnetic entropy at TC reached R ln 3, indicating a quasi-triplet crystalline electric field (CEF) ground state. The maximum value of magnetic entropy change was 11.5 J/K kg with a field change of 7 T, which is comparable to those of other right rare-earth based magnetocaloric materials. This large magnetic entropy change was attributed to the quasi-triplet ground state of the CEF.

  20. Entropy uncertainty relations and stability of phase-temporal quantum cryptography with finite-length transmitted strings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com

    2012-12-15

    Any key-generation session contains a finite number of quantum-state messages, and it is there-fore important to understand the fundamental restrictions imposed on the minimal length of a string required to obtain a secret key with a specified length. The entropy uncertainty relations for smooth min and max entropies considerably simplify and shorten the proof of security. A proof of security of quantum key distribution with phase-temporal encryption is presented. This protocol provides the maximum critical error compared to other protocols up to which secure key distribution is guaranteed. In addition, unlike other basic protocols (of the BB84 type), which aremore » vulnerable with respect to an attack by 'blinding' of avalanche photodetectors, this protocol is stable with respect to such an attack and guarantees key security.« less

  1. Negative specific heat of a magnetically self-confined plasma torus

    PubMed Central

    Kiessling, Michael K.-H.; Neukirch, Thomas

    2003-01-01

    It is shown that the thermodynamic maximum-entropy principle predicts negative specific heat for a stationary, magnetically self-confined current-carrying plasma torus. Implications for the magnetic self-confinement of fusion plasma are considered. PMID:12576553

  2. Entropy in molecular recognition by proteins

    PubMed Central

    Caro, José A.; Harpole, Kyle W.; Kasinath, Vignesh; Lim, Jackwee; Granja, Jeffrey; Valentine, Kathleen G.; Sharp, Kim A.

    2017-01-01

    Molecular recognition by proteins is fundamental to molecular biology. Dissection of the thermodynamic energy terms governing protein–ligand interactions has proven difficult, with determination of entropic contributions being particularly elusive. NMR relaxation measurements have suggested that changes in protein conformational entropy can be quantitatively obtained through a dynamical proxy, but the generality of this relationship has not been shown. Twenty-eight protein–ligand complexes are used to show a quantitative relationship between measures of fast side-chain motion and the underlying conformational entropy. We find that the contribution of conformational entropy can range from favorable to unfavorable, which demonstrates the potential of this thermodynamic variable to modulate protein–ligand interactions. For about one-quarter of these complexes, the absence of conformational entropy would render the resulting affinity biologically meaningless. The dynamical proxy for conformational entropy or “entropy meter” also allows for refinement of the contributions of solvent entropy and the loss in rotational-translational entropy accompanying formation of high-affinity complexes. Furthermore, structure-based application of the approach can also provide insight into long-lived specific water–protein interactions that escape the generic treatments of solvent entropy based simply on changes in accessible surface area. These results provide a comprehensive and unified view of the general role of entropy in high-affinity molecular recognition by proteins. PMID:28584100

  3. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  4. Entropy in molecular recognition by proteins.

    PubMed

    Caro, José A; Harpole, Kyle W; Kasinath, Vignesh; Lim, Jackwee; Granja, Jeffrey; Valentine, Kathleen G; Sharp, Kim A; Wand, A Joshua

    2017-06-20

    Molecular recognition by proteins is fundamental to molecular biology. Dissection of the thermodynamic energy terms governing protein-ligand interactions has proven difficult, with determination of entropic contributions being particularly elusive. NMR relaxation measurements have suggested that changes in protein conformational entropy can be quantitatively obtained through a dynamical proxy, but the generality of this relationship has not been shown. Twenty-eight protein-ligand complexes are used to show a quantitative relationship between measures of fast side-chain motion and the underlying conformational entropy. We find that the contribution of conformational entropy can range from favorable to unfavorable, which demonstrates the potential of this thermodynamic variable to modulate protein-ligand interactions. For about one-quarter of these complexes, the absence of conformational entropy would render the resulting affinity biologically meaningless. The dynamical proxy for conformational entropy or "entropy meter" also allows for refinement of the contributions of solvent entropy and the loss in rotational-translational entropy accompanying formation of high-affinity complexes. Furthermore, structure-based application of the approach can also provide insight into long-lived specific water-protein interactions that escape the generic treatments of solvent entropy based simply on changes in accessible surface area. These results provide a comprehensive and unified view of the general role of entropy in high-affinity molecular recognition by proteins.

  5. Improved relocatable over-the-horizon radar detection and tracking using the maximum likelihood adaptive neural system algorithm

    NASA Astrophysics Data System (ADS)

    Perlovsky, Leonid I.; Webb, Virgil H.; Bradley, Scott R.; Hansen, Christopher A.

    1998-07-01

    An advanced detection and tracking system is being developed for the U.S. Navy's Relocatable Over-the-Horizon Radar (ROTHR) to provide improved tracking performance against small aircraft typically used in drug-smuggling activities. The development is based on the Maximum Likelihood Adaptive Neural System (MLANS), a model-based neural network that combines advantages of neural network and model-based algorithmic approaches. The objective of the MLANS tracker development effort is to address user requirements for increased detection and tracking capability in clutter and improved track position, heading, and speed accuracy. The MLANS tracker is expected to outperform other approaches to detection and tracking for the following reasons. It incorporates adaptive internal models of target return signals, target tracks and maneuvers, and clutter signals, which leads to concurrent clutter suppression, detection, and tracking (track-before-detect). It is not combinatorial and thus does not require any thresholding or peak picking and can track in low signal-to-noise conditions. It incorporates superresolution spectrum estimation techniques exceeding the performance of conventional maximum likelihood and maximum entropy methods. The unique spectrum estimation method is based on the Einsteinian interpretation of the ROTHR received energy spectrum as a probability density of signal frequency. The MLANS neural architecture and learning mechanism are founded on spectrum models and maximization of the "Einsteinian" likelihood, allowing knowledge of the physical behavior of both targets and clutter to be injected into the tracker algorithms. The paper describes the addressed requirements and expected improvements, theoretical foundations, engineering methodology, and results of the development effort to date.

  6. Estimating parameter of Rayleigh distribution by using Maximum Likelihood method and Bayes method

    NASA Astrophysics Data System (ADS)

    Ardianti, Fitri; Sutarman

    2018-01-01

    In this paper, we use Maximum Likelihood estimation and Bayes method under some risk function to estimate parameter of Rayleigh distribution to know the best method. The prior knowledge which used in Bayes method is Jeffrey’s non-informative prior. Maximum likelihood estimation and Bayes method under precautionary loss function, entropy loss function, loss function-L 1 will be compared. We compare these methods by bias and MSE value using R program. After that, the result will be displayed in tables to facilitate the comparisons.

  7. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin

    PubMed Central

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins. PMID:28095404

  8. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis

    PubMed Central

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-01-01

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods. PMID:29690526

  9. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis.

    PubMed

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-04-21

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods.

  10. The many faces of the second law

    NASA Astrophysics Data System (ADS)

    Van den Broeck, C.

    2010-10-01

    There exists no perpetuum mobile of the second kind. We review the implications of this observation on the second law, on the efficiency of thermal machines, on Onsager symmetry, on Brownian motors and Brownian refrigerators, and on the universality of efficiency of thermal machines at maximum power. We derive a microscopic expression for the stochastic entropy production, and obtain from it the detailed and integral fluctuation theorem. We close with the remarkable observation that the second law can be split in two: the total entropy production is the sum of two contributions each of which is growing independently in time.

  11. Shock melting and vaporization of metals.

    NASA Technical Reports Server (NTRS)

    Ahrens, T. J.

    1972-01-01

    The effect of initial porosity on shock induction of melting and vaporization is investigated for Ba, Sr, Li, Fe, Al, U, and Th. For the less compressible of these metals, it is found that for a given strong shock-generation system (explosive in contact, or flyer-plate impact) an optimum initial specific volume exists such that the total entropy production, and hence the amount of metal liquid or vapor, is a maximum. Initial volumes from 1.4 to 2.0 times crystal volumes, depending on the metal sample and shock-inducing system, will result in optimum post-shock entropies.

  12. The adiabatic piston: a perpetuum mobile in the mesoscopic realm

    NASA Astrophysics Data System (ADS)

    Crosignani, Bruno; Porto, Paolo; Conti, Claudio

    2004-03-01

    A detailed analysis of the adiabatic-piston problem reveals, for a finely-tuned choice of the spatial dimensions of the system, peculiar dynamical features that challenge the statement that an isolated system necessarily reaches a time-independent equilibrium state. In particular, the piston behaves like a perpetuum mobile, i.e., it never comes to a stop but keeps wandering, undergoing sizeable oscillations around the position corresponding to maximum entropy; this has remarkable implications on the entropy changes of a mesoscopic isolated system and on the limits of validity of the second law of thermodynamics in the mesoscopic realm.

  13. Coarse-graining errors and numerical optimization using a relative entropy framework

    NASA Astrophysics Data System (ADS)

    Chaimovich, Aviel; Shell, M. Scott

    2011-03-01

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, Srel, that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework.

  14. A Stationary Wavelet Entropy-Based Clustering Approach Accurately Predicts Gene Expression

    PubMed Central

    Nguyen, Nha; Vo, An; Choi, Inchan

    2015-01-01

    Abstract Studying epigenetic landscapes is important to understand the condition for gene regulation. Clustering is a useful approach to study epigenetic landscapes by grouping genes based on their epigenetic conditions. However, classical clustering approaches that often use a representative value of the signals in a fixed-sized window do not fully use the information written in the epigenetic landscapes. Clustering approaches to maximize the information of the epigenetic signals are necessary for better understanding gene regulatory environments. For effective clustering of multidimensional epigenetic signals, we developed a method called Dewer, which uses the entropy of stationary wavelet of epigenetic signals inside enriched regions for gene clustering. Interestingly, the gene expression levels were highly correlated with the entropy levels of epigenetic signals. Dewer separates genes better than a window-based approach in the assessment using gene expression and achieved a correlation coefficient above 0.9 without using any training procedure. Our results show that the changes of the epigenetic signals are useful to study gene regulation. PMID:25383910

  15. Origin and Characteristics of High Shannon Entropy at the Pivot of Locally Stable Rotors: Insights from Computational Simulation

    PubMed Central

    Gharaviri, Ali; Brooks, Anthony; Chapman, Darius; Lau, Dennis H.; Roberts-Thomson, Kurt C.; Sanders, Prashanthan

    2014-01-01

    Background Rotors are postulated to maintain cardiac fibrillation. Despite the importance of bipolar electrograms in clinical electrophysiology, few data exist on the properties of bipolar electrograms at rotor sites. The pivot of a spiral wave is characterized by relative uncertainty of wavefront propagation direction compared to the periphery. The bipolar electrograms used in electrophysiology recording encode information on both direction and timing of approaching wavefronts. Objective To test the hypothesis that bipolar electrograms from the pivot of rotors have higher Shannon entropy (ShEn) than electrograms recorded at the periphery due to the spatial dynamics of spiral waves. Methods and Results We studied spiral wave propagation in 2-dimensional sheets constructed using a simple cell automaton (FitzHugh-Nagumo), atrial (Courtemanche-Ramirez-Nattel) and ventricular (Luo-Rudy) myocyte cell models and in a geometric model spiral wave. In each system, bipolar electrogram recordings were simulated, and Shannon entropy maps constructed as a measure of electrogram information content. ShEn was consistently highest in the pivoting region associated with the phase singularity of the spiral wave. This property was consistently preserved across; (i) variation of model system (ii) alterations in bipolar electrode spacing, (iii) alternative bipolar electrode orientation (iv) bipolar electrogram filtering and (v) in the presence of rotor meander. Directional activation plots demonstrated that the origin of high ShEn at the pivot was the directional diversity of wavefront propagation observed in this location. Conclusions The pivot of the rotor is consistently associated with high Shannon entropy of bipolar electrograms despite differences in action potential model, bipolar electrode spacing, signal filtering and rotor meander. Maximum ShEn is co-located with the pivot for rotors observed in the bipolar electrogram recording mode, and may be an intrinsic property of spiral wave dynamic behaviour. PMID:25401331

  16. Numerical estimation of the relative entropy of entanglement

    NASA Astrophysics Data System (ADS)

    Zinchenko, Yuriy; Friedland, Shmuel; Gour, Gilad

    2010-11-01

    We propose a practical algorithm for the calculation of the relative entropy of entanglement (REE), defined as the minimum relative entropy between a state and the set of states with positive partial transpose. Our algorithm is based on a practical semidefinite cutting plane approach. In low dimensions the implementation of the algorithm in matlab provides an estimation for the REE with an absolute error smaller than 10-3.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Sen; Luo, Sheng-Nian

    Polychromatic X-ray sources can be useful for photon-starved small-angle X-ray scattering given their high spectral fluxes. Their bandwidths, however, are 10–100 times larger than those using monochromators. To explore the feasibility, ideal scattering curves of homogeneous spherical particles for polychromatic X-rays are calculated and analyzed using the Guinier approach, maximum entropy and regularization methods. Monodisperse and polydisperse systems are explored. The influence of bandwidth and asymmetric spectra shape are exploredviaGaussian and half-Gaussian spectra. Synchrotron undulator spectra represented by two undulator sources of the Advanced Photon Source are examined as an example, as regards the influence of asymmetric harmonic shape, fundamentalmore » harmonic bandwidth and high harmonics. The effects of bandwidth, spectral shape and high harmonics on particle size determination are evaluated quantitatively.« less

  18. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems

    NASA Astrophysics Data System (ADS)

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  19. Use of the maximum entropy method to retrieve the vertical atmospheric ozone profile and predict atmospheric ozone content

    NASA Technical Reports Server (NTRS)

    Turner, B. Curtis

    1992-01-01

    A method is developed for prediction of ozone levels in planetary atmospheres. This method is formulated in terms of error covariance matrices, and is associated with both direct measurements, a priori first guess profiles, and a weighting function matrix. This is described by the following linearized equation: y = A(matrix) x X + eta, where A is the weighting matrix and eta is noise. The problems to this approach are: (1) the A matrix is near singularity; (2) the number of unknowns in the profile exceeds the number of data points, therefore, the solution may not be unique; and (3) even if a unique solution exists, eta may cause the solution to be ill conditioned.

  20. Prediction of Spatiotemporal Patterns of Neural Activity from Pairwise Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marre, O.; El Boustani, S.; Fregnac, Y.

    We designed a model-based analysis to predict the occurrence of population patterns in distributed spiking activity. Using a maximum entropy principle with a Markovian assumption, we obtain a model that accounts for both spatial and temporal pairwise correlations among neurons. This model is tested on data generated with a Glauber spin-glass system and is shown to correctly predict the occurrence probabilities of spatiotemporal patterns significantly better than Ising models only based on spatial correlations. This increase of predictability was also observed on experimental data recorded in parietal cortex during slow-wave sleep. This approach can also be used to generate surrogatesmore » that reproduce the spatial and temporal correlations of a given data set.« less

Top