Sample records for minimum entropy generation

  1. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines.

    PubMed

    Haseli, Y

    2016-05-01

    The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  2. Optimization of a Circular Microchannel With Entropy Generation Minimization Method

    NASA Astrophysics Data System (ADS)

    Jafari, Arash; Ghazali, Normah Mohd

    2010-06-01

    New advances in micro and nano scales are being realized and the contributions of micro and nano heat dissipation devices are of high importance in this novel technology development. Past studies showed that microchannel design depends on its thermal resistance and pressure drop. However, entropy generation minimization (EGM) as a new optimization theory stated that the rate of entropy generation should be also optimized. Application of EGM in microchannel heat sink design is reviewed and discussed in this paper. Latest principles for deriving the entropy generation relations are discussed to present how this approach can be achieved. An optimization procedure using EGM method with the entropy generation rate is derived for a circular microchannel heat sink based upon thermal resistance and pressure drop. The equations are solved using MATLAB and the obtained results are compared to similar past studies. The effects of channel diameter, number of channels, heat flux, and pumping power on the entropy generation rate and Reynolds number are investigated. Analytical correlations are utilized for heat transfer and friction coefficients. A minimum entropy generation has been observed for N = 40 and channel diameter of 90μm. It is concluded that for N = 40 and channel hydraulic diameter of 90μm, the circular microchannel heat sink is on its optimum operating point based on second law of thermodynamics.

  3. A minimum entropy principle in the gas dynamics equations

    NASA Technical Reports Server (NTRS)

    Tadmor, E.

    1986-01-01

    Let u(x bar,t) be a weak solution of the Euler equations, governing the inviscid polytropic gas dynamics; in addition, u(x bar, t) is assumed to respect the usual entropy conditions connected with the conservative Euler equations. We show that such entropy solutions of the gas dynamics equations satisfy a minimum entropy principle, namely, that the spatial minimum of their specific entropy, (Ess inf s(u(x,t)))/x, is an increasing function of time. This principle equally applies to discrete approximations of the Euler equations such as the Godunov-type and Lax-Friedrichs schemes. Our derivation of this minimum principle makes use of the fact that there is a family of generalized entrophy functions connected with the conservative Euler equations.

  4. Increased temperature and entropy production in cancer: the role of anti-inflammatory drugs.

    PubMed

    Pitt, Michael A

    2015-02-01

    Some cancers have been shown to have a higher temperature than surrounding normal tissue. This higher temperature is due to heat generated internally in the cancer. The higher temperature of cancer (compared to surrounding tissue) enables a thermodynamic analysis to be carried out. Here I show that there is increased entropy production in cancer compared with surrounding tissue. This is termed excess entropy production. The excess entropy production is expressed in terms of heat flow from the cancer to surrounding tissue and enzymic reactions in the cancer and surrounding tissue. The excess entropy production in cancer drives it away from the stationary state that is characterised by minimum entropy production. Treatments that reduce inflammation (and therefore temperature) should drive a cancer towards the stationary state. Anti-inflammatory agents, such as aspirin, other non-steroidal anti-inflammatory drugs, corticosteroids and also thyroxine analogues have been shown (using various criteria) to reduce the progress of cancer.

  5. Application of an improved minimum entropy deconvolution method for railway rolling element bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Cheng, Yao; Zhou, Ning; Zhang, Weihua; Wang, Zhiwei

    2018-07-01

    Minimum entropy deconvolution is a widely-used tool in machinery fault diagnosis, because it enhances the impulse component of the signal. The filter coefficients that greatly influence the performance of the minimum entropy deconvolution are calculated by an iterative procedure. This paper proposes an improved deconvolution method for the fault detection of rolling element bearings. The proposed method solves the filter coefficients by the standard particle swarm optimization algorithm, assisted by a generalized spherical coordinate transformation. When optimizing the filters performance for enhancing the impulses in fault diagnosis (namely, faulty rolling element bearings), the proposed method outperformed the classical minimum entropy deconvolution method. The proposed method was validated in simulation and experimental signals from railway bearings. In both simulation and experimental studies, the proposed method delivered better deconvolution performance than the classical minimum entropy deconvolution method, especially in the case of low signal-to-noise ratio.

  6. Towards the minimization of thermodynamic irreversibility in an electrically actuated microflow of a viscoelastic fluid under electrical double layer phenomenon

    NASA Astrophysics Data System (ADS)

    Sarma, Rajkumar; Jain, Manish; Mondal, Pranab Kumar

    2017-10-01

    We discuss the entropy generation minimization for electro-osmotic flow of a viscoelastic fluid through a parallel plate microchannel under the combined influences of interfacial slip and conjugate transport of heat. We use in this study the simplified Phan-Thien-Tanner model to describe the rheological behavior of the viscoelastic fluid. Using Navier's slip law and thermal boundary conditions of the third kind, we solve the transport equations analytically and evaluate the global entropy generation rate of the system. We examine the influential role of the following parameters on the entropy generation rate of the system, viz., the viscoelastic parameter (ɛDe2), Debye-Hückel parameter ( κ ¯ ) , channel wall thickness (δ), thermal conductivity of the wall (γ), Biot number (Bi), Peclet number (Pe), and axial temperature gradient (B). This investigation finally establishes the optimum values of the abovementioned parameters, leading to the minimum entropy generation of the system. We believe that results of this analysis could be helpful in optimizing the second-law performance of microscale thermal management devices, including the micro-heat exchangers, micro-reactors, and micro-heat pipes.

  7. Minimum entropy deconvolution and blind equalisation

    NASA Technical Reports Server (NTRS)

    Satorius, E. H.; Mulligan, J. J.

    1992-01-01

    Relationships between minimum entropy deconvolution, developed primarily for geophysics applications, and blind equalization are pointed out. It is seen that a large class of existing blind equalization algorithms are directly related to the scale-invariant cost functions used in minimum entropy deconvolution. Thus the extensive analyses of these cost functions can be directly applied to blind equalization, including the important asymptotic results of Donoho.

  8. Application of genetic algorithms in nonlinear heat conduction problems.

    PubMed

    Kadri, Muhammad Bilal; Khan, Waqar A

    2014-01-01

    Genetic algorithms are employed to optimize dimensionless temperature in nonlinear heat conduction problems. Three common geometries are selected for the analysis and the concept of minimum entropy generation is used to determine the optimum temperatures under the same constraints. The thermal conductivity is assumed to vary linearly with temperature while internal heat generation is assumed to be uniform. The dimensionless governing equations are obtained for each selected geometry and the dimensionless temperature distributions are obtained using MATLAB. It is observed that GA gives the minimum dimensionless temperature in each selected geometry.

  9. Shock wave induced vaporization of porous solids

    NASA Astrophysics Data System (ADS)

    Shen, Andy H.; Ahrens, Thomas J.; O'Keefe, John D.

    2003-05-01

    Strong shock waves generated by hypervelocity impact can induce vaporization in solid materials. To pursue knowledge of the chemical species in the shock-induced vapors, one needs to design experiments that will drive the system to such thermodynamic states that sufficient vapor can be generated for investigation. It is common to use porous media to reach high entropy, vaporized states in impact experiments. We extended calculations by Ahrens [J. Appl. Phys. 43, 2443 (1972)] and Ahrens and O'Keefe [The Moon 4, 214 (1972)] to higher distentions (up to five) and improved their method with a different impedance match calculation scheme and augmented their model with recent thermodynamic and Hugoniot data of metals, minerals, and polymers. Although we reconfirmed the competing effects reported in the previous studies: (1) increase of entropy production and (2) decrease of impedance match, when impacting materials with increasing distentions, our calculations did not exhibit optimal entropy-generating distention. For different materials, very different impact velocities are needed to initiate vaporization. For aluminum at distention (m)<2.2, a minimum impact velocity of 2.7 km/s is required using tungsten projectile. For ionic solids such as NaCl at distention <2.2, 2.5 km/s is needed. For carbonate and sulfate minerals, the minimum impact velocities are much lower, ranging from less than 1 to 1.5 km/s.

  10. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  11. The constructal law of design and evolution in nature

    PubMed Central

    Bejan, Adrian; Lorente, Sylvie

    2010-01-01

    Constructal theory is the view that (i) the generation of images of design (pattern, rhythm) in nature is a phenomenon of physics and (ii) this phenomenon is covered by a principle (the constructal law): ‘for a finite-size flow system to persist in time (to live) it must evolve such that it provides greater and greater access to the currents that flow through it’. This law is about the necessity of design to occur, and about the time direction of the phenomenon: the tape of the design evolution ‘movie’ runs such that existing configurations are replaced by globally easier flowing configurations. The constructal law has two useful sides: the prediction of natural phenomena and the strategic engineering of novel architectures, based on the constructal law, i.e. not by mimicking nature. We show that the emergence of scaling laws in inanimate (geophysical) flow systems is the same phenomenon as the emergence of allometric laws in animate (biological) flow systems. Examples are lung design, animal locomotion, vegetation, river basins, turbulent flow structure, self-lubrication and natural multi-scale porous media. This article outlines the place of the constructal law as a self-standing law in physics, which covers all the ad hoc (and contradictory) statements of optimality such as minimum entropy generation, maximum entropy generation, minimum flow resistance, maximum flow resistance, minimum time, minimum weight, uniform maximum stresses and characteristic organ sizes. Nature is configured to flow and move as a conglomerate of ‘engine and brake’ designs. PMID:20368252

  12. The constructal law of design and evolution in nature.

    PubMed

    Bejan, Adrian; Lorente, Sylvie

    2010-05-12

    Constructal theory is the view that (i) the generation of images of design (pattern, rhythm) in nature is a phenomenon of physics and (ii) this phenomenon is covered by a principle (the constructal law): 'for a finite-size flow system to persist in time (to live) it must evolve such that it provides greater and greater access to the currents that flow through it'. This law is about the necessity of design to occur, and about the time direction of the phenomenon: the tape of the design evolution 'movie' runs such that existing configurations are replaced by globally easier flowing configurations. The constructal law has two useful sides: the prediction of natural phenomena and the strategic engineering of novel architectures, based on the constructal law, i.e. not by mimicking nature. We show that the emergence of scaling laws in inanimate (geophysical) flow systems is the same phenomenon as the emergence of allometric laws in animate (biological) flow systems. Examples are lung design, animal locomotion, vegetation, river basins, turbulent flow structure, self-lubrication and natural multi-scale porous media. This article outlines the place of the constructal law as a self-standing law in physics, which covers all the ad hoc (and contradictory) statements of optimality such as minimum entropy generation, maximum entropy generation, minimum flow resistance, maximum flow resistance, minimum time, minimum weight, uniform maximum stresses and characteristic organ sizes. Nature is configured to flow and move as a conglomerate of 'engine and brake' designs.

  13. Uncertainty relations with quantum memory for the Wehrl entropy

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo

    2018-03-01

    We prove two new fundamental uncertainty relations with quantum memory for the Wehrl entropy. The first relation applies to the bipartite memory scenario. It determines the minimum conditional Wehrl entropy among all the quantum states with a given conditional von Neumann entropy and proves that this minimum is asymptotically achieved by a suitable sequence of quantum Gaussian states. The second relation applies to the tripartite memory scenario. It determines the minimum of the sum of the Wehrl entropy of a quantum state conditioned on the first memory quantum system with the Wehrl entropy of the same state conditioned on the second memory quantum system and proves that also this minimum is asymptotically achieved by a suitable sequence of quantum Gaussian states. The Wehrl entropy of a quantum state is the Shannon differential entropy of the outcome of a heterodyne measurement performed on the state. The heterodyne measurement is one of the main measurements in quantum optics and lies at the basis of one of the most promising protocols for quantum key distribution. These fundamental entropic uncertainty relations will be a valuable tool in quantum information and will, for example, find application in security proofs of quantum key distribution protocols in the asymptotic regime and in entanglement witnessing in quantum optics.

  14. Maximum Relative Entropy of Coherence: An Operational Coherence Measure.

    PubMed

    Bu, Kaifeng; Singh, Uttam; Fei, Shao-Ming; Pati, Arun Kumar; Wu, Junde

    2017-10-13

    The operational characterization of quantum coherence is the cornerstone in the development of the resource theory of coherence. We introduce a new coherence quantifier based on maximum relative entropy. We prove that the maximum relative entropy of coherence is directly related to the maximum overlap with maximally coherent states under a particular class of operations, which provides an operational interpretation of the maximum relative entropy of coherence. Moreover, we show that, for any coherent state, there are examples of subchannel discrimination problems such that this coherent state allows for a higher probability of successfully discriminating subchannels than that of all incoherent states. This advantage of coherent states in subchannel discrimination can be exactly characterized by the maximum relative entropy of coherence. By introducing a suitable smooth maximum relative entropy of coherence, we prove that the smooth maximum relative entropy of coherence provides a lower bound of one-shot coherence cost, and the maximum relative entropy of coherence is equivalent to the relative entropy of coherence in the asymptotic limit. Similar to the maximum relative entropy of coherence, the minimum relative entropy of coherence has also been investigated. We show that the minimum relative entropy of coherence provides an upper bound of one-shot coherence distillation, and in the asymptotic limit the minimum relative entropy of coherence is equivalent to the relative entropy of coherence.

  15. Irreversibility and entropy production in transport phenomena, III—Principle of minimum integrated entropy production including nonlinear responses

    NASA Astrophysics Data System (ADS)

    Suzuki, Masuo

    2013-01-01

    A new variational principle of steady states is found by introducing an integrated type of energy dissipation (or entropy production) instead of instantaneous energy dissipation. This new principle is valid both in linear and nonlinear transport phenomena. Prigogine’s dream has now been realized by this new general principle of minimum “integrated” entropy production (or energy dissipation). This new principle does not contradict with the Onsager-Prigogine principle of minimum instantaneous entropy production in the linear regime, but it is conceptually different from the latter which does not hold in the nonlinear regime. Applications of this theory to electric conduction, heat conduction, particle diffusion and chemical reactions are presented. The irreversibility (or positive entropy production) and long time tail problem in Kubo’s formula are also discussed in the Introduction and last section. This constitutes the complementary explanation of our theory of entropy production given in the previous papers (M. Suzuki, Physica A 390 (2011) 1904 and M. Suzuki, Physica A 391 (2012) 1074) and has given the motivation of the present investigation of variational principle.

  16. A survey of the role of thermodynamic stability in viscous flow

    NASA Technical Reports Server (NTRS)

    Horne, W. C.; Smith, C. A.; Karamcheti, K.

    1991-01-01

    The stability of near-equilibrium states has been studied as a branch of the general field of nonequilibrium thermodynamics. By treating steady viscous flow as an open thermodynamic system, nonequilibrium principles such as the condition of minimum entropy-production rate for steady, near-equilibrium processes can be used to generate flow distributions from variational analyses. Examples considered in this paper are steady heat conduction, channel flow, and unconstrained three-dimensional flow. The entropy-production-rate condition has also been used for hydrodynamic stability criteria, and calculations of the stability of a laminar wall jet support this interpretation.

  17. Maximum and minimum entropy states yielding local continuity bounds

    NASA Astrophysics Data System (ADS)

    Hanson, Eric P.; Datta, Nilanjana

    2018-04-01

    Given an arbitrary quantum state (σ), we obtain an explicit construction of a state ρɛ * ( σ ) [respectively, ρ * , ɛ ( σ ) ] which has the maximum (respectively, minimum) entropy among all states which lie in a specified neighborhood (ɛ-ball) of σ. Computing the entropy of these states leads to a local strengthening of the continuity bound of the von Neumann entropy, i.e., the Audenaert-Fannes inequality. Our bound is local in the sense that it depends on the spectrum of σ. The states ρɛ * ( σ ) and ρ * , ɛ (σ) depend only on the geometry of the ɛ-ball and are in fact optimizers for a larger class of entropies. These include the Rényi entropy and the minimum- and maximum-entropies, providing explicit formulas for certain smoothed quantities. This allows us to obtain local continuity bounds for these quantities as well. In obtaining this bound, we first derive a more general result which may be of independent interest, namely, a necessary and sufficient condition under which a state maximizes a concave and Gâteaux-differentiable function in an ɛ-ball around a given state σ. Examples of such a function include the von Neumann entropy and the conditional entropy of bipartite states. Our proofs employ tools from the theory of convex optimization under non-differentiable constraints, in particular Fermat's rule, and majorization theory.

  18. Entropy-Based Registration of Point Clouds Using Terrestrial Laser Scanning and Smartphone GPS.

    PubMed

    Chen, Maolin; Wang, Siying; Wang, Mingwei; Wan, Youchuan; He, Peipei

    2017-01-20

    Automatic registration of terrestrial laser scanning point clouds is a crucial but unresolved topic that is of great interest in many domains. This study combines terrestrial laser scanner with a smartphone for the coarse registration of leveled point clouds with small roll and pitch angles and height differences, which is a novel sensor combination mode for terrestrial laser scanning. The approximate distance between two neighboring scan positions is firstly calculated with smartphone GPS coordinates. Then, 2D distribution entropy is used to measure the distribution coherence between the two scans and search for the optimal initial transformation parameters. To this end, we propose a method called Iterative Minimum Entropy (IME) to correct initial transformation parameters based on two criteria: the difference between the average and minimum entropy and the deviation from the minimum entropy to the expected entropy. Finally, the presented method is evaluated using two data sets that contain tens of millions of points from panoramic and non-panoramic, vegetation-dominated and building-dominated cases and can achieve high accuracy and efficiency.

  19. Unbiased All-Optical Random-Number Generator

    NASA Astrophysics Data System (ADS)

    Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja

    2017-10-01

    The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.

  20. New paradigm for task switching strategies while performing multiple tasks: entropy and symbolic dynamics analysis of voluntary patterns.

    PubMed

    Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten

    2012-10-01

    It has become well established in laboratory experiments that switching tasks, perhaps due to interruptions at work, incur costs in response time to complete the next task. Conditions are also known that exaggerate or lessen the switching costs. Although switching costs can contribute to fatigue, task switching can also be an adaptive response to fatigue. The present study introduces a new research paradigm for studying the emergence of voluntary task switching regimes, self-organizing processes therein, and the possibly conflicting roles of switching costs and minimum entropy. Fifty-four undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The sequences of task choices were analyzed using orbital decomposition to extract pattern types and lengths, which were then classified and compared with regard to Shannon entropy, topological entropy, number of task switches involved, and overall performance. Results indicated that similar but different patterns were generated under the two instructional conditions, and better performance was associated with lower topological entropy. Both entropy metrics were associated with the amount of voluntary task switching. Future research should explore conditions affecting the trade-off between switching costs and entropy, levels of automaticity between task elements, and the role of voluntary switching regimes on fatigue.

  1. The Conditional Entropy Power Inequality for Bosonic Quantum Systems

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo; Trevisan, Dario

    2018-06-01

    We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under the heat semigroup evolution. The beam-splitter and the squeezing are the central elements of quantum optics, and can model the attenuation, the amplification and the noise of electromagnetic signals. This conditional Entropy Power Inequality will have a strong impact in quantum information and quantum cryptography. Among its many possible applications there is the proof of a new uncertainty relation for the conditional Wehrl entropy.

  2. The Conditional Entropy Power Inequality for Bosonic Quantum Systems

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo; Trevisan, Dario

    2018-01-01

    We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under the heat semigroup evolution. The beam-splitter and the squeezing are the central elements of quantum optics, and can model the attenuation, the amplification and the noise of electromagnetic signals. This conditional Entropy Power Inequality will have a strong impact in quantum information and quantum cryptography. Among its many possible applications there is the proof of a new uncertainty relation for the conditional Wehrl entropy.

  3. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  4. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  5. Joint Entropy for Space and Spatial Frequency Domains Estimated from Psychometric Functions of Achromatic Discrimination

    PubMed Central

    Silveira, Vladímir de Aquino; Souza, Givago da Silva; Gomes, Bruno Duarte; Rodrigues, Anderson Raiol; Silveira, Luiz Carlos de Lima

    2014-01-01

    We used psychometric functions to estimate the joint entropy for space discrimination and spatial frequency discrimination. Space discrimination was taken as discrimination of spatial extent. Seven subjects were tested. Gábor functions comprising unidimensionalsinusoidal gratings (0.4, 2, and 10 cpd) and bidimensionalGaussian envelopes (1°) were used as reference stimuli. The experiment comprised the comparison between reference and test stimulithat differed in grating's spatial frequency or envelope's standard deviation. We tested 21 different envelope's standard deviations around the reference standard deviation to study spatial extent discrimination and 19 different grating's spatial frequencies around the reference spatial frequency to study spatial frequency discrimination. Two series of psychometric functions were obtained for 2%, 5%, 10%, and 100% stimulus contrast. The psychometric function data points for spatial extent discrimination or spatial frequency discrimination were fitted with Gaussian functions using the least square method, and the spatial extent and spatial frequency entropies were estimated from the standard deviation of these Gaussian functions. Then, joint entropy was obtained by multiplying the square root of space extent entropy times the spatial frequency entropy. We compared our results to the theoretical minimum for unidimensional Gábor functions, 1/4π or 0.0796. At low and intermediate spatial frequencies and high contrasts, joint entropy reached levels below the theoretical minimum, suggesting non-linear interactions between two or more visual mechanisms. We concluded that non-linear interactions of visual pathways, such as the M and P pathways, could explain joint entropy values below the theoretical minimum at low and intermediate spatial frequencies and high contrasts. These non-linear interactions might be at work at intermediate and high contrasts at all spatial frequencies once there was a substantial decrease in joint entropy for these stimulus conditions when contrast was raised. PMID:24466158

  6. Joint entropy for space and spatial frequency domains estimated from psychometric functions of achromatic discrimination.

    PubMed

    Silveira, Vladímir de Aquino; Souza, Givago da Silva; Gomes, Bruno Duarte; Rodrigues, Anderson Raiol; Silveira, Luiz Carlos de Lima

    2014-01-01

    We used psychometric functions to estimate the joint entropy for space discrimination and spatial frequency discrimination. Space discrimination was taken as discrimination of spatial extent. Seven subjects were tested. Gábor functions comprising unidimensionalsinusoidal gratings (0.4, 2, and 10 cpd) and bidimensionalGaussian envelopes (1°) were used as reference stimuli. The experiment comprised the comparison between reference and test stimulithat differed in grating's spatial frequency or envelope's standard deviation. We tested 21 different envelope's standard deviations around the reference standard deviation to study spatial extent discrimination and 19 different grating's spatial frequencies around the reference spatial frequency to study spatial frequency discrimination. Two series of psychometric functions were obtained for 2%, 5%, 10%, and 100% stimulus contrast. The psychometric function data points for spatial extent discrimination or spatial frequency discrimination were fitted with Gaussian functions using the least square method, and the spatial extent and spatial frequency entropies were estimated from the standard deviation of these Gaussian functions. Then, joint entropy was obtained by multiplying the square root of space extent entropy times the spatial frequency entropy. We compared our results to the theoretical minimum for unidimensional Gábor functions, 1/4π or 0.0796. At low and intermediate spatial frequencies and high contrasts, joint entropy reached levels below the theoretical minimum, suggesting non-linear interactions between two or more visual mechanisms. We concluded that non-linear interactions of visual pathways, such as the M and P pathways, could explain joint entropy values below the theoretical minimum at low and intermediate spatial frequencies and high contrasts. These non-linear interactions might be at work at intermediate and high contrasts at all spatial frequencies once there was a substantial decrease in joint entropy for these stimulus conditions when contrast was raised.

  7. [Specific features in realization of the principle of minimum energy dissipation during individual development].

    PubMed

    Zotin, A A

    2012-01-01

    Realization of the principle of minimum energy dissipation (Prigogine's theorem) during individual development has been analyzed. This analysis has suggested the following reformulation of this principle for living objects: when environmental conditions are constant, the living system evolves to a current steady state in such a way that the difference between entropy production and entropy flow (psi(u) function) is positive and constantly decreases near the steady state, approaching zero. In turn, the current steady state tends to a final steady state in such a way that the difference between the specific entropy productions in an organism and its environment tends to be minimal. In general, individual development completely agrees with the law of entropy increase (second law of thermodynamics).

  8. Entropy considerations applied to shock unsteadiness in hypersonic inlets

    NASA Astrophysics Data System (ADS)

    Bussey, Gillian Mary Harding

    The stability of curved or rectangular shocks in hypersonic inlets in response to flow perturbations can be determined analytically from the principle of minimum entropy. Unsteady shock wave motion can have a significant effect on the flow in a hypersonic inlet or combustor. According to the principle of minimum entropy, a stable thermodynamic state is one with the lowest entropy gain. A model based on piston theory and its limits has been developed for applying the principle of minimum entropy to quasi-steady flow. Relations are derived for analyzing the time-averaged entropy gain flux across a shock for quasi-steady perturbations in atmospheric conditions and angle as a perturbation in entropy gain flux from the steady state. Initial results from sweeping a wedge at Mach 10 through several degrees in AEDC's Tunnel 9 indicates the bow shock becomes unsteady near the predicted normal Mach number. Several curved shocks of varying curvature are compared to a straight shock with the same mean normal Mach number, pressure ratio, or temperature ratio. The present work provides analysis and guidelines for designing an inlet robust to off- design flight or perturbations in flow conditions an inlet is likely to face. It also suggests that inlets with curved shocks are less robust to off-design flight than those with straight shocks such as rectangular inlets. Relations for evaluating entropy perturbations for highly unsteady flow across a shock and limits on their use were also developed. The normal Mach number at which a shock could be stable to high frequency upstream perturbations increases as the speed of the shock motion increases and slightly decreases as the perturbation size increases. The present work advances the principle of minimum entropy theory by providing additional validity for using the theory for time-varying flows and applying it to shocks, specifically those in inlets. While this analytic tool is applied in the present work for evaluating the stability of shocks in hypersonic inlets, it can be used for an arbitrary application with a shock.

  9. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  10. Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations

    DTIC Science & Technology

    2017-08-21

    distributions, and we discuss some applications for engineered and biological information transmission systems. Keywords: information theory; minimum...of its interpretation as a measure of the amount of information communicable by a neural system to groups of downstream neurons. Previous authors...of the maximum entropy approach. Our results also have relevance for engineered information transmission systems. We show that empirically measured

  11. Entropy generation of nanofluid flow in a microchannel heat sink

    NASA Astrophysics Data System (ADS)

    Manay, Eyuphan; Akyürek, Eda Feyza; Sahin, Bayram

    2018-06-01

    Present study aims to investigate the effects of the presence of nano sized TiO2 particles in the base fluid on entropy generation rate in a microchannel heat sink. Pure water was chosen as base fluid, and TiO2 particles were suspended into the pure water in five different particle volume fractions of 0.25%, 0.5%, 1.0%, 1.5% and 2.0%. Under laminar, steady state flow and constant heat flux boundary conditions, thermal, frictional, total entropy generation rates and entropy generation number ratios of nanofluids were experimentally analyzed in microchannel flow for different channel heights of 200 μm, 300 μm, 400 μm and 500 μm. It was observed that frictional and total entropy generation rates increased as thermal entropy generation rate were decreasing with an increase in particle volume fraction. In microchannel flows, thermal entropy generation could be neglected due to its too low rate smaller than 1.10e-07 in total entropy generation. Higher channel heights caused higher thermal entropy generation rates, and increasing channel height yielded an increase from 30% to 52% in thermal entropy generation. When channel height decreased, an increase of 66%-98% in frictional entropy generation was obtained. Adding TiO2 nanoparticles into the base fluid caused thermal entropy generation to decrease about 1.8%-32.4%, frictional entropy generation to increase about 3.3%-21.6%.

  12. Comment on: The cancer Warburg effect may be a testable example of the minimum entropy production rate principle.

    PubMed

    Sadeghi Ghuchani, Mostafa

    2018-02-08

    This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.

  13. Comment on: The cancer Warburg effect may be a testable example of the minimum entropy production rate principle

    NASA Astrophysics Data System (ADS)

    Sadeghi Ghuchani, Mostafa

    2018-03-01

    This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.

  14. Radiative entropy generation in a gray absorbing, emitting, and scattering planar medium at radiative equilibrium

    NASA Astrophysics Data System (ADS)

    Sadeghi, Pegah; Safavinejad, Ali

    2017-11-01

    Radiative entropy generation through a gray absorbing, emitting, and scattering planar medium at radiative equilibrium with diffuse-gray walls is investigated. The radiative transfer equation and radiative entropy generation equations are solved using discrete ordinates method. Components of the radiative entropy generation are considered for two different boundary conditions: two walls are at a prescribed temperature and mixed boundary conditions, which one wall is at a prescribed temperature and the other is at a prescribed heat flux. The effect of wall emissivities, optical thickness, single scattering albedo, and anisotropic-scattering factor on the entropy generation is attentively investigated. The results reveal that entropy generation in the system mainly arises from irreversible radiative transfer at wall with lower temperature. Total entropy generation rate for the system with prescribed temperature at walls remarkably increases as wall emissivity increases; conversely, for system with mixed boundary conditions, total entropy generation rate slightly decreases. Furthermore, as the optical thickness increases, total entropy generation rate remarkably decreases for the system with prescribed temperature at walls; nevertheless, for the system with mixed boundary conditions, total entropy generation rate increases. The variation of single scattering albedo does not considerably affect total entropy generation rate. This parametric analysis demonstrates that the optical thickness and wall emissivities have a significant effect on the entropy generation in the system at radiative equilibrium. Considering the parameters affecting radiative entropy generation significantly, provides an opportunity to optimally design or increase overall performance and efficiency by applying entropy minimization techniques for the systems at radiative equilibrium.

  15. Viscous regularization of the full set of nonequilibrium-diffusion Grey Radiation-Hydrodynamic equations

    DOE PAGES

    Delchini, Marc O.; Ragusa, Jean C.; Ferguson, Jim

    2017-02-17

    A viscous regularization technique, based on the local entropy residual, was proposed by Delchini et al. (2015) to stabilize the nonequilibrium-diffusion Grey Radiation-Hydrodynamic equations using an artificial viscosity technique. This viscous regularization is modulated by the local entropy production and is consistent with the entropy minimum principle. However, Delchini et al. (2015) only based their work on the hyperbolic parts of the Grey Radiation-Hydrodynamic equations and thus omitted the relaxation and diffusion terms present in the material energy and radiation energy equations. Here in this paper, we extend the theoretical grounds for the method and derive an entropy minimum principlemore » for the full set of nonequilibrium-diffusion Grey Radiation-Hydrodynamic equations. This further strengthens the applicability of the entropy viscosity method as a stabilization technique for radiation-hydrodynamic shock simulations. Radiative shock calculations using constant and temperature-dependent opacities are compared against semi-analytical reference solutions, and we present a procedure to perform spatial convergence studies of such simulations.« less

  16. System Mass Variation and Entropy Generation in 100k We Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Reid, Bryan M.

    2004-01-01

    State-of-the-art closed-Brayton-cycle (CBC) space power systems were modeled to study performance trends in a trade space characteristic of interplanetary orbiters. For working-fluid molar masses of 48.6, 39.9, and 11.9 kg/kmol, peak system pressures of 1.38 and 3.0 MPa and compressor pressure ratios ranging from 1.6 to 2.4, total system masses were estimated. System mass increased as peak operating pressure increased for all compressor pressure ratios and molar mass values examined. Minimum mass point comparison between 72 percent He at 1.38 MPa peak and 94 percent He at 3.0 MPa peak showed an increase in system mass of 14 percent. Converter flow loop entropy generation rates were calculated for 1.38 and 3.0 MPa peak pressure cases. Physical system behavior was approximated using a pedigreed NASA Glenn modeling code, Closed Cycle Engine Program (CCEP), which included realistic performance prediction for heat exchangers, radiators and turbomachinery.

  17. System Mass Variation and Entropy Generation in 100-kWe Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Reid, Bryan M.

    2004-01-01

    State-of-the-art closed-Brayton-cycle (CBC) space power systems were modeled to study performance trends in a trade space characteristic of interplanetary orbiters. For working-fluid molar masses of 48.6, 39.9, and 11.9 kg/kmol, peak system pressures of 1.38 and 3.0 MPa and compressor pressure ratios ranging from 1.6 to 2.4, total system masses were estimated. System mass increased as peak operating pressure increased for all compressor pressure ratios and molar mass values examined. Minimum mass point comparison between 72 percent He at 1.38 MPa peak and 94 percent He at 3.0 MPa peak showed an increase in system mass of 14 percent. Converter flow loop entropy generation rates were calculated for 1.38 and 3.0 MPa peak pressure cases. Physical system behavior was approximated using a pedigreed NASA Glenn modeling code, Closed Cycle Engine Program (CCEP), which included realistic performance prediction for heat exchangers, radiators and turbomachinery.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giovannetti, Vittorio; Maccone, Lorenzo; Shapiro, Jeffrey H.

    The minimum Renyi and Wehrl output entropies are found for bosonic channels in which the signal photons are either randomly displaced by a Gaussian distribution (classical-noise channel), or coupled to a thermal environment through lossy propagation (thermal-noise channel). It is shown that the Renyi output entropies of integer orders z{>=}2 and the Wehrl output entropy are minimized when the channel input is a coherent state.

  19. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    NASA Astrophysics Data System (ADS)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  20. Free Energy in Introductory Physics

    NASA Astrophysics Data System (ADS)

    Prentis, Jeffrey J.; Obsniuk, Michael J.

    2016-02-01

    Energy and entropy are two of the most important concepts in science. For all natural processes where a system exchanges energy with its environment, the energy of the system tends to decrease and the entropy of the system tends to increase. Free energy is the special concept that specifies how to balance the opposing tendencies to minimize energy and maximize entropy. There are many pedagogical articles on energy and entropy. Here we present a simple model to illustrate the concept of free energy and the principle of minimum free energy.

  1. Force-Time Entropy of Isometric Impulse.

    PubMed

    Hsieh, Tsung-Yu; Newell, Karl M

    2016-01-01

    The relation between force and temporal variability in discrete impulse production has been viewed as independent (R. A. Schmidt, H. Zelaznik, B. Hawkins, J. S. Frank, & J. T. Quinn, 1979 ) or dependent on the rate of force (L. G. Carlton & K. M. Newell, 1993 ). Two experiments in an isometric single finger force task investigated the joint force-time entropy with (a) fixed time to peak force and different percentages of force level and (b) fixed percentage of force level and different times to peak force. The results showed that the peak force variability increased either with the increment of force level or through a shorter time to peak force that also reduced timing error variability. The peak force entropy and entropy of time to peak force increased on the respective dimension as the parameter conditions approached either maximum force or a minimum rate of force production. The findings show that force error and timing error are dependent but complementary when considered in the same framework with the joint force-time entropy at a minimum in the middle parameter range of discrete impulse.

  2. Ratio of shear viscosity to entropy density in multifragmentation of Au + Au

    NASA Astrophysics Data System (ADS)

    Zhou, C. L.; Ma, Y. G.; Fang, D. Q.; Li, S. X.; Zhang, G. Q.

    2012-06-01

    The ratio of the shear viscosity (η) to entropy density (s) for the intermediate energy heavy-ion collisions has been calculated by using the Green-Kubo method in the framework of the quantum molecular dynamics model. The theoretical curve of η/s as a function of the incident energy for the head-on Au + Au collisions displays that a minimum region of η/s has been approached at higher incident energies, where the minimum η/s value is about 7 times Kovtun-Son-Starinets (KSS) bound (1/4π). We argue that the onset of minimum η/s region at higher incident energies corresponds to the nuclear liquid gas phase transition in nuclear multifragmentation.

  3. Quantum entropy and uncertainty for two-mode squeezed, coherent and intelligent spin states

    NASA Technical Reports Server (NTRS)

    Aragone, C.; Mundarain, D.

    1993-01-01

    We compute the quantum entropy for monomode and two-mode systems set in squeezed states. Thereafter, the quantum entropy is also calculated for angular momentum algebra when the system is either in a coherent or in an intelligent spin state. These values are compared with the corresponding values of the respective uncertainties. In general, quantum entropies and uncertainties have the same minimum and maximum points. However, for coherent and intelligent spin states, it is found that some minima for the quantum entropy turn out to be uncertainty maxima. We feel that the quantum entropy we use provides the right answer, since it is given in an essentially unique way.

  4. Numerical estimation of the relative entropy of entanglement

    NASA Astrophysics Data System (ADS)

    Zinchenko, Yuriy; Friedland, Shmuel; Gour, Gilad

    2010-11-01

    We propose a practical algorithm for the calculation of the relative entropy of entanglement (REE), defined as the minimum relative entropy between a state and the set of states with positive partial transpose. Our algorithm is based on a practical semidefinite cutting plane approach. In low dimensions the implementation of the algorithm in matlab provides an estimation for the REE with an absolute error smaller than 10-3.

  5. Application of SNODAS and hydrologic models to enhance entropy-based snow monitoring network design

    NASA Astrophysics Data System (ADS)

    Keum, Jongho; Coulibaly, Paulin; Razavi, Tara; Tapsoba, Dominique; Gobena, Adam; Weber, Frank; Pietroniro, Alain

    2018-06-01

    Snow has a unique characteristic in the water cycle, that is, snow falls during the entire winter season, but the discharge from snowmelt is typically delayed until the melting period and occurs in a relatively short period. Therefore, reliable observations from an optimal snow monitoring network are necessary for an efficient management of snowmelt water for flood prevention and hydropower generation. The Dual Entropy and Multiobjective Optimization is applied to design snow monitoring networks in La Grande River Basin in Québec and Columbia River Basin in British Columbia. While the networks are optimized to have the maximum amount of information with minimum redundancy based on entropy concepts, this study extends the traditional entropy applications to the hydrometric network design by introducing several improvements. First, several data quantization cases and their effects on the snow network design problems were explored. Second, the applicability the Snow Data Assimilation System (SNODAS) products as synthetic datasets of potential stations was demonstrated in the design of the snow monitoring network of the Columbia River Basin. Third, beyond finding the Pareto-optimal networks from the entropy with multi-objective optimization, the networks obtained for La Grande River Basin were further evaluated by applying three hydrologic models. The calibrated hydrologic models simulated discharges using the updated snow water equivalent data from the Pareto-optimal networks. Then, the model performances for high flows were compared to determine the best optimal network for enhanced spring runoff forecasting.

  6. Recurrence plots of discrete-time Gaussian stochastic processes

    NASA Astrophysics Data System (ADS)

    Ramdani, Sofiane; Bouchara, Frédéric; Lagarde, Julien; Lesne, Annick

    2016-09-01

    We investigate the statistical properties of recurrence plots (RPs) of data generated by discrete-time stationary Gaussian random processes. We analytically derive the theoretical values of the probabilities of occurrence of recurrence points and consecutive recurrence points forming diagonals in the RP, with an embedding dimension equal to 1. These results allow us to obtain theoretical values of three measures: (i) the recurrence rate (REC) (ii) the percent determinism (DET) and (iii) RP-based estimation of the ε-entropy κ(ε) in the sense of correlation entropy. We apply these results to two Gaussian processes, namely first order autoregressive processes and fractional Gaussian noise. For these processes, we simulate a number of realizations and compare the RP-based estimations of the three selected measures to their theoretical values. These comparisons provide useful information on the quality of the estimations, such as the minimum required data length and threshold radius used to construct the RP.

  7. Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level

    NASA Astrophysics Data System (ADS)

    Silva, Carlos; Annamalai, Kalyan

    2008-06-01

    The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giovannetti, Vittorio; Lloyd, Seth; Department of Mechanical Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139

    The Amosov-Holevo-Werner conjecture implies the additivity of the minimum Renyi entropies at the output of a channel. The conjecture is proven true for all Renyi entropies of integer order greater than two in a class of Gaussian bosonic channel where the input signal is randomly displaced or where it is coupled linearly to an external environment.

  9. Minimum entropy deconvolution optimized sinusoidal synthesis and its application to vibration based fault detection

    NASA Astrophysics Data System (ADS)

    Li, Gang; Zhao, Qing

    2017-03-01

    In this paper, a minimum entropy deconvolution based sinusoidal synthesis (MEDSS) filter is proposed to improve the fault detection performance of the regular sinusoidal synthesis (SS) method. The SS filter is an efficient linear predictor that exploits the frequency properties during model construction. The phase information of the harmonic components is not used in the regular SS filter. However, the phase relationships are important in differentiating noise from characteristic impulsive fault signatures. Therefore, in this work, the minimum entropy deconvolution (MED) technique is used to optimize the SS filter during the model construction process. A time-weighted-error Kalman filter is used to estimate the MEDSS model parameters adaptively. Three simulation examples and a practical application case study are provided to illustrate the effectiveness of the proposed method. The regular SS method and the autoregressive MED (ARMED) method are also implemented for comparison. The MEDSS model has demonstrated superior performance compared to the regular SS method and it also shows comparable or better performance with much less computational intensity than the ARMED method.

  10. Entropy generation in magnetohydrodynamic radiative flow due to rotating disk in presence of viscous dissipation and Joule heating

    NASA Astrophysics Data System (ADS)

    Hayat, Tasawar; Qayyum, Sumaira; Khan, Muhammad Ijaz; Alsaedi, Ahmed

    2018-01-01

    Simultaneous effects of viscous dissipation and Joule heating in flow by rotating disk of variable thickness are examined. Radiative flow saturating porous space is considered. Much attention is given to entropy generation outcome. Developed nonlinear ordinary differential systems are computed for the convergent series solutions. Specifically, the results of velocity, temperature, entropy generation, Bejan number, coefficient of skin friction, and local Nusselt number are discussed. Clearly the entropy generation rate depends on velocity and temperature distributions. Moreover the entropy generation rate is a decreasing function of Hartmann number, Eckert number, and Reynolds number, while they gave opposite behavior for Bejan numbers.

  11. Aeroacoustic and aerodynamic applications of the theory of nonequilibrium thermodynamics

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Smith, Charles A.; Karamcheti, Krishnamurty

    1991-01-01

    Recent developments in the field of nonequilibrium thermodynamics associated with viscous flows are examined and related to developments to the understanding of specific phenomena in aerodynamics and aeroacoustics. A key element of the nonequilibrium theory is the principle of minimum entropy production rate for steady dissipative processes near equilibrium, and variational calculus is used to apply this principle to several examples of viscous flow. A review of nonequilibrium thermodynamics and its role in fluid motion are presented. Several formulations are presented of the local entropy production rate and the local energy dissipation rate, two quantities that are of central importance to the theory. These expressions and the principle of minimum entropy production rate for steady viscous flows are used to identify parallel-wall channel flow and irrotational flow as having minimally dissipative velocity distributions. Features of irrotational, steady, viscous flow near an airfoil, such as the effect of trailing-edge radius on circulation, are also found to be compatible with the minimum principle. Finally, the minimum principle is used to interpret the stability of infinitesimal and finite amplitude disturbances in an initially laminar, parallel shear flow, with results that are consistent with experiment and linearized hydrodynamic stability theory. These results suggest that a thermodynamic approach may be useful in unifying the understanding of many diverse phenomena in aerodynamics and aeroacoustics.

  12. Absolute Equilibrium Entropy

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1997-01-01

    The entropy associated with absolute equilibrium ensemble theories of ideal, homogeneous, fluid and magneto-fluid turbulence is discussed and the three-dimensional fluid case is examined in detail. A sigma-function is defined, whose minimum value with respect to global parameters is the entropy. A comparison is made between the use of global functions sigma and phase functions H (associated with the development of various H-theorems of ideal turbulence). It is shown that the two approaches are complimentary though conceptually different: H-theorems show that an isolated system tends to equilibrium while sigma-functions allow the demonstration that entropy never decreases when two previously isolated systems are combined. This provides a more complete picture of entropy in the statistical mechanics of ideal fluids.

  13. Efficient optimization of the quantum relative entropy

    NASA Astrophysics Data System (ADS)

    Fawzi, Hamza; Fawzi, Omar

    2018-04-01

    Many quantum information measures can be written as an optimization of the quantum relative entropy between sets of states. For example, the relative entropy of entanglement of a state is the minimum relative entropy to the set of separable states. The various capacities of quantum channels can also be written in this way. We propose a unified framework to numerically compute these quantities using off-the-shelf semidefinite programming solvers, exploiting the approximation method proposed in Fawzi, Saunderson and Parrilo (2017 arXiv: 1705.00812). As a notable application, this method allows us to provide numerical counterexamples for a proposed lower bound on the quantum conditional mutual information in terms of the relative entropy of recovery.

  14. Entropy Generation in Regenerative Systems

    NASA Technical Reports Server (NTRS)

    Kittel, Peter

    1995-01-01

    Heat exchange to the oscillating flows in regenerative coolers generates entropy. These flows are characterized by oscillating mass flows and oscillating temperatures. Heat is transferred between the flow and heat exchangers and regenerators. In the former case, there is a steady temperature difference between the flow and the heat exchangers. In the latter case, there is no mean temperature difference. In this paper a mathematical model of the entropy generated is developed for both cases. Estimates of the entropy generated by this process are given for oscillating flows in heat exchangers and in regenerators. The practical significance of this entropy is also discussed.

  15. Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows

    NASA Astrophysics Data System (ADS)

    Safari, Mehdi

    Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.

  16. Large Eddy Simulation of Entropy Generation in a Turbulent Mixing Layer

    NASA Astrophysics Data System (ADS)

    Sheikhi, Reza H.; Safari, Mehdi; Hadi, Fatemeh

    2013-11-01

    Entropy transport equation is considered in large eddy simulation (LES) of turbulent flows. The irreversible entropy generation in this equation provides a more general description of subgrid scale (SGS) dissipation due to heat conduction, mass diffusion and viscosity effects. A new methodology is developed, termed the entropy filtered density function (En-FDF), to account for all individual entropy generation effects in turbulent flows. The En-FDF represents the joint probability density function of entropy, frequency, velocity and scalar fields within the SGS. An exact transport equation is developed for the En-FDF, which is modeled by a system of stochastic differential equations, incorporating the second law of thermodynamics. The modeled En-FDF transport equation is solved by a Lagrangian Monte Carlo method. The methodology is employed to simulate a turbulent mixing layer involving transport of passive scalars and entropy. Various modes of entropy generation are obtained from the En-FDF and analyzed. Predictions are assessed against data generated by direct numerical simulation (DNS). The En-FDF predictions are in good agreements with the DNS data.

  17. [Study on once sampling quantitation based on information entropy of ISSR amplified bands of Houttuynia cordata].

    PubMed

    Wang, Haiqin; Liu, Wenlong; He, Fuyuan; Chen, Zuohong; Zhang, Xili; Xie, Xianggui; Zeng, Jiaoli; Duan, Xiaopeng

    2012-02-01

    To explore the once sampling quantitation of Houttuynia cordata through its DNA polymorphic bands that carried information entropy, from other form that the expression of traditional Chinese medicine polymorphism, genetic polymorphism, of traditional Chinese medicine. The technique of inter simple sequence repeat (ISSR) was applied to analyze genetic polymorphism of H. cordata samples from the same GAP producing area, the DNA genetic bands were transformed its into the information entropy, and the minimum once sampling quantitation with the mathematical mode was measured. One hundred and thirty-four DNA bands were obtained by using 9 screened ISSR primers to amplify from 46 strains DNA samples of H. cordata from the same GAP, the information entropy was H=0.365 6-0.978 6, and RSD was 14.75%. The once sampling quantitation was W=11.22 kg (863 strains). The "once minimum sampling quantitation" were calculated from the angle of the genetic polymorphism of H. cordata, and a great differences between this volume and the amount from the angle of fingerprint were found.

  18. Rényi-Fisher entropy product as a marker of topological phase transitions

    NASA Astrophysics Data System (ADS)

    Bolívar, J. C.; Nagy, Ágnes; Romera, Elvira

    2018-05-01

    The combined Rényi-Fisher entropy product of electrons plus holes displays a minimum at the charge neutrality points. The Stam-Rényi difference and the Stam-Rényi uncertainty product of the electrons plus holes, show maxima at the charge neutrality points. Topological quantum numbers capable of detecting the topological insulator and the band insulator phases, are defined. Upper and lower bounds for the position and momentum space Rényi-Fisher entropy products are derived.

  19. Heat Transfer and Entropy Generation Analysis of an Intermediate Heat Exchanger in ADS

    NASA Astrophysics Data System (ADS)

    Wang, Yongwei; Huai, Xiulan

    2018-04-01

    The intermediate heat exchanger for enhancement heat transfer is the important equipment in the usage of nuclear energy. In the present work, heat transfer and entropy generation of an intermediate heat exchanger (IHX) in the accelerator driven subcritical system (ADS) are investigated experimentally. The variation of entropy generation number with performance parameters of the IHX is analyzed, and effects of inlet conditions of the IHX on entropy generation number and heat transfer are discussed. Compared with the results at two working conditions of the constant mass flow rates of liquid lead-bismuth eutectic (LBE) and helium gas, the total pumping power all tends to reduce with the decreasing entropy generation number, but the variations of the effectiveness, number of transfer units and thermal capacity rate ratio are inconsistent, and need to analyze respectively. With the increasing inlet mass flow rate or LBE inlet temperature, the entropy generation number increases and the heat transfer is enhanced, while the opposite trend occurs with the increasing helium gas inlet temperature. The further study is necessary for obtaining the optimized operation parameters of the IHX to minimize entropy generation and enhance heat transfer.

  20. Statistical mechanical theory for steady state systems. VI. Variational principles

    NASA Astrophysics Data System (ADS)

    Attard, Phil

    2006-12-01

    Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Pisin; Hsin, Po-Shen; Niu, Yuezhen, E-mail: pisinchen@phys.ntu.edu.tw, E-mail: r01222031@ntu.edu.tw, E-mail: yuezhenniu@gmail.com

    We investigate the entropy evolution in the early universe by computing the change of the entanglement entropy in Freedmann-Robertson-Walker quantum cosmology in the presence of particle horizon. The matter is modeled by a Chaplygin gas so as to provide a smooth interpolation between inflationary and radiation epochs, rendering the evolution of entropy from early time to late time trackable. We found that soon after the onset of the inflation, the total entanglement entropy rapidly decreases to a minimum. It then rises monotonically in the remainder of the inflation epoch as well as the radiation epoch. Our result is in qualitativemore » agreement with the area law of Ryu and Takayanagi including the logarithmic correction. We comment on the possible implication of our finding to the cosmological entropy problem.« less

  2. Entropy of adsorption of mixed surfactants from solutions onto the air/water interface

    USGS Publications Warehouse

    Chen, L.-W.; Chen, J.-H.; Zhou, N.-F.

    1995-01-01

    The partial molar entropy change for mixed surfactant molecules adsorbed from solution at the air/water interface has been investigated by surface thermodynamics based upon the experimental surface tension isotherms at various temperatures. Results for different surfactant mixtures of sodium dodecyl sulfate and sodium tetradecyl sulfate, decylpyridinium chloride and sodium alkylsulfonates have shown that the partial molar entropy changes for adsorption of the mixed surfactants were generally negative and decreased with increasing adsorption to a minimum near the maximum adsorption and then increased abruptly. The entropy decrease can be explained by the adsorption-orientation of surfactant molecules in the adsorbed monolayer and the abrupt entropy increase at the maximum adsorption is possible due to the strong repulsion between the adsorbed molecules.

  3. Recommendations and illustrations for the evaluation of photonic random number generators

    NASA Astrophysics Data System (ADS)

    Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi

    2017-09-01

    The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.

  4. Harvesting Entropy for Random Number Generation for Internet of Things Constrained Devices Using On-Board Sensors

    PubMed Central

    Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej

    2015-01-01

    Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things. PMID:26506357

  5. Harvesting entropy for random number generation for internet of things constrained devices using on-board sensors.

    PubMed

    Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej

    2015-10-22

    Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things.

  6. Bimodal behavior of post-measured entropy and one-way quantum deficit for two-qubit X states

    NASA Astrophysics Data System (ADS)

    Yurischev, Mikhail A.

    2018-01-01

    A method for calculating the one-way quantum deficit is developed. It involves a careful study of post-measured entropy shapes. We discovered that in some regions of X-state space the post-measured entropy \\tilde{S} as a function of measurement angle θ \\in [0,π /2] exhibits a bimodal behavior inside the open interval (0,π /2), i.e., it has two interior extrema: one minimum and one maximum. Furthermore, cases are found when the interior minimum of such a bimodal function \\tilde{S}(θ ) is less than that one at the endpoint θ =0 or π /2. This leads to the formation of a boundary between the phases of one-way quantum deficit via finite jumps of optimal measured angle from the endpoint to the interior minimum. Phase diagram is built up for a two-parameter family of X states. The subregions with variable optimal measured angle are around 1% of the total region, with their relative linear sizes achieving 17.5%, and the fidelity between the states of those subregions can be reduced to F=0.968. In addition, a correction to the one-way deficit due to the interior minimum can achieve 2.3%. Such conditions are favorable to detect the subregions with variable optimal measured angle of one-way quantum deficit in an experiment.

  7. 16QAM Blind Equalization via Maximum Entropy Density Approximation Technique and Nonlinear Lagrange Multipliers

    PubMed Central

    Mauda, R.; Pinchas, M.

    2014-01-01

    Recently a new blind equalization method was proposed for the 16QAM constellation input inspired by the maximum entropy density approximation technique with improved equalization performance compared to the maximum entropy approach, Godard's algorithm, and others. In addition, an approximated expression for the minimum mean square error (MSE) was obtained. The idea was to find those Lagrange multipliers that bring the approximated MSE to minimum. Since the derivation of the obtained MSE with respect to the Lagrange multipliers leads to a nonlinear equation for the Lagrange multipliers, the part in the MSE expression that caused the nonlinearity in the equation for the Lagrange multipliers was ignored. Thus, the obtained Lagrange multipliers were not those Lagrange multipliers that bring the approximated MSE to minimum. In this paper, we derive a new set of Lagrange multipliers based on the nonlinear expression for the Lagrange multipliers obtained from minimizing the approximated MSE with respect to the Lagrange multipliers. Simulation results indicate that for the high signal to noise ratio (SNR) case, a faster convergence rate is obtained for a channel causing a high initial intersymbol interference (ISI) while the same equalization performance is obtained for an easy channel (initial ISI low). PMID:24723813

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guha, Saikat; Shapiro, Jeffrey H.; Erkmen, Baris I.

    Previous work on the classical information capacities of bosonic channels has established the capacity of the single-user pure-loss channel, bounded the capacity of the single-user thermal-noise channel, and bounded the capacity region of the multiple-access channel. The latter is a multiple-user scenario in which several transmitters seek to simultaneously and independently communicate to a single receiver. We study the capacity region of the bosonic broadcast channel, in which a single transmitter seeks to simultaneously and independently communicate to two different receivers. It is known that the tightest available lower bound on the capacity of the single-user thermal-noise channel is thatmore » channel's capacity if, as conjectured, the minimum von Neumann entropy at the output of a bosonic channel with additive thermal noise occurs for coherent-state inputs. Evidence in support of this minimum output entropy conjecture has been accumulated, but a rigorous proof has not been obtained. We propose a minimum output entropy conjecture that, if proved to be correct, will establish that the capacity region of the bosonic broadcast channel equals the inner bound achieved using a coherent-state encoding and optimum detection. We provide some evidence that supports this conjecture, but again a full proof is not available.« less

  9. Nonlinear radiative heat flux and heat source/sink on entropy generation minimization rate

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Khan, M. Waleed Ahmed; Khan, M. Ijaz; Alsaedi, A.

    2018-06-01

    Entropy generation minimization in nonlinear radiative mixed convective flow towards a variable thicked surface is addressed. Entropy generation for momentum and temperature is carried out. The source for this flow analysis is stretching velocity of sheet. Transformations are used to reduce system of partial differential equations into ordinary ones. Total entropy generation rate is determined. Series solutions for the zeroth and mth order deformation systems are computed. Domain of convergence for obtained solutions is identified. Velocity, temperature and concentration fields are plotted and interpreted. Entropy equation is studied through nonlinear mixed convection and radiative heat flux. Velocity and temperature gradients are discussed through graphs. Meaningful results are concluded in the final remarks.

  10. Connectivity in the human brain dissociates entropy and complexity of auditory inputs☆

    PubMed Central

    Nastase, Samuel A.; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-01-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. PMID:25536493

  11. Effect of Entropy Generation on Wear Mechanics and System Reliability

    NASA Astrophysics Data System (ADS)

    Gidwani, Akshay; James, Siddanth; Jagtap, Sagar; Karthikeyan, Ram; Vincent, S.

    2018-04-01

    Wear is an irreversible phenomenon. Processes such as mutual sliding and rolling between materials involve entropy generation. These processes are monotonic with respect to time. The concept of entropy generation is further quantified using Degradation Entropy Generation theorem formulated by Michael D. Bryant. The sliding-wear model can be extrapolated to different instances in order to further provide a potential analysis of machine prognostics as well as system and process reliability for various processes besides even mere mechanical processes. In other words, using the concept of ‘entropy generation’ and wear, one can quantify the reliability of a system with respect to time using a thermodynamic variable, which is the basis of this paper. Thus in the present investigation, a unique attempt has been made to establish correlation between entropy-wear-reliability which can be useful technique in preventive maintenance.

  12. Nonequilibrium Thermodynamics in Biological Systems

    NASA Astrophysics Data System (ADS)

    Aoki, I.

    2005-12-01

    1. Respiration Oxygen-uptake by respiration in organisms decomposes macromolecules such as carbohydrate, protein and lipid and liberates chemical energy of high quality, which is then used to chemical reactions and motions of matter in organisms to support lively order in structure and function in organisms. Finally, this chemical energy becomes heat energy of low quality and is discarded to the outside (dissipation function). Accompanying this heat energy, entropy production which inevitably occurs by irreversibility also is discarded to the outside. Dissipation function and entropy production are estimated from data of respiration. 2. Human body From the observed data of respiration (oxygen absorption), the entropy production in human body can be estimated. Entropy production from 0 to 75 years old human has been obtained, and extrapolated to fertilized egg (beginning of human life) and to 120 years old (maximum period of human life). Entropy production show characteristic behavior in human life span : early rapid increase in short growing phase and later slow decrease in long aging phase. It is proposed that this tendency is ubiquitous and constitutes a Principle of Organization in complex biotic systems. 3. Ecological communities From the data of respiration of eighteen aquatic communities, specific (i.e. per biomass) entropy productions are obtained. They show two phase character with respect to trophic diversity : early increase and later decrease with the increase of trophic diversity. The trophic diversity in these aquatic ecosystems is shown to be positively correlated with the degree of eutrophication, and the degree of eutrophication is an "arrow of time" in the hierarchy of aquatic ecosystems. Hence specific entropy production has the two phase: early increase and later decrease with time. 4. Entropy principle for living systems The Second Law of Thermodynamics has been expressed as follows. 1) In isolated systems, entropy increases with time and approaches to a maximum value. This is well-known classical Clausius principle. 2) In open systems near equilibrium entropy production always decreases with time approaching a minimum stationary level. This is the minimum entropy production principle by Prigogine. These two principle are established ones. However, living systems are not isolated and not near to equilibrium. Hence, these two principles can not be applied to living systems. What is entropy principle for living systems? Answer: Entropy production in living systems consists of multi-stages with time: early increasing, later decreasing and/or intermediate stages. This tendency is supported by various living systems.

  13. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  14. Numerical investigation for entropy generation in hydromagnetic flow of fluid with variable properties and slip

    NASA Astrophysics Data System (ADS)

    Khan, M. Ijaz; Hayat, Tasawar; Alsaedi, Ahmed

    2018-02-01

    This modeling and computations present the study of viscous fluid flow with variable properties by a rotating stretchable disk. Rotating flow is generated through nonlinear rotating stretching surface. Nonlinear thermal radiation and heat generation/absorption are studied. Flow is conducting for a constant applied magnetic field. No polarization is taken. Induced magnetic field is not taken into account. Attention is focused on the entropy generation rate and Bejan number. The entropy generation rate and Bejan number clearly depend on velocity and thermal fields. The von Kármán approach is utilized to convert the partial differential expressions into ordinary ones. These expressions are non-dimensionalized, and numerical results are obtained for flow variables. The effects of the magnetic parameter, Prandtl number, radiative parameter, heat generation/absorption parameter, and slip parameter on velocity and temperature fields as well as the entropy generation rate and Bejan number are discussed. Drag forces (radial and tangential) and heat transfer rates are calculated and discussed. Furthermore the entropy generation rate is a decreasing function of magnetic variable and Reynolds number. The Bejan number effect on the entropy generation rate is reverse to that of the magnetic variable. Also opposite behavior of heat transfers is observed for varying estimations of radiative and slip variables.

  15. Heat engine by exorcism of Maxwell Demon using spin angular momentum reservoir

    NASA Astrophysics Data System (ADS)

    Bedkihal, Salil; Wright, Jackson; Vaccaro, Joan; Gould, Tim

    Landauer's erasure principle is a hallmark in thermodynamics and information theory. According to this principle, erasing one bit of information incurs a minimum energy cost. Recently, Vaccaro and Barnett (VB) have explored the role of multiple conserved quantities in memory erasure. They further illustrated that for the energy degenerate spin reservoirs, the cost of erasure can be solely in terms of spin angular momentum and no energy. Motivated by the VB erasure, in this work we propose a novel optical heat engine that operates under a single thermal reservoir and a spin angular momentum reservoir. The novel heat engine exploits ultrafast processes of phonon absorption to convert thermal phonon energy to coherent light. The entropy generated in this process then corresponds to a mixture of spin up and spin down populations of energy degenerate electronic ground states which acts as demon's memory. This information is then erased using a polarised spin reservoir that acts as an entropy sink. The proposed heat engines goes beyond the traditional Carnot engine.

  16. Connectivity in the human brain dissociates entropy and complexity of auditory inputs.

    PubMed

    Nastase, Samuel A; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-03-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Copyright © 2014. Published by Elsevier Inc.

  17. Minimum entropy principle-based solar cell operation without a pn-junction and a thin CdS layer to extract the holes from the emitter

    NASA Astrophysics Data System (ADS)

    Böer, Karl W.

    2016-10-01

    The solar cell does not use a pn-junction to separate electrons from holes, but uses an undoped CdS layer that is p-type inverted when attached to a p-type collector and collects the holes while rejecting the backflow of electrons and thereby prevents junction leakage. The operation of the solar cell is determined by the minimum entropy principle of the cell and its external circuit that determines the electrochemical potential, i.e., the Fermi-level of the base electrode to the operating (maximum power point) voltage. It leaves the Fermi level of the metal electrode of the CdS unchanged, since CdS does not participate in the photo-emf. All photoelectric actions are generated by the holes excited from the light that causes the shift of the quasi-Fermi levels in the generator and supports the diffusion current in operating conditions. It is responsible for the measured solar maximum power current. The open circuit voltage (Voc) can approach its theoretical limit of the band gap of the collector at 0 K and the cell increases the efficiency at AM1 to 21% for a thin-film CdS/CdTe that is given as an example here. However, a series resistance of the CdS forces a limitation of its thickness to preferably below 200 Å to avoid unnecessary reduction in efficiency or Voc. The operation of the CdS solar cell does not involve heated carriers. It is initiated by the field at the CdS/CdTe interface that exceeds 20 kV/cm that is sufficient to cause extraction of holes by the CdS that is inverted to become p-type. Here a strong doubly charged intrinsic donor can cause a negative differential conductivity that switches-on a high-field domain that is stabilized by the minimum entropy principle and permits an efficient transport of the holes from the CdTe to the base electrode. Experimental results of the band model of CdS/CdTe solar cells are given and show that the conduction bands are connected in the dark, where the electron current must be continuous, and the valence bands are connected with light where the hole currents are dominant and must be continuous through the junction. The major shifts of the bands in operating conditions are self-adjusting by a change in the junction dipole momentum.

  18. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  19. Reply to Comment on ‘The cancer Warburg effect may be a testable example of the minimum entropy production rate principle’

    NASA Astrophysics Data System (ADS)

    Sabater, Bartolomé; Marín, Dolores

    2018-03-01

    The minimum rate principle is applied to the chemical reaction in a steady-state open cell system where, under constant supply of the glucose precursor, reference to time or to glucose consumption does not affect the conclusions.

  20. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  1. Entropy generation in a mixed convection Poiseulle flow of molybdenum disulphide Jeffrey nanofluid

    NASA Astrophysics Data System (ADS)

    Gul, Aaiza; Khan, Ilyas; Makhanov, Stanislav S.

    2018-06-01

    Entropy analysis in a mixed convection Poiseulle flow of a Molybdenum Disulphide Jeffrey Nanofluid (MDJN) is presented. Mixed convection is caused due to buoyancy force and external pressure gradient. The problem is formulated in terms of a boundary value problem for a system of partial differential equations. An analytical solution for the velocity and the temperature is obtained using the perturbation technique. Entropy generation has been derived as a function of the velocity and temperature gradients. The solutions are displayed graphically and the relevant importance of the input parameters is discussed. A Jeffrey nanofluid (JN) has been compared with a second grade nanofluid (SGN) and Newtonian nanofluid (NN). It is found that the entropy generation decreases when the temperature increases whereas increasing the Brickman number increases entropy generation.

  2. Dirac dispersion generates unusually large Nernst effect in Weyl semimetals

    NASA Astrophysics Data System (ADS)

    Watzman, Sarah J.; McCormick, Timothy M.; Shekhar, Chandra; Wu, Shu-Chun; Sun, Yan; Prakash, Arati; Felser, Claudia; Trivedi, Nandini; Heremans, Joseph P.

    2018-04-01

    Weyl semimetals contain linearly dispersing electronic states, offering interesting features in transport yet to be thoroughly explored thermally. Here we show how the Nernst effect, combining entropy with charge transport, gives a unique signature for the presence of Dirac bands and offers a diagnostic to determine if trivial pockets play a role in this transport. The Nernst thermopower of NbP exceeds its conventional thermopower by a 100-fold, and the temperature dependence of the Nernst effect has a pronounced maximum. The charge-neutrality condition dictates that the Fermi level shifts with increasing temperature toward the energy that has the minimum density of states (DOS). In NbP, the agreement of the Nernst and Seebeck data with a model that assumes this minimum DOS resides at the Dirac points is taken as strong experimental evidence that the trivial (non-Dirac) bands play no role in high-temperature transport.

  3. Two faces of entropy and information in biological systems.

    PubMed

    Mitrokhin, Yuriy

    2014-10-21

    The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Query construction, entropy, and generalization in neural-network models

    NASA Astrophysics Data System (ADS)

    Sollich, Peter

    1994-05-01

    We study query construction algorithms, which aim at improving the generalization ability of systems that learn from examples by choosing optimal, nonredundant training sets. We set up a general probabilistic framework for deriving such algorithms from the requirement of optimizing a suitable objective function; specifically, we consider the objective functions entropy (or information gain) and generalization error. For two learning scenarios, the high-low game and the linear perceptron, we evaluate the generalization performance obtained by applying the corresponding query construction algorithms and compare it to training on random examples. We find qualitative differences between the two scenarios due to the different structure of the underlying rules (nonlinear and ``noninvertible'' versus linear); in particular, for the linear perceptron, random examples lead to the same generalization ability as a sequence of queries in the limit of an infinite number of examples. We also investigate learning algorithms which are ill matched to the learning environment and find that, in this case, minimum entropy queries can in fact yield a lower generalization ability than random examples. Finally, we study the efficiency of single queries and its dependence on the learning history, i.e., on whether the previous training examples were generated randomly or by querying, and the difference between globally and locally optimal query construction.

  5. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  6. Quantum Entanglement and the Topological Order of Fractional Hall States

    NASA Astrophysics Data System (ADS)

    Rezayi, Edward

    2015-03-01

    Fractional quantum Hall states or, more generally, topological phases of matter defy Landau classification based on order parameter and broken symmetry. Instead they have been characterized by their topological order. Quantum information concepts, such as quantum entanglement, appear to provide the most efficient method of detecting topological order solely from the knowledge of the ground state wave function. This talk will focus on real-space bi-partitioning of quantum Hall states and will present both exact diagonalization and quantum Monte Carlo studies of topological entanglement entropy in various geometries. Results on the torus for non-contractible cuts are quite rich and, through the use of minimum entropy states, yield the modular S-matrix and hence uniquely determine the topological order, as shown in recent literature. Concrete examples of minimum entropy states from known quantum Hall wave functions and their corresponding quantum numbers, used in exact diagonalizations, will be given. In collaboration with Clare Abreu and Raul Herrera. Supported by DOE Grant DE-SC0002140.

  7. Investigating Friction as a Main Source of Entropy Generation in the Expansion of Confined Gas in a Piston-and-Cylinder Device

    ERIC Educational Resources Information Center

    Kang, Dun-Yen; Liou, Kai-Hsin; Chang, Wei-Lun

    2015-01-01

    The expansion or compression of gas confined in a piston-and-cylinder device is a classic working example used for illustrating the First and Second Laws of Thermodynamics. The balance of energy and entropy enables the estimation of a number of thermodynamic properties. The entropy generation (also called entropy production) resulting from this…

  8. Minimum energy dissipation required for a logically irreversible operation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Yoshikawa, Nobuyuki

    2018-01-01

    According to Landauer's principle, the minimum heat emission required for computing is linked to logical entropy, or logical reversibility. The validity of Landauer's principle has been investigated for several decades and was finally demonstrated in recent experiments by showing that the minimum heat emission is associated with the reduction in logical entropy during a logically irreversible operation. Although the relationship between minimum heat emission and logical reversibility is being revealed, it is not clear how much free energy is required to be dissipated for a logically irreversible operation. In the present study, in order to reveal the connection between logical reversibility and free energy dissipation, we numerically demonstrated logically irreversible protocols using adiabatic superconductor logic. The calculation results of work during the protocol showed that, while the minimum heat emission conforms to Landauer's principle, the free energy dissipation can be arbitrarily reduced by performing the protocol quasistatically. The above results show that logical reversibility is not associated with thermodynamic reversibility, and that heat is not only emitted from logic devices but also absorbed by logic devices. We also formulated the heat emission from adiabatic superconductor logic during a logically irreversible operation at a finite operation speed.

  9. Effective Techniques for Augmenting Heat Transfer: An Application of Entropy Generation Minimization Principles.

    DTIC Science & Technology

    1980-12-01

    augmentation techniques, entropy generation, irreversibility, exergy . 20. ABSTRACT (Continue on rovers. side If necessary and Identify by block number...35 3.5 Internally finned tubes ...... ................. .. 37 3.6 Internally roughened tubes ..... ............... . 41 3.7 Other heat transfer...irreversibility and entropy generation as fundamental criterion for evaluating and, eventually, minimizing the waste of usable energy ( exergy ) in energy

  10. MHD effects on heat transfer and entropy generation of nanofluid flow in an open cavity

    NASA Astrophysics Data System (ADS)

    Mehrez, Zouhaier; El Cafsi, Afif; Belghith, Ali; Le Quéré, Patrick

    2015-01-01

    The present numerical work investigates the effect of an external oriented magnetic field on heat transfer and entropy generation of Cu-water nanofluid flow in an open cavity heated from below. The governing equations are solved numerically by the finite-volume method. The study has been carried out for a wide range of solid volume fraction 0≤φ≤0.06, Hartmann number 0≤Ha≤100, Reynolds number 100≤Re≤500 and Richardson number 0.001≤Ri≤1 at three inclination angles of magnetic field γ: 0°, 45° and 90°. The numerical results are given by streamlines, isotherms, average Nusselt number, average entropy generation and Bejan number. The results show that flow behavior, temperature distribution, heat transfer and entropy generation are strongly affected by the presence of a magnetic field. The average Nusselt number and entropy generation, which increase by increasing volume fraction of nanoparticles, depend mainly on the Hartmann number and inclination angle of the magnetic field. The variation rates of heat transfer and entropy generation while adding nanoparticles or applying a magnetic field depend on the Richardson and Reynolds numbers.

  11. Entropy Generation in a Chemical Reaction

    ERIC Educational Resources Information Center

    Miranda, E. N.

    2010-01-01

    Entropy generation in a chemical reaction is analysed without using the general formalism of non-equilibrium thermodynamics at a level adequate for advanced undergraduates. In a first approach to the problem, the phenomenological kinetic equation of an elementary first-order reaction is used to show that entropy production is always positive. A…

  12. Prediction of pKa Values for Neutral and Basic Drugs based on Hybrid Artificial Intelligence Methods.

    PubMed

    Li, Mengshan; Zhang, Huaijing; Chen, Bingsheng; Wu, Yan; Guan, Lixin

    2018-03-05

    The pKa value of drugs is an important parameter in drug design and pharmacology. In this paper, an improved particle swarm optimization (PSO) algorithm was proposed based on the population entropy diversity. In the improved algorithm, when the population entropy was higher than the set maximum threshold, the convergence strategy was adopted; when the population entropy was lower than the set minimum threshold the divergence strategy was adopted; when the population entropy was between the maximum and minimum threshold, the self-adaptive adjustment strategy was maintained. The improved PSO algorithm was applied in the training of radial basis function artificial neural network (RBF ANN) model and the selection of molecular descriptors. A quantitative structure-activity relationship model based on RBF ANN trained by the improved PSO algorithm was proposed to predict the pKa values of 74 kinds of neutral and basic drugs and then validated by another database containing 20 molecules. The validation results showed that the model had a good prediction performance. The absolute average relative error, root mean square error, and squared correlation coefficient were 0.3105, 0.0411, and 0.9685, respectively. The model can be used as a reference for exploring other quantitative structure-activity relationships.

  13. Developing Soil Moisture Profiles Utilizing Remotely Sensed MW and TIR Based SM Estimates Through Principle of Maximum Entropy

    NASA Astrophysics Data System (ADS)

    Mishra, V.; Cruise, J. F.; Mecikalski, J. R.

    2015-12-01

    Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Earlier studies show that the principle of maximum entropy (POME) can be utilized to develop vertical soil moisture profiles with accuracy (MAE of about 1% for a monotonically dry profile; nearly 2% for monotonically wet profiles and 3.8% for mixed profiles) with minimum constraints (surface, mean and bottom soil moisture contents). In this study, the constraints for the vertical soil moisture profiles were obtained from remotely sensed data. Low resolution (25 km) MW soil moisture estimates (AMSR-E) were downscaled to 4 km using a soil evaporation efficiency index based disaggregation approach. The downscaled MW soil moisture estimates served as a surface boundary condition, while 4 km resolution TIR based Atmospheric Land Exchange Inverse (ALEXI) estimates provided the required mean root-zone soil moisture content. Bottom soil moisture content is assumed to be a soil dependent constant. Mulit-year (2002-2011) gridded profiles were developed for the southeastern United States using the POME method. The soil moisture profiles were compared to those generated in land surface models (Land Information System (LIS) and an agricultural model DSSAT) along with available NRCS SCAN sites in the study region. The end product, spatial soil moisture profiles, can be assimilated into agricultural and hydrologic models in lieu of precipitation for data scarce regions.Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Previous studies have shown that the principle of maximum entropy (POME) can be utilized with minimal constraints to develop vertical soil moisture profiles with accuracy (MAE = 1% for monotonically dry profiles; MAE = 2% for monotonically wet profiles and MAE = 3.8% for mixed profiles) when compared to laboratory and field data. In this study, vertical soil moisture profiles were developed using the POME model to evaluate an irrigation schedule over a maze field in north central Alabama (USA). The model was validated using both field data and a physically based mathematical model. The results demonstrate that a simple two-constraint entropy model under the assumption of a uniform initial soil moisture distribution can simulate most soil moisture profiles within the field area for 6 different soil types. The results of the irrigation simulation demonstrated that the POME model produced a very efficient irrigation strategy with loss of about 1.9% of the total applied irrigation water. However, areas of fine-textured soil (i.e. silty clay) resulted in plant stress of nearly 30% of the available moisture content due to insufficient water supply on the last day of the drying phase of the irrigation cycle. Overall, the POME approach showed promise as a general strategy to guide irrigation in humid environments, with minimum input requirements.

  14. New thermodynamics of entropy generation minimization with nonlinear thermal radiation and nanomaterials

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Khan, M. Ijaz; Qayyum, Sumaira; Alsaedi, A.; Khan, M. Imran

    2018-03-01

    This research addressed entropy generation for MHD stagnation point flow of viscous nanofluid over a stretching surface. Characteristics of heat transport are analyzed through nonlinear radiation and heat generation/absorption. Nanoliquid features for Brownian moment and thermophoresis have been considered. Fluid in the presence of constant applied inclined magnetic field is considered. Flow problem is mathematically modeled and governing expressions are changed into nonlinear ordinary ones by utilizing appropriate transformations. The effects of pertinent variables on velocity, nanoparticle concentration and temperature are discussed graphically. Furthermore Brownian motion and thermophoresis effects on entropy generation and Bejan number have been examined. Total entropy generation is inspected through various flow variables. Consideration is mainly given to the convergence process. Velocity, temperature and mass gradients at the surface of sheet are calculated numerically.

  15. Entropy Generation Minimization in Dimethyl Ether Synthesis: A Case Study

    NASA Astrophysics Data System (ADS)

    Kingston, Diego; Razzitte, Adrián César

    2018-04-01

    Entropy generation minimization is a method that helps improve the efficiency of real processes and devices. In this article, we study the entropy production (due to chemical reactions, heat exchange and friction) in a conventional reactor that synthesizes dimethyl ether and minimize it by modifying different operating variables of the reactor, such as composition, temperature and pressure, while aiming at a fixed production of dimethyl ether. Our results indicate that it is possible to reduce the entropy production rate by nearly 70 % and that, by changing only the inlet composition, it is possible to cut it by nearly 40 %, though this comes at the expense of greater dissipation due to heat transfer. We also study the alternative of coupling the reactor with another, where dehydrogenation of methylcyclohexane takes place. In that case, entropy generation can be reduced by 54 %, when pressure, temperature and inlet molar flows are varied. These examples show that entropy generation analysis can be a valuable tool in engineering design and applications aiming at process intensification and efficient operation of plant equipment.

  16. Entropy Generation Across Earth's Bow Shock

    NASA Technical Reports Server (NTRS)

    Parks, George K.; McCarthy, Michael; Fu, Suiyan; Lee E. s; Cao, Jinbin; Goldstein, Melvyn L.; Canu, Patrick; Dandouras, Iannis S.; Reme, Henri; Fazakerley, Andrew; hide

    2011-01-01

    Earth's bow shock is a transition layer that causes an irreversible change in the state of plasma that is stationary in time. Theories predict entropy increases across the bow shock but entropy has never been directly measured. Cluster and Double Star plasma experiments measure 3D plasma distributions upstream and downstream of the bow shock that allow calculation of Boltzmann's entropy function H and his famous H-theorem, dH/dt O. We present the first direct measurements of entropy density changes across Earth's bow shock. We will show that this entropy generation may be part of the processes that produce the non-thermal plasma distributions is consistent with a kinetic entropy flux model derived from the collisionless Boltzmann equation, giving strong support that solar wind's total entropy across the bow shock remains unchanged. As far as we know, our results are not explained by any existing shock models and should be of interests to theorists.

  17. [Application of an Adaptive Inertia Weight Particle Swarm Algorithm in the Magnetic Resonance Bias Field Correction].

    PubMed

    Wang, Chang; Qin, Xin; Liu, Yan; Zhang, Wenchao

    2016-06-01

    An adaptive inertia weight particle swarm algorithm is proposed in this study to solve the local optimal problem with the method of traditional particle swarm optimization in the process of estimating magnetic resonance(MR)image bias field.An indicator measuring the degree of premature convergence was designed for the defect of traditional particle swarm optimization algorithm.The inertia weight was adjusted adaptively based on this indicator to ensure particle swarm to be optimized globally and to avoid it from falling into local optimum.The Legendre polynomial was used to fit bias field,the polynomial parameters were optimized globally,and finally the bias field was estimated and corrected.Compared to those with the improved entropy minimum algorithm,the entropy of corrected image was smaller and the estimated bias field was more accurate in this study.Then the corrected image was segmented and the segmentation accuracy obtained in this research was 10% higher than that with improved entropy minimum algorithm.This algorithm can be applied to the correction of MR image bias field.

  18. Numerical study of entropy generation in MHD water-based carbon nanotubes along an inclined permeable surface

    NASA Astrophysics Data System (ADS)

    Soomro, Feroz Ahmed; Rizwan-ul-Haq; Khan, Z. H.; Zhang, Qiang

    2017-10-01

    Main theme of the article is to examine the entropy generation analysis for the magneto-hydrodynamic mixed convection flow of water functionalized carbon nanotubes along an inclined stretching surface. Thermophysical properties of both particles and working fluid are incorporated in the system of governing partial differential equations. Rehabilitation of nonlinear system of equations is obtained via similarity transformations. Moreover, solutions of these equations are further utilized to determine the volumetric entropy and characteristic entropy generation. Solutions of governing boundary layer equations are obtained numerically using the finite difference method. Effects of two types of carbon nanotubes, namely, single-wall carbon nanotubes (SWCNTs) and multi-wall carbon nanotubes (MWCNTs) with water as base fluid have been analyzed over the physical quantities of interest, namely, surface skin friction, heat transfer rate and entropy generation coefficients. Influential results of velocities, temperature, entropy generation and isotherms are plotted against the emerging parameter, namely, nanoparticle fraction 0≤φ ≤ 0.2, thermal convective parameter 0≤ λ ≤ 5, Hartmann number 0≤ M≤ 2, suction/injection parameter -1≤ S≤ 1, and Eckert number 0≤ Ec ≤ 2. It is finally concluded that skin friction increases due to the increase in the magnetic parameter, suction/injection and nanoparticle volume fraction, whereas the Nusselt number shows an increasing trend due to the increase in the suction parameter, mixed convection parameter and nanoparticle volume fraction. Similarly, entropy generation shows an opposite behavior for the Hartmann number and mixed convection parameter for both single-wall and multi-wall carbon nanotubes.

  19. Generating Multivariate Ordinal Data via Entropy Principles.

    PubMed

    Lee, Yen; Kaplan, David

    2018-03-01

    When conducting robustness research where the focus of attention is on the impact of non-normality, the marginal skewness and kurtosis are often used to set the degree of non-normality. Monte Carlo methods are commonly applied to conduct this type of research by simulating data from distributions with skewness and kurtosis constrained to pre-specified values. Although several procedures have been proposed to simulate data from distributions with these constraints, no corresponding procedures have been applied for discrete distributions. In this paper, we present two procedures based on the principles of maximum entropy and minimum cross-entropy to estimate the multivariate observed ordinal distributions with constraints on skewness and kurtosis. For these procedures, the correlation matrix of the observed variables is not specified but depends on the relationships between the latent response variables. With the estimated distributions, researchers can study robustness not only focusing on the levels of non-normality but also on the variations in the distribution shapes. A simulation study demonstrates that these procedures yield excellent agreement between specified parameters and those of estimated distributions. A robustness study concerning the effect of distribution shape in the context of confirmatory factor analysis shows that shape can affect the robust [Formula: see text] and robust fit indices, especially when the sample size is small, the data are severely non-normal, and the fitted model is complex.

  20. Beyond the classical theory of heat conduction: a perspective view of future from entropy

    PubMed Central

    Lai, Xiang; Zhu, Pingan

    2016-01-01

    Energy is conserved by the first law of thermodynamics; its quality degrades constantly due to entropy generation, by the second law of thermodynamics. It is thus important to examine the entropy generation regarding the way to reduce its magnitude and the limit of entropy generation as time tends to infinity regarding whether it is bounded or not. This work initiates such an analysis with one-dimensional heat conduction. The work not only offers some fundamental insights of universe and its future, but also builds up the relation between the second law of thermodynamics and mathematical inequalities via developing the latter of either new or classical nature. A concise review of entropy is also included for the interest of performing the analysis in this work and the similar analysis for other processes in the future. PMID:27843400

  1. Integrated design of multivariable hydrometric networks using entropy theory with a multiobjective optimization approach

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Hwang, T.; Vose, J. M.; Martin, K. L.; Band, L. E.

    2016-12-01

    Obtaining quality hydrologic observations is the first step towards a successful water resources management. While remote sensing techniques have enabled to convert satellite images of the Earth's surface to hydrologic data, the importance of ground-based observations has never been diminished because in-situ data are often highly accurate and can be used to validate remote measurements. The existence of efficient hydrometric networks is becoming more important to obtain as much as information with minimum redundancy. The World Meteorological Organization (WMO) has recommended a guideline for the minimum hydrometric network density based on physiography; however, this guideline is not for the optimum network design but for avoiding serious deficiency from a network. Moreover, all hydrologic variables are interconnected within the hydrologic cycle, while monitoring networks have been designed individually. This study proposes an integrated network design method using entropy theory with a multiobjective optimization approach. In specific, a precipitation and a streamflow networks in a semi-urban watershed in Ontario, Canada were designed simultaneously by maximizing joint entropy, minimizing total correlation, and maximizing conditional entropy of streamflow network given precipitation network. After comparing with the typical individual network designs, the proposed design method would be able to determine more efficient optimal networks by avoiding the redundant stations, in which hydrologic information is transferable. Additionally, four quantization cases were applied in entropy calculations to assess their implications on the station rankings and the optimal networks. The results showed that the selection of quantization method should be considered carefully because the rankings and optimal networks are subject to change accordingly.

  2. Integrated design of multivariable hydrometric networks using entropy theory with a multiobjective optimization approach

    NASA Astrophysics Data System (ADS)

    Keum, J.; Coulibaly, P. D.

    2017-12-01

    Obtaining quality hydrologic observations is the first step towards a successful water resources management. While remote sensing techniques have enabled to convert satellite images of the Earth's surface to hydrologic data, the importance of ground-based observations has never been diminished because in-situ data are often highly accurate and can be used to validate remote measurements. The existence of efficient hydrometric networks is becoming more important to obtain as much as information with minimum redundancy. The World Meteorological Organization (WMO) has recommended a guideline for the minimum hydrometric network density based on physiography; however, this guideline is not for the optimum network design but for avoiding serious deficiency from a network. Moreover, all hydrologic variables are interconnected within the hydrologic cycle, while monitoring networks have been designed individually. This study proposes an integrated network design method using entropy theory with a multiobjective optimization approach. In specific, a precipitation and a streamflow networks in a semi-urban watershed in Ontario, Canada were designed simultaneously by maximizing joint entropy, minimizing total correlation, and maximizing conditional entropy of streamflow network given precipitation network. After comparing with the typical individual network designs, the proposed design method would be able to determine more efficient optimal networks by avoiding the redundant stations, in which hydrologic information is transferable. Additionally, four quantization cases were applied in entropy calculations to assess their implications on the station rankings and the optimal networks. The results showed that the selection of quantization method should be considered carefully because the rankings and optimal networks are subject to change accordingly.

  3. Enhancement of heat transfer and entropy generation analysis of nanofluids turbulent convection flow in square section tubes

    NASA Astrophysics Data System (ADS)

    Bianco, Vincenzo; Nardini, Sergio; Manca, Oronzio

    2011-12-01

    In this article, developing turbulent forced convection flow of a water-Al2O3 nanofluid in a square tube, subjected to constant and uniform wall heat flux, is numerically investigated. The mixture model is employed to simulate the nanofluid flow and the investigation is accomplished for particles size equal to 38 nm. An entropy generation analysis is also proposed in order to find the optimal working condition for the given geometry under given boundary conditions. A simple analytical procedure is proposed to evaluate the entropy generation and its results are compared with the numerical calculations, showing a very good agreement. A comparison of the resulting Nusselt numbers with experimental correlations available in literature is accomplished. To minimize entropy generation, the optimal Reynolds number is determined.

  4. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle

    NASA Astrophysics Data System (ADS)

    Marín, Dolores; Sabater, Bartolomé

    2017-04-01

    Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.

  5. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle.

    PubMed

    Marín, Dolores; Sabater, Bartolomé

    2017-04-28

    Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO 2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.

  6. Minimum relative entropy distributions with a large mean are Gaussian

    NASA Astrophysics Data System (ADS)

    Smerlak, Matteo

    2016-12-01

    Entropy optimization principles are versatile tools with wide-ranging applications from statistical physics to engineering to ecology. Here we consider the following constrained problem: Given a prior probability distribution q , find the posterior distribution p minimizing the relative entropy (also known as the Kullback-Leibler divergence) with respect to q under the constraint that mean (p ) is fixed and large. We show that solutions to this problem are approximately Gaussian. We discuss two applications of this result. In the context of dissipative dynamics, the equilibrium distribution of a Brownian particle confined in a strong external field is independent of the shape of the confining potential. We also derive an H -type theorem for evolutionary dynamics: The entropy of the (standardized) distribution of fitness of a population evolving under natural selection is eventually increasing in time.

  7. Use and validity of principles of extremum of entropy production in the study of complex systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitor Reis, A., E-mail: ahr@uevora.pt

    2014-07-15

    It is shown how both the principles of extremum of entropy production, which are often used in the study of complex systems, follow from the maximization of overall system conductivities, under appropriate constraints. In this way, the maximum rate of entropy production (MEP) occurs when all the forces in the system are kept constant. On the other hand, the minimum rate of entropy production (mEP) occurs when all the currents that cross the system are kept constant. A brief discussion on the validity of the application of the mEP and MEP principles in several cases, and in particular to themore » Earth’s climate is also presented. -- Highlights: •The principles of extremum of entropy production are not first principles. •They result from the maximization of conductivities under appropriate constraints. •The conditions of their validity are set explicitly. •Some long-standing controversies are discussed and clarified.« less

  8. Evaluation of spectral entropy to measure anaesthetic depth and antinociception in sevoflurane-anaesthetised Beagle dogs.

    PubMed

    Morgaz, Juan; Granados, María del Mar; Domínguez, Juan Manuel; Navarrete, Rocío; Fernández, Andrés; Galán, Alba; Muñoz, Pilar; Gómez-Villamandos, Rafael J

    2011-06-01

    The use of spectral entropy to determine anaesthetic depth and antinociception was evaluated in sevoflurane-anaesthetised Beagle dogs. Dogs were anaesthetised at each of five multiples of their individual minimum alveolar concentrations (MAC; 0.75, 1, 1.25, 1.5 and 1.75 MAC), and response entropy (RE), state entropy (SE), RE-SE difference, burst suppression rate (BSR) and cardiorespiratory parameters were recorded before and after a painful stimulus. RE, SE and RE-SE difference did not change significantly after the stimuli. The correlation between MAC-entropy parameters was weak, but these values increased when 1.75 MAC results were excluded from the analysis. BSR was different to zero at 1.5 and 1.75 MAC. It was concluded that RE and RE-SE differences were not adequate indicators of antinociception and SE and RE were unable to detect deep planes of anaesthesia in dogs, although they both distinguished the awake and unconscious states. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Entropy-based artificial viscosity stabilization for non-equilibrium Grey Radiation-Hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc O., E-mail: delchinm@email.tamu.edu; Ragusa, Jean C., E-mail: jean.ragusa@tamu.edu; Morel, Jim, E-mail: jim.morel@tamu.edu

    2015-09-01

    The entropy viscosity method is extended to the non-equilibrium Grey Radiation-Hydrodynamic equations. The method employs a viscous regularization to stabilize the numerical solution. The artificial viscosity coefficient is modulated by the entropy production and peaks at shock locations. The added dissipative terms are consistent with the entropy minimum principle. A new functional form of the entropy residual, suitable for the Radiation-Hydrodynamic equations, is derived. We demonstrate that the viscous regularization preserves the equilibrium diffusion limit. The equations are discretized with a standard Continuous Galerkin Finite Element Method and a fully implicit temporal integrator within the MOOSE multiphysics framework. The methodmore » of manufactured solutions is employed to demonstrate second-order accuracy in both the equilibrium diffusion and streaming limits. Several typical 1-D radiation-hydrodynamic test cases with shocks (from Mach 1.05 to Mach 50) are presented to establish the ability of the technique to capture and resolve shocks.« less

  10. New Insights into the Fractional Order Diffusion Equation Using Entropy and Kurtosis.

    PubMed

    Ingo, Carson; Magin, Richard L; Parrish, Todd B

    2014-11-01

    Fractional order derivative operators offer a concise description to model multi-scale, heterogeneous and non-local systems. Specifically, in magnetic resonance imaging, there has been recent work to apply fractional order derivatives to model the non-Gaussian diffusion signal, which is ubiquitous in the movement of water protons within biological tissue. To provide a new perspective for establishing the utility of fractional order models, we apply entropy for the case of anomalous diffusion governed by a fractional order diffusion equation generalized in space and in time. This fractional order representation, in the form of the Mittag-Leffler function, gives an entropy minimum for the integer case of Gaussian diffusion and greater values of spectral entropy for non-integer values of the space and time derivatives. Furthermore, we consider kurtosis, defined as the normalized fourth moment, as another probabilistic description of the fractional time derivative. Finally, we demonstrate the implementation of anomalous diffusion, entropy and kurtosis measurements in diffusion weighted magnetic resonance imaging in the brain of a chronic ischemic stroke patient.

  11. Entropy generation in Gaussian quantum transformations: applying the replica method to continuous-variable quantum information theory

    NASA Astrophysics Data System (ADS)

    Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.

    2016-02-01

    In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.

  12. Bilayer graphene phonovoltaic-FET: In situ phonon recycling

    NASA Astrophysics Data System (ADS)

    Melnick, Corey; Kaviany, Massoud

    2017-11-01

    A new heat harvester, the phonovoltaic (pV) cell, was recently proposed. The device converts optical phonons into power before they become heat. Due to the low entropy of a typical hot optical phonon population, the phonovoltaic can operate at high fractions of the Carnot limit and harvest heat more efficiently than conventional heat harvesting technologies such as the thermoelectric generator. Previously, the optical phonon source was presumed to produce optical phonons with a single polarization and momentum. Here, we examine a realistic optical phonon source in a potential pV application and the effects this has on pV operation. Supplementing this work is our investigation of bilayer graphene as a new pV material. Our ab initio calculations show that bilayer graphene has a figure of merit exceeding 0.9, well above previously investigated materials. This allows a room-temperature pV to recycle 65% of a highly nonequilibrium, minimum entropy population of phonons. However, full-band Monte Carlo simulations of the electron and phonon dynamics in a bilayer graphene field-effect transistor (FET) show that the optical phonons emitted by field-accelerated electrons can only be recycled in situ with an efficiency of 50%, and this efficiency falls as the field strength grows. Still, an appropriately designed FET-pV can recycle the phonons produced therein in situ with a much higher efficiency than a thermoelectric generator can harvest heat produced by a FET ex situ.

  13. Irreversible entropy model for damage diagnosis in resistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuadras, Angel, E-mail: angel.cuadras@upc.edu; Crisóstomo, Javier; Ovejas, Victoria J.

    2015-10-28

    We propose a method to characterize electrical resistor damage based on entropy measurements. Irreversible entropy and the rate at which it is generated are more convenient parameters than resistance for describing damage because they are essentially positive in virtue of the second law of thermodynamics, whereas resistance may increase or decrease depending on the degradation mechanism. Commercial resistors were tested in order to characterize the damage induced by power surges. Resistors were biased with constant and pulsed voltage signals, leading to power dissipation in the range of 4–8 W, which is well above the 0.25 W nominal power to initiate failure. Entropymore » was inferred from the added power and temperature evolution. A model is proposed to understand the relationship among resistance, entropy, and damage. The power surge dissipates into heat (Joule effect) and damages the resistor. The results show a correlation between entropy generation rate and resistor failure. We conclude that damage can be conveniently assessed from irreversible entropy generation. Our results for resistors can be easily extrapolated to other systems or machines that can be modeled based on their resistance.« less

  14. Necessary conditions for the optimality of variable rate residual vector quantizers

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Smith, Mark J. T.; Barnes, Christopher F.

    1993-01-01

    Residual vector quantization (RVQ), or multistage VQ, as it is also called, has recently been shown to be a competitive technique for data compression. The competitive performance of RVQ reported in results from the joint optimization of variable rate encoding and RVQ direct-sum code books. In this paper, necessary conditions for the optimality of variable rate RVQ's are derived, and an iterative descent algorithm based on a Lagrangian formulation is introduced for designing RVQ's having minimum average distortion subject to an entropy constraint. Simulation results for these entropy-constrained RVQ's (EC-RVQ's) are presented for memory less Gaussian, Laplacian, and uniform sources. A Gauss-Markov source is also considered. The performance is superior to that of entropy-constrained scalar quantizers (EC-SQ's) and practical entropy-constrained vector quantizers (EC-VQ's), and is competitive with that of some of the best source coding techniques that have appeared in the literature.

  15. The Nature of Grand Minima and Maxima from Fully Nonlinear Flux Transport Dynamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inceoglu, Fadil; Arlt, Rainer; Rempel, Matthias, E-mail: finceoglu@aip.de

    We aim to investigate the nature and occurrence characteristics of grand solar minimum and maximum periods, which are observed in the solar proxy records such as {sup 10}Be and {sup 14}C, using a fully nonlinear Babcock–Leighton type flux transport dynamo including momentum and entropy equations. The differential rotation and meridional circulation are generated from the effect of turbulent Reynolds stress and are subjected to back-reaction from the magnetic field. To generate grand minimum- and maximum-like periods in our simulations, we used random fluctuations in the angular momentum transport process, namely the Λ-mechanism, and in the Babcock–Leighton mechanism. To characterize themore » nature and occurrences of the identified grand minima and maxima in our simulations, we used the waiting time distribution analyses, which reflect whether the underlying distribution arises from a random or a memory-bearing process. The results show that, in the majority of the cases, the distributions of grand minima and maxima reveal that the nature of these events originates from memoryless processes. We also found that in our simulations the meridional circulation speed tends to be smaller during grand maximum, while it is faster during grand minimum periods. The radial differential rotation tends to be larger during grand maxima, while it is smaller during grand minima. The latitudinal differential rotation, on the other hand, is found to be larger during grand minima.« less

  16. Entropy Production in Chemical Reactors

    NASA Astrophysics Data System (ADS)

    Kingston, Diego; Razzitte, Adrián C.

    2017-06-01

    We have analyzed entropy production in chemically reacting systems and extended previous results to the two limiting cases of ideal reactors, namely continuous stirred tank reactor (CSTR) and plug flow reactor (PFR). We have found upper and lower bounds for the entropy production in isothermal systems and given expressions for non-isothermal operation and analyzed the influence of pressure and temperature in entropy generation minimization in reactors with a fixed volume and production. We also give a graphical picture of entropy production in chemical reactions subject to constant volume, which allows us to easily assess different options. We show that by dividing a reactor into two smaller ones, operating at different temperatures, the entropy production is lowered, going as near as 48 % less in the case of a CSTR and PFR in series, and reaching 58 % with two CSTR. Finally, we study the optimal pressure and temperature for a single isothermal PFR, taking into account the irreversibility introduced by a compressor and a heat exchanger, decreasing the entropy generation by as much as 30 %.

  17. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  18. ECOSYSTEM GROWTH AND DEVELOPMENT

    EPA Science Inventory

    Thermodynamically, ecosystem growth and development is the process by which energy throughflow and stored biomass increase. Several proposed hypotheses describe the natural tendencies that occur as an ecosystem matures, and here, we consider five: minimum entropy production, maxi...

  19. Heat capacities and thermodynamic properties of annite (aluminous iron biotite)

    USGS Publications Warehouse

    Hemingway, B.S.; Robie, R.A.

    1990-01-01

    The heat capacities have been measured between 7 and 650 K by quasi-adiabatic calorimetry and differential scanning calorimetry. At 298.15 K and 1 bar, the calorimetric entropy for our sample is 354.9??0.7 J/(mol.K). A minimum configurational entropy of 18.7 J/(mol.K) for full disorder of Al/Si in the tetrahedral sites should be added to the calorimetric entropy for third-law calculations. The heat capacity equation [Cp in units of J/mol.K)] Cp0 = 583.586 + 0.075246T - 3420.60T-0.5 - (4.4551 ?? 106)T-2 fits the experimental and estimated heat capacities for our sample (valid range 250 to 1000 K) with an average deviation of 0.37%. -from Authors

  20. Temperature lapse rates at restricted thermodynamic equilibrium. Part II: Saturated air and further discussions

    NASA Astrophysics Data System (ADS)

    Björnbom, Pehr

    2016-03-01

    In the first part of this work equilibrium temperature profiles in fluid columns with ideal gas or ideal liquid were obtained by numerically minimizing the column energy at constant entropy, equivalent to maximizing column entropy at constant energy. A minimum in internal plus potential energy for an isothermal temperature profile was obtained in line with Gibbs' classical equilibrium criterion. However, a minimum in internal energy alone for adiabatic temperature profiles was also obtained. This led to a hypothesis that the adiabatic lapse rate corresponds to a restricted equilibrium state, a type of state in fact discussed already by Gibbs. In this paper similar numerical results for a fluid column with saturated air suggest that also the saturated adiabatic lapse rate corresponds to a restricted equilibrium state. The proposed hypothesis is further discussed and amended based on the previous and the present numerical results and a theoretical analysis based on Gibbs' equilibrium theory.

  1. Optimal Binarization of Gray-Scaled Digital Images via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A. (Inventor); Klinko, Steven J. (Inventor)

    2007-01-01

    A technique for finding an optimal threshold for binarization of a gray scale image employs fuzzy reasoning. A triangular membership function is employed which is dependent on the degree to which the pixels in the image belong to either the foreground class or the background class. Use of a simplified linear fuzzy entropy factor function facilitates short execution times and use of membership values between 0.0 and 1.0 for improved accuracy. To improve accuracy further, the membership function employs lower and upper bound gray level limits that can vary from image to image and are selected to be equal to the minimum and the maximum gray levels, respectively, that are present in the image to be converted. To identify the optimal binarization threshold, an iterative process is employed in which different possible thresholds are tested and the one providing the minimum fuzzy entropy measure is selected.

  2. A MATLAB implementation of the minimum relative entropy method for linear inverse problems

    NASA Astrophysics Data System (ADS)

    Neupauer, Roseanna M.; Borchers, Brian

    2001-08-01

    The minimum relative entropy (MRE) method can be used to solve linear inverse problems of the form Gm= d, where m is a vector of unknown model parameters and d is a vector of measured data. The MRE method treats the elements of m as random variables, and obtains a multivariate probability density function for m. The probability density function is constrained by prior information about the upper and lower bounds of m, a prior expected value of m, and the measured data. The solution of the inverse problem is the expected value of m, based on the derived probability density function. We present a MATLAB implementation of the MRE method. Several numerical issues arise in the implementation of the MRE method and are discussed here. We present the source history reconstruction problem from groundwater hydrology as an example of the MRE implementation.

  3. Heat transfer enhancement and entropy generation analysis of Al2O3-water nanofluid in an alternating oval cross-section tube using two-phase mixture model under turbulent flow

    NASA Astrophysics Data System (ADS)

    Najafi Khaboshan, Hasan; Nazif, Hamid Reza

    2018-04-01

    Heat transfer and turbulent flow of Al2O3-water nanofluid within alternating oval cross-section tube are numerically simulated using Eulerian-Eulerian two-phase mixture model. The primary goal of the present study is to investigate the effects of nanoparticles volume fraction, nanoparticles diameter and different inlet velocities on heat transfer, pressure drop and entropy generation characteristics of the alternating oval cross-section tube. For numerical simulation validation, the numerical results were compared with experimental data. Also, constant wall temperature boundary condition was considered on the tube wall. In addition, the comparison of thermal-hydraulic performance and the entropy generation characteristics between alternating oval cross-section tube and circular tube under same fluids were done. The results show that the heat transfer coefficient and pressure drop of alternating oval cross-section tube is more than base tube under same fluids. Also, these two parameters are increased when adding Al2O3 nanoparticle into water fluid, at any inlet velocity for both tubes. Furthermore, compared to the base fluid, the value of the heat transfer enhancement of nanofluid is higher than the increase of friction factor of nanofluid at the same given inlet boundary conditions. The results of entropy generation analysis illustrate that the total entropy generation increase with increasing the nanoparticles volume fraction and decreasing the nanoparticles diameter of nanofluid. The generation of thermal entropy is the main part of irreversibility, and Bejan number with an increase of the nanoparticles diameter slightly increases. Finally, at any given inlet velocity the frictional irreversibility is grown with an increase the nanoparticles volume fraction.

  4. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    PubMed

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  5. Sample entropy analysis of cervical neoplasia gene-expression signatures

    PubMed Central

    Botting, Shaleen K; Trzeciakowski, Jerome P; Benoit, Michelle F; Salama, Salama A; Diaz-Arrastia, Concepcion R

    2009-01-01

    Background We introduce Approximate Entropy as a mathematical method of analysis for microarray data. Approximate entropy is applied here as a method to classify the complex gene expression patterns resultant of a clinical sample set. Since Entropy is a measure of disorder in a system, we believe that by choosing genes which display minimum entropy in normal controls and maximum entropy in the cancerous sample set we will be able to distinguish those genes which display the greatest variability in the cancerous set. Here we describe a method of utilizing Approximate Sample Entropy (ApSE) analysis to identify genes of interest with the highest probability of producing an accurate, predictive, classification model from our data set. Results In the development of a diagnostic gene-expression profile for cervical intraepithelial neoplasia (CIN) and squamous cell carcinoma of the cervix, we identified 208 genes which are unchanging in all normal tissue samples, yet exhibit a random pattern indicative of the genetic instability and heterogeneity of malignant cells. This may be measured in terms of the ApSE when compared to normal tissue. We have validated 10 of these genes on 10 Normal and 20 cancer and CIN3 samples. We report that the predictive value of the sample entropy calculation for these 10 genes of interest is promising (75% sensitivity, 80% specificity for prediction of cervical cancer over CIN3). Conclusion The success of the Approximate Sample Entropy approach in discerning alterations in complexity from biological system with such relatively small sample set, and extracting biologically relevant genes of interest hold great promise. PMID:19232110

  6. Entropy Generation Analysis in Convective Ferromagnetic Nano Blood Flow Through a Composite Stenosed Arteries with Permeable Wall

    NASA Astrophysics Data System (ADS)

    Sher Akbar, Noreen; Wahid Butt, Adil

    2017-05-01

    The study of heat transfer is of significant importance in many biological and biomedical industry problems. This investigation comprises of the study of entropy generation analysis of the blood flow in the arteries with permeable walls. The convection through the flow is studied with compliments to the entropy generation. Governing problem is formulized and solved for low Reynold’s number and long wavelength approximations. Exact analytical solutions have been obtained and are analyzed graphically. It is seen that temperature for pure water is lower as compared to the copper water. It gains magnitude with an increase in the slip parameter.

  7. A Maximum Entropy Method for Particle Filtering

    NASA Astrophysics Data System (ADS)

    Eyink, Gregory L.; Kim, Sangil

    2006-06-01

    Standard ensemble or particle filtering schemes do not properly represent states of low priori probability when the number of available samples is too small, as is often the case in practical applications. We introduce here a set of parametric resampling methods to solve this problem. Motivated by a general H-theorem for relative entropy, we construct parametric models for the filter distributions as maximum-entropy/minimum-information models consistent with moments of the particle ensemble. When the prior distributions are modeled as mixtures of Gaussians, our method naturally generalizes the ensemble Kalman filter to systems with highly non-Gaussian statistics. We apply the new particle filters presented here to two simple test cases: a one-dimensional diffusion process in a double-well potential and the three-dimensional chaotic dynamical system of Lorenz.

  8. Thermodynamic geometry for a non-extensive ideal gas

    NASA Astrophysics Data System (ADS)

    López, J. L.; Obregón, O.; Torres-Arenas, J.

    2018-05-01

    A generalized entropy arising in the context of superstatistics is applied to an ideal gas. The curvature scalar associated to the thermodynamic space generated by this modified entropy is calculated using two formalisms of the geometric approach to thermodynamics. By means of the curvature/interaction hypothesis of the geometric approach to thermodynamic geometry it is found that as a consequence of considering a generalized statistics, an effective interaction arises but the interaction is not enough to generate a phase transition. This generalized entropy seems to be relevant in confinement or in systems with not so many degrees of freedom, so it could be interesting to use such entropies to characterize the thermodynamics of small systems.

  9. Measurement of entropy generation within bypass transitional flow

    NASA Astrophysics Data System (ADS)

    Skifton, Richard; Budwig, Ralph; McEligot, Donald; Crepeau, John

    2012-11-01

    A flat plate made from quartz was submersed in the Idaho National Laboratory's Matched Index of Refraction (MIR) flow facility. PIV was utilized to capture spatial vectors maps at near wall locations with five to ten points within the viscous sublayer. Entropy generation was calculated directly from measured velocity fluctuation derivatives. Two flows were studied: a zero pressure gradient and an adverse pressure gradient (β = -0.039). The free stream turbulence intensity to drive bypass transition ranged between 3% (near trailing edge) and 8% (near leading edge). The pointwise entropy generation rate will be utilized as a design parameter to systematically reduce losses. As a second observation, the pointwise entropy can be shown to predict the onset of transitional flow. This research was partially supported by the DOE EPSCOR program, grant DE-SC0004751 and by the Idaho National Laboratory. Center for Advanced Energy Studies.

  10. Effect of entropy change of lithium intercalation in cathodes and anodes on Li-ion battery thermal management

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vilayanur V.; Choi, Daiwon; Wang, Donghai; Xu, Wu; Towne, Silas; Williford, Ralph E.; Zhang, Ji-Guang; Liu, Jun; Yang, Zhenguo

    The entropy changes (Δ S) in various cathode and anode materials, as well as in complete Li-ion batteries, were measured using an electrochemical thermodynamic measurement system (ETMS). LiCoO 2 has a much larger entropy change than electrodes based on LiNi xCo yMn zO 2 and LiFePO 4, while lithium titanate based anodes have lower entropy change compared to graphite anodes. The reversible heat generation rate was found to be a significant portion of the total heat generation rate. The appropriate combinations of cathode and anode were investigated to minimize reversible heat generation rate across the 0-100% state of charge (SOC) range. In addition to screening for battery electrode materials with low reversible heat, the techniques described in this paper can be a useful engineering tool for battery thermal management in stationary and transportation applications.

  11. Zero entropy continuous interval maps and MMLS-MMA property

    NASA Astrophysics Data System (ADS)

    Jiang, Yunping

    2018-06-01

    We prove that the flow generated by any continuous interval map with zero topological entropy is minimally mean-attractable and minimally mean-L-stable. One of the consequences is that any oscillating sequence is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy. In particular, the Möbius function is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy (Sarnak’s conjecture for continuous interval maps). Another consequence is a non-trivial example of a flow having discrete spectrum. We also define a log-uniform oscillating sequence and show a result in ergodic theory for comparison. This material is based upon work supported by the National Science Foundation. It is also partially supported by a collaboration grant from the Simons Foundation (grant number 523341) and PSC-CUNY awards and a grant from NSFC (grant number 11571122).

  12. Adjusting protein graphs based on graph entropy.

    PubMed

    Peng, Sheng-Lung; Tsay, Yu-Wei

    2014-01-01

    Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid.

  13. Adjusting protein graphs based on graph entropy

    PubMed Central

    2014-01-01

    Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid. PMID:25474347

  14. A globally convergent Lagrange and barrier function iterative algorithm for the traveling salesman problem.

    PubMed

    Dang, C; Xu, L

    2001-03-01

    In this paper a globally convergent Lagrange and barrier function iterative algorithm is proposed for approximating a solution of the traveling salesman problem. The algorithm employs an entropy-type barrier function to deal with nonnegativity constraints and Lagrange multipliers to handle linear equality constraints, and attempts to produce a solution of high quality by generating a minimum point of a barrier problem for a sequence of descending values of the barrier parameter. For any given value of the barrier parameter, the algorithm searches for a minimum point of the barrier problem in a feasible descent direction, which has a desired property that the nonnegativity constraints are always satisfied automatically if the step length is a number between zero and one. At each iteration the feasible descent direction is found by updating Lagrange multipliers with a globally convergent iterative procedure. For any given value of the barrier parameter, the algorithm converges to a stationary point of the barrier problem without any condition on the objective function. Theoretical and numerical results show that the algorithm seems more effective and efficient than the softassign algorithm.

  15. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  16. Using heteroclinic orbits to quantify topological entropy in fluid flows

    NASA Astrophysics Data System (ADS)

    Sattari, Sulimon; Chen, Qianting; Mitchell, Kevin A.

    2016-03-01

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or "ghost," rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding of ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.

  17. Entropy generation method to quantify thermal comfort.

    PubMed

    Boregowda, S C; Tiwari, S N; Chaturvedi, S K

    2001-12-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  18. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  19. Entropy as a Gene-Like Performance Indicator Promoting Thermoelectric Materials.

    PubMed

    Liu, Ruiheng; Chen, Hongyi; Zhao, Kunpeng; Qin, Yuting; Jiang, Binbin; Zhang, Tiansong; Sha, Gang; Shi, Xun; Uher, Ctirad; Zhang, Wenqing; Chen, Lidong

    2017-10-01

    High-throughput explorations of novel thermoelectric materials based on the Materials Genome Initiative paradigm only focus on digging into the structure-property space using nonglobal indicators to design materials with tunable electrical and thermal transport properties. As the genomic units, following the biogene tradition, such indicators include localized crystal structural blocks in real space or band degeneracy at certain points in reciprocal space. However, this nonglobal approach does not consider how real materials differentiate from others. Here, this study successfully develops a strategy of using entropy as the global gene-like performance indicator that shows how multicomponent thermoelectric materials with high entropy can be designed via a high-throughput screening method. Optimizing entropy works as an effective guide to greatly improve the thermoelectric performance through either a significantly depressed lattice thermal conductivity down to its theoretical minimum value and/or via enhancing the crystal structure symmetry to yield large Seebeck coefficients. The entropy engineering using multicomponent crystal structures or other possible techniques provides a new avenue for an improvement of the thermoelectric performance beyond the current methods and approaches. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Generating intrinsically disordered protein conformational ensembles from a Markov chain

    NASA Astrophysics Data System (ADS)

    Cukier, Robert I.

    2018-03-01

    Intrinsically disordered proteins (IDPs) sample a diverse conformational space. They are important to signaling and regulatory pathways in cells. An entropy penalty must be payed when an IDP becomes ordered upon interaction with another protein or a ligand. Thus, the degree of conformational disorder of an IDP is of interest. We create a dichotomic Markov model that can explore entropic features of an IDP. The Markov condition introduces local (neighbor residues in a protein sequence) rotamer dependences that arise from van der Waals and other chemical constraints. A protein sequence of length N is characterized by its (information) entropy and mutual information, MIMC, the latter providing a measure of the dependence among the random variables describing the rotamer probabilities of the residues that comprise the sequence. For a Markov chain, the MIMC is proportional to the pair mutual information MI which depends on the singlet and pair probabilities of neighbor residue rotamer sampling. All 2N sequence states are generated, along with their probabilities, and contrasted with the probabilities under the assumption of independent residues. An efficient method to generate realizations of the chain is also provided. The chain entropy, MIMC, and state probabilities provide the ingredients to distinguish different scenarios using the terminologies: MoRF (molecular recognition feature), not-MoRF, and not-IDP. A MoRF corresponds to large entropy and large MIMC (strong dependence among the residues' rotamer sampling), a not-MoRF corresponds to large entropy but small MIMC, and not-IDP corresponds to low entropy irrespective of the MIMC. We show that MorFs are most appropriate as descriptors of IDPs. They provide a reasonable number of high-population states that reflect the dependences between neighbor residues, thus classifying them as IDPs, yet without very large entropy that might lead to a too high entropy penalty.

  1. Multipoint Optimal Minimum Entropy Deconvolution and Convolution Fix: Application to vibration fault detection

    NASA Astrophysics Data System (ADS)

    McDonald, Geoff L.; Zhao, Qing

    2017-01-01

    Minimum Entropy Deconvolution (MED) has been applied successfully to rotating machine fault detection from vibration data, however this method has limitations. A convolution adjustment to the MED definition and solution is proposed in this paper to address the discontinuity at the start of the signal - in some cases causing spurious impulses to be erroneously deconvolved. A problem with the MED solution is that it is an iterative selection process, and will not necessarily design an optimal filter for the posed problem. Additionally, the problem goal in MED prefers to deconvolve a single-impulse, while in rotating machine faults we expect one impulse-like vibration source per rotational period of the faulty element. Maximum Correlated Kurtosis Deconvolution was proposed to address some of these problems, and although it solves the target goal of multiple periodic impulses, it is still an iterative non-optimal solution to the posed problem and only solves for a limited set of impulses in a row. Ideally, the problem goal should target an impulse train as the output goal, and should directly solve for the optimal filter in a non-iterative manner. To meet these goals, we propose a non-iterative deconvolution approach called Multipoint Optimal Minimum Entropy Deconvolution Adjusted (MOMEDA). MOMEDA proposes a deconvolution problem with an infinite impulse train as the goal and the optimal filter solution can be solved for directly. From experimental data on a gearbox with and without a gear tooth chip, we show that MOMEDA and its deconvolution spectrums according to the period between the impulses can be used to detect faults and study the health of rotating machine elements effectively.

  2. Numerical study of magnetic field on mixed convection and entropy generation of nanofluid in a trapezoidal enclosure

    NASA Astrophysics Data System (ADS)

    Aghaei, Alireza; Khorasanizadeh, Hossein; Sheikhzadeh, Ghanbarali; Abbaszadeh, Mahmoud

    2016-04-01

    The flow under influence of magnetic field is experienced in cooling electronic devices and voltage transformers, nuclear reactors, biochemistry and in physical phenomenon like geology. In this study, the effects of magnetic field on the flow field, heat transfer and entropy generation of Cu-water nanofluid mixed convection in a trapezoidal enclosure have been investigated. The top lid is cold and moving toward right or left, the bottom wall is hot and the side walls are insulated and their angle from the horizon are 15°, 30°, 45° and 60°. Simulations have been carried out for constant Grashof number of 104, Reynolds numbers of 30, 100, 300 and 1000, Hartmann numbers of 25, 50, 75 and 100 and nanoparticles volume fractions of zero up to 0.04. The finite volume method and SIMPLER algorithm have been utilized to solve the governing equations numerically. The results showed that with imposing the magnetic field and enhancing it, the nanofluid convection and the strength of flow decrease and the flow tends toward natural convection and finally toward pure conduction. For this reason, for all of the considered Reynolds numbers and volume fractions, by increasing the Hartmann number the average Nusselt number decreases. Furthermore, for any case with constant Reynolds and Hartmann numbers by increasing the volume fraction of nanoparticles the maximum stream function decreases. For all of the studied cases, entropy generation due to friction is negligible and the total entropy generation is mainly due to irreversibility associated with heat transfer and variation of the total entropy generation with Hartmann number is similar to that of the average Nusselt number. With change in lid movement direction at Reynolds number of 30 the average Nusselt number and total entropy generation are changed, but at Reynolds number of 1000 it has a negligible effect.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattari, Sulimon, E-mail: ssattari2@ucmerced.edu; Chen, Qianting, E-mail: qchen2@ucmerced.edu; Mitchell, Kevin A., E-mail: kmitchell@ucmerced.edu

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or “ghost,” rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding ofmore » ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.« less

  4. Dynamic approximate entropy electroanatomic maps detect rotors in a simulated atrial fibrillation model.

    PubMed

    Ugarte, Juan P; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping.

  5. Experimental heat capacities, excess entropies, and magnetic properties of bulk and nano Fe3O4-Co3O4 and Fe3O4-Mn3O4 spinel solid solutions

    NASA Astrophysics Data System (ADS)

    Schliesser, Jacob M.; Huang, Baiyu; Sahu, Sulata K.; Asplund, Megan; Navrotsky, Alexandra; Woodfield, Brian F.

    2018-03-01

    We have measured the heat capacities of several well-characterized bulk and nanophase Fe3O4-Co3O4 and Fe3O4-Mn3O4 spinel solid solution samples from which magnetic properties of transitions and third-law entropies have been determined. The magnetic transitions show several features common to effects of particle and magnetic domain sizes. From the standard molar entropies, excess entropies of mixing have been generated for these solid solutions and compared with configurational entropies determined previously by assuming appropriate cation and valence distributions. The vibrational and magnetic excess entropies for bulk materials are comparable in magnitude to the respective configurational entropies indicating that excess entropies of mixing must be included when analyzing entropies of mixing. The excess entropies for nanophase materials are even larger than the configurational entropies. Changes in valence, cation distribution, bonding and microstructure between the mixing ions are the likely sources of the positive excess entropies of mixing.

  6. Mathematical model for thermal and entropy analysis of thermal solar collectors by using Maxwell nanofluids with slip conditions, thermal radiation and variable thermal conductivity

    NASA Astrophysics Data System (ADS)

    Aziz, Asim; Jamshed, Wasim; Aziz, Taha

    2018-04-01

    In the present research a simplified mathematical model for the solar thermal collectors is considered in the form of non-uniform unsteady stretching surface. The non-Newtonian Maxwell nanofluid model is utilized for the working fluid along with slip and convective boundary conditions and comprehensive analysis of entropy generation in the system is also observed. The effect of thermal radiation and variable thermal conductivity are also included in the present model. The mathematical formulation is carried out through a boundary layer approach and the numerical computations are carried out for Cu-water and TiO2-water nanofluids. Results are presented for the velocity, temperature and entropy generation profiles, skin friction coefficient and Nusselt number. The discussion is concluded on the effect of various governing parameters on the motion, temperature variation, entropy generation, velocity gradient and the rate of heat transfer at the boundary.

  7. Moisture sorption isotherms and thermodynamic properties of mexican mennonite-style cheese.

    PubMed

    Martinez-Monteagudo, Sergio I; Salais-Fierro, Fabiola

    2014-10-01

    Moisture adsorption isotherms of fresh and ripened Mexican Mennonite-style cheese were investigated using the static gravimetric method at 4, 8, and 12 °C in a water activity range (aw) of 0.08-0.96. These isotherms were modeled using GAB, BET, Oswin and Halsey equations through weighed non-linear regression. All isotherms were sigmoid in shape, showing a type II BET isotherm, and the data were best described by GAB model. GAB model coefficients revealed that water adsorption by cheese matrix is a multilayer process characterized by molecules that are strongly bound in the monolayer and molecules that are slightly structured in a multilayer. Using the GAB model, it was possible to estimate thermodynamic functions (net isosteric heat, differential entropy, integral enthalpy and entropy, and enthalpy-entropy compensation) as function of moisture content. For both samples, the isosteric heat and differential entropy decreased with moisture content in exponential fashion. The integral enthalpy gradually decreased with increasing moisture content after reached a maximum value, while the integral entropy decreased with increasing moisture content after reached a minimum value. A linear compensation was found between integral enthalpy and entropy suggesting enthalpy controlled adsorption. Determination of moisture content and aw relationship yields to important information of controlling the ripening, drying and storage operations as well as understanding of the water state within a cheese matrix.

  8. Quantum thermodynamics and quantum entanglement entropies in an expanding universe

    NASA Astrophysics Data System (ADS)

    Farahmand, Mehrnoosh; Mohammadzadeh, Hosein; Mehri-Dehnavi, Hossein

    2017-05-01

    We investigate an asymptotically spatially flat Robertson-Walker space-time from two different perspectives. First, using von Neumann entropy, we evaluate the entanglement generation due to the encoded information in space-time. Then, we work out the entropy of particle creation based on the quantum thermodynamics of the scalar field on the underlying space-time. We show that the general behavior of both entropies are the same. Therefore, the entanglement can be applied to the customary quantum thermodynamics of the universe. Also, using these entropies, we can recover some information about the parameters of space-time.

  9. Information Measures for Multisensor Systems

    DTIC Science & Technology

    2013-12-11

    permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor

  10. Campbell's Rule for Estimating Entropy Changes

    ERIC Educational Resources Information Center

    Jensen, William B.

    2004-01-01

    Campbell's rule for estimating entropy changes is discussed in relation to an earlier article by Norman Craig, where it was proposed that the approximate value of the entropy of reaction was related to net moles of gas consumed or generated. It was seen that the average for Campbell's data set was lower than that for Craig's data set and…

  11. An Integrated Theory of Everything (TOE)

    NASA Astrophysics Data System (ADS)

    Colella, Antonio

    2014-03-01

    An Integrated TOE unifies all known physical phenomena from the Planck cube to the Super Universe (multiverse). Each matter/force particle is represented by a Planck cube string. Any Super Universe object is a volume of contiguous Planck cubes. Super force Planck cube string singularities existed at the start of all universes. An Integrated TOE foundations are twenty independent existing theories and without sacrificing their integrities, are replaced by twenty interrelated amplified theories. Amplifications of Higgs force theory are key to an Integrated TOE and include: 64 supersymmetric Higgs particles; super force condensations to 17 matter particles/associated Higgs forces; spontaneous symmetry breaking is bidirectional; and the sum of 8 permanent Higgs force energies is dark energy. Stellar black hole theory was amplified to include a quark star (matter) with mass, volume, near zero temperature, and maximum entropy. A black hole (energy) has energy, minimal volume (singularity), near infinite temperature, and minimum entropy. Our precursor universe's super supermassive quark star (matter) evaporated to a super supermassive black hole (energy). This transferred total conserved energy/mass and transformed entropy from maximum to minimum. Integrated Theory of Everything Book Video: https://www.youtube.com/watch?v=4a1c9IvdoGY Research Article Video: http://www.youtube.com/watch?v=CD-QoLeVbSY Research Article: http://toncolella.files.wordpress.com/2012/07/m080112.pdf.

  12. Entropy production and optimization of geothermal power plants

    NASA Astrophysics Data System (ADS)

    Michaelides, Efstathios E.

    2012-09-01

    Geothermal power plants are currently producing reliable and low-cost, base load electricity. Three basic types of geothermal power plants are currently in operation: single-flashing, dual-flashing, and binary power plants. Typically, the single-flashing and dual-flashing geothermal power plants utilize geothermal water (brine) at temperatures in the range of 550-430 K. Binary units utilize geothermal resources at lower temperatures, typically 450-380 K. The entropy production in the various components of the three types of geothermal power plants determines the efficiency of the plants. It is axiomatic that a lower entropy production would improve significantly the energy utilization factor of the corresponding power plant. For this reason, the entropy production in the major components of the three types of geothermal power plants has been calculated. It was observed that binary power plants generate the lowest amount of entropy and, thus, convert the highest rate of geothermal energy into mechanical energy. The single-flashing units generate the highest amount of entropy, primarily because they re-inject fluid at relatively high temperature. The calculations for entropy production provide information on the equipment where the highest irreversibilities occur, and may be used to optimize the design of geothermal processes in future geothermal power plants and thermal cycles used for the harnessing of geothermal energy.

  13. Maximum-Entropy Inference with a Programmable Annealer

    PubMed Central

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.

    2016-01-01

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition. PMID:26936311

  14. Sorption isotherms, thermodynamic properties and glass transition temperature of mucilage extracted from chia seeds (Salvia hispanica L.).

    PubMed

    Velázquez-Gutiérrez, Sandra Karina; Figueira, Ana Cristina; Rodríguez-Huezo, María Eva; Román-Guerrero, Angélica; Carrillo-Navas, Hector; Pérez-Alonso, César

    2015-05-05

    Freeze-dried chia mucilage adsorption isotherms were determined at 25, 35 and 40°C and fitted with the Guggenheim-Anderson-de Boer model. The integral thermodynamic properties (enthalpy and entropy) were estimated with the Clausius-Clapeyron equation. Pore radius of the mucilage, calculated with the Kelvin equation, varied from 0.87 to 6.44 nm in the temperature range studied. The point of maximum stability (minimum integral entropy) ranged between 7.56 and 7.63kg H2O per 100 kg of dry solids (d.s.) (water activity of 0.34-0.53). Enthalpy-entropy compensation for the mucilage showed two isokinetic temperatures: (i) one occurring at low moisture contents (0-7.56 kg H2O per 100 kg d.s.), controlled by changes in water entropy; and (ii) another happening in the moisture interval of 7.56-24 kg H2O per 100 kg d.s. and was enthalpy driven. The glass transition temperature Tg of the mucilage fluctuated between 42.93 and 57.93°C. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Dynamic Approximate Entropy Electroanatomic Maps Detect Rotors in a Simulated Atrial Fibrillation Model

    PubMed Central

    Ugarte, Juan P.; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping. PMID:25489858

  16. Study of thermodynamic properties of liquid binary alloys by a pseudopotential method

    NASA Astrophysics Data System (ADS)

    Vora, Aditya M.

    2010-11-01

    On the basis of the Percus-Yevick hard-sphere model as a reference system and the Gibbs-Bogoliubov inequality, a thermodynamic perturbation method is applied with the use of the well-known model potential. By applying a variational method, the hard-core diameters are found which correspond to a minimum free energy. With this procedure, the thermodynamic properties such as the internal energy, entropy, Helmholtz free energy, entropy of mixing, and heat of mixing are computed for liquid NaK binary systems. The influence of the local-field correction functions of Hartree, Taylor, Ichimaru-Utsumi, Farid-Heine-Engel-Robertson, and Sarkar-Sen-Haldar-Roy is also investigated. The computed excess entropy is in agreement with available experimental data in the case of liquid alloys, whereas the agreement for the heat of mixing is poor. This may be due to the sensitivity of the latter to the potential parameters and dielectric function.

  17. a Maximum Entropy Model of the Bearded Capuchin Monkey Habitat Incorporating Topography and Spectral Unmixing Analysis

    NASA Astrophysics Data System (ADS)

    Howard, A. M.; Bernardes, S.; Nibbelink, N.; Biondi, L.; Presotto, A.; Fragaszy, D. M.; Madden, M.

    2012-07-01

    Movement patterns of bearded capuchin monkeys (Cebus (Sapajus) libidinosus) in northeastern Brazil are likely impacted by environmental features such as elevation, vegetation density, or vegetation type. Habitat preferences of these monkeys provide insights regarding the impact of environmental features on species ecology and the degree to which they incorporate these features in movement decisions. In order to evaluate environmental features influencing movement patterns and predict areas suitable for movement, we employed a maximum entropy modelling approach, using observation points along capuchin monkey daily routes as species presence points. We combined these presence points with spatial data on important environmental features from remotely sensed data on land cover and topography. A spectral mixing analysis procedure was used to generate fraction images that represent green vegetation, shade and soil of the study area. A Landsat Thematic Mapper scene of the area of study was geometrically and atmospherically corrected and used as input in a Minimum Noise Fraction (MNF) procedure and a linear spectral unmixing approach was used to generate the fraction images. These fraction images and elevation were the environmental layer inputs for our logistic MaxEnt model of capuchin movement. Our models' predictive power (test AUC) was 0.775. Areas of high elevation (>450 m) showed low probabilities of presence, and percent green vegetation was the greatest overall contributor to model AUC. This work has implications for predicting daily movement patterns of capuchins in our field site, as suitability values from our model may relate to habitat preference and facility of movement.

  18. Entropy generation in a second grade magnetohydrodynamic nanofluid flow over a convectively heated stretching sheet with nonlinear thermal radiation and viscous dissipation

    NASA Astrophysics Data System (ADS)

    Sithole, Hloniphile; Mondal, Hiranmoy; Sibanda, Precious

    2018-06-01

    This study addresses entropy generation in magnetohydrodynamic flow of a second grade nanofluid over a convectively heated stretching sheet with nonlinear thermal radiation and viscous dissipation. The second grade fluid is assumed to be electrically conducting and is permeated by an applied non-uniform magnetic field. We further consider the impact on the fluid properties and the Nusselt number of homogeneous-heterogeneous reactions and a convective boundary condition. The mathematical equations are solved using the spectral local linearization method. Computations for skin-friction coefficient and local Nusselt number are carried out and displayed in a table. It is observed that the effects of the thermophoresis parameter is to increase the temperature distributions throughout the boundary layer. The entropy generation is enhanced by larger magnetic parameters and increasing Reynolds number. The aim of this manuscript is to pay more attention of entropy generation analysis with heat and fluid flow on second grade nanofluids to improve the system performance. Also the fluid velocity and temperature in the boundary layer region rise significantly for increasing the values of the second grade nanofluid parameter.

  19. Operational safety assessment of turbo generators with wavelet Rényi entropy from sensor-dependent vibration signals.

    PubMed

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-04-16

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals' wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luis, Alfredo

    The use of Renyi entropy as an uncertainty measure alternative to variance leads to the study of states with quantum fluctuations below the levels established by Gaussian states, which are the position-momentum minimum uncertainty states according to variance. We examine the quantum properties of states with exponential wave functions, which combine reduced fluctuations with practical feasibility.

  1. Entropy production of a Brownian ellipsoid in the overdamped limit.

    PubMed

    Marino, Raffaele; Eichhorn, Ralf; Aurell, Erik

    2016-01-01

    We analyze the translational and rotational motion of an ellipsoidal Brownian particle from the viewpoint of stochastic thermodynamics. The particle's Brownian motion is driven by external forces and torques and takes place in an heterogeneous thermal environment where friction coefficients and (local) temperature depend on space and time. Our analysis of the particle's stochastic thermodynamics is based on the entropy production associated with single particle trajectories. It is motivated by the recent discovery that the overdamped limit of vanishing inertia effects (as compared to viscous fricion) produces a so-called "anomalous" contribution to the entropy production, which has no counterpart in the overdamped approximation, when inertia effects are simply discarded. Here we show that rotational Brownian motion in the overdamped limit generates an additional contribution to the "anomalous" entropy. We calculate its specific form by performing a systematic singular perturbation analysis for the generating function of the entropy production. As a side result, we also obtain the (well-known) equations of motion in the overdamped limit. We furthermore investigate the effects of particle shape and give explicit expressions of the "anomalous entropy" for prolate and oblate spheroids and for near-spherical Brownian particles.

  2. Information dynamics in living systems: prokaryotes, eukaryotes, and cancer.

    PubMed

    Frieden, B Roy; Gatenby, Robert A

    2011-01-01

    Living systems use information and energy to maintain stable entropy while far from thermodynamic equilibrium. The underlying first principles have not been established. We propose that stable entropy in living systems, in the absence of thermodynamic equilibrium, requires an information extremum (maximum or minimum), which is invariant to first order perturbations. Proliferation and death represent key feedback mechanisms that promote stability even in a non-equilibrium state. A system moves to low or high information depending on its energy status, as the benefit of information in maintaining and increasing order is balanced against its energy cost. Prokaryotes, which lack specialized energy-producing organelles (mitochondria), are energy-limited and constrained to an information minimum. Acquisition of mitochondria is viewed as a critical evolutionary step that, by allowing eukaryotes to achieve a sufficiently high energy state, permitted a phase transition to an information maximum. This state, in contrast to the prokaryote minima, allowed evolution of complex, multicellular organisms. A special case is a malignant cell, which is modeled as a phase transition from a maximum to minimum information state. The minimum leads to a predicted power-law governing the in situ growth that is confirmed by studies measuring growth of small breast cancers. We find living systems achieve a stable entropic state by maintaining an extreme level of information. The evolutionary divergence of prokaryotes and eukaryotes resulted from acquisition of specialized energy organelles that allowed transition from information minima to maxima, respectively. Carcinogenesis represents a reverse transition: of an information maximum to minimum. The progressive information loss is evident in accumulating mutations, disordered morphology, and functional decline characteristics of human cancers. The findings suggest energy restriction is a critical first step that triggers the genetic mutations that drive somatic evolution of the malignant phenotype.

  3. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S.

    1989-05-15

    Several theorems on the mechanics of gathering information are proved, and the possibility of violating the second law of thermodynamics by obtaining information is discussed in light of these theorems. Maxwell's demon can lower the entropy of his surroundings by an amount equal to the difference between the maximum entropy of his recording device and its initial entropy, without generating a compensating entropy increase. A demon with human-scale recording devices can reduce the entropy of a gas by a negligible amount only, but the proof of the demon's impracticability leaves open the possibility that systems highly correlated with their environmentmore » can reduce the environment's entropy by a substantial amount without increasing entropy elsewhere. In the event that a boundary condition for the universe requires it to be in a state of low entropy when small, the correlations induced between different particle modes during the expansion phase allow the modes to behave like Maxwell's demons during the contracting phase, reducing the entropy of the universe to a low value.« less

  4. On Use of Multi-Chambered Fission Detectors for In-Core, Neutron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Roberts, Jeremy A.

    2018-01-01

    Presented is a short, computational study on the potential use of multichambered fission detectors for in-core, neutron spectroscopy. Motivated by the development of very small fission chambers at CEA in France and at Kansas State University in the U.S., it was assumed in this preliminary analysis that devices can be made small enough to avoid flux perturbations and that uncertainties related to measurements can be ignored. It was hypothesized that a sufficient number of chambers with unique reactants can act as a real-time, foilactivation experiment. An unfolding scheme based on maximizing (Shannon) entropy was used to produce a flux spectrum from detector signals that requires no prior information. To test the method, integral, detector responses were generated for singleisotope detectors of various Th, U, Np, Pu, Am, and Cs isotopes using a simplified, pressurized-water reactor spectrum and fluxweighted, microscopic, fission cross sections, in the WIMS-69 multigroup format. An unfolded spectrum was found from subsets of these responses that had a maximum entropy while reproducing the responses considered and summing to one (that is, they were normalized). Several nuclide subsets were studied, and, as expected, the results indicate inclusion of more nuclides leads to better spectra but with diminishing improvements, with the best-case spectrum having an average, relative, group-wise error of approximately 51%. Furthermore, spectra found from minimum-norm and Tihkonov-regularization inversion were of lower quality than the maximum entropy solutions. Finally, the addition of thermal-neutron filters (here, Cd and Gd) provided substantial improvement over unshielded responses alone. The results, as a whole, suggest that in-core, neutron spectroscopy is at least marginally feasible.

  5. Magnetization dynamics of single-domain nanodots and minimum energy dissipation during either irreversible or reversible switching

    NASA Astrophysics Data System (ADS)

    Madami, Marco; Gubbiotti, Gianluca; Tacchi, Silvia; Carlotti, Giovanni

    2017-11-01

    Single- or multi-layered planar magnetic dots, with lateral dimensions ranging from tens to hundreds of nanometers, are used as elemental switches in current and forthcoming devices for information and communication technology (ICT), including magnetic memories, spin-torque oscillators and nano-magnetic logic gates. In this review article, we will first discuss energy dissipation during irreversible switching protocols of dots of different dimensions, ranging from a few tens of nanometers to the micrometric range. Then we will focus on the fundamental energy limits of adiabatic (slow) erasure and reversal of a magnetic nanodot, showing that dissipationless operation is achievable, provided that both dynamic reversibility (arbitrarily slow application of external fields) and entropic reversibility (no free entropy increase) are insured. However, recent theoretical and experimental tests of magnetic-dot erasure reveal that intrinsic defects related to materials imperfections such as roughness or polycrystallinity, may cause an excess of dissipation if compared to the minimum theoretical limit. We will conclude providing an outlook on the most promising strategies to achieve a new generation of power-saving nanomagnetic logic devices based on clusters of interacting dots and on straintronics.

  6. Entropy-as-a-Service: Unlocking the Full Potential of Cryptography.

    PubMed

    Vassilev, Apostol; Staples, Robert

    2016-09-01

    Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised.

  7. Texture descriptions of lunar surface derived from LOLA data: Kilometer-scale roughness and entropy maps

    NASA Astrophysics Data System (ADS)

    Li, Bo; Ling, Zongcheng; Zhang, Jiang; Chen, Jian; Wu, Zhongchen; Ni, Yuheng; Zhao, Haowei

    2015-11-01

    The lunar global texture maps of roughness and entropy are derived at kilometer scales from Digital Elevation Models (DEMs) data obtained by Lunar Orbiter Laser Altimeter (LOLA) aboard on Lunar Reconnaissance Orbiter (LRO) spacecraft. We use statistical moments of a gray-level histogram of elevations in a neighborhood to compute the roughness and entropy value. Our texture descriptors measurements are shown in global maps at multi-sized square neighborhoods, whose length of side is 3, 5, 10, 20, 40 and 80 pixels, respectively. We found that large-scale topographical changes can only be displayed in maps with longer side of neighborhood, but the small scale global texture maps are more disorderly and unsystematic because of more complicated textures' details. Then, the frequency curves of texture maps are made out, whose shapes and distributions are changing as the spatial scales increases. Entropy frequency curve with minimum 3-pixel scale has large fluctuations and six peaks. According to this entropy curve we can classify lunar surface into maria, highlands, different parts of craters preliminarily. The most obvious textures in the middle-scale roughness and entropy maps are the two typical morphological units, smooth maria and rough highlands. For the impact crater, its roughness and entropy value are characterized by a multiple-ring structure obviously, and its different parts have different texture results. In the last, we made a 2D scatter plot between the two texture results of typical lunar maria and highlands. There are two clusters with largest dot density which are corresponded to the lunar highlands and maria separately. In the lunar mare regions (cluster A), there is a high correlation between roughness and entropy, but in the highlands (Cluster B), the entropy shows little change. This could be subjected to different geological processes of maria and highlands forming different landforms.

  8. Entropy Generation/Availability Energy Loss Analysis Inside MIT Gas Spring and "Two Space" Test Rigs

    NASA Technical Reports Server (NTRS)

    Ebiana, Asuquo B.; Savadekar, Rupesh T.; Patel, Kaushal V.

    2006-01-01

    The results of the entropy generation and availability energy loss analysis under conditions of oscillating pressure and oscillating helium gas flow in two Massachusetts Institute of Technology (MIT) test rigs piston-cylinder and piston-cylinder-heat exchanger are presented. Two solution domains, the gas spring (single-space) in the piston-cylinder test rig and the gas spring + heat exchanger (two-space) in the piston-cylinder-heat exchanger test rig are of interest. Sage and CFD-ACE+ commercial numerical codes are used to obtain 1-D and 2-D computer models, respectively, of each of the two solution domains and to simulate the oscillating gas flow and heat transfer effects in these domains. Second law analysis is used to characterize the entropy generation and availability energy losses inside the two solution domains. Internal and external entropy generation and availability energy loss results predicted by Sage and CFD-ACE+ are compared. Thermodynamic loss analysis of simple systems such as the MIT test rigs are often useful to understand some important features of complex pattern forming processes in more complex systems like the Stirling engine. This study is aimed at improving numerical codes for the prediction of thermodynamic losses via the development of a loss post-processor. The incorporation of loss post-processors in Stirling engine numerical codes will facilitate Stirling engine performance optimization. Loss analysis using entropy-generation rates due to heat and fluid flow is a relatively new technique for assessing component performance. It offers a deep insight into the flow phenomena, allows a more exact calculation of losses than is possible with traditional means involving the application of loss correlations and provides an effective tool for improving component and overall system performance.

  9. A computational study of entropy generation in magnetohydrodynamic flow and heat transfer over an unsteady stretching permeable sheet

    NASA Astrophysics Data System (ADS)

    Saeed Butt, Adnan; Ali, Asif

    2014-01-01

    The present article aims to investigate the entropy effects in magnetohydrodynamic flow and heat transfer over an unsteady permeable stretching surface. The time-dependent partial differential equations are converted into non-linear ordinary differential equations by suitable similarity transformations. The solutions of these equations are computed analytically by the Homotopy Analysis Method (HAM) then solved numerically by the MATLAB built-in routine. Comparison of the obtained results is made with the existing literature under limiting cases to validate our study. The effects of unsteadiness parameter, magnetic field parameter, suction/injection parameter, Prandtl number, group parameter and Reynolds number on flow and heat transfer characteristics are checked and analysed with the aid of graphs and tables. Moreover, the effects of these parameters on entropy generation number and Bejan number are also shown graphically. It is examined that the unsteadiness and presence of magnetic field augments the entropy production.

  10. Double strand breaks may be a missing link between entropy and aging.

    PubMed

    Lenart, Peter; Bienertová-Vašků, Julie

    2016-07-01

    It has been previously suggested that an increase in entropy production leads to aging. However, the mechanisms linking increased entropy production in living mass to aging are currently unclear. Even though entropy cannot be easily associated with any specific molecular damage, the increase of entropy in structural mass may be connected with heat stress, which is known to generate double strand breaks. Double strand breaks, which are in turn known to play an important role in process of aging, are thus connected to both aging and an increase of entropy. In view of these associations, we propose a new model where the increase of entropy leads to the formation of double strand breaks, resulting in an aging phenotype. This not only offers a new perspective on aging research and facilitates experimental validation, but could also serve as a useful explanatory tool. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Noisy EEG signals classification based on entropy metrics. Performance assessment using first and second generation statistics.

    PubMed

    Cuesta-Frau, David; Miró-Martínez, Pau; Jordán Núñez, Jorge; Oltra-Crespo, Sandra; Molina Picó, Antonio

    2017-08-01

    This paper evaluates the performance of first generation entropy metrics, featured by the well known and widely used Approximate Entropy (ApEn) and Sample Entropy (SampEn) metrics, and what can be considered an evolution from these, Fuzzy Entropy (FuzzyEn), in the Electroencephalogram (EEG) signal classification context. The study uses the commonest artifacts found in real EEGs, such as white noise, and muscular, cardiac, and ocular artifacts. Using two different sets of publicly available EEG records, and a realistic range of amplitudes for interfering artifacts, this work optimises and assesses the robustness of these metrics against artifacts in class segmentation terms probability. The results show that the qualitative behaviour of the two datasets is similar, with SampEn and FuzzyEn performing the best, and the noise and muscular artifacts are the most confounding factors. On the contrary, there is a wide variability as regards initialization parameters. The poor performance achieved by ApEn suggests that this metric should not be used in these contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Determination of LEDs degradation with entropy generation rate

    NASA Astrophysics Data System (ADS)

    Cuadras, Angel; Yao, Jiaqiang; Quilez, Marcos

    2017-10-01

    We propose a method to assess the degradation and aging of light emitting diodes (LEDs) based on irreversible entropy generation rate. We degraded several LEDs and monitored their entropy generation rate ( S ˙ ) in accelerated tests. We compared the thermoelectrical results with the optical light emission evolution during degradation. We find a good relationship between aging and S ˙ (t), because S ˙ is both related to device parameters and optical performance. We propose a threshold of S ˙ (t) as a reliable damage indicator of LED end-of-life that can avoid the need to perform optical measurements to assess optical aging. The method lays beyond the typical statistical laws for lifetime prediction provided by manufacturers. We tested different LED colors and electrical stresses to validate the electrical LED model and we analyzed the degradation mechanisms of the devices.

  13. An information-theoretical perspective on weighted ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Weijs, Steven V.; van de Giesen, Nick

    2013-08-01

    This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.

  14. Spatial chaos of Wang tiles with two symbols

    NASA Astrophysics Data System (ADS)

    Chen, Jin-Yu; Chen, Yu-Jie; Hu, Wen-Guei; Lin, Song-Sun

    2016-02-01

    This investigation completely classifies the spatial chaos problem in plane edge coloring (Wang tiles) with two symbols. For a set of Wang tiles B , spatial chaos occurs when the spatial entropy h ( B ) is positive. B is called a minimal cycle generator if P ( B ) ≠ 0̸ and P ( B ' ) = 0̸ whenever B ' ⫋ B , where P ( B ) is the set of all periodic patterns on ℤ2 generated by B . Given a set of Wang tiles B , write B = C 1 ∪ C 2 ∪ ⋯ ∪ C k ∪ N , where Cj, 1 ≤ j ≤ k, are minimal cycle generators and B contains no minimal cycle generator except those contained in C1∪C2∪⋯∪Ck. Then, the positivity of spatial entropy h ( B ) is completely determined by C1∪C2∪⋯∪Ck. Furthermore, there are 39 equivalence classes of marginal positive-entropy sets of Wang tiles and 18 equivalence classes of saturated zero-entropy sets of Wang tiles. For a set of Wang tiles B , h ( B ) is positive if and only if B contains a MPE set, and h ( B ) is zero if and only if B is a subset of a SZE set.

  15. MHD nanofluid free convection and entropy generation in porous enclosures with different conductivity ratios

    NASA Astrophysics Data System (ADS)

    Ghasemi, Kasra; Siavashi, Majid

    2017-11-01

    MHD natural convection of Cu-water nanofluid in a square porous enclosure is investigated using a parallel LBM code, considering temperature dependence of viscosity and viscous dissipation. Effects of nanofluid concentration (φ = 0 - 0.12), Rayleigh (Ra =103 -106), Hartmann (Ha = 0-20) and porous-fluid thermal conductivity ratio (K∗ = 1-70) on heat transfer and entropy generation are investigated. It is shown that K∗ is a very important parameter, and porous media with low K∗ numbers can confine convection effects, but by increasing K∗ both conduction and convection effects can substantially improve. Also, magnetic field always has negative impact on Nu, however this impact can be controlled by φ and K∗. A magnetic instability has also observed in Ra = 104, and Nu exhibits a sinusoidal variation with Ha. It is proved that, depending on K∗, Ra and Ha values, use of nanofluid with porous media to enhance heat transfer can be either beneficial or detrimental. Also, for given K∗, Ra and Ha numbers an optimal φ exists to improve heat transfer. Finally, entropy generation study performed and results state that in low and high Ra values the thermal and frictional entropy generation are respectively dominant, while for moderate Ra they have the same order of magnitude.

  16. Entropy-as-a-Service: Unlocking the Full Potential of Cryptography

    PubMed Central

    Vassilev, Apostol; Staples, Robert

    2016-01-01

    Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised. PMID:28003687

  17. Free Energy Landscape of Protein-Protein Encounter Resulting from Brownian Dynamics Simulations of Barnase:Barstar.

    PubMed

    Spaar, Alexander; Helms, Volkhard

    2005-07-01

    Over the past years Brownian dynamics (BD) simulations have been proven to be a suitable tool for the analysis of protein-protein association. The computed rates and relative trends for protein mutants and different ionic strength are generally in good agreement with experimental results, e.g. see ref 1. By design, BD simulations correspond to an intensive sampling over energetically favorable states, rather than to a systematic sampling over all possible states which is feasible only at rather low resolution. On the example of barnase and barstar, a well characterized model system of electrostatically steered diffusional encounter, we report here the computation of the 6-dimensional free energy landscape for the encounter process of two proteins by a novel, careful analysis of the trajectories from BD simulations. The aim of these studies was the clarification of the encounter state. Along the trajectories, the individual positions and orientations of one protein (relative to the other) are recorded and stored in so-called occupancy maps. Since the number of simulated trajectories is sufficiently high, these occupancy maps can be interpreted as a probability distribution which allows the calculation of the entropy landscape by the use of a locally defined entropy function. Additionally, the configuration dependent electrostatic and desolvation energies are recorded in separate maps. The free energy landscape of protein-protein encounter is finally obtained by summing the energy and entropy contributions. In the free energy profile along the reaction path, which is defined as the path along the minima in the free energy landscape, a minimum shows up suggesting this to be used as the definition of the encounter state. This minimum describes a state of reduced diffusion velocity where the electrostatic attraction is compensated by the repulsion due to the unfavorable desolvation of the charged residues and the entropy loss due to the increasing restriction of the motional freedom. In the simulations the orientational degrees of freedom at the encounter state are found to be less restricted than the translational degrees of freedom. Therefore, the orientational alignment of the two binding partners seems to take place beyond this free energy minimum. The free energy profiles along the reaction pathway are compared for different ionic strength and temperature. This novel analysis technique facilitates mechanistic interpretation of protein-protein encounter pathways which should be useful for interpretation of experimental results as well.

  18. Efficient Computation of Small-Molecule Configurational Binding Entropy and Free Energy Changes by Ensemble Enumeration

    PubMed Central

    2013-01-01

    Here we present a novel, end-point method using the dead-end-elimination and A* algorithms to efficiently and accurately calculate the change in free energy, enthalpy, and configurational entropy of binding for ligand–receptor association reactions. We apply the new approach to the binding of a series of human immunodeficiency virus (HIV-1) protease inhibitors to examine the effect ensemble reranking has on relative accuracy as well as to evaluate the role of the absolute and relative ligand configurational entropy losses upon binding in affinity differences for structurally related inhibitors. Our results suggest that most thermodynamic parameters can be estimated using only a small fraction of the full configurational space, and we see significant improvement in relative accuracy when using an ensemble versus single-conformer approach to ligand ranking. We also find that using approximate metrics based on the single-conformation enthalpy differences between the global minimum energy configuration in the bound as well as unbound states also correlates well with experiment. Using a novel, additive entropy expansion based on conditional mutual information, we also analyze the source of ligand configurational entropy loss upon binding in terms of both uncoupled per degree of freedom losses as well as changes in coupling between inhibitor degrees of freedom. We estimate entropic free energy losses of approximately +24 kcal/mol, 12 kcal/mol of which stems from loss of translational and rotational entropy. Coupling effects contribute only a small fraction to the overall entropy change (1–2 kcal/mol) but suggest differences in how inhibitor dihedral angles couple to each other in the bound versus unbound states. The importance of accounting for flexibility in drug optimization and design is also discussed. PMID:24250277

  19. Statistical physics of self-replication.

    PubMed

    England, Jeremy L

    2013-09-28

    Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.

  20. Benford's law and the FSD distribution of economic behavioral micro data

    NASA Astrophysics Data System (ADS)

    Villas-Boas, Sofia B.; Fu, Qiuzi; Judge, George

    2017-11-01

    In this paper, we focus on the first significant digit (FSD) distribution of European micro income data and use information theoretic-entropy based methods to investigate the degree to which Benford's FSD law is consistent with the nature of these economic behavioral systems. We demonstrate that Benford's law is not an empirical phenomenon that occurs only in important distributions in physical statistics, but that it also arises in self-organizing dynamic economic behavioral systems. The empirical likelihood member of the minimum divergence-entropy family, is used to recover country based income FSD probability density functions and to demonstrate the implications of using a Benford prior reference distribution in economic behavioral system information recovery.

  1. Scaling laws for ignition at the National Ignition Facility from first principles.

    PubMed

    Cheng, Baolian; Kwan, Thomas J T; Wang, Yi-Ming; Batha, Steven H

    2013-10-01

    We have developed an analytical physics model from fundamental physics principles and used the reduced one-dimensional model to derive a thermonuclear ignition criterion and implosion energy scaling laws applicable to inertial confinement fusion capsules. The scaling laws relate the fuel pressure and the minimum implosion energy required for ignition to the peak implosion velocity and the equation of state of the pusher and the hot fuel. When a specific low-entropy adiabat path is used for the cold fuel, our scaling laws recover the ignition threshold factor dependence on the implosion velocity, but when a high-entropy adiabat path is chosen, the model agrees with recent measurements.

  2. Minimax Quantum Tomography: Estimators and Relative Entropy Bounds.

    PubMed

    Ferrie, Christopher; Blume-Kohout, Robin

    2016-03-04

    A minimax estimator has the minimum possible error ("risk") in the worst case. We construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O(1/sqrt[N])-in contrast to that of classical probability estimation, which is O(1/N)-where N is the number of copies of the quantum state used. We trace this deficiency to sampling mismatch: future observations that determine risk may come from a different sample space than the past data that determine the estimate. This makes minimax estimators very biased, and we propose a computationally tractable alternative with similar behavior in the worst case, but superior accuracy on most states.

  3. Consistent description of kinetic equation with triangle anomaly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pu Shi; Gao Jianhua; Wang Qun

    2011-05-01

    We provide a consistent description of the kinetic equation with a triangle anomaly which is compatible with the entropy principle of the second law of thermodynamics and the charge/energy-momentum conservation equations. In general an anomalous source term is necessary to ensure that the equations for the charge and energy-momentum conservation are satisfied and that the correction terms of distribution functions are compatible to these equations. The constraining equations from the entropy principle are derived for the anomaly-induced leading order corrections to the particle distribution functions. The correction terms can be determined for the minimum number of unknown coefficients in onemore » charge and two charge cases by solving the constraining equations.« less

  4. Operational Safety Assessment of Turbo Generators with Wavelet Rényi Entropy from Sensor-Dependent Vibration Signals

    PubMed Central

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-01-01

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals’ wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance. PMID:25894934

  5. Entropy generation across Earth's collisionless bow shock.

    PubMed

    Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A

    2012-02-10

    Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.

  6. Investigation of Heat and Mass Transfer and Irreversibility Phenomena Within a Three-Dimensional Tilted Enclosure for Different Shapes

    NASA Astrophysics Data System (ADS)

    Oueslati, F.; Ben-Beya, B.

    2018-01-01

    Three-dimensional thermosolutal natural convection and entropy generation within an inclined enclosure is investigated in the current study. A numerical method based on the finite volume method and a full multigrid technique is implemented to solve the governing equations. Effects of various parameters, namely, the aspect ratio, buoyancy ratio, and tilt angle on the flow patterns and entropy generation are predicted and discussed.

  7. Analysis of natural convection in nanofluid-filled H-shaped cavity by entropy generation and heatline visualization using lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Rahimi, Alireza; Sepehr, Mohammad; Lariche, Milad Janghorban; Mesbah, Mohammad; Kasaeipoor, Abbas; Malekshah, Emad Hasani

    2018-03-01

    The lattice Boltzmann simulation of natural convection in H-shaped cavity filled with nanofluid is performed. The entropy generation analysis and heatline visualization are employed to analyze the considered problem comprehensively. The produced nanofluid is SiO2-TiO2/Water-EG (60:40) hybrid nanofluid, and the thermal conductivity and dynamic viscosity of used nanofluid are measured experimentally. To use the experimental data of thermal conductivity and dynamic viscosity, two sets of correlations based on temperature for six different solid volume fractions of 0.5, 1, 1.5, 2, 2.5 and 3 vol% are derived. The influences of different governing parameters such different aspect ratio, solid volume fractions of nanofluid and Rayleigh numbers on the fluid flow, temperature filed, average/local Nusselt number, total/local entropy generation and heatlines are presented.

  8. Application of exergetic sustainability index to a nano-scale irreversible Brayton cycle operating with ideal Bose and Fermi gasses

    NASA Astrophysics Data System (ADS)

    Açıkkalp, Emin; Caner, Necmettin

    2015-09-01

    In this study, a nano-scale irreversible Brayton cycle operating with quantum gasses including Bose and Fermi gasses is researched. Developments in the nano-technology cause searching the nano-scale machines including thermal systems to be unavoidable. Thermodynamic analysis of a nano-scale irreversible Brayton cycle operating with Bose and Fermi gasses was performed (especially using exergetic sustainability index). In addition, thermodynamic analysis involving classical evaluation parameters such as work output, exergy output, entropy generation, energy and exergy efficiencies were conducted. Results are submitted numerically and finally some useful recommendations were conducted. Some important results are: entropy generation and exergetic sustainability index are affected mostly for Bose gas and power output and exergy output are affected mostly for the Fermi gas by x. At the high temperature conditions, work output and entropy generation have high values comparing with other degeneracy conditions.

  9. Entropy generation minimization (EGM) of nanofluid flow by a thin moving needle with nonlinear thermal radiation

    NASA Astrophysics Data System (ADS)

    Waleed Ahmed Khan, M.; Ijaz Khan, M.; Hayat, T.; Alsaedi, A.

    2018-04-01

    Entropy generation minimization (EGM) and heat transport in nonlinear radiative flow of nanomaterials over a thin moving needle has been discussed. Nonlinear thermal radiation and viscous dissipation terms are merged in the energy expression. Water is treated as ordinary fluid while nanomaterials comprise titanium dioxide, copper and aluminum oxide. The nonlinear governing expressions of flow problems are transferred to ordinary ones and then tackled for numerical results by Built-in-shooting technique. In first section of this investigation, the entropy expression is derived as a function of temperature and velocity gradients. Geometrical and physical flow field variables are utilized to make it nondimensionalized. An entropy generation analysis is utilized through second law of thermodynamics. The results of temperature, velocity, concentration, surface drag force and heat transfer rate are explored. Our outcomes reveal that surface drag force and Nusselt number (heat transfer) enhanced linearly for higher nanoparticle volume fraction. Furthermore drag force decays for aluminum oxide and it enhances for copper nanoparticles. In addition, the lowest heat transfer rate is achieved for higher radiative parameter. Temperature field is enhanced with increase in temperature ratio parameter.

  10. Perspective: Maximum caliber is a general variational principle for dynamical systems

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A.

    2018-01-01

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics—such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production—are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  11. Divalent cation shrinks DNA but inhibits its compaction with trivalent cation.

    PubMed

    Tongu, Chika; Kenmotsu, Takahiro; Yoshikawa, Yuko; Zinchenko, Anatoly; Chen, Ning; Yoshikawa, Kenichi

    2016-05-28

    Our observation reveals the effects of divalent and trivalent cations on the higher-order structure of giant DNA (T4 DNA 166 kbp) by fluorescence microscopy. It was found that divalent cations, Mg(2+) and Ca(2+), inhibit DNA compaction induced by a trivalent cation, spermidine (SPD(3+)). On the other hand, in the absence of SPD(3+), divalent cations cause the shrinkage of DNA. As the control experiment, we have confirmed the minimum effect of monovalent cation, Na(+) on the DNA higher-order structure. We interpret the competition between 2+ and 3+ cations in terms of the change in the translational entropy of the counterions. For the compaction with SPD(3+), we consider the increase in translational entropy due to the ion-exchange of the intrinsic monovalent cations condensing on a highly charged polyelectrolyte, double-stranded DNA, by the 3+ cations. In contrast, the presence of 2+ cation decreases the gain of entropy contribution by the ion-exchange between monovalent and 3+ ions.

  12. Perspective: Maximum caliber is a general variational principle for dynamical systems.

    PubMed

    Dixit, Purushottam D; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A

    2018-01-07

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics-such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production-are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  13. Covariance hypotheses for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Decell, H. P.; Peters, C.

    1983-01-01

    Two covariance hypotheses are considered for LANDSAT data acquired by sampling fields, one an autoregressive covariance structure and the other the hypothesis of exchangeability. A minimum entropy approximation of the first structure by the second is derived and shown to have desirable properties for incorporation into a mixture density estimation procedure. Results of a rough test of the exchangeability hypothesis are presented.

  14. Minimum Entropy Autofocus Correction of Residual Range Cell Migration

    DTIC Science & Technology

    2017-03-02

    reduced the residual to effectively a slowly varying bias on the order of a wavelength ( ∼ 3 cm ) which has negligible impact on the image focus. Fig...Fitzgerrell, and J. Beaver , “Two- dimensional phase gradient autofocus,” Proc. SPIE, vol. 4123, pp. 162– 173, 2000. [6] D. H. Brandwood, “A complex gradient

  15. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  16. Entropy, matter, and cosmology.

    PubMed

    Prigogine, I; Géhéniau, J

    1986-09-01

    The role of irreversible processes corresponding to creation of matter in general relativity is investigated. The use of Landau-Lifshitz pseudotensors together with conformal (Minkowski) coordinates suggests that this creation took place in the early universe at the stage of the variation of the conformal factor. The entropy production in this creation process is calculated. It is shown that these dissipative processes lead to the possibility of cosmological models that start from empty conditions and gradually build up matter and entropy. Gravitational entropy takes a simple meaning as associated to the entropy that is necessary to produce matter. This leads to an extension of the third law of thermodynamics, as now the zero point of entropy becomes the space-time structure out of which matter is generated. The theory can be put into a convenient form using a supplementary "C" field in Einstein's field equations. The role of the C field is to express the coupling between gravitation and matter leading to irreversible entropy production.

  17. Wavelet Packet Entropy for Heart Murmurs Classification

    PubMed Central

    Safara, Fatemeh; Doraisamy, Shyamala; Azman, Azreen; Jantan, Azrul; Ranga, Sri

    2012-01-01

    Heart murmurs are the first signs of cardiac valve disorders. Several studies have been conducted in recent years to automatically differentiate normal heart sounds, from heart sounds with murmurs using various types of audio features. Entropy was successfully used as a feature to distinguish different heart sounds. In this paper, new entropy was introduced to analyze heart sounds and the feasibility of using this entropy in classification of five types of heart sounds and murmurs was shown. The entropy was previously introduced to analyze mammograms. Four common murmurs were considered including aortic regurgitation, mitral regurgitation, aortic stenosis, and mitral stenosis. Wavelet packet transform was employed for heart sound analysis, and the entropy was calculated for deriving feature vectors. Five types of classification were performed to evaluate the discriminatory power of the generated features. The best results were achieved by BayesNet with 96.94% accuracy. The promising results substantiate the effectiveness of the proposed wavelet packet entropy for heart sounds classification. PMID:23227043

  18. Constrained signal reconstruction from wavelet transform coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brislawn, C.M.

    1991-12-31

    A new method is introduced for reconstructing a signal from an incomplete sampling of its Discrete Wavelet Transform (DWT). The algorithm yields a minimum-norm estimate satisfying a priori upper and lower bounds on the signal. The method is based on a finite-dimensional representation theory for minimum-norm estimates of bounded signals developed by R.E. Cole. Cole`s work has its origins in earlier techniques of maximum-entropy spectral estimation due to Lang and McClellan, which were adapted by Steinhardt, Goodrich and Roberts for minimum-norm spectral estimation. Cole`s extension of their work provides a representation for minimum-norm estimates of a class of generalized transformsmore » in terms of general correlation data (not just DFT`s of autocorrelation lags, as in spectral estimation). One virtue of this great generality is that it includes the inverse DWT. 20 refs.« less

  19. Experimental and analytical investigation of direct and indirect noise generated from non-isentropic boundaries

    NASA Astrophysics Data System (ADS)

    de Domenico, Francesca; Rolland, Erwan; Hochgreb, Simone

    2017-11-01

    Pressure fluctuations in combustors arise either directly from the heat release rate perturbations of the flame (direct noise), or indirectly from the acceleration of entropy, vorticity or compositional perturbations through nozzles or turbine guide vanes (indirect noise). In this work, the second mechanism is experimentally investigated in a simplified rig. Synthetic entropy spots are generated via Joule effect or helium injection and then accelerated via orifice plates of different area contraction and thickness. The objective of the study is to parametrically analyse the entropy-to-sound conversion in non isentropic contractions (e.g. with pressure losses), represented by the orifice plates. Acoustic measurements are performed to reconstruct the acoustic and entropic transfer functions of the orifices and compare experimental data with analytical predictions, to investigate the effect of orifice thickness and area ratio on the transfer functions. PIV measurements are performed to study the stretching and dispersion of the entropy waves due to mean flow effects. Secondly, PIV images taken in the jet exiting downstream of the orifices are used to investigate the coupling of the acoustic and entropy fields with the hydrodynamic field. EPRSC, Qualcomm.

  20. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  1. An Equation for Moist Entropy in a Precipitating and Icy Atmosphere

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Simpson, Joanne; Zeng, Xiping

    2003-01-01

    Moist entropy is nearly conserved in adiabatic motion. It is redistributed rather than created by moist convection. Thus moist entropy and its equation, as a healthy direction, can be used to construct analytical and numerical models for the interaction between tropical convective clouds and large-scale circulations. Hence, an accurate equation of moist entropy is needed for the analysis and modeling of atmospheric convective clouds. On the basis of the consistency between the energy and the entropy equations, a complete equation of moist entropy is derived from the energy equation. The equation expresses explicitly the internal and external sources of moist entropy, including those in relation to the microphysics of clouds and precipitation. In addition, an accurate formula for the surface flux of moist entropy from the underlying surface into the air above is derived. Because moist entropy deals "easily" with the transition among three water phases, it will be used as a prognostic variable in the next generation of cloud-resolving models (e. g. a global cloud-resolving model) for low computational noise. Its equation that is derived in this paper is accurate and complete, providing a theoretical basis for using moist entropy as a prognostic variable in the long-term modeling of clouds and large-scale circulations.

  2. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy.

    PubMed

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  3. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  4. On the Application of Information Theory to Sustainability

    EPA Science Inventory

    According to the 2nd Law of Thermodynamics, entropy must be an increasing function of time for the whole universe, system plus surroundings. This gives rise to conjectures regarding the lost of work with entropy generation in a general processes. It can be shown that under cond...

  5. Molecular mechanism of direct proflavine-DNA intercalation: evidence for drug-induced minimum base-stacking penalty pathway.

    PubMed

    Sasikala, Wilbee D; Mukherjee, Arnab

    2012-10-11

    DNA intercalation, a biophysical process of enormous clinical significance, has surprisingly eluded molecular understanding for several decades. With appropriate configurational restraint (to prevent dissociation) in all-atom metadynamics simulations, we capture the free energy surface of direct intercalation from minor groove-bound state for the first time using an anticancer agent proflavine. Mechanism along the minimum free energy path reveals that intercalation happens through a minimum base stacking penalty pathway where nonstacking parameters (Twist→Slide/Shift) change first, followed by base stacking parameters (Buckle/Roll→Rise). This mechanism defies the natural fluctuation hypothesis and provides molecular evidence for the drug-induced cavity formation hypothesis. The thermodynamic origin of the barrier is found to be a combination of entropy and desolvation energy.

  6. Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.

    PubMed

    Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai

    2017-02-20

    Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.

  7. On the Application of Information Theory to Regime Changes and Sustainability

    EPA Science Inventory

    According to the 2nd Law of Thermodynamics, entropy must be an increasing function of time for the whole universe, system plus surroundings. This gives rise to conjectures regarding the lost of work with entropy generation in a general processes. It can be shown that under cond...

  8. Systematic investigation of NLTE phenomena in the limit of small departures from LTE

    NASA Astrophysics Data System (ADS)

    Libby, S. B.; Graziani, F. R.; More, R. M.; Kato, T.

    1997-04-01

    In this paper, we begin a systematic study of Non-Local Thermal Equilibrium (NLTE) phenomena in near equilibrium (LTE) high energy density, highly radiative plasmas. It is shown that the principle of minimum entropy production rate characterizes NLTE steady states for average atom rate equations in the case of small departures form LTE. With the aid of a novel hohlraum-reaction box thought experiment, we use the principles of minimum entropy production and detailed balance to derive Onsager reciprocity relations for the NLTE responses of a near equilibrium sample to non-Planckian perturbations in different frequency groups. This result is a significant symmetry constraint on the linear corrections to Kirchoff's law. We envisage applying our strategy to a number of test problems which include: the NLTE corrections to the ionization state of an ion located near the edge of an otherwise LTE medium; the effect of a monochromatic radiation field perturbation on an LTE medium; the deviation of Rydberg state populations from LTE in recombining or ionizing plasmas; multi-electron temperature models such as that of Busquet; and finally, the effect of NLTE population shifts on opacity models.

  9. Characterization of separability and entanglement in (2xD)- and (3xD)-dimensional systems by single-qubit and single-qutrit unitary transformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giampaolo, Salvatore M.; CNR-INFM Coherentia, Naples; CNISM Unita di Salerno and INFN Sezione di Napoli, Gruppo collegato di Salerno, Baronissi

    2007-10-15

    We investigate the geometric characterization of pure state bipartite entanglement of (2xD)- and (3xD)-dimensional composite quantum systems. To this aim, we analyze the relationship between states and their images under the action of particular classes of local unitary operations. We find that invariance of states under the action of single-qubit and single-qutrit transformations is a necessary and sufficient condition for separability. We demonstrate that in the (2xD)-dimensional case the von Neumann entropy of entanglement is a monotonic function of the minimum squared Euclidean distance between states and their images over the set of single qubit unitary transformations. Moreover, both inmore » the (2xD)- and in the (3xD)-dimensional cases the minimum squared Euclidean distance exactly coincides with the linear entropy [and thus as well with the tangle measure of entanglement in the (2xD)-dimensional case]. These results provide a geometric characterization of entanglement measures originally established in informational frameworks. Consequences and applications of the formalism to quantum critical phenomena in spin systems are discussed.« less

  10. Thermodynamics of an ideal generalized gas: II. Means of order alpha.

    PubMed

    Lavenda, B H

    2005-11-01

    The property that power means are monotonically increasing functions of their order is shown to be the basis of the second laws not only for processes involving heat conduction, but also for processes involving deformations. This generalizes earlier work involving only pure heat conduction and underlines the incomparability of the internal energy and adiabatic potentials when expressed as powers of the adiabatic variable. In an L-potential equilibration, the final state will be one of maximum entropy, whereas in an entropy equilibration, the final state will be one of minimum L. Unlike classical equilibrium thermodynamic phase space, which lacks an intrinsic metric structure insofar as distances and other geometrical concepts do not have an intrinsic thermodynamic significance in such spaces, a metric space can be constructed for the power means: the distance between means of different order is related to the Carnot efficiency. In the ideal classical gas limit, the average change in the entropy is shown to be proportional to the difference between the Shannon and Rényi entropies for nonextensive systems that are multifractal in nature. The L potential, like the internal energy, is a Schur convex function of the empirical temperature, which satisfies Jensen's inequality, and serves as a measure of the tendency to uniformity in processes involving pure thermal conduction.

  11. Characterizing Protease Specificity: How Many Substrates Do We Need?

    PubMed Central

    Schauperl, Michael; Fuchs, Julian E.; Waldner, Birgit J.; Huber, Roland G.; Kramer, Christian; Liedl, Klaus R.

    2015-01-01

    Calculation of cleavage entropies allows to quantify, map and compare protease substrate specificity by an information entropy based approach. The metric intrinsically depends on the number of experimentally determined substrates (data points). Thus a statistical analysis of its numerical stability is crucial to estimate the systematic error made by estimating specificity based on a limited number of substrates. In this contribution, we show the mathematical basis for estimating the uncertainty in cleavage entropies. Sets of cleavage entropies are calculated using experimental cleavage data and modeled extreme cases. By analyzing the underlying mathematics and applying statistical tools, a linear dependence of the metric in respect to 1/n was found. This allows us to extrapolate the values to an infinite number of samples and to estimate the errors. Analyzing the errors, a minimum number of 30 substrates was found to be necessary to characterize substrate specificity, in terms of amino acid variability, for a protease (S4-S4’) with an uncertainty of 5 percent. Therefore, we encourage experimental researchers in the protease field to record specificity profiles of novel proteases aiming to identify at least 30 peptide substrates of maximum sequence diversity. We expect a full characterization of protease specificity helpful to rationalize biological functions of proteases and to assist rational drug design. PMID:26559682

  12. Entropy studies on beam distortion by atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2015-09-01

    When a beam propagates through atmospheric turbulence over a known distance, the target beam profile deviates from the projected profile of the beam on the receiver. Intuitively, the unwanted distortion provides information about the atmospheric turbulence. This information is crucial for guiding adaptive optic systems and improving beam propagation results. In this paper, we propose an entropy study based on the image from a plenoptic sensor to provide a measure of information content of atmospheric turbulence. In general, lower levels of atmospheric turbulence will have a smaller information size while higher levels of atmospheric turbulence will cause significant expansion of the information size, which may exceed the maximum capacity of a sensing system and jeopardize the reliability of an AO system. Therefore, the entropy function can be used to analyze the turbulence distortion and evaluate performance of AO systems. In fact, it serves as a metric that can tell the improvement of beam correction in each iteration step. In addition, it points out the limitation of an AO system at optimized correction as well as the minimum information needed for wavefront sensing to achieve certain levels of correction. In this paper, we will demonstrate the definition of the entropy function and how it is related to evaluating information (randomness) carried by atmospheric turbulence.

  13. Rényi entropy of the totally asymmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Wood, Anthony J.; Blythe, Richard A.; Evans, Martin R.

    2017-11-01

    The Rényi entropy is a generalisation of the Shannon entropy that is sensitive to the fine details of a probability distribution. We present results for the Rényi entropy of the totally asymmetric exclusion process (TASEP). We calculate explicitly an entropy whereby the squares of configuration probabilities are summed, using the matrix product formalism to map the problem to one involving a six direction lattice walk in the upper quarter plane. We derive the generating function across the whole phase diagram, using an obstinate kernel method. This gives the leading behaviour of the Rényi entropy and corrections in all phases of the TASEP. The leading behaviour is given by the result for a Bernoulli measure and we conjecture that this holds for all Rényi entropies. Within the maximal current phase the correction to the leading behaviour is logarithmic in the system size. Finally, we remark upon a special property of equilibrium systems whereby discontinuities in the Rényi entropy arise away from phase transitions, which we refer to as secondary transitions. We find no such secondary transition for this nonequilibrium system, supporting the notion that these are specific to equilibrium cases.

  14. Intracranial EEG potentials estimated from MEG sources: A new approach to correlate MEG and iEEG data in epilepsy.

    PubMed

    Grova, Christophe; Aiguabella, Maria; Zelmann, Rina; Lina, Jean-Marc; Hall, Jeffery A; Kobayashi, Eliane

    2016-05-01

    Detection of epileptic spikes in MagnetoEncephaloGraphy (MEG) requires synchronized neuronal activity over a minimum of 4cm2. We previously validated the Maximum Entropy on the Mean (MEM) as a source localization able to recover the spatial extent of the epileptic spike generators. The purpose of this study was to evaluate quantitatively, using intracranial EEG (iEEG), the spatial extent recovered from MEG sources by estimating iEEG potentials generated by these MEG sources. We evaluated five patients with focal epilepsy who had a pre-operative MEG acquisition and iEEG with MRI-compatible electrodes. Individual MEG epileptic spikes were localized along the cortical surface segmented from a pre-operative MRI, which was co-registered with the MRI obtained with iEEG electrodes in place for identification of iEEG contacts. An iEEG forward model estimated the influence of every dipolar source of the cortical surface on each iEEG contact. This iEEG forward model was applied to MEG sources to estimate iEEG potentials that would have been generated by these sources. MEG-estimated iEEG potentials were compared with measured iEEG potentials using four source localization methods: two variants of MEM and two standard methods equivalent to minimum norm and LORETA estimates. Our results demonstrated an excellent MEG/iEEG correspondence in the presumed focus for four out of five patients. In one patient, the deep generator identified in iEEG could not be localized in MEG. MEG-estimated iEEG potentials is a promising method to evaluate which MEG sources could be retrieved and validated with iEEG data, providing accurate results especially when applied to MEM localizations. Hum Brain Mapp 37:1661-1683, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Thermodynamical transcription of density functional theory with minimum Fisher information

    NASA Astrophysics Data System (ADS)

    Nagy, Á.

    2018-03-01

    Ghosh, Berkowitz and Parr designed a thermodynamical transcription of the ground-state density functional theory and introduced a local temperature that varies from point to point. The theory, however, is not unique because the kinetic energy density is not uniquely defined. Here we derive the expression of the phase-space Fisher information in the GBP theory taking the inverse temperature as the Fisher parameter. It is proved that this Fisher information takes its minimum for the case of constant temperature. This result is consistent with the recently proven theorem that the phase-space Shannon information entropy attains its maximum at constant temperature.

  16. Thermodynamic constraints on a varying cosmological-constant-like term from the holographic equipartition law with a power-law corrected entropy

    NASA Astrophysics Data System (ADS)

    Komatsu, Nobuyoshi

    2017-11-01

    A power-law corrected entropy based on a quantum entanglement is considered to be a viable black-hole entropy. In this study, as an alternative to Bekenstein-Hawking entropy, a power-law corrected entropy is applied to Padmanabhan's holographic equipartition law to thermodynamically examine an extra driving term in the cosmological equations for a flat Friedmann-Robertson-Walker universe at late times. Deviations from the Bekenstein-Hawking entropy generate an extra driving term (proportional to the α th power of the Hubble parameter, where α is a dimensionless constant for the power-law correction) in the acceleration equation, which can be derived from the holographic equipartition law. Interestingly, the value of the extra driving term in the present model is constrained by the second law of thermodynamics. From the thermodynamic constraint, the order of the driving term is found to be consistent with the order of the cosmological constant measured by observations. In addition, the driving term tends to be constantlike when α is small, i.e., when the deviation from the Bekenstein-Hawking entropy is small.

  17. Calculation of Cyclodextrin Binding Affinities: Energy, Entropy, and Implications for Drug Design

    PubMed Central

    Chen, Wei; Chang, Chia-En; Gilson, Michael K.

    2004-01-01

    The second generation Mining Minima method yields binding affinities accurate to within 0.8 kcal/mol for the associations of α-, β-, and γ-cyclodextrin with benzene, resorcinol, flurbiprofen, naproxen, and nabumetone. These calculations require hours to a day on a commodity computer. The calculations also indicate that the changes in configurational entropy upon binding oppose association by as much as 24 kcal/mol and result primarily from a narrowing of energy wells in the bound versus the free state, rather than from a drop in the number of distinct low-energy conformations on binding. Also, the configurational entropy is found to vary substantially among the bound conformations of a given cyclodextrin-guest complex. This result suggests that the configurational entropy must be accounted for to reliably rank docked conformations in both host-guest and ligand-protein complexes. In close analogy with the common experimental observation of entropy-enthalpy compensation, the computed entropy changes show a near-linear relationship with the changes in mean potential plus solvation energy. PMID:15339804

  18. Numerical study focusing on the entropy analysis of MHD squeezing flow of a nanofluid model using Cattaneo–Christov theory

    NASA Astrophysics Data System (ADS)

    Akmal, N.; Sagheer, M.; Hussain, S.

    2018-05-01

    The present study gives an account of the heat transfer characteristics of the squeezing flow of a nanofluid between two flat plates with upper plate moving vertically and the lower in the horizontal direction. Tiwari and Das nanofluid model has been utilized to give a comparative analysis of the heat transfer in the Cu-water and Al2O3-water nanofluids with entropy generation. The modeling is carried out with the consideration of Lorentz forces to observe the effect of magnetic field on the flow. The Joule heating effect is included to discuss the heat dissipation in the fluid and its effect on the entropy of the system. The nondimensional ordinary differential equations are solved using the Keller box method to assess the numerical results which are presented by the graphs and tables. An interesting observation is that the entropy is generated more near the lower plate as compared with that at the upper plate. Also, the heat transfer rate is found to be higher for the Cu nanoparticles in comparison with the Al2O3 nanoparticles.

  19. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie

    2008-06-01

    Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases.

  20. Pareto versus lognormal: A maximum entropy test

    NASA Astrophysics Data System (ADS)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  1. Detection of cracks in shafts with the Approximated Entropy algorithm

    NASA Astrophysics Data System (ADS)

    Sampaio, Diego Luchesi; Nicoletti, Rodrigo

    2016-05-01

    The Approximate Entropy is a statistical calculus used primarily in the fields of Medicine, Biology, and Telecommunication for classifying and identifying complex signal data. In this work, an Approximate Entropy algorithm is used to detect cracks in a rotating shaft. The signals of the cracked shaft are obtained from numerical simulations of a de Laval rotor with breathing cracks modelled by the Fracture Mechanics. In this case, one analysed the vertical displacements of the rotor during run-up transients. The results show the feasibility of detecting cracks from 5% depth, irrespective of the unbalance of the rotating system and crack orientation in the shaft. The results also show that the algorithm can differentiate the occurrence of crack only, misalignment only, and crack + misalignment in the system. However, the algorithm is sensitive to intrinsic parameters p (number of data points in a sample vector) and f (fraction of the standard deviation that defines the minimum distance between two sample vectors), and good results are only obtained by appropriately choosing their values according to the sampling rate of the signal.

  2. Minimax Quantum Tomography: Estimators and Relative Entropy Bounds

    DOE PAGES

    Ferrie, Christopher; Blume-Kohout, Robin

    2016-03-04

    A minimax estimator has the minimum possible error (“risk”) in the worst case. Here we construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O (1/more » $$\\sqrt{N}$$ ) —in contrast to that of classical probability estimation, which is O (1/N) —where N is the number of copies of the quantum state used. We trace this deficiency to sampling mismatch: future observations that determine risk may come from a different sample space than the past data that determine the estimate. Lastly, this makes minimax estimators very biased, and we propose a computationally tractable alternative with similar behavior in the worst case, but superior accuracy on most states.« less

  3. Thermal performance of plate fin heat sink cooled by air slot impinging jet with different cross-sectional area

    NASA Astrophysics Data System (ADS)

    Mesalhy, O. M.; El-Sayed, Mostafa M.

    2015-06-01

    Flow and heat transfer characteristics of a plate-fin heat sink cooled by a rectangular impinging jet with different cross-sectional area were studied experimentally and numerically. The study concentrated on investigating the effect of jet width, fin numbers, and fin heights on thermal performance. Entropy generation minimization method was used to define the optimum design and operating conditions. It is found that, the jet width that minimizes entropy generation changes with heat sink height and fin numbers.

  4. A thermodynamic approach to the 'mitosis/apoptosis' ratio in cancer

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto; Ponzetto, Antonio; Deisboeck, Thomas S.

    2015-10-01

    Cancer can be considered as an open, complex, (bio-thermo)dynamic and self-organizing system. Consequently, an entropy generation approach has been employed to analyze its mitosis/apoptosis ratio. Specifically, a novel thermodynamic anticancer strategy is suggested, based on the variation of entropy generation caused by the application of external fields, for example electro-magnetic fields, for therapeutic purposes. Eventually, this innovative approach could support conventional therapies, particularly for inoperable tumors or advanced stages of cancer, when larger tumor burden is diagnosed, and therapeutic options are often limited.

  5. Entropy of level-cut random Gaussian structures at different volume fractions

    NASA Astrophysics Data System (ADS)

    Marčelja, Stjepan

    2017-10-01

    Cutting random Gaussian fields at a given level can create a variety of morphologically different two- or several-phase structures that have often been used to describe physical systems. The entropy of such structures depends on the covariance function of the generating Gaussian random field, which in turn depends on its spectral density. But the entropy of level-cut structures also depends on the volume fractions of different phases, which is determined by the selection of the cutting level. This dependence has been neglected in earlier work. We evaluate the entropy of several lattice models to show that, even in the cases of strongly coupled systems, the dependence of the entropy of level-cut structures on molar fractions of the constituents scales with the simple ideal noninteracting system formula. In the last section, we discuss the application of the results to binary or ternary fluids and microemulsions.

  6. Landauer-Büttiker Approach to Strongly Coupled Quantum Thermodynamics: Inside-Outside Duality of Entropy Evolution

    NASA Astrophysics Data System (ADS)

    Bruch, Anton; Lewenkopf, Caio; von Oppen, Felix

    2018-03-01

    We develop a Landauer-Büttiker theory of entropy evolution in time-dependent, strongly coupled electron systems. The formalism naturally avoids the problem of the system-bath distinction by defining the entropy current in the attached leads. This current can then be used to infer changes of the entropy of the system which we refer to as the inside-outside duality. We carry out this program in an adiabatic expansion up to first order beyond the quasistatic limit. When combined with particle and energy currents, as well as the work required to change an external potential, our formalism provides a full thermodynamic description, applicable to arbitrary noninteracting electron systems in contact with reservoirs. This provides a clear understanding of the relation between heat and entropy currents generated by time-dependent potentials and their connection to the occurring dissipation.

  7. Effect of extreme data loss on heart rate signals quantified by entropy analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao

    2015-02-01

    The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.

  8. Capacities of quantum amplifier channels

    NASA Astrophysics Data System (ADS)

    Qi, Haoyu; Wilde, Mark M.

    2017-01-01

    Quantum amplifier channels are at the core of several physical processes. Not only do they model the optical process of spontaneous parametric down-conversion, but the transformation corresponding to an amplifier channel also describes the physics of the dynamical Casimir effect in superconducting circuits, the Unruh effect, and Hawking radiation. Here we study the communication capabilities of quantum amplifier channels. Invoking a recently established minimum output-entropy theorem for single-mode phase-insensitive Gaussian channels, we determine capacities of quantum-limited amplifier channels in three different scenarios. First, we establish the capacities of quantum-limited amplifier channels for one of the most general communication tasks, characterized by the trade-off between classical communication, quantum communication, and entanglement generation or consumption. Second, we establish capacities of quantum-limited amplifier channels for the trade-off between public classical communication, private classical communication, and secret key generation. Third, we determine the capacity region for a broadcast channel induced by the quantum-limited amplifier channel, and we also show that a fully quantum strategy outperforms those achieved by classical coherent-detection strategies. In all three scenarios, we find that the capacities significantly outperform communication rates achieved with a naive time-sharing strategy.

  9. Entanglement of a quantum field with a dispersive medium.

    PubMed

    Klich, Israel

    2012-08-10

    In this Letter we study the entanglement of a quantum radiation field interacting with a dielectric medium. In particular, we describe the quantum mixed state of a field interacting with a dielectric through plasma and Drude models and show that these generate very different entanglement behavior, as manifested in the entanglement entropy of the field. We also present a formula for a "Casimir" entanglement entropy, i.e., the distance dependence of the field entropy. Finally, we study a toy model of the interaction between two plates. In this model, the field entanglement entropy is divergent; however, as in the Casimir effect, its distance-dependent part is finite, and the field matter entanglement is reduced when the objects are far.

  10. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less

  11. Theory and implementation of a very high throughput true random number generator in field programmable gate array.

    PubMed

    Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao

    2016-04-01

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  12. Optimization of rainfall networks using information entropy and temporal variability analysis

    NASA Astrophysics Data System (ADS)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  13. Ovarian Cancer Differential Interactome and Network Entropy Analysis Reveal New Candidate Biomarkers.

    PubMed

    Ayyildiz, Dilara; Gov, Esra; Sinha, Raghu; Arga, Kazim Yalcin

    2017-05-01

    Ovarian cancer is one of the most common cancers and has a high mortality rate due to insidious symptoms and lack of robust diagnostics. A hitherto understudied concept in cancer pathogenesis may offer new avenues for innovation in ovarian cancer biomarker development. Cancer cells are characterized by an increase in network entropy, and several studies have exploited this concept to identify disease-associated gene and protein modules. We report in this study the changes in protein-protein interactions (PPIs) in ovarian cancer within a differential network (interactome) analysis framework utilizing the entropy concept and gene expression data. A compendium of six transcriptome datasets that included 140 samples from laser microdissected epithelial cells of ovarian cancer patients and 51 samples from healthy population was obtained from Gene Expression Omnibus, and the high confidence human protein interactome (31,465 interactions among 10,681 proteins) was used. The uncertainties of the up- or downregulation of PPIs in ovarian cancer were estimated through an entropy formulation utilizing combined expression levels of genes, and the interacting protein pairs with minimum uncertainty were identified. We identified 105 proteins with differential PPI patterns scattered in 11 modules, each indicating significantly affected biological pathways in ovarian cancer such as DNA repair, cell proliferation-related mechanisms, nucleoplasmic translocation of estrogen receptor, extracellular matrix degradation, and inflammation response. In conclusion, we suggest several PPIs as biomarker candidates for ovarian cancer and discuss their future biological implications as potential molecular targets for pharmaceutical development as well. In addition, network entropy analysis is a concept that deserves greater research attention for diagnostic innovation in oncology and tumor pathogenesis.

  14. Entanglement entropy between real and virtual particles in ϕ4 quantum field theory

    NASA Astrophysics Data System (ADS)

    Ardenghi, Juan Sebastián

    2015-04-01

    The aim of this work is to compute the entanglement entropy of real and virtual particles by rewriting the generating functional of ϕ4 theory as a mean value between states and observables defined through the correlation functions. Then the von Neumann definition of entropy can be applied to these quantum states and in particular, for the partial traces taken over the internal or external degrees of freedom. This procedure can be done for each order in the perturbation expansion showing that the entanglement entropy for real and virtual particles behaves as ln (m0). In particular, entanglement entropy is computed at first order for the correlation function of two external points showing that mutual information is identical to the external entropy and that conditional entropies are negative for all the domain of m0. In turn, from the definition of the quantum states, it is possible to obtain general relations between total traces between different quantum states of a ϕr theory. Finally, discussion about the possibility of taking partial traces over external degrees of freedom is considered, which implies the introduction of some observables that measure space-time points where an interaction occurs.

  15. Studies of Entanglement Entropy, and Relativistic Fluids for Thermal Field Theories

    NASA Astrophysics Data System (ADS)

    Spillane, Michael

    In this dissertation we consider physical consequences of adding a finite temperature to quantum field theories. At small length scales entanglement is a critically important feature. It is therefore unsurprising that entanglement entropy and Renyi entropy are useful tools in studying quantum phase transition, and quantum information. In this thesis we consider the corrections to entanglement and Renyi entropies due to addition of a finite temperature. More specifically, we investigate the entanglement entropy of a massive scalar field in 1+1 dimensions at nonzero temperature. In the small mass ( m) and temperature (T) limit, we put upper and lower bounds on the two largest eigenvalues of the covariance matrix used to compute the entanglement entropy. We argue that the entanglement entropy has e-m/T scaling in the limit T << m.. Additionally, we calculate thermal corrections to Renyi entropies for free massless fermions on R x S d-1. By expanding the density matrix in a Boltzmann sum, the problem of finding the Renyi entropies can be mapped to the problem of calculating a two point function on an n-sheeted cover of the sphere. We map the problem on the sphere to a conical region in Euclidean space. By using the method of images, we calculate the two point function and recover the Renyi entropies. At large length scales hydrodynamics is a useful way to study quantum field theories. We review recent interest in the Riemann problem as a method for generating a non-equilibrium steady state. The initial conditions consist of a planar interface between two halves of a system held at different temperatures in a hydrodynamic regime. The resulting fluid flow contains a fixed temperature region with a nonzero flux. We briefly discuss the effects of a conserved charge. Next we discuss deforming the relativistic equations with a nonlinear term and how that deformation affects the temperature and velocity in the region connecting the asymptotic fluids. Finally, we study properties of a non-equilibrium steady state generated when two heat baths are initially in contact with one another. The dynamics of the system in question are governed by holographic duality to a blackhole. We discuss the "phase diagram" associated with the steady state of the dual, dynamical black hole and its relation to the fluid/gravity correspondence.

  16. Real topological entropy versus metric entropy for birational measure-preserving transformations

    NASA Astrophysics Data System (ADS)

    Abarenkova, N.; Anglès d'Auriac, J.-Ch.; Boukraa, S.; Maillard, J.-M.

    2000-10-01

    We consider a family of birational measure-preserving transformations of two complex variables, depending on one parameter for which simple rational expressions for the dynamical zeta function have been conjectured, together with an equality between the topological entropy and the logarithm of the Arnold complexity (divided by the number of iterations). Similar results have been obtained for the adaptation of these two concepts to dynamical systems of real variables, yielding to introduce a “real topological entropy” and a “real Arnold complexity”. We try to compare, here, the Kolmogorov-Sinai metric entropy and this real Arnold complexity, or real topological entropy, on this particular example of a one-parameter dependent birational transformation of two variables. More precisely, we analyze, using an infinite precision calculation, the Lyapunov characteristic exponents for various values of the parameter of the birational transformation, in order to compare these results with the ones for the real Arnold complexity. We find a quite surprising result: for this very birational example, and, in fact, for a large set of birational measure-preserving mappings generated by involutions, the Lyapunov characteristic exponents seem to be equal to zero or, at least, extremely small, for all the orbits we have considered, and for all values of the parameter. Birational measure-preserving transformations, generated by involutions, could thus allow to better understand the difference between the topological description and the probabilistic description of discrete dynamical systems. Many birational measure-preserving transformations, generated by involutions, seem to provide examples of discrete dynamical systems which can be topologically chaotic while they are metrically almost quasi-periodic. Heuristically, this can be understood as a consequence of the fact that their orbits seem to form some kind of “transcendental foliation” of the two-dimensional space of variables.

  17. Effects of heat sink and source and entropy generation on MHD mixed convection of a Cu-water nanofluid in a lid-driven square porous enclosure with partial slip

    NASA Astrophysics Data System (ADS)

    Chamkha, A. J.; Rashad, A. M.; Mansour, M. A.; Armaghani, T.; Ghalambaz, M.

    2017-05-01

    In this work, the effects of the presence of a heat sink and a heat source and their lengths and locations and the entropy generation on MHD mixed convection flow and heat transfer in a porous enclosure filled with a Cu-water nanofluid in the presence of partial slip effect are investigated numerically. Both the lid driven vertical walls of the cavity are thermally insulated and are moving with constant and equal speeds in their own plane and the effect of partial slip is imposed on these walls. A segment of the bottom wall is considered as a heat source meanwhile a heat sink is placed on the upper wall of cavity. There are heated and cold parts placed on the bottom and upper walls, respectively, while the remaining parts are thermally insulated. Entropy generation and local heat transfer according to different values of the governing parameters are presented in detail. It is found that the addition of nanoparticles decreases the convective heat transfer inside the porous cavity at all ranges of the heat sink and source lengths. The results for the effects of the magnetic field show that the average Nusselt number decreases considerably upon the enhancement of the Hartmann number. Also, adding nanoparticles to a pure fluid leads to increasing the entropy generation for all values of D for λl=-λr = 1 .

  18. HMM for hyperspectral spectrum representation and classification with endmember entropy vectors

    NASA Astrophysics Data System (ADS)

    Arabi, Samir Y. W.; Fernandes, David; Pizarro, Marco A.

    2015-10-01

    The Hyperspectral images due to its good spectral resolution are extensively used for classification, but its high number of bands requires a higher bandwidth in the transmission data, a higher data storage capability and a higher computational capability in processing systems. This work presents a new methodology for hyperspectral data classification that can work with a reduced number of spectral bands and achieve good results, comparable with processing methods that require all hyperspectral bands. The proposed method for hyperspectral spectra classification is based on the Hidden Markov Model (HMM) associated to each Endmember (EM) of a scene and the conditional probabilities of each EM belongs to each other EM. The EM conditional probability is transformed in EM vector entropy and those vectors are used as reference vectors for the classes in the scene. The conditional probability of a spectrum that will be classified is also transformed in a spectrum entropy vector, which is classified in a given class by the minimum ED (Euclidian Distance) among it and the EM entropy vectors. The methodology was tested with good results using AVIRIS spectra of a scene with 13 EM considering the full 209 bands and the reduced spectral bands of 128, 64 and 32. For the test area its show that can be used only 32 spectral bands instead of the original 209 bands, without significant loss in the classification process.

  19. Temperature and composition dependence of short-range order and entropy, and statistics of bond length: the semiconductor alloy (GaN)(1-x)(ZnO)(x).

    PubMed

    Liu, Jian; Pedroza, Luana S; Misch, Carissa; Fernández-Serra, Maria V; Allen, Philip B

    2014-07-09

    We present total energy and force calculations for the (GaN)1-x(ZnO)x alloy. Site-occupancy configurations are generated from Monte Carlo (MC) simulations, on the basis of a cluster expansion model proposed in a previous study. Local atomic coordinate relaxations of surprisingly large magnitude are found via density-functional calculations using a 432-atom periodic supercell, for three representative configurations at x = 0.5. These are used to generate bond-length distributions. The configurationally averaged composition- and temperature-dependent short-range order (SRO) parameters of the alloys are discussed. The entropy is approximated in terms of pair distribution statistics and thus related to SRO parameters. This approximate entropy is compared with accurate numerical values from MC simulations. An empirical model for the dependence of the bond length on the local chemical environments is proposed.

  20. Near horizon symmetry and entropy formula for Kerr-Newman (A)dS black holes

    NASA Astrophysics Data System (ADS)

    Setare, Mohammad Reza; Adami, Hamed

    2018-04-01

    In this paper we provide the first non-trivial evidence for universality of the entropy formula 4 πJ 0 + J 0 - beyond pure Einstein gravity in 4-dimensions. We consider the Einstein-Maxwell theory in the presence of cosmological constant, then write near horizon metric of the Kerr-Newman (A)dS black hole in the Gaussian null coordinate system. We consider near horizon fall-off conditions for metric and U(1) gauge field. We find asymptotic combined symmetry generator, consists of diffeomorphism and U(1) gauge transformation, so that it preserves fall-off conditions. Consequently, we find supertranslation, supperrotation and multiple-charge modes and then we show that the entropy formula is held for the Kerr-Newman (A)dS black hole. Supperrotation modes suffer from a problem. By introducing new combined symmetry generator, we cure that problem.

  1. Magnetorheological rotational flow with viscous dissipation

    NASA Astrophysics Data System (ADS)

    Ashrafi, Nariman

    2017-11-01

    Effects of a magnetic field and fluid nonlinearity are investigated for the rotational flow of the Carreau-type fluid while viscous dissipation is taken into account. The governing motion and energy balance equations are coupled, adding complexity to the already highly correlated set of differential equations. The numerical solution is obtained for the narrow-gap limit and steady-state base flow. Magnetic field effect on local entropy generation due to steady two-dimensional laminar forced convection flow was investigated. This study was focused on the entropy generation characteristics and its dependency on various dimensionless parameters. The effects of the Hartmann number, the Brinkman number, and the Deborah number on the stability of the flow were investigated. The introduction of the magnetic field induces a resistive force acting in the opposite direction of the flow, thus causing its deceleration. Moreover, the study shows that the presence of magnetic field tends to slow down the fluid motion. It, however, increases the fluid temperature. Moreover, the total entropy generation number decreases as the Hartmann number and fluid elasticity increase and increases with increasing Brinkman number.

  2. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  3. Entropy information of heart rate variability and its power spectrum during day and night

    NASA Astrophysics Data System (ADS)

    Jin, Li; Jun, Wang

    2013-07-01

    Physiologic systems generate complex fluctuations in their output signals that reflect the underlying dynamics. We employed the base-scale entropy method and the power spectral analysis to study the 24 hours heart rate variability (HRV) signals. The results show that such profound circadian-, age- and pathologic-dependent changes are accompanied by changes in base-scale entropy and power spectral distribution. Moreover, the base-scale entropy changes reflect the corresponding changes in the autonomic nerve outflow. With the suppression of the vagal tone and dominance of the sympathetic tone in congestive heart failure (CHF) subjects, there is more variability in the date fluctuation mode. So the higher base-scale entropy belongs to CHF subjects. With the decrease of the sympathetic tone and the respiratory frequency (RSA) becoming more pronounced with slower breathing during sleeping, the base-scale entropy drops in CHF subjects. The HRV series of the two healthy groups have the same diurnal/nocturnal trend as the CHF series. The fluctuation dynamics trend of data in the three groups can be described as “HF effect”.

  4. Highly Entangled, Non-random Subspaces of Tensor Products from Quantum Groups

    NASA Astrophysics Data System (ADS)

    Brannan, Michael; Collins, Benoît

    2018-03-01

    In this paper we describe a class of highly entangled subspaces of a tensor product of finite-dimensional Hilbert spaces arising from the representation theory of free orthogonal quantum groups. We determine their largest singular values and obtain lower bounds for the minimum output entropy of the corresponding quantum channels. An application to the construction of d-positive maps on matrix algebras is also presented.

  5. Enthalpy-entropy compensation: the role of solvation.

    PubMed

    Dragan, Anatoliy I; Read, Christopher M; Crane-Robinson, Colyn

    2017-05-01

    Structural modifications to interacting systems frequently lead to changes in both the enthalpy (heat) and entropy of the process that compensate each other, so that the Gibbs free energy is little changed: a major barrier to the development of lead compounds in drug discovery. The conventional explanation for such enthalpy-entropy compensation (EEC) is that tighter contacts lead to a more negative enthalpy but increased molecular constraints, i.e., a compensating conformational entropy reduction. Changes in solvation can also contribute to EEC but this contribution is infrequently discussed. We review long-established and recent cases of EEC and conclude that the large fluctuations in enthalpy and entropy observed are too great to be a result of only conformational changes and must result, to a considerable degree, from variations in the amounts of water immobilized or released on forming complexes. Two systems exhibiting EEC show a correlation between calorimetric entropies and local mobilities, interpreted to mean conformational control of the binding entropy/free energy. However, a substantial contribution from solvation gives the same effect, as a consequence of a structural link between the amount of bound water and the protein flexibility. Only by assuming substantial changes in solvation-an intrinsically compensatory process-can a more complete understanding of EEC be obtained. Faced with such large, and compensating, changes in the enthalpies and entropies of binding, the best approach to engineering elevated affinities must be through the addition of ionic links, as they generate increased entropy without affecting the enthalpy.

  6. Heavy fields and gravity

    NASA Astrophysics Data System (ADS)

    Goon, Garrett

    2017-01-01

    We study the effects of heavy fields on 4D spacetimes with flat, de Sitter and anti-de Sitter asymptotics. At low energies, matter generates specific, calculable higher derivative corrections to the GR action which perturbatively alter the Schwarzschild-( A) dS family of solutions. The effects of massive scalars, Dirac spinors and gauge fields are each considered. The six-derivative operators they produce, such as ˜ R 3 terms, generate the leading corrections. The induced changes to horizon radii, Hawking temperatures and entropies are found. Modifications to the energy of large AdS black holes are derived by imposing the first law. An explicit demonstration of the replica trick is provided, as it is used to derive black hole and cosmological horizon entropies. Considering entropy bounds, it's found that scalars and fermions increase the entropy one can store inside a region bounded by a sphere of fixed size, but vectors lead to a decrease, oddly. We also demonstrate, however, that many of the corrections fall below the resolving power of the effective field theory and are therefore untrustworthy. Defining properties of black holes, such as the horizon area and Hawking temperature, prove to be remarkably robust against higher derivative gravitational corrections.

  7. Entropy Analysis in Mixed Convection MHD flow of Nanofluid over a Non-linear Stretching Sheet

    NASA Astrophysics Data System (ADS)

    Matin, Meisam Habibi; Nobari, Mohammad Reza Heirani; Jahangiri, Pouyan

    This article deals with a numerical study of entropy analysis in mixed convection MHD flow of nanofluid over a non-linear stretching sheet taking into account the effects of viscous dissipation and variable magnetic field. The nanofluid is made of such nano particles as SiO2 with pure water as a base fluid. To analyze the problem, at first the boundary layer equations are transformed into non-linear ordinary equations using a similarity transformation. The resultant equations are then solved numerically using the Keller-Box scheme based on the implicit finite-difference method. The effects of different non-dimensional governing parameters such as magnetic parameter, nanoparticles volume fraction, Nusselt, Richardson, Eckert, Hartman, Brinkman, Reynolds and entropy generation numbers are investigated in details. The results indicate that increasing the nano particles to the base fluids causes the reduction in shear forces and a decrease in stretching sheet heat transfer coefficient. Also, decreasing the magnetic parameter and increasing the Eckert number result in improves heat transfer rate. Furthermore, the surface acts as a strong source of irreversibility due to the higher entropy generation number near the surface.

  8. Causality, transfer entropy, and allosteric communication landscapes in proteins with harmonic interactions.

    PubMed

    Hacisuleyman, Aysima; Erman, Burak

    2017-06-01

    A fast and approximate method of generating allosteric communication landscapes in proteins is presented by using Schreiber's entropy transfer concept in combination with the Gaussian Network Model of proteins. Predictions of the model and the allosteric communication landscapes generated show that information transfer in proteins does not necessarily take place along a single path, but an ensemble of pathways is possible. The model emphasizes that knowledge of entropy only is not sufficient for determining allosteric communication and additional information based on time delayed correlations should be introduced, which leads to the presence of causality in proteins. The model provides a simple tool for mapping entropy sink-source relations into pairs of residues. By this approach, residues that should be manipulated to control protein activity may be determined. This should be of great importance for allosteric drug design and for understanding the effects of mutations on function. The model is applied to determine allosteric communication in three proteins, Ubiquitin, Pyruvate Kinase, and the PDZ domain. Predictions are in agreement with molecular dynamics simulations and experimental evidence. Proteins 2017; 85:1056-1064. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  9. Maximum nonlocality and minimum uncertainty using magic states

    NASA Astrophysics Data System (ADS)

    Howard, Mark

    2015-04-01

    We prove that magic states from the Clifford hierarchy give optimal solutions for tasks involving nonlocality and entropic uncertainty with respect to Pauli measurements. For both the nonlocality and uncertainty tasks, stabilizer states are the worst possible pure states, so our solutions have an operational interpretation as being highly nonstabilizer. The optimal strategy for a qudit version of the Clauser-Horne-Shimony-Holt game in prime dimensions is achieved by measuring maximally entangled states that are isomorphic to single-qudit magic states. These magic states have an appealingly simple form, and our proof shows that they are "balanced" with respect to all but one of the mutually unbiased stabilizer bases. Of all equatorial qudit states, magic states minimize the average entropic uncertainties for collision entropy and also, for small prime dimensions, min-entropy, a fact that may have implications for cryptography.

  10. Optimal behavior of viscoelastic flow at resonant frequencies.

    PubMed

    Lambert, A A; Ibáñez, G; Cuevas, S; del Río, J A

    2004-11-01

    The global entropy generation rate in the zero-mean oscillatory flow of a Maxwell fluid in a pipe is analyzed with the aim of determining its behavior at resonant flow conditions. This quantity is calculated explicitly using the analytic expression for the velocity field and assuming isothermal conditions. The global entropy generation rate shows well-defined peaks at the resonant frequencies where the flow displays maximum velocities. It was found that resonant frequencies can be considered optimal in the sense that they maximize the power transmitted to the pulsating flow at the expense of maximum dissipation.

  11. Information entropy to measure the spatial and temporal complexity of solute transport in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Li, Weiyao; Huang, Guanhua; Xiong, Yunwu

    2016-04-01

    The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and solute transport complexity weakened, and the corresponding information entropy also decreased. Longitudinal macro dispersivity declined slightly at early time then rose. Solute spatial and temporal distribution had significant impacts on the information entropy. Information entropy could reflect the change of solute distribution. Information entropy appears a tool to characterize the spatial and temporal complexity of solute migration and provides a reference for future research.

  12. Hidden disorder in the α '→δ transformation of Pu-1.9 at.% Ga

    DOE PAGES

    Jeffries, J. R.; Manley, M. E.; Wall, M. A.; ...

    2012-06-06

    Enthalpy and entropy are thermodynamic quantities critical to determining how and at what temperature a phase transition occurs. At a phase transition, the enthalpy and temperature-weighted entropy differences between two phases are equal (ΔH=TΔS), but there are materials where this balance has not been experimentally or theoretically realized, leading to the idea of hidden order and disorder. In a Pu-1.9 at. % Ga alloy, the δ phase is retained as a metastable state at room temperature, but at low temperatures, the δ phase yields to a mixed-phase microstructure of δ- and α'-Pu. The previously measured sources of entropy associated withmore » the α'→δ transformation fail to sum to the entropy predicted theoretically. We report an experimental measurement of the entropy of the α'→δ transformation that corroborates the theoretical prediction, and implies that only about 65% of the entropy stabilizing the δ phase is accounted for, leaving a missing entropy of about 0.5 k B/atom. Some previously proposed mechanisms for generating entropy are discussed, but none seem capable of providing the necessary disorder to stabilize the δ phase. This hidden disorder represents multiple accessible states per atom within the δ phase of Pu that may not be included in our current understanding of the properties and phase stability of δ-Pu.« less

  13. Thermodynamics of a third-generation poly(phenylene-pyridyl) dendron decorated with dodecyl groups in the range of T → 0 to 480 K

    NASA Astrophysics Data System (ADS)

    Smirnova, N. N.; Markin, A. V.; Tsvetkova, L. Ya.; Kuchkina, N. V.; Yuzik-Klimova, E. Yu.; Shifrina, Z. B.

    2016-05-01

    The heat capacity of a glassy third-generation poly(phenylene-pyridyl) dendron decorated with dodecyl groups is studied for the first time via high-precision adiabatic vacuum and differential scanning calorimetry in the temperature range of 6 to 520 K. The standard thermodynamic functions (molar heat capacity C p ° , enthalpy H°( T), entropy S°( T), and Gibbs energy G°( T)- H°(0)) in the range of T → 0 to 480 K, and the entropy of formation at 298.15 K, are calculated on the basis of the obtained data. The thermodynamic properties of the dendron and the corresponding third-generation poly(phenylene-pyridyl) dendrimer studied earlier are compared.

  14. Entanglement entropy between virtual and real excitations in quantum electrodynamics

    NASA Astrophysics Data System (ADS)

    Ardenghi, Juan Sebastián

    2018-05-01

    The aim of this work is to introduce the entanglement entropy of real and virtual excitations of fermion and photon fields. By rewriting the generating functional of quantum electrodynamics theory as an inner product between quantum operators, it is possible to obtain quantum density operators representing the propagation of real and virtual particles. These operators are partial traces, where the degrees of freedom traced out are unobserved excitations. Then the von Neumann definition of entropy can be applied to these quantum operators and in particular, for the partial traces taken over by the internal or external degrees of freedom. A universal behavior is obtained for the entanglement entropy for different quantum fields at zeroth order in the coupling constant. In order to obtain numerical results at different orders in the perturbation expansion, the Bloch-Nordsieck model is considered, where it is shown that for some particular values of the electric charge, the von Neumann entropy increases or decreases with respect to the noninteracting case.

  15. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  16. Investigating weaknesses in Android certificate security

    NASA Astrophysics Data System (ADS)

    Krych, Daniel E.; Lange-Maney, Stephen; McDaniel, Patrick; Glodek, William

    2015-05-01

    Android's application market relies on secure certificate generation to establish trust between applications and their users; yet, cryptography is often not a priority for application developers and many fail to take the necessary security precautions. Indeed, there is cause for concern: several recent high-profile studies have observed a pervasive lack of entropy on Web-systems leading to the factorization of private keys.1 Sufficient entropy, or randomness, is essential to generate secure key pairs and combat predictable key generation. In this paper, we analyze the security of Android certificates. We investigate the entropy present in 550,000 Android application certificates using the Quasilinear GCD finding algorithm.1 Our results show that while the lack of entropy does not appear to be as ubiquitous in the mobile markets as on Web-systems, there is substantial reuse of certificates only one third of the certificates in our dataset were unique. In other words, we find that organizations frequently reuse certificates for different applications. While such a practice is acceptable under Google's specifications for a single developer, we find that in some cases the same certificates are used for a myriad of developers, potentially compromising Android's intended trust relationships. Further, we observed duplicate certificates being used by both malicious and non-malicious applications. The top 3 repeated certificates present in our dataset accounted for a total of 11,438 separate APKs. Of these applications, 451, or roughly 4%, were identified as malicious by antivirus services.

  17. Dissipated energy and entropy production for an unconventional heat engine: the stepwise `circular cycle'

    NASA Astrophysics Data System (ADS)

    di Liberto, Francesco; Pastore, Raffaele; Peruggi, Fulvio

    2011-05-01

    When some entropy is transferred, by means of a reversible engine, from a hot heat source to a colder one, the maximum efficiency occurs, i.e. the maximum available work is obtained. Similarly, a reversible heat pumps transfer entropy from a cold heat source to a hotter one with the minimum expense of energy. In contrast, if we are faced with non-reversible devices, there is some lost work for heat engines, and some extra work for heat pumps. These quantities are both related to entropy production. The lost work, i.e. ? , is also called 'degraded energy' or 'energy unavailable to do work'. The extra work, i.e. ? , is the excess of work performed on the system in the irreversible process with respect to the reversible one (or the excess of heat given to the hotter source in the irreversible process). Both quantities are analysed in detail and are evaluated for a complex process, i.e. the stepwise circular cycle, which is similar to the stepwise Carnot cycle. The stepwise circular cycle is a cycle performed by means of N small weights, dw, which are first added and then removed from the piston of the vessel containing the gas or vice versa. The work performed by the gas can be found as the increase of the potential energy of the dw's. Each single dw is identified and its increase, i.e. its increase in potential energy, evaluated. In such a way it is found how the energy output of the cycle is distributed among the dw's. The size of the dw's affects entropy production and therefore the lost and extra work. The distribution of increases depends on the chosen removal process.

  18. Irreversibility and entropy production in transport phenomena, IV: Symmetry, integrated intermediate processes and separated variational principles for multi-currents

    NASA Astrophysics Data System (ADS)

    Suzuki, Masuo

    2013-10-01

    The mechanism of entropy production in transport phenomena is discussed again by emphasizing the role of symmetry of non-equilibrium states and also by reformulating Einstein’s theory of Brownian motion to derive entropy production from it. This yields conceptual reviews of the previous papers [M. Suzuki, Physica A 390 (2011) 1904; 391 (2012) 1074; 392 (2013) 314]. Separated variational principles of steady states for multi external fields {Xi} and induced currents {Ji} are proposed by extending the principle of minimum integrated entropy production found by the present author for a single external field. The basic strategy of our theory on steady states is to take in all the intermediate processes from the equilibrium state to the final possible steady states in order to study the irreversible physics even in the steady states. As an application of this principle, Gransdorff-Prigogine’s evolution criterion inequality (or stability condition) dXP≡∫dr∑iJidXi≤0 is derived in the stronger form dQi≡∫drJidXi≤0 for individual force Xi and current Ji even in nonlinear responses which depend on all the external forces {Xk} nonlinearly. This is called “separated evolution criterion”. Some explicit demonstrations of the present general theory to simple electric circuits with multi external fields are given in order to clarify the physical essence of our new theory and to realize the condition of its validity concerning the existence of the solutions of the simultaneous equations obtained by the separated variational principles. It is also instructive to compare the two results obtained by the new variational theory and by the old scheme based on the instantaneous entropy production. This seems to be suggestive even to the energy problem in the world.

  19. Entropy generation minimization for the sloshing phenomenon in half-full elliptical storage tanks

    NASA Astrophysics Data System (ADS)

    Saghi, Hassan

    2018-02-01

    In this paper, the entropy generation in the sloshing phenomenon was obtained in elliptical storage tanks and the optimum geometry of tank was suggested. To do this, a numerical model was developed to simulate the sloshing phenomenon by using coupled Reynolds-Averaged Navier-Stokes (RANS) solver and the Volume-of-Fluid (VOF) method. The RANS equations were discretized and solved using the staggered grid finite difference and SMAC methods, and the available data were used for the model validation. Some parameters consisting of maximum free surface displacement (MFSD), maximum horizontal force exerted on the tank perimeter (MHF), tank perimeter (TP), and total entropy generation (Sgen) were introduced as design criteria for elliptical storage tanks. The entropy generation distribution provides designers with useful information about the causes of the energy loss. In this step, horizontal periodic sway motions as X =amsin(ωt) were applied to elliptical storage tanks with different aspect ratios namely ratios of large diameter to small diameter of elliptical storage tank (AR). Then, the effect of am and ω was studied on the results. The results show that the relation between MFSD and MHF is almost linear relative to the sway motion amplitude. Moreover, the results show that an increase in the AR causes a decrease in the MFSD and MHF. The results, also, show that the relation between MFSD and MHF is nonlinear relative to the sway motion angular frequency. Furthermore, the results show that an increase in the AR causes that the relation between MFSD and MHF becomes linear relative to the sway motion angular frequency. In addition, MFSD and MHF were minimized in a sway motion with a 7 rad/s angular frequency. Finally, the results show that the elliptical storage tank with AR =1.2-1.4 is the optimum section.

  20. Detection of direct and indirect noise generated by synthetic hot spots in a duct

    NASA Astrophysics Data System (ADS)

    De Domenico, Francesca; Rolland, Erwan O.; Hochgreb, Simone

    2017-04-01

    Sound waves in a combustor are generated from fluctuations in the heat release rate (direct noise) or the acceleration of entropy, vorticity or compositional perturbations through nozzles or turbine guide vanes (indirect or entropy noise). These sound waves are transmitted downstream as well as reflected upstream of the acceleration point, contributing to the overall noise emissions, or triggering combustion instabilities. Previous experiments attempted to isolate indirect noise by generating thermoacoustic hot spots electrically and measuring the transmitted acoustic waves, yet there are no measurements on the backward propagating entropy and acoustic waves. This work presents the first measurements which clearly separate the direct and indirect noise contributions to pressure fluctuations upstream of the acceleration point. Synthetic entropy spots are produced by unsteady electrical heating of a grid of thin wires located in a tube. Compression waves (direct noise) are generated from this heating process. The hot spots are then advected with the mean flow and finally accelerated through an orifice plate located at the end of the tube, producing a strong acoustic signature which propagates upstream (indirect noise). The convective time is selected to be longer than the heating pulse length, in order to obtain a clear time separation between direct and indirect noise in the overall pressure trace. The contribution of indirect noise to the overall noise is shown to be non-negligible either in subsonic or sonic throat conditions. However, the absolute amplitude of direct noise is larger than the corresponding fraction of indirect noise, explaining the difficulty in clearly identifying the two contributions when they are merged. Further, the work shows the importance of using appropriate pressure transducer instrumentation and correcting for the respective transfer functions in order to account for low frequency effects in the determination of pressure fluctuations.

  1. Extension of Murray's law using a non-Newtonian model of blood flow.

    PubMed

    Revellin, Rémi; Rousset, François; Baud, David; Bonjour, Jocelyn

    2009-05-15

    So far, none of the existing methods on Murray's law deal with the non-Newtonian behavior of blood flow although the non-Newtonian approach for blood flow modelling looks more accurate. MODELING: In the present paper, Murray's law which is applicable to an arterial bifurcation, is generalized to a non-Newtonian blood flow model (power-law model). When the vessel size reaches the capillary limitation, blood can be modeled using a non-Newtonian constitutive equation. It is assumed two different constraints in addition to the pumping power: the volume constraint or the surface constraint (related to the internal surface of the vessel). For a seek of generality, the relationships are given for an arbitrary number of daughter vessels. It is shown that for a cost function including the volume constraint, classical Murray's law remains valid (i.e. SigmaR(c) = cste with c = 3 is verified and is independent of n, the dimensionless index in the viscosity equation; R being the radius of the vessel). On the contrary, for a cost function including the surface constraint, different values of c may be calculated depending on the value of n. We find that c varies for blood from 2.42 to 3 depending on the constraint and the fluid properties. For the Newtonian model, the surface constraint leads to c = 2.5. The cost function (based on the surface constraint) can be related to entropy generation, by dividing it by the temperature. It is demonstrated that the entropy generated in all the daughter vessels is greater than the entropy generated in the parent vessel. Furthermore, it is shown that the difference of entropy generation between the parent and daughter vessels is smaller for a non-Newtonian fluid than for a Newtonian fluid.

  2. Self-growing neural network architecture using crisp and fuzzy entropy

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.

    1992-01-01

    The paper briefly describes the self-growing neural network algorithm, CID2, which makes decision trees equivalent to hidden layers of a neural network. The algorithm generates a feedforward architecture using crisp and fuzzy entropy measures. The results of a real-life recognition problem of distinguishing defects in a glass ribbon and of a benchmark problem of differentiating two spirals are shown and discussed.

  3. Self-growing neural network architecture using crisp and fuzzy entropy

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.

    1992-01-01

    The paper briefly describes the self-growing neural network algorithm, CID3, which makes decision trees equivalent to hidden layers of a neural network. The algorithm generates a feedforward architecture using crisp and fuzzy entropy measures. The results for a real-life recognition problem of distinguishing defects in a glass ribbon, and for a benchmark problen of telling two spirals apart are shown and discussed.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khosla, D.; Singh, M.

    The estimation of three-dimensional dipole current sources on the cortical surface from the measured magnetoencephalogram (MEG) is a highly under determined inverse problem as there are many {open_quotes}feasible{close_quotes} images which are consistent with the MEG data. Previous approaches to this problem have concentrated on the use of weighted minimum norm inverse methods. While these methods ensure a unique solution, they often produce overly smoothed solutions and exhibit severe sensitivity to noise. In this paper we explore the maximum entropy approach to obtain better solutions to the problem. This estimation technique selects that image from the possible set of feasible imagesmore » which has the maximum entropy permitted by the information available to us. In order to account for the presence of noise in the data, we have also incorporated a noise rejection or likelihood term into our maximum entropy method. This makes our approach mirror a Bayesian maximum a posteriori (MAP) formulation. Additional information from other functional techniques like functional magnetic resonance imaging (fMRI) can be incorporated in the proposed method in the form of a prior bias function to improve solutions. We demonstrate the method with experimental phantom data from a clinical 122 channel MEG system.« less

  5. Spectral Entropy Can Predict Changes of Working Memory Performance Reduced by Short-Time Training in the Delayed-Match-to-Sample Task

    PubMed Central

    Tian, Yin; Zhang, Huiling; Xu, Wei; Zhang, Haiyong; Yang, Li; Zheng, Shuxing; Shi, Yupan

    2017-01-01

    Spectral entropy, which was generated by applying the Shannon entropy concept to the power distribution of the Fourier-transformed electroencephalograph (EEG), was utilized to measure the uniformity of power spectral density underlying EEG when subjects performed the working memory tasks twice, i.e., before and after training. According to Signed Residual Time (SRT) scores based on response speed and accuracy trade-off, 20 subjects were divided into two groups, namely high-performance and low-performance groups, to undertake working memory (WM) tasks. We found that spectral entropy derived from the retention period of WM on channel FC4 exhibited a high correlation with SRT scores. To this end, spectral entropy was used in support vector machine classifier with linear kernel to differentiate these two groups. Receiver operating characteristics analysis and leave-one out cross-validation (LOOCV) demonstrated that the averaged classification accuracy (CA) was 90.0 and 92.5% for intra-session and inter-session, respectively, indicating that spectral entropy could be used to distinguish these two different WM performance groups successfully. Furthermore, the support vector regression prediction model with radial basis function kernel and the root-mean-square error of prediction revealed that spectral entropy could be utilized to predict SRT scores on individual WM performance. After testing the changes in SRT scores and spectral entropy for each subject by short-time training, we found that 16 in 20 subjects’ SRT scores were clearly promoted after training and 15 in 20 subjects’ SRT scores showed consistent changes with spectral entropy before and after training. The findings revealed that spectral entropy could be a promising indicator to predict individual’s WM changes by training and further provide a novel application about WM for brain–computer interfaces. PMID:28912701

  6. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  7. Measurement-induced randomness and state-merging

    NASA Astrophysics Data System (ADS)

    Chakrabarty, Indranil; Deshpande, Abhishek; Chatterjee, Sourav

    In this work we introduce the randomness which is truly quantum mechanical in nature arising as an act of measurement. For a composite classical system, we have the joint entropy to quantify the randomness present in the total system and that happens to be equal to the sum of the entropy of one subsystem and the conditional entropy of the other subsystem, given we know the first system. The same analogy carries over to the quantum setting by replacing the Shannon entropy by the von Neumann entropy. However, if we replace the conditional von Neumann entropy by the average conditional entropy due to measurement, we find that it is different from the joint entropy of the system. We call this difference Measurement Induced Randomness (MIR) and argue that this is unique of quantum mechanical systems and there is no classical counterpart to this. In other words, the joint von Neumann entropy gives only the total randomness that arises because of the heterogeneity of the mixture and we show that it is not the total randomness that can be generated in the composite system. We generalize this quantity for N-qubit systems and show that it reduces to quantum discord for two-qubit systems. Further, we show that it is exactly equal to the change in the cost quantum state merging that arises because of the measurement. We argue that for quantum information processing tasks like state merging, the change in the cost as a result of discarding prior information can also be viewed as a rise of randomness due to measurement.

  8. Potential of mean force between two hydrophobic solutes in water.

    PubMed

    Southall, Noel T; Dill, Ken A

    2002-12-10

    We study the potential of mean force between two nonpolar solutes in the Mercedes Benz model of water. Using NPT Monte Carlo simulations, we find that the solute size determines the relative preference of two solute molecules to come into contact ('contact minimum') or to be separated by a single layer of water ('solvent-separated minimum'). Larger solutes more strongly prefer the contacting state, while smaller solutes have more tendency to become solvent-separated, particularly in cold water. The thermal driving forces oscillate with solute separation. Contacts are stabilized by entropy, whereas solvent-separated solute pairing is stabilized by enthalpy. The free energy of interaction for small solutes is well-approximated by scaled-particle theory. Copyright 2002 Elsevier Science B.V.

  9. Sample entropy applied to the analysis of synthetic time series and tachograms

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.

    2017-01-01

    Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.

  10. An entropy-assisted musculoskeletal shoulder model.

    PubMed

    Xu, Xu; Lin, Jia-Hua; McGorry, Raymond W

    2017-04-01

    Optimization combined with a musculoskeletal shoulder model has been used to estimate mechanical loading of musculoskeletal elements around the shoulder. Traditionally, the objective function is to minimize the summation of the total activities of the muscles with forces, moments, and stability constraints. Such an objective function, however, tends to neglect the antagonist muscle co-contraction. In this study, an objective function including an entropy term is proposed to address muscle co-contractions. A musculoskeletal shoulder model is developed to apply the proposed objective function. To find the optimal weight for the entropy term, an experiment was conducted. In the experiment, participants generated various 3-D shoulder moments in six shoulder postures. The surface EMG of 8 shoulder muscles was measured and compared with the predicted muscle activities based on the proposed objective function using Bhattacharyya distance and concordance ratio under different weight of the entropy term. The results show that a small weight of the entropy term can improve the predictability of the model in terms of muscle activities. Such a result suggests that the concept of entropy could be helpful for further understanding the mechanism of muscle co-contractions as well as developing a shoulder biomechanical model with greater validity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Generation of skeletal mechanism by means of projected entropy participation indices

    NASA Astrophysics Data System (ADS)

    Paolucci, Samuel; Valorani, Mauro; Ciottoli, Pietro Paolo; Galassi, Riccardo Malpica

    2017-11-01

    When the dynamics of reactive systems develop very-slow and very-fast time scales separated by a range of active time scales, with gaps in the fast/active and slow/active time scales, then it is possible to achieve multi-scale adaptive model reduction along-with the integration of the ODEs using the G-Scheme. The scheme assumes that the dynamics is decomposed into active, slow, fast, and invariant subspaces. We derive expressions that establish a direct link between time scales and entropy production by using estimates provided by the G-Scheme. To calculate the contribution to entropy production, we resort to a standard model of a constant pressure, adiabatic, batch reactor, where the mixture temperature of the reactants is initially set above the auto-ignition temperature. Numerical experiments show that the contribution to entropy production of the fast subspace is of the same magnitude as the error threshold chosen for the identification of the decomposition of the tangent space, and the contribution of the slow subspace is generally much smaller than that of the active subspace. The information on entropy production associated with reactions within each subspace is used to define an entropy participation index that is subsequently utilized for model reduction.

  12. Noise, chaos, and (ɛ, τ)-entropy per unit time

    NASA Astrophysics Data System (ADS)

    Gaspard, Pierre; Wang, Xiao-Jing

    1993-12-01

    The degree of dynamical randomness of different time processes is characterized in terms of the (ε, τ)-entropy per unit time. The (ε, τ)-entropy is the amount of information generated per unit time, at different scales τ of time and ε of the observables. This quantity generalizes the Kolmogorov-Sinai entropy per unit time from deterministic chaotic processes, to stochastic processes such as fluctuations in mesoscopic physico-chemical phenomena or strong turbulence in macroscopic spacetime dynamics. The random processes that are characterized include chaotic systems, Bernoulli and Markov chains, Poisson and birth-and-death processes, Ornstein-Uhlenbeck and Yaglom noises, fractional Brownian motions, different regimes of hydrodynamical turbulence, and the Lorentz-Boltzmann process of nonequilibrium statistical mechanics. We also extend the (ε, τ)-entropy to spacetime processes like cellular automata, Conway's game of life, lattice gas automata, coupled maps, spacetime chaos in partial differential equations, as well as the ideal, the Lorentz, and the hard sphere gases. Through these examples it is demonstrated that the (ε, τ)-entropy provides a unified quantitative measure of dynamical randomness to both chaos and noises, and a method to detect transitions between dynamical states of different degrees of randomness as a parameter of the system is varied.

  13. Entropy generation and momentum transfer in the superconductor-normal and normal-superconductor phase transformations and the consistency of the conventional theory of superconductivity

    NASA Astrophysics Data System (ADS)

    Hirsch, J. E.

    2018-05-01

    Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.

  14. Geometric optimization of an active magnetic regenerative refrigerator via second-law analysis

    NASA Astrophysics Data System (ADS)

    Li, Peng; Gong, Maoqiong; Wu, Jianfeng

    2008-11-01

    Previous analyses [Z. Yan and J. Chen, J. Appl. Phys. 72, 1 (1992); J. Chen and Z. Yan, ibid., 84, 1791 (1998); Lin et al., Physica B 344, 147 (2004); Yang et al., ibid., 364, 33 (2005); Xia et al., ibid., 381, 246 (2006).] of irreversibilities in magnetic refrigerators overlooked several important losses that could be dominant in a real active magnetic regenerative refrigerator (AMRR). No quantitative expressions have been provided yet to estimate the corresponding entropy generations in real AMRRs. The important geometric parameters of AMRRs, such as the aspect ratio of the active magnetic regenerator and the refrigerant diameter, are still arbitrarily chosen. Expressions for calculating different types of entropy generations in the AMRR were derived and used to optimize the aspect ratio and the refrigerant diameter. An optimal coefficient of performance (15.54) was achieved at an aspect ratio of 6.39 and a refrigerant diameter of 1.1mm for our current system. Further study showed that the dissipative sources (e.g., the fluid friction and the unbalanced magnetic forces) in AMRRs, which were overlooked by previous investigations, could significantly contribute to entropy generations.

  15. Entropy generation analysis for film boiling: A simple model of quenching

    NASA Astrophysics Data System (ADS)

    Lotfi, Ali; Lakzian, Esmail

    2016-04-01

    In this paper, quenching in high-temperature materials processing is modeled as a superheated isothermal flat plate. In these phenomena, a liquid flows over the highly superheated surfaces for cooling. So the surface and the liquid are separated by the vapor layer that is formed because of the liquid which is in contact with the superheated surface. This is named forced film boiling. As an objective, the distribution of the entropy generation in the laminar forced film boiling is obtained by similarity solution for the first time in the quenching processes. The PDE governing differential equations of the laminar film boiling including continuity, momentum, and energy are reduced to ODE ones, and a dimensionless equation for entropy generation inside the liquid boundary and vapor layer is obtained. Then the ODEs are solved by applying the 4th-order Runge-Kutta method with a shooting procedure. Moreover, the Bejan number is used as a design criterion parameter for a qualitative study about the rate of cooling and the effects of plate speed are studied in the quenching processes. It is observed that for high speed of the plate the rate of cooling (heat transfer) is more.

  16. A fault diagnosis scheme for planetary gearboxes using modified multi-scale symbolic dynamic entropy and mRMR feature selection

    NASA Astrophysics Data System (ADS)

    Li, Yongbo; Yang, Yuantao; Li, Guoyan; Xu, Minqiang; Huang, Wenhu

    2017-07-01

    Health condition identification of planetary gearboxes is crucial to reduce the downtime and maximize productivity. This paper aims to develop a novel fault diagnosis method based on modified multi-scale symbolic dynamic entropy (MMSDE) and minimum redundancy maximum relevance (mRMR) to identify the different health conditions of planetary gearbox. MMSDE is proposed to quantify the regularity of time series, which can assess the dynamical characteristics over a range of scales. MMSDE has obvious advantages in the detection of dynamical changes and computation efficiency. Then, the mRMR approach is introduced to refine the fault features. Lastly, the obtained new features are fed into the least square support vector machine (LSSVM) to complete the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault types of planetary gearboxes.

  17. A measurement of disorder in binary sequences

    NASA Astrophysics Data System (ADS)

    Gong, Longyan; Wang, Haihong; Cheng, Weiwen; Zhao, Shengmei

    2015-03-01

    We propose a complex quantity, AL, to characterize the degree of disorder of L-length binary symbolic sequences. As examples, we respectively apply it to typical random and deterministic sequences. One kind of random sequences is generated from a periodic binary sequence and the other is generated from the logistic map. The deterministic sequences are the Fibonacci and Thue-Morse sequences. In these analyzed sequences, we find that the modulus of AL, denoted by |AL | , is a (statistically) equivalent quantity to the Boltzmann entropy, the metric entropy, the conditional block entropy and/or other quantities, so it is a useful quantitative measure of disorder. It can be as a fruitful index to discern which sequence is more disordered. Moreover, there is one and only one value of |AL | for the overall disorder characteristics. It needs extremely low computational costs. It can be easily experimentally realized. From all these mentioned, we believe that the proposed measure of disorder is a valuable complement to existing ones in symbolic sequences.

  18. Numerical study of entropy generation and melting heat transfer on MHD generalised non-Newtonian fluid (GNF): Application to optimal energy

    NASA Astrophysics Data System (ADS)

    Iqbal, Z.; Mehmood, Zaffar; Ahmad, Bilal

    2018-05-01

    This paper concerns an application to optimal energy by incorporating thermal equilibrium on MHD-generalised non-Newtonian fluid model with melting heat effect. Highly nonlinear system of partial differential equations is simplified to a nonlinear system using boundary layer approach and similarity transformations. Numerical solutions of velocity and temperature profile are obtained by using shooting method. The contribution of entropy generation is appraised on thermal and fluid velocities. Physical features of relevant parameters have been discussed by plotting graphs and tables. Some noteworthy findings are: Prandtl number, power law index and Weissenberg number contribute in lowering mass boundary layer thickness and entropy effect and enlarging thermal boundary layer thickness. However, an increasing mass boundary layer effect is only due to melting heat parameter. Moreover, thermal boundary layers have same trend for all parameters, i.e., temperature enhances with increase in values of significant parameters. Similarly, Hartman and Weissenberg numbers enhance Bejan number.

  19. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  20. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    PubMed

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  1. Entropy for Mechanically Vibrating Systems

    NASA Astrophysics Data System (ADS)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  2. A method for the fast estimation of a battery entropy-variation high-resolution curve - Application on a commercial LiFePO4/graphite cell

    NASA Astrophysics Data System (ADS)

    Damay, Nicolas; Forgez, Christophe; Bichat, Marie-Pierre; Friedrich, Guy

    2016-11-01

    The entropy-variation of a battery is responsible for heat generation or consumption during operation and its prior measurement is mandatory for developing a thermal model. It is generally done through the potentiometric method which is considered as a reference. However, it requires several days or weeks to get a look-up table with a 5 or 10% SoC (State of Charge) resolution. In this study, a calorimetric method based on the inversion of a thermal model is proposed for the fast estimation of a nearly continuous curve of entropy-variation. This is achieved by separating the heats produced while charging and discharging the battery. The entropy-variation is then deduced from the extracted entropic heat. The proposed method is validated by comparing the results obtained with several current rates to measurements made with the potentiometric method.

  3. Analyzing the financial crisis using the entropy density function

    NASA Astrophysics Data System (ADS)

    Oh, Gabjin; Kim, Ho-yong; Ahn, Seok-Won; Kwak, Wooseop

    2015-02-01

    The risk that is created by nonlinear interactions among subjects in economic systems is assumed to increase during an abnormal state of a financial market. Nevertheless, investigating the systemic risk in financial markets following the global financial crisis is not sufficient. In this paper, we analyze the entropy density function in the return time series for several financial markets, such as the S&P500, KOSPI, and DAX indices, from October 2002 to December 2011 and analyze the variability in the entropy value over time. We find that the entropy density function of the S&P500 index during the subprime crisis exhibits a significant decrease compared to that in other periods, whereas the other markets, such as those in Germany and Korea, exhibit no significant decrease during the market crisis. These findings demonstrate that the S&P500 index generated a regular pattern in the return time series during the financial crisis.

  4. Entropy production of doubly stochastic quantum channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müller-Hermes, Alexander, E-mail: muellerh@posteo.net; Department of Mathematical Sciences, University of Copenhagen, 2100 Copenhagen; Stilck França, Daniel, E-mail: dsfranca@mytum.de

    2016-02-15

    We study the entropy increase of quantum systems evolving under primitive, doubly stochastic Markovian noise and thus converging to the maximally mixed state. This entropy increase can be quantified by a logarithmic-Sobolev constant of the Liouvillian generating the noise. We prove a universal lower bound on this constant that stays invariant under taking tensor-powers. Our methods involve a new comparison method to relate logarithmic-Sobolev constants of different Liouvillians and a technique to compute logarithmic-Sobolev inequalities of Liouvillians with eigenvectors forming a projective representation of a finite abelian group. Our bounds improve upon similar results established before and as an applicationmore » we prove an upper bound on continuous-time quantum capacities. In the last part of this work we study entropy production estimates of discrete-time doubly stochastic quantum channels by extending the framework of discrete-time logarithmic-Sobolev inequalities to the quantum case.« less

  5. Kinetic analysis of hyperpolarized data with minimum a priori knowledge: Hybrid maximum entropy and nonlinear least squares method (MEM/NLS).

    PubMed

    Mariotti, Erika; Veronese, Mattia; Dunn, Joel T; Southworth, Richard; Eykyn, Thomas R

    2015-06-01

    To assess the feasibility of using a hybrid Maximum-Entropy/Nonlinear Least Squares (MEM/NLS) method for analyzing the kinetics of hyperpolarized dynamic data with minimum a priori knowledge. A continuous distribution of rates obtained through the Laplace inversion of the data is used as a constraint on the NLS fitting to derive a discrete spectrum of rates. Performance of the MEM/NLS algorithm was assessed through Monte Carlo simulations and validated by fitting the longitudinal relaxation time curves of hyperpolarized [1-(13) C] pyruvate acquired at 9.4 Tesla and at three different flip angles. The method was further used to assess the kinetics of hyperpolarized pyruvate-lactate exchange acquired in vitro in whole blood and to re-analyze the previously published in vitro reaction of hyperpolarized (15) N choline with choline kinase. The MEM/NLS method was found to be adequate for the kinetic characterization of hyperpolarized in vitro time-series. Additional insights were obtained from experimental data in blood as well as from previously published (15) N choline experimental data. The proposed method informs on the compartmental model that best approximate the biological system observed using hyperpolarized (13) C MR especially when the metabolic pathway assessed is complex or a new hyperpolarized probe is used. © 2014 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc.

  6. Entanglement entropy in causal set theory

    NASA Astrophysics Data System (ADS)

    Sorkin, Rafael D.; Yazdi, Yasaman K.

    2018-04-01

    Entanglement entropy is now widely accepted as having deep connections with quantum gravity. It is therefore desirable to understand it in the context of causal sets, especially since they provide in a natural manner the UV cutoff needed to render entanglement entropy finite. Formulating a notion of entanglement entropy in a causal set is not straightforward because the type of canonical hypersurface-data on which its definition typically relies is not available. Instead, we appeal to the more global expression given in Sorkin (2012 (arXiv:1205.2953)) which, for a Gaussian scalar field, expresses the entropy of a spacetime region in terms of the field’s correlation function within that region (its ‘Wightman function’ W(x, x') ). Carrying this formula over to the causal set, one obtains an entropy which is both finite and of a Lorentz invariant nature. We evaluate this global entropy-expression numerically for certain regions (primarily order-intervals or ‘causal diamonds’) within causal sets of 1  +  1 dimensions. For the causal-set counterpart of the entanglement entropy, we obtain, in the first instance, a result that follows a (spacetime) volume law instead of the expected (spatial) area law. We find, however, that one obtains an area law if one truncates the commutator function (‘Pauli–Jordan operator’) and the Wightman function by projecting out the eigenmodes of the Pauli–Jordan operator whose eigenvalues are too close to zero according to a geometrical criterion which we describe more fully below. In connection with these results and the questions they raise, we also study the ‘entropy of coarse-graining’ generated by thinning out the causal set, and we compare it with what one obtains by similarly thinning out a chain of harmonic oscillators, finding the same, ‘universal’ behaviour in both cases.

  7. Evaluation of the entropy consistent euler flux on 1D and 2D test problems

    NASA Astrophysics Data System (ADS)

    Roslan, Nur Khairunnisa Hanisah; Ismail, Farzad

    2012-06-01

    Perhaps most CFD simulations may yield good predictions of pressure and velocity when compared to experimental data. Unfortunately, these results will most likely not adhere to the second law of thermodynamics hence comprising the authenticity of predicted data. Currently, the test of a good CFD code is to check how much entropy is generated in a smooth flow and hope that the numerical entropy produced is of the correct sign when a shock is encountered. Herein, a shock capturing code written in C++ based on a recent entropy consistent Euler flux is developed to simulate 1D and 2D flows. Unlike other finite volume schemes in commercial CFD code, this entropy consistent flux (EC) function precisely satisfies the discrete second law of thermodynamics. This EC flux has an entropy-conserved part, preserving entropy for smooth flows and a numerical diffusion part that will accurately produce the proper amount of entropy, consistent with the second law. Several numerical simulations of the entropy consistent flux have been tested on two dimensional test cases. The first case is a Mach 3 flow over a forward facing step. The second case is a flow over a NACA 0012 airfoil while the third case is a hypersonic flow passing over a 2D cylinder. Local flow quantities such as velocity and pressure are analyzed and then compared with mainly the Roe flux. The results herein show that the EC flux does not capture the unphysical rarefaction shock unlike the Roe-flux and does not easily succumb to the carbuncle phenomenon. In addition, the EC flux maintains good performance in cases where the Roe flux is known to be superior.

  8. Fault detection and diagnosis for gas turbines based on a kernelized information entropy model.

    PubMed

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.

  9. Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model

    PubMed Central

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms. PMID:25258726

  10. Secure uniform random-number extraction via incoherent strategies

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Zhu, Huangjun

    2018-01-01

    To guarantee the security of uniform random numbers generated by a quantum random-number generator, we study secure extraction of uniform random numbers when the environment of a given quantum state is controlled by the third party, the eavesdropper. Here we restrict our operations to incoherent strategies that are composed of the measurement on the computational basis and incoherent operations (or incoherence-preserving operations). We show that the maximum secure extraction rate is equal to the relative entropy of coherence. By contrast, the coherence of formation gives the extraction rate when a certain constraint is imposed on the eavesdropper's operations. The condition under which the two extraction rates coincide is then determined. Furthermore, we find that the exponential decreasing rate of the leaked information is characterized by Rényi relative entropies of coherence. These results clarify the power of incoherent strategies in random-number generation, and can be applied to guarantee the quality of random numbers generated by a quantum random-number generator.

  11. An entropy correction method for unsteady full potential flows with strong shocks

    NASA Technical Reports Server (NTRS)

    Whitlow, W., Jr.; Hafez, M. M.; Osher, S. J.

    1986-01-01

    An entropy correction method for the unsteady full potential equation is presented. The unsteady potential equation is modified to account for entropy jumps across shock waves. The conservative form of the modified equation is solved in generalized coordinates using an implicit, approximate factorization method. A flux-biasing differencing method, which generates the proper amounts of artificial viscosity in supersonic regions, is used to discretize the flow equations in space. Comparisons between the present method and solutions of the Euler equations and between the present method and experimental data are presented. The comparisons show that the present method more accurately models solutions of the Euler equations and experiment than does the isentropic potential formulation.

  12. Maximum-entropy description of animal movement.

    PubMed

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  13. Gravitational baryogenesis in running vacuum models

    NASA Astrophysics Data System (ADS)

    Oikonomou, V. K.; Pan, Supriya; Nunes, Rafael C.

    2017-08-01

    We study the gravitational baryogenesis mechanism for generating baryon asymmetry in the context of running vacuum models. Regardless of whether these models can produce a viable cosmological evolution, we demonstrate that they produce a nonzero baryon-to-entropy ratio even if the universe is filled with conformal matter. This is a sound difference between the running vacuum gravitational baryogenesis and the Einstein-Hilbert one, since in the latter case, the predicted baryon-to-entropy ratio is zero. We consider two well known and most used running vacuum models and show that the resulting baryon-to-entropy ratio is compatible with the observational data. Moreover, we also show that the mechanism of gravitational baryogenesis may constrain the running vacuum models.

  14. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    DTIC Science & Technology

    2014-07-01

    to the query graph, or subgraph permutations with the same mismatch cost (often the case for homogeneous and/or symmetrical data/query). To avoid...decisions are generated in a bottom-up manner using the metric of entropy at the cluster level (Figure 9c). Using the definition of belief messages...for a cluster and a set of data nodes in this cluster , we compute the entropy for forward and backward messages as (,) = −∑ (

  15. Acoustic firearm discharge detection and classification in an enclosed environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luzi, Lorenzo; Gonzalez, Eric; Bruillard, Paul

    2016-05-01

    Two different signal processing algorithms are described for detection and classification of acoustic signals generated by firearm discharges in small enclosed spaces. The first is based on the logarithm of the signal energy. The second is a joint entropy. The current study indicates that a system using both signal energy and joint entropy would be able to both detect weapon discharges and classify weapon type, in small spaces, with high statistical certainty.

  16. Intrinsic measures of field entropy in cosmological particle creation

    NASA Astrophysics Data System (ADS)

    Hu, B. L.; Pavon, D.

    1986-11-01

    Using the properties of quantum parametric oscillators, two quantities are identified which increase monotonically in time in the process of parametric amplification. The use of these quantities as possible measures of entropy generation in vacuum cosmological particle creation is suggested. These quantities which are of complementary nature are both related to the number of particles spontaneously created. Permanent address: Departamento de Termologia, Facultad de Ciencias, Universidad Autonoma de Barcelona, Ballaterra, Barcelona, Spain.

  17. The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval

    DTIC Science & Technology

    2006-07-01

    reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We

  18. Complexity Measures in Magnetoencephalography: Measuring "Disorder" in Schizophrenia

    PubMed Central

    Brookes, Matthew J.; Hall, Emma L.; Robson, Siân E.; Price, Darren; Palaniyappan, Lena; Liddle, Elizabeth B.; Liddle, Peter F.; Robinson, Stephen E.; Morris, Peter G.

    2015-01-01

    This paper details a methodology which, when applied to magnetoencephalography (MEG) data, is capable of measuring the spatio-temporal dynamics of ‘disorder’ in the human brain. Our method, which is based upon signal entropy, shows that spatially separate brain regions (or networks) generate temporally independent entropy time-courses. These time-courses are modulated by cognitive tasks, with an increase in local neural processing characterised by localised and transient increases in entropy in the neural signal. We explore the relationship between entropy and the more established time-frequency decomposition methods, which elucidate the temporal evolution of neural oscillations. We observe a direct but complex relationship between entropy and oscillatory amplitude, which suggests that these metrics are complementary. Finally, we provide a demonstration of the clinical utility of our method, using it to shed light on aberrant neurophysiological processing in schizophrenia. We demonstrate significantly increased task induced entropy change in patients (compared to controls) in multiple brain regions, including a cingulo-insula network, bilateral insula cortices and a right fronto-parietal network. These findings demonstrate potential clinical utility for our method and support a recent hypothesis that schizophrenia can be characterised by abnormalities in the salience network (a well characterised distributed network comprising bilateral insula and cingulate cortices). PMID:25886553

  19. Galilei group with multiple central extension, vorticity, and entropy generation: Exotic fluid in 3 +1 dimensions

    NASA Astrophysics Data System (ADS)

    Das, Praloy; Ghosh, Subir

    2017-12-01

    A noncommutative extension of an ideal (Hamiltonian) fluid model in 3 +1 dimensions is proposed. The model enjoys several interesting features: it allows a multiparameter central extension in Galilean boost algebra (which is significant being contrary to the existing belief that a similar feature can appear only in 2 +1 -dimensions); noncommutativity generates vorticity in a canonically irrotational fluid; it induces a nonbarotropic pressure leading to a nonisentropic system. (Barotropic fluids are entropy preserving as the pressure depends only on the matter density.) Our fluid model is termed "exotic" since it has a close resemblance with the extensively studied planar (2 +1 dimensions) exotic models and exotic (noncommutative) field theories.

  20. Energy landscapes and properties of biomolecules.

    PubMed

    Wales, David J

    2005-11-09

    Thermodynamic and dynamic properties of biomolecules can be calculated using a coarse-grained approach based upon sampling stationary points of the underlying potential energy surface. The superposition approximation provides an overall partition function as a sum of contributions from the local minima, and hence functions such as internal energy, entropy, free energy and the heat capacity. To obtain rates we must also sample transition states that link the local minima, and the discrete path sampling method provides a systematic means to achieve this goal. A coarse-grained picture is also helpful in locating the global minimum using the basin-hopping approach. Here we can exploit a fictitious dynamics between the basins of attraction of local minima, since the objective is to find the lowest minimum, rather than to reproduce the thermodynamics or dynamics.

  1. Principles of time evolution in classical physics

    NASA Astrophysics Data System (ADS)

    Güémez, J.; Fiolhais, M.

    2018-07-01

    We address principles of time evolution in classical mechanical/thermodynamical systems in translational and rotational motion, in three cases: when there is conservation of mechanical energy, when there is energy dissipation and when there is mechanical energy production. In the first case, the time derivative of the Hamiltonian vanishes. In the second one, when dissipative forces are present, the time evolution is governed by the minimum potential energy principle, or, equivalently, maximum increase of the entropy of the universe. Finally, in the third situation, when internal sources of work are available to the system, it evolves in time according to the principle of minimum Gibbs function. We apply the Lagrangian formulation to the systems, dealing with the non-conservative forces using restriction functions such as the Rayleigh dissipative function.

  2. Parameter Estimation as a Problem in Statistical Thermodynamics.

    PubMed

    Earle, Keith A; Schneider, David J

    2011-03-14

    In this work, we explore the connections between parameter fitting and statistical thermodynamics using the maxent principle of Jaynes as a starting point. In particular, we show how signal averaging may be described by a suitable one particle partition function, modified for the case of a variable number of particles. These modifications lead to an entropy that is extensive in the number of measurements in the average. Systematic error may be interpreted as a departure from ideal gas behavior. In addition, we show how to combine measurements from different experiments in an unbiased way in order to maximize the entropy of simultaneous parameter fitting. We suggest that fit parameters may be interpreted as generalized coordinates and the forces conjugate to them may be derived from the system partition function. From this perspective, the parameter fitting problem may be interpreted as a process where the system (spectrum) does work against internal stresses (non-optimum model parameters) to achieve a state of minimum free energy/maximum entropy. Finally, we show how the distribution function allows us to define a geometry on parameter space, building on previous work[1, 2]. This geometry has implications for error estimation and we outline a program for incorporating these geometrical insights into an automated parameter fitting algorithm.

  3. Upper Limits for Power Yield in Thermal, Chemical, and Electrochemical Systems

    NASA Astrophysics Data System (ADS)

    Sieniutycz, Stanislaw

    2010-03-01

    We consider modeling and power optimization of energy converters, such as thermal, solar and chemical engines and fuel cells. Thermodynamic principles lead to expressions for converter's efficiency and generated power. Efficiency equations serve to solve the problems of upgrading or downgrading a resource. Power yield is a cumulative effect in a system consisting of a resource, engines, and an infinite bath. While optimization of steady state systems requires using the differential calculus and Lagrange multipliers, dynamic optimization involves variational calculus and dynamic programming. The primary result of static optimization is the upper limit of power, whereas that of dynamic optimization is a finite-rate counterpart of classical reversible work (exergy). The latter quantity depends on the end state coordinates and a dissipation index, h, which is the Hamiltonian of the problem of minimum entropy production. In reacting systems, an active part of chemical affinity constitutes a major component of the overall efficiency. The theory is also applied to fuel cells regarded as electrochemical flow engines. Enhanced bounds on power yield follow, which are stronger than those predicted by the reversible work potential.

  4. Does horizon entropy satisfy a quantum null energy conjecture?

    NASA Astrophysics Data System (ADS)

    Fu, Zicao; Marolf, Donald

    2016-12-01

    A modern version of the idea that the area of event horizons gives 4G times an entropy is the Hubeny-Rangamani causal holographic information (CHI) proposal for holographic field theories. Given a region R of a holographic QFTs, CHI computes A/4G on a certain cut of an event horizon in the gravitational dual. The result is naturally interpreted as a coarse-grained entropy for the QFT. CHI is known to be finitely greater than the fine-grained Hubeny-Rangamani-Takayanagi (HRT) entropy when \\partial R lies on a Killing horizon of the QFT spacetime, and in this context satisfies other non-trivial properties expected of an entropy. Here we present evidence that it also satisfies the quantum null energy condition (QNEC), which bounds the second derivative of the entropy of a quantum field theory on one side of a non-expanding null surface by the flux of stress-energy across the surface. In particular, we show CHI to satisfy the QNEC in 1  +  1 holographic CFTs when evaluated in states dual to conical defects in AdS3. This surprising result further supports the idea that CHI defines a useful notion of coarse-grained holographic entropy, and suggests unprecedented bounds on the rate at which bulk horizon generators emerge from a caustic. To supplement our motivation, we include an appendix deriving a corresponding coarse-grained generalized second law for 1  +  1 holographic CFTs perturbatively coupled to dilaton gravity.

  5. Identification of breathing cracks in a beam structure with entropy

    NASA Astrophysics Data System (ADS)

    Wimarshana, Buddhi; Wu, Nan; Wu, Christine

    2016-04-01

    A cantilever beam with a breathing crack is studied to detect and evaluate the crack using entropy measures. Closed cracks in engineering structures lead to proportional complexities to their vibration responses due to weak bi-linearity imposed by the crack breathing phenomenon. Entropy is a measure of system complexity and has the potential in quantifying the complexity. The weak bi-linearity in vibration signals can be amplified using wavelet transformation to increase the sensitivity of the measurements. A mathematical model of harmonically excited unit length steel cantilever beam with a breathing crack located near the fixed end is established, and an iterative numerical method is applied to generate accurate time domain dynamic responses. The bi-linearity in time domain signals due to the crack breathing are amplified by wavelet transformation first, and then the complexities due to bi-linearity is quantified using sample entropy to detect the possible crack and estimate the crack depth. It is observed that the method is capable of identifying crack depths even at very early stages of 3% with the increase in the entropy values more than 10% compared with the healthy beam. The current study extends the entropy based damage detection of rotary machines to structural analysis and takes a step further in high-sensitivity structural health monitoring by combining wavelet transformation with entropy calculations. The proposed technique can also be applied to other types of structures, such as plates and shells.

  6. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  7. Thermodynamics and evolution.

    PubMed

    Demetrius, L

    2000-09-07

    The science of thermodynamics is concerned with understanding the properties of inanimate matter in so far as they are determined by changes in temperature. The Second Law asserts that in irreversible processes there is a uni-directional increase in thermodynamic entropy, a measure of the degree of uncertainty in the thermal energy state of a randomly chosen particle in the aggregate. The science of evolution is concerned with understanding the properties of populations of living matter in so far as they are regulated by changes in generation time. Directionality theory, a mathematical model of the evolutionary process, establishes that in populations subject to bounded growth constraints, there is a uni-directional increase in evolutionary entropy, a measure of the degree of uncertainty in the age of the immediate ancestor of a randomly chosen newborn. This article reviews the mathematical basis of directionality theory and analyses the relation between directionality theory and statistical thermodynamics. We exploit an analytic relation between temperature, and generation time, to show that the directionality principle for evolutionary entropy is a non-equilibrium extension of the principle of a uni-directional increase of thermodynamic entropy. The analytic relation between these directionality principles is consistent with the hypothesis of the equivalence of fundamental laws as one moves up the hierarchy, from a molecular ensemble where the thermodynamic laws apply, to a population of replicating entities (molecules, cells, higher organisms), where evolutionary principles prevail. Copyright 2000 Academic Press.

  8. He-Ne laser-induced changes in germination, thermodynamic parameters, internal energy, enzyme activities and physiological attributes of wheat during germination and early growth

    NASA Astrophysics Data System (ADS)

    Jamil, Yasir; Perveen, Rashida; Ashraf, Muhammad; Ali, Qasim; Iqbal, Munawar; Ahmad, Muhammad Raza

    2013-04-01

    Using low power continuous wave He-Ne laser irradiation of seeds, the germination characteristics, thermodynamic changes and enzyme activities as well as changes in morphological attributes were explored for wheat (Triticum aestivum L. cv. S-24) cultivar. The changes in thermodynamic properties such as change in enthalpy (ΔH), entropy generation [(ΔSe)], entropy flux [(ΔSc)], entropy generation ratio [(ΔS)e/Δt], and entropy flux ratio [(ΔS)c/Δt] showed significant (P < 0.05) changes at an energy level of 500 mJ. The germination energy (GE), germination percentage (G%), germination index (GI) as well as α-amylase and protease activities was also found to be higher at 500 mJ, while the mean emergence time (MET) and time for 50% germination (E50) decreased for 300 mJ irradiance. The internal energy of the seeds increased significantly at all laser energy levels, but was highest for 500 mJ 72 h after sowing. The enzyme activities increased up to 24 h after sowing and then declined. The activities of α-amylase and protease were found to be positively correlated with the plant physiological attributes. These results indicate that low power continuous wave He-Ne laser (632 nm) treatment has considerable biological effects on seed metabolism during germination as well as on later vegetative growth.

  9. Theory and Normal Mode Analysis of Change in Protein Vibrational Dynamics on Ligand Binding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mortisugu, Kei; Njunda, Brigitte; Smith, Jeremy C

    2009-12-01

    The change of protein vibrations on ligand binding is of functional and thermodynamic importance. Here, this process is characterized using a simple analytical 'ball-and-spring' model and all-atom normal-mode analysis (NMA) of the binding of the cancer drug, methotrexate (MTX) to its target, dihydrofolate reductase (DHFR). The analytical model predicts that the coupling between protein vibrations and ligand external motion generates entropy-rich, low-frequency vibrations in the complex. This is consistent with the atomistic NMA which reveals vibrational softening in forming the DHFR-MTX complex, a result also in qualitative agreement with neutron-scattering experiments. Energy minimization of the atomistic bound-state (B) structure whilemore » gradually decreasing the ligand interaction to zero allows the generation of a hypothetical 'intermediate' (I) state, without the ligand force field but with a structure similar to that of B. In going from I to B, it is found that the vibrational entropies of both the protein and MTX decrease while the complex structure becomes enthalpically stabilized. However, the relatively weak DHFR:MTX interaction energy results in the net entropy gain arising from coupling between the protein and MTX external motion being larger than the loss of vibrational entropy on complex formation. This, together with the I structure being more flexible than the unbound structure, results in the observed vibrational softening on ligand binding.« less

  10. MHD mixed convection and entropy generation of water-alumina nanofluid flow in a double lid driven cavity with discrete heating

    NASA Astrophysics Data System (ADS)

    Hussain, S.; Mehmood, K.; Sagheer, M.

    2016-12-01

    In the present study, entropy generation due to mixed convection in a partially heated square double lid driven cavity filled with Al2O3 -water nanofluid under the influence of inclined magnetic field is numerically investigated. At the lower wall of the cavity two heat sources are fixed, with the condition that the remaining part of the bottom wall is kept insulated. Top wall and vertically moving walls are maintained at constant cold temperature. Buoyant force is responsible for the flow along with the two moving vertical walls. Governing equations are discretized in space using LBB-stable finite element pair Q2 / P1disc which lead to 3rd and 2nd order accuracy in the L2-norm for the velocity/temperature and pressure, respectively and the fully implicit Crank-Nicolson scheme of 2nd order accuracy is utilized for the temporal discretization. The discretized systems of nonlinear equations are treated by using the Newton method and the associated linear subproblems are solved by means of Guassian elimination method. Numerical results are presented and analyzed by means of streamlines, isotherms, tables and some useful plots. Impacts of emerging parameters on the flow, in specific ranges such as Reynolds number (1 ≤ Re ≤ 100) , Richardson number (1 ≤ Ri ≤ 50) , Hartman number (0 ≤ Ha ≤ 100) , solid volume fraction (0 ≤ ϕ ≤ 0.2) as well as the angles of inclined magnetic field (0 ° ≤ γ ≤ 90 °) are investigated and the findings are exactly of the same order as that of the previously performed analysis. Calculation of average Nusselt number, entropy generation due to heat transfer, fluid friction and magnetic field, total entropy generation, Bejan number and kinetic energy are the main focus of our study.

  11. Develop and Test a Solvent Accessible Surface Area-Based Model in Conformational Entropy Calculations

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2012-01-01

    It is of great interest in modern drug design to accurately calculate the free energies of protein-ligand or nucleic acid-ligand binding. MM-PBSA (Molecular Mechanics-Poisson Boltzmann Surface Area) and MM-GBSA (Molecular Mechanics-Generalized Born Surface Area) have gained popularity in this field. For both methods, the conformational entropy, which is usually calculated through normal mode analysis (NMA), is needed to calculate the absolute binding free energies. Unfortunately, NMA is computationally demanding and becomes a bottleneck of the MM-PB/GBSA-NMA methods. In this work, we have developed a fast approach to estimate the conformational entropy based upon solvent accessible surface area calculations. In our approach, the conformational entropy of a molecule, S, can be obtained by summing up the contributions of all atoms, no matter they are buried or exposed. Each atom has two types of surface areas, solvent accessible surface area (SAS) and buried SAS (BSAS). The two types of surface areas are weighted to estimate the contribution of an atom to S. Atoms having the same atom type share the same weight and a general parameter k is applied to balance the contributions of the two types of surface areas. This entropy model was parameterized using a large set of small molecules for which their conformational entropies were calculated at the B3LYP/6-31G* level taking the solvent effect into account. The weighted solvent accessible surface area (WSAS) model was extensively evaluated in three tests. For the convenience, TS, the product of temperature T and conformational entropy S, were calculated in those tests. T was always set to 298.15 K through the text. First of all, good correlations were achieved between WSAS TS and NMA TS for 44 protein or nucleic acid systems sampled with molecular dynamics simulations (10 snapshots were collected for post-entropy calculations): the mean correlation coefficient squares (R2) was 0.56. As to the 20 complexes, the TS changes upon binding, TΔS, were also calculated and the mean R2 was 0.67 between NMA and WSAS. In the second test, TS were calculated for 12 proteins decoy sets (each set has 31 conformations) generated by the Rosetta software package. Again, good correlations were achieved for all decoy sets: the mean, maximum, minimum of R2 were 0.73, 0.89 and 0.55, respectively. Finally, binding free energies were calculated for 6 protein systems (the numbers of inhibitors range from 4 to 18) using four scoring functions. Compared to the measured binding free energies, the mean R2 of the six protein systems were 0.51, 0.47, 0.40 and 0.43 for MM-GBSA-WSAS, MM-GBSA-NMA, MM-PBSA-WSAS and MM-PBSA-NMA, respectively. The mean RMS errors of prediction were 1.19, 1.24, 1.41, 1.29 kcal/mol for the four scoring functions, correspondingly. Therefore, the two scoring functions employing WSAS achieved a comparable prediction performance to that of the scoring functions using NMA. It should be emphasized that no minimization was performed prior to the WSAS calculation in the last test. Although WSAS is not as rigorous as physical models such as quasi-harmonic analysis and thermodynamic integration (TI), it is computationally very efficient as only surface area calculation is involved and no structural minimization is required. Moreover, WSAS has achieved a comparable performance to normal mode analysis. We expect that this model could find its applications in the fields like high throughput screening (HTS), molecular docking and rational protein design. In those fields, efficiency is crucial since there are a large number of compounds, docking poses or protein models to be evaluated. A list of acronyms and abbreviations used in this work is provided for quick reference. PMID:22497310

  12. An uncertainty principle for unimodular quantum groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crann, Jason; Université Lille 1 - Sciences et Technologies, UFR de Mathématiques, Laboratoire de Mathématiques Paul Painlevé - UMR CNRS 8524, 59655 Villeneuve d'Ascq Cédex; Kalantar, Mehrdad, E-mail: jason-crann@carleton.ca, E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect tomore » the Haar weight reduces to the canonical entropy of the random walk generated by the state.« less

  13. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  14. Convex foundations for generalized MaxEnt models

    NASA Astrophysics Data System (ADS)

    Frongillo, Rafael; Reid, Mark D.

    2014-12-01

    We present an approach to maximum entropy models that highlights the convex geometry and duality of generalized exponential families (GEFs) and their connection to Bregman divergences. Using our framework, we are able to resolve a puzzling aspect of the bijection of Banerjee and coauthors between classical exponential families and what they call regular Bregman divergences. Their regularity condition rules out all but Bregman divergences generated from log-convex generators. We recover their bijection and show that a much broader class of divergences correspond to GEFs via two key observations: 1) Like classical exponential families, GEFs have a "cumulant" C whose subdifferential contains the mean: Eo˜pθ[φ(o)]∈∂C(θ) ; 2) Generalized relative entropy is a C-Bregman divergence between parameters: DF(pθ,pθ')= D C(θ,θ') , where DF becomes the KL divergence for F = -H. We also show that every incomplete market with cost function C can be expressed as a complete market, where the prices are constrained to be a GEF with cumulant C. This provides an entirely new interpretation of prediction markets, relating their design back to the principle of maximum entropy.

  15. From entropy-maximization to equality-maximization: Gauss, Laplace, Pareto, and Subbotin

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2014-12-01

    The entropy-maximization paradigm of statistical physics is well known to generate the omnipresent Gauss law. In this paper we establish an analogous socioeconomic model which maximizes social equality, rather than physical disorder, in the context of the distributions of income and wealth in human societies. We show that-on a logarithmic scale-the Laplace law is the socioeconomic equality-maximizing counterpart of the physical entropy-maximizing Gauss law, and that this law manifests an optimized balance between two opposing forces: (i) the rich and powerful, striving to amass ever more wealth, and thus to increase social inequality; and (ii) the masses, struggling to form more egalitarian societies, and thus to increase social equality. Our results lead from log-Gauss statistics to log-Laplace statistics, yield Paretian power-law tails of income and wealth distributions, and show how the emergence of a middle-class depends on the underlying levels of socioeconomic inequality and variability. Also, in the context of asset-prices with Laplace-distributed returns, our results imply that financial markets generate an optimized balance between risk and predictability.

  16. Unresolved Problems by Shock Capturing: Taming the Overheating Problem

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2012-01-01

    The overheating problem, first observed by von Neumann [1] and later studied extensively by Noh [2] using both Eulerian and Lagrangian formulations, remains to be one of the unsolved problems by shock capturing. It is historically well known to occur when a flow is under compression, such as when a shock wave hits and reflects from a wall or when two streams collides with each other. The overheating phenomenon is also found numerically in a smooth flow undergoing rarefaction created by two streams receding from each other. This is in contrary to one s intuition expecting a decrease in internal energy. The excessive amount in the temperature increase does not reduce by refining the mesh size or increasing the order of accuracy. This study finds that the overheating in the receding flow correlates with the entropy generation. By requiring entropy preservation, the overheating is eliminated and the solution is grid convergent. The shock-capturing scheme, as being practiced today, gives rise to the entropy generation, which in turn causes the overheating. This assertion stands up to the convergence test.

  17. Structure and Dynamics of Fluid Planets

    NASA Astrophysics Data System (ADS)

    Houben, H.

    2014-12-01

    Attention to conservation laws gives a comprehensive picture of the structure and dynamics of gas giants: Atmospheric differential rotation is generated by tidal torques (dependent on tropospheric static stability) and is dragged into the interior by turbulent viscosity. The consequent heat dissipation generates baroclinicity and approximate thermal wind balance, not Taylor-Proudman conditions. Magnetic Lorentz forces have no effect on the zonal wind, but generate a meridional wind approximately parallel to field lines. Thus, magnetic field generation in the interior is dominated by the ω-effect (zonal field wound up by differential rotation), with the α-effect (meridional field generated by turbulence) severely limited by the β-effect (turbulence-enhanced resistivity). The meridional circulation quenches the ω-effect so that a steady state is reached and also limits the magnitude of the non-axisymmetric field under certain circumstances. The stability of the steady state requires further study. The magnetic field travels with the E X B drift, rather than the fluid velocity. Work by the fluid on the magnetic field balances work by the magnetic field on the fluid, so the global heat flux is little changed. In conducting regions the meridional density distribution (and gravity field) is most sensitive to the total pressure (gas + magnetic) and the ω-effect. In nonconducting regions, the gas pressure, centrifugal force, and differential rotation dominate. The differential rotation varies at least as fast as r³, so the gravitational signal is small compared to that for differential rotation on cylinders. The entropy minimum near the tropopause allows meteorology to be dominated by (relatively) long-lived, closed potential temperature surfaces, usually called spots, which conserve potential vorticity. All of the above must be taken into account to properly assimilate any available observational data to further specify the interior properties of fluid planets.

  18. Shallow water equations: viscous solutions and inviscid limit

    NASA Astrophysics Data System (ADS)

    Chen, Gui-Qiang; Perepelitsa, Mikhail

    2012-12-01

    We establish the inviscid limit of the viscous shallow water equations to the Saint-Venant system. For the viscous equations, the viscosity terms are more degenerate when the shallow water is close to the bottom, in comparison with the classical Navier-Stokes equations for barotropic gases; thus, the analysis in our earlier work for the classical Navier-Stokes equations does not apply directly, which require new estimates to deal with the additional degeneracy. We first introduce a notion of entropy solutions to the viscous shallow water equations and develop an approach to establish the global existence of such solutions and their uniform energy-type estimates with respect to the viscosity coefficient. These uniform estimates yield the existence of measure-valued solutions to the Saint-Venant system generated by the viscous solutions. Based on the uniform energy-type estimates and the features of the Saint-Venant system, we further establish that the entropy dissipation measures of the viscous solutions for weak entropy-entropy flux pairs, generated by compactly supported C 2 test-functions, are confined in a compact set in H -1, which yields that the measure-valued solutions are confined by the Tartar-Murat commutator relation. Then, the reduction theorem established in Chen and Perepelitsa [5] for the measure-valued solutions with unbounded support leads to the convergence of the viscous solutions to a finite-energy entropy solution of the Saint-Venant system with finite-energy initial data, which is relative with respect to the different end-states of the bottom topography of the shallow water at infinity. The analysis also applies to the inviscid limit problem for the Saint-Venant system in the presence of friction.

  19. Metabolic networks evolve towards states of maximum entropy production.

    PubMed

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  1. Entangled de Sitter from stringy axionic Bell pair I: an analysis using Bunch-Davies vacuum

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayantan; Panda, Sudhakar

    2018-01-01

    In this work, we study the quantum entanglement and compute entanglement entropy in de Sitter space for a bipartite quantum field theory driven by an axion originating from Type IIB string compactification on a Calabi-Yau three fold (CY^3) and in the presence of an NS5 brane. For this computation, we consider a spherical surface S^2, which divides the spatial slice of de Sitter (dS_4) into exterior and interior sub-regions. We also consider the initial choice of vacuum to be Bunch-Davies state. First we derive the solution of the wave function of the axion in a hyperbolic open chart by constructing a suitable basis for Bunch-Davies vacuum state using Bogoliubov transformation. We then derive the expression for density matrix by tracing over the exterior region. This allows us to compute the entanglement entropy and Rényi entropy in 3+1 dimension. Furthermore, we quantify the UV-finite contribution of the entanglement entropy which contain the physics of long range quantum correlations of our expanding universe. Finally, our analysis complements the necessary condition for generating non-vanishing entanglement entropy in primordial cosmology due to the axion.

  2. Seebeck Effects in N-Type and P-Type Polymers Driven Simultaneously by Surface Polarization and Entropy Differences Based on Conductor/Polymer/Conductor Thin-Film Devices

    DOE PAGES

    Hu, Dehua; Liu, Qing; Tisdale, Jeremy; ...

    2015-04-15

    This paper reports Seebeck effects driven by both surface polarization difference and entropy difference by using intramolecular charge-transfer states in n-type and p-type conjugated polymers, namely IIDT and IIDDT, based on vertical conductor/polymer/conductor thin-film devices. Large Seebeck coefficients of -898 V/K and 1300 V/K from are observed from n-type IIDT p-type IIDDT, respectively, when the charge-transfer states are generated by a white light illumination of 100 mW/cm2. Simultaneously, electrical conductivities are increased from almost insulating states in dark condition to conducting states under photoexcitation in both n-type IIDT and p-type IIDDT devices. We find that the intramolecular charge-transfer states canmore » largely enhance Seebeck effects in the n-type IIDT and p-type IIDDT devices driven by both surface polarization difference and entropy difference. Furthermore, the Seebeck effects can be shifted between polarization and entropy regimes when electrical conductivities are changed. This reveals a new concept to develop Seebeck effects by controlling polarization and entropy regimes based on charge-transfer states in vertical conductor/polymer/conductor thin-film devices.« less

  3. Ladar imaging detection of salient map based on PWVD and Rényi entropy

    NASA Astrophysics Data System (ADS)

    Xu, Yuannan; Zhao, Yuan; Deng, Rong; Dong, Yanbing

    2013-10-01

    Spatial-frequency information of a given image can be extracted by associating the grey-level spatial data with one of the well-known spatial/spatial-frequency distributions. The Wigner-Ville distribution (WVD) has a good characteristic that the images can be represented in spatial/spatial-frequency domains. For intensity and range images of ladar, through the pseudo Wigner-Ville distribution (PWVD) using one or two dimension window, the statistical property of Rényi entropy is studied. We also analyzed the change of Rényi entropy's statistical property in the ladar intensity and range images when the man-made objects appear. From this foundation, a novel method for generating saliency map based on PWVD and Rényi entropy is proposed. After that, target detection is completed when the saliency map is segmented using a simple and convenient threshold method. For the ladar intensity and range images, experimental results show the proposed method can effectively detect the military vehicles from complex earth background with low false alarm.

  4. EEG based topography analysis in string recognition task

    NASA Astrophysics Data System (ADS)

    Ma, Xiaofei; Huang, Xiaolin; Shen, Yuxiaotong; Qin, Zike; Ge, Yun; Chen, Ying; Ning, Xinbao

    2017-03-01

    Vision perception and recognition is a complex process, during which different parts of brain are involved depending on the specific modality of the vision target, e.g. face, character, or word. In this study, brain activities in string recognition task compared with idle control state are analyzed through topographies based on multiple measurements, i.e. sample entropy, symbolic sample entropy and normalized rhythm power, extracted from simultaneously collected scalp EEG. Our analyses show that, for most subjects, both symbolic sample entropy and normalized gamma power in string recognition task are significantly higher than those in idle state, especially at locations of P4, O2, T6 and C4. It implies that these regions are highly involved in string recognition task. Since symbolic sample entropy measures complexity, from the perspective of new information generation, and normalized rhythm power reveals the power distributions in frequency domain, complementary information about the underlying dynamics can be provided through the two types of indices.

  5. A synergistic approach to protein crystallization: Combination of a fixed-arm carrier with surface entropy reduction

    PubMed Central

    Moon, Andrea F; Mueller, Geoffrey A; Zhong, Xuejun; Pedersen, Lars C

    2010-01-01

    Protein crystallographers are often confronted with recalcitrant proteins not readily crystallizable, or which crystallize in problematic forms. A variety of techniques have been used to surmount such obstacles: crystallization using carrier proteins or antibody complexes, chemical modification, surface entropy reduction, proteolytic digestion, and additive screening. Here we present a synergistic approach for successful crystallization of proteins that do not form diffraction quality crystals using conventional methods. This approach combines favorable aspects of carrier-driven crystallization with surface entropy reduction. We have generated a series of maltose binding protein (MBP) fusion constructs containing different surface mutations designed to reduce surface entropy and encourage crystal lattice formation. The MBP advantageously increases protein expression and solubility, and provides a streamlined purification protocol. Using this technique, we have successfully solved the structures of three unrelated proteins that were previously unattainable. This crystallization technique represents a valuable rescue strategy for protein structure solution when conventional methods fail. PMID:20196072

  6. Megahertz-Rate Semi-Device-Independent Quantum Random Number Generators Based on Unambiguous State Discrimination

    NASA Astrophysics Data System (ADS)

    Brask, Jonatan Bohr; Martin, Anthony; Esposito, William; Houlmann, Raphael; Bowles, Joseph; Zbinden, Hugo; Brunner, Nicolas

    2017-05-01

    An approach to quantum random number generation based on unambiguous quantum state discrimination is developed. We consider a prepare-and-measure protocol, where two nonorthogonal quantum states can be prepared, and a measurement device aims at unambiguously discriminating between them. Because the states are nonorthogonal, this necessarily leads to a minimal rate of inconclusive events whose occurrence must be genuinely random and which provide the randomness source that we exploit. Our protocol is semi-device-independent in the sense that the output entropy can be lower bounded based on experimental data and a few general assumptions about the setup alone. It is also practically relevant, which we demonstrate by realizing a simple optical implementation, achieving rates of 16.5 Mbits /s . Combining ease of implementation, a high rate, and a real-time entropy estimation, our protocol represents a promising approach intermediate between fully device-independent protocols and commercial quantum random number generators.

  7. Rényi Entropies from Random Quenches in Atomic Hubbard and Spin Models.

    PubMed

    Elben, A; Vermersch, B; Dalmonte, M; Cirac, J I; Zoller, P

    2018-02-02

    We present a scheme for measuring Rényi entropies in generic atomic Hubbard and spin models using single copies of a quantum state and for partitions in arbitrary spatial dimensions. Our approach is based on the generation of random unitaries from random quenches, implemented using engineered time-dependent disorder potentials, and standard projective measurements, as realized by quantum gas microscopes. By analyzing the properties of the generated unitaries and the role of statistical errors, with respect to the size of the partition, we show that the protocol can be realized in existing quantum simulators and used to measure, for instance, area law scaling of entanglement in two-dimensional spin models or the entanglement growth in many-body localized systems.

  8. Rényi Entropies from Random Quenches in Atomic Hubbard and Spin Models

    NASA Astrophysics Data System (ADS)

    Elben, A.; Vermersch, B.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a scheme for measuring Rényi entropies in generic atomic Hubbard and spin models using single copies of a quantum state and for partitions in arbitrary spatial dimensions. Our approach is based on the generation of random unitaries from random quenches, implemented using engineered time-dependent disorder potentials, and standard projective measurements, as realized by quantum gas microscopes. By analyzing the properties of the generated unitaries and the role of statistical errors, with respect to the size of the partition, we show that the protocol can be realized in existing quantum simulators and used to measure, for instance, area law scaling of entanglement in two-dimensional spin models or the entanglement growth in many-body localized systems.

  9. On Entropy Trail

    NASA Astrophysics Data System (ADS)

    Farokhi, Saeed; Taghavi, Ray; Keshmiri, Shawn

    2015-11-01

    Stealth technology is developed for military aircraft to minimize their signatures. The primary attention was focused on radar signature, followed by the thermal and noise signatures of the vehicle. For radar evasion, advanced configuration designs, extensive use of carbon composites and radar-absorbing material, are developed. On thermal signature, mainly in the infra-red (IR) bandwidth, the solution was found in blended rectangular nozzles of high aspect ratio that are shielded from ground detectors. For noise, quiet and calm jets are integrated into vehicles with low-turbulence configuration design. However, these technologies are totally incapable of detecting new generation of revolutionary aircraft. These shall use all electric, distributed, propulsion system that are thermally transparent. In addition, composite skin and non-emitting sensors onboard the aircraft will lead to low signature. However, based on the second-law of thermodynamics, there is no air vehicle that can escape from leaving an entropy trail. Entropy is thus the only inevitable signature of any system, that once measured, can detect the source. By characterizing the entropy field based on its statistical properties, the source may be recognized, akin to face recognition technology. Direct measurement of entropy is cumbersome, however as a derived property, it can be easily measured. The measurement accuracy depends on the probe design and the sensors onboard. One novel air data sensor suite is introduced with promising potential to capture the entropy trail.

  10. The Root Cause of the Overheating Problem

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2017-01-01

    Previously we identified the receding flow, where two fluid streams recede from each other, as an open numerical problem, because all well-known numerical fluxes give an anomalous temperature rise, thus called the overheating problem. This phenomenon, although presented in several textbooks, and many previous publications, has scarcely been satisfactorily addressed and the root cause of the overheating problem not well understood. We found that this temperature rise was solely connected to entropy rise and proposed to use the method of characteristics to eradicate the problem. However, the root cause of the entropy production was still unclear. In the present study, we identify the cause of this problem: the entropy rise is rooted in the pressure flux in a finite volume formulation and is implanted at the first time step. It is found theoretically inevitable for all existing numerical flux schemes used in the finite volume setting, as confirmed by numerical tests. This difficulty cannot be eliminated by manipulating time step, grid size, spatial accuracy, etc, although the rate of overheating depends on the flux scheme used. Finally, we incorporate the entropy transport equation, in place of the energy equation, to ensure preservation of entropy, thus correcting this temperature anomaly. Its applicability is demonstrated for some relevant 1D and 2D problems. Thus, the present study validates that the entropy generated ab initio is the genesis of the overheating problem.

  11. Reactive Power Compensation Method Considering Minimum Effective Reactive Power Reserve

    NASA Astrophysics Data System (ADS)

    Gong, Yiyu; Zhang, Kai; Pu, Zhang; Li, Xuenan; Zuo, Xianghong; Zhen, Jiao; Sudan, Teng

    2017-05-01

    According to the calculation model of minimum generator reactive power reserve of power system voltage stability under the premise of the guarantee, the reactive power management system with reactive power compensation combined generator, the formation of a multi-objective optimization problem, propose a reactive power reserve is considered the minimum generator reactive power compensation optimization method. This method through the improvement of the objective function and constraint conditions, when the system load growth, relying solely on reactive power generation system can not meet the requirement of safe operation, increase the reactive power reserve to solve the problem of minimum generator reactive power compensation in the case of load node.

  12. Variation principle in calculating the flow of a two-phase mixture in the pipes of the cooling systems in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Aksenov, Andrey; Malysheva, Anna

    2018-03-01

    The analytical solution of one of the urgent problems of modern hydromechanics and heat engineering about the distribution of gas and liquid phases along the channel cross-section, the thickness of the annular layer and their connection with the mass content of the gas phase in the gas-liquid flow is given in the paper.The analytical method is based on the fundamental laws of theoretical mechanics and thermophysics on the minimum of energy dissipation and the minimum rate of increase in the system entropy, which determine the stability of stationary states and processes. Obtained dependencies disclose the physical laws of the motion of two-phase media and can be used in hydraulic calculations during the design and operation of refrigeration and air conditioning systems.

  13. Study of Thermodynamics of Liquid Noble-Metals Alloys Through a Pseudopotential Theory

    NASA Astrophysics Data System (ADS)

    Vora, Aditya M.

    2010-09-01

    The Gibbs-Bogoliubov (GB) inequality is applied to investigate the thermodynamic properties of some equiatomic noble metal alloys in liquid phase such as Au-Cu, Ag-Cu, and Ag-Au using well recognized pseudopotential formalism. For description of the structure, well known Percus-Yevick (PY) hard sphere model is used as a reference system. By applying a variation method the best hard core diameters have been found which correspond to minimum free energy. With this procedure the thermodynamic properties such as entropy and heat of mixing have been computed. The influence of local field correction function viz; Hartree (H), Taylor (T), Ichimaru-Utsumi (IU), Farid et al. (F), and Sarkar et al. (S) is also investigated. The computed results of the excess entropy compares favourably in the case of liquid alloys while the agreement with experiment is poor in the case of heats of mixing. This may be due to the sensitivity of the heats of mixing with the potential parameters and the dielectric function.

  14. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    NASA Technical Reports Server (NTRS)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  15. Secondary electric power generation with minimum engine bleed

    NASA Technical Reports Server (NTRS)

    Tagge, G. E.

    1983-01-01

    Secondary electric power generation with minimum engine bleed is discussed. Present and future jet engine systems are compared. The role of auxiliary power units is evaluated. Details of secondary electric power generation systems with and without auxiliary power units are given. Advanced bleed systems are compared with minimum bleed systems. A cost model of ownership is given. The difference in the cost of ownership between a minimum bleed system and an advanced bleed system is given.

  16. Inference of gene regulatory networks from time series by Tsallis entropy

    PubMed Central

    2011-01-01

    Background The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 ≤ q ≤ 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/. PMID:21545720

  17. Coherent entropy induced and acoustic noise separation in compact nozzles

    NASA Astrophysics Data System (ADS)

    Tao, Wenjie; Schuller, Thierry; Huet, Maxime; Richecoeur, Franck

    2017-04-01

    A method to separate entropy induced noise from an acoustic pressure wave in an harmonically perturbed flow through a nozzle is presented. It is tested on an original experimental setup generating simultaneously acoustic and temperature fluctuations in an air flow that is accelerated by a convergent nozzle. The setup mimics the direct and indirect noise contributions to the acoustic pressure field in a confined combustion chamber by producing synchronized acoustic and temperature fluctuations, without dealing with the complexity of the combustion process. It allows generating temperature fluctuations with amplitude up to 10 K in the frequency range from 10 to 100 Hz. The noise separation technique uses experiments with and without temperature fluctuations to determine the relative level of acoustic and entropy fluctuations in the system and to identify the nozzle response to these forcing waves. It requires multi-point measurements of acoustic pressure and temperature. The separation method is first validated with direct numerical simulations of the nonlinear Euler equations. These simulations are used to investigate the conditions for which the separation technique is valid and yield similar trends as the experiments for the investigated flow operating conditions. The separation method then gives successfully the acoustic reflection coefficient but does not recover the same entropy reflection coefficient as predicted by the compact nozzle theory due to the sensitivity of the method to signal noises in the explored experimental conditions. This methodology provides a framework for experimental investigation of direct and indirect combustion noises originating from synchronized perturbations.

  18. Study of water based nanofluid flows in annular tubes using numerical simulation and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Siadaty, Moein; Kazazi, Mohsen

    2018-04-01

    Convective heat transfer, entropy generation and pressure drop of two water based nanofluids (Cu-water and Al2O3-water) in horizontal annular tubes are scrutinized by means of computational fluids dynamics, response surface methodology and sensitivity analysis. First, central composite design is used to perform a series of experiments with diameter ratio, length to diameter ratio, Reynolds number and solid volume fraction. Then, CFD is used to calculate the Nusselt Number, Euler number and entropy generation. After that, RSM is applied to fit second order polynomials on responses. Finally, sensitivity analysis is conducted to manage the above mentioned parameters inside tube. Totally, 62 different cases are examined. CFD results show that Cu-water and Al2O3-water have the highest and lowest heat transfer rate, respectively. In addition, analysis of variances indicates that increase in solid volume fraction increases dimensionless pressure drop for Al2O3-water. Moreover, it has a significant negative and insignificant effects on Cu-water Nusselt and Euler numbers, respectively. Analysis of Bejan number indicates that frictional and thermal entropy generations are the dominant irreversibility in Al2O3-water and Cu-water flows, respectively. Sensitivity analysis indicates dimensionless pressure drop sensitivity to tube length for Cu-water is independent of its diameter ratio at different Reynolds numbers.

  19. Entropy measures detect increased movement variability in resistance training when elite rugby players use the ball.

    PubMed

    Moras, Gerard; Fernández-Valdés, Bruno; Vázquez-Guerrero, Jairo; Tous-Fajardo, Julio; Exel, Juliana; Sampaio, Jaime

    2018-05-24

    This study described the variability in acceleration during a resistance training task, performed in horizontal inertial flywheels without (NOBALL) or with the constraint of catching and throwing a rugby ball (BALL). Twelve elite rugby players (mean±SD: age 25.6±3.0years, height 1.82±0.07m, weight 94.0±9.9kg) performed a resistance training task in both conditions (NOBALL AND BALL). Players had five minutes of a standardized warm-up, followed by two series of six repetitions of both conditions: at the first three repetitions the intensity was progressively increased while the last three were performed at maximal voluntary effort. Thereafter, the participants performed two series of eight repetitions from each condition for two days and in a random order, with a minimum of 10min between series. The structure of variability was analysed using non-linear measures of entropy. Mean changes (%; ±90% CL) of 4.64; ±3.1g for mean acceleration and 39.48; ±36.63a.u. for sample entropy indicated likely and very likely increase when in BALL condition. Multiscale entropy also showed higher unpredictability of acceleration under the BALL condition, especially at higher time scales. The application of match specific constraints in resistance training for rugby players elicit different amount of variability of body acceleration across multiple physiological time scales. Understanding the non-linear process inherent to the manipulation of resistance training variables with constraints and its motor adaptations may help coaches and trainers to enhance the effectiveness of physical training and, ultimately, better understand and maximize sports performance. Copyright © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  20. Analysing generator matrices G of similar state but varying minimum determinants

    NASA Astrophysics Data System (ADS)

    Harun, H.; Razali, M. F.; Rahman, N. A. Abdul

    2016-10-01

    Since Tarokh discovered Space-Time Trellis Code (STTC) in 1998, a considerable effort has been done to improve the performance of the original STTC. One way of achieving enhancement is by focusing on the generator matrix G, which represents the encoder structure for STTC. Until now, researchers have only concentrated on STTCs of different states in analyzing the performance of generator matrix G. No effort has been made on different generator matrices G of similar state. The reason being, it is difficult to produce a wide variety of generator matrices G with diverse minimum determinants. In this paper a number of generator matrices G with minimum determinant of four (4), eight (8) and sixteen (16) of the same state (i.e., 4-PSK) have been successfully produced. The performance of different generator matrices G in term of their bit error rate and signal-to-noise ratio for a Rayleigh fading environment are compared and evaluated. It is found from the MATLAB simulation that at low SNR (<8), the BER of generator matrices G with smaller minimum determinant is comparatively lower than those of higher minimum determinant. However, at high SNR (>14) there is no significant difference between the BER of these generator matrices G.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donnelly, William; Freidel, Laurent

    We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less

  2. Structure-Activity Relationship and Molecular Mechanics Reveal the Importance of Ring Entropy in the Biosynthesis and Activity of a Natural Product.

    PubMed

    Tran, Hai L; Lexa, Katrina W; Julien, Olivier; Young, Travis S; Walsh, Christopher T; Jacobson, Matthew P; Wells, James A

    2017-02-22

    Macrocycles are appealing drug candidates due to their high affinity, specificity, and favorable pharmacological properties. In this study, we explored the effects of chemical modifications to a natural product macrocycle upon its activity, 3D geometry, and conformational entropy. We chose thiocillin as a model system, a thiopeptide in the ribosomally encoded family of natural products that exhibits potent antimicrobial effects against Gram-positive bacteria. Since thiocillin is derived from a genetically encoded peptide scaffold, site-directed mutagenesis allows for rapid generation of analogues. To understand thiocillin's structure-activity relationship, we generated a site-saturation mutagenesis library covering each position along thiocillin's macrocyclic ring. We report the identification of eight unique compounds more potent than wild-type thiocillin, the best having an 8-fold improvement in potency. Computational modeling of thiocillin's macrocyclic structure revealed a striking requirement for a low-entropy macrocycle for activity. The populated ensembles of the active mutants showed a rigid structure with few adoptable conformations while inactive mutants showed a more flexible macrocycle which is unfavorable for binding. This finding highlights the importance of macrocyclization in combination with rigidifying post-translational modifications to achieve high-potency binding.

  3. Three perspectives on complexity: entropy, compression, subsymmetry

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi

    2017-12-01

    There is no single universally accepted definition of `Complexity'. There are several perspectives on complexity and what constitutes complex behaviour or complex systems, as opposed to regular, predictable behaviour and simple systems. In this paper, we explore the following perspectives on complexity: effort-to-describe (Shannon entropy H, Lempel-Ziv complexity LZ), effort-to-compress (ETC complexity) and degree-of-order (Subsymmetry or SubSym). While Shannon entropy and LZ are very popular and widely used, ETC is relatively a new complexity measure. In this paper, we also propose a novel normalized complexity measure SubSym based on the existing idea of counting the number of subsymmetries or palindromes within a sequence. We compare the performance of these complexity measures on the following tasks: (A) characterizing complexity of short binary sequences of lengths 4 to 16, (B) distinguishing periodic and chaotic time series from 1D logistic map and 2D Hénon map, (C) analyzing the complexity of stochastic time series generated from 2-state Markov chains, and (D) distinguishing between tonic and irregular spiking patterns generated from the `Adaptive exponential integrate-and-fire' neuron model. Our study reveals that each perspective has its own advantages and uniqueness while also having an overlap with each other.

  4. Cattaneo-Christov based study of {TiO}_2 -CuO/EG Casson hybrid nanofluid flow over a stretching surface with entropy generation

    NASA Astrophysics Data System (ADS)

    Jamshed, Wasim; Aziz, Asim

    2018-06-01

    In the present research, a simplified mathematical model is presented to study the heat transfer and entropy generation analysis of thermal system containing hybrid nanofluid. Nanofluid occupies the space over an infinite horizontal surface and the flow is induced by the non-linear stretching of surface. A uniform transverse magnetic field, Cattaneo-Christov heat flux model and thermal radiation effects are also included in the present study. The similarity technique is employed to reduce the governing non-linear partial differential equations to a set of ordinary differential equation. Keller Box numerical scheme is then used to approximate the solutions for the thermal analysis. Results are presented for conventional copper oxide-ethylene glycol (CuO-EG) and hybrid titanium-copper oxide/ethylene glycol ({TiO}_2 -CuO/EG) nanofluids. The spherical, hexahedron, tetrahedron, cylindrical, and lamina-shaped nanoparticles are considered in the present analysis. The significant findings of the study is the enhanced heat transfer capability of hybrid nanofluids over the conventional nanofluids, greatest heat transfer rate for the smallest value of the shape factor parameter and the increase in Reynolds number and Brinkman number increases the overall entropy of the system.

  5. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  6. Essential equivalence of the general equation for the nonequilibrium reversible-irreversible coupling (GENERIC) and steepest-entropy-ascent models of dissipation for nonequilibrium thermodynamics.

    PubMed

    Montefusco, Alberto; Consonni, Francesco; Beretta, Gian Paolo

    2015-04-01

    By reformulating the steepest-entropy-ascent (SEA) dynamical model for nonequilibrium thermodynamics in the mathematical language of differential geometry, we compare it with the primitive formulation of the general equation for the nonequilibrium reversible-irreversible coupling (GENERIC) model and discuss the main technical differences of the two approaches. In both dynamical models the description of dissipation is of the "entropy-gradient" type. SEA focuses only on the dissipative, i.e., entropy generating, component of the time evolution, chooses a sub-Riemannian metric tensor as dissipative structure, and uses the local entropy density field as potential. GENERIC emphasizes the coupling between the dissipative and nondissipative components of the time evolution, chooses two compatible degenerate structures (Poisson and degenerate co-Riemannian), and uses the global energy and entropy functionals as potentials. As an illustration, we rewrite the known GENERIC formulation of the Boltzmann equation in terms of the square root of the distribution function adopted by the SEA formulation. We then provide a formal proof that in more general frameworks, whenever all degeneracies in the GENERIC framework are related to conservation laws, the SEA and GENERIC models of the dissipative component of the dynamics are essentially interchangeable, provided of course they assume the same kinematics. As part of the discussion, we note that equipping the dissipative structure of GENERIC with the Leibniz identity makes it automatically SEA on metric leaves.

  7. Swellix: a computational tool to explore RNA conformational space.

    PubMed

    Sloat, Nathan; Liu, Jui-Wen; Schroeder, Susan J

    2017-11-21

    The sequence of nucleotides in an RNA determines the possible base pairs for an RNA fold and thus also determines the overall shape and function of an RNA. The Swellix program presented here combines a helix abstraction with a combinatorial approach to the RNA folding problem in order to compute all possible non-pseudoknotted RNA structures for RNA sequences. The Swellix program builds on the Crumple program and can include experimental constraints on global RNA structures such as the minimum number and lengths of helices from crystallography, cryoelectron microscopy, or in vivo crosslinking and chemical probing methods. The conceptual advance in Swellix is to count helices and generate all possible combinations of helices rather than counting and combining base pairs. Swellix bundles similar helices and includes improvements in memory use and efficient parallelization. Biological applications of Swellix are demonstrated by computing the reduction in conformational space and entropy due to naturally modified nucleotides in tRNA sequences and by motif searches in Human Endogenous Retroviral (HERV) RNA sequences. The Swellix motif search reveals occurrences of protein and drug binding motifs in the HERV RNA ensemble that do not occur in minimum free energy or centroid predicted structures. Swellix presents significant improvements over Crumple in terms of efficiency and memory use. The efficient parallelization of Swellix enables the computation of sequences as long as 418 nucleotides with sufficient experimental constraints. Thus, Swellix provides a practical alternative to free energy minimization tools when multiple structures, kinetically determined structures, or complex RNA-RNA and RNA-protein interactions are present in an RNA folding problem.

  8. Natural approach to quantum dissipation

    NASA Astrophysics Data System (ADS)

    Taj, David; Öttinger, Hans Christian

    2015-12-01

    The dissipative dynamics of a quantum system weakly coupled to one or several reservoirs is usually described in terms of a Lindblad generator. The popularity of this approach is certainly due to the linear character of the latter. However, while such linearity finds justification from an underlying Hamiltonian evolution in some scaling limit, it does not rely on solid physical motivations at small but finite values of the coupling constants, where the generator is typically used for applications. The Markovian quantum master equations we propose are instead supported by very natural thermodynamic arguments. They themselves arise from Markovian master equations for the system and the environment which preserve factorized states and mean energy and generate entropy at a non-negative rate. The dissipative structure is driven by an entropic map, called modular, which introduces nonlinearity. The generated modular dynamical semigroup (MDS) guarantees for the positivity of the time evolved state the correct steady state properties, the positivity of the entropy production, and a positive Onsager matrix with symmetry relations arising from Green-Kubo formulas. We show that the celebrated Davies Lindblad generator, obtained through the Born and the secular approximations, generates a MDS. In doing so we also provide a nonlinear MDS which is supported by a weak coupling argument and is free from the limitations of the Davies generator.

  9. Bacterial protease uses distinct thermodynamic signatures for substrate recognition.

    PubMed

    Bezerra, Gustavo Arruda; Ohara-Nemoto, Yuko; Cornaciu, Irina; Fedosyuk, Sofiya; Hoffmann, Guillaume; Round, Adam; Márquez, José A; Nemoto, Takayuki K; Djinović-Carugo, Kristina

    2017-06-06

    Porphyromonas gingivalis and Porphyromonas endodontalis are important bacteria related to periodontitis, the most common chronic inflammatory disease in humans worldwide. Its comorbidity with systemic diseases, such as type 2 diabetes, oral cancers and cardiovascular diseases, continues to generate considerable interest. Surprisingly, these two microorganisms do not ferment carbohydrates; rather they use proteinaceous substrates as carbon and energy sources. However, the underlying biochemical mechanisms of their energy metabolism remain unknown. Here, we show that dipeptidyl peptidase 11 (DPP11), a central metabolic enzyme in these bacteria, undergoes a conformational change upon peptide binding to distinguish substrates from end products. It binds substrates through an entropy-driven process and end products in an enthalpy-driven fashion. We show that increase in protein conformational entropy is the main-driving force for substrate binding via the unfolding of specific regions of the enzyme ("entropy reservoirs"). The relationship between our structural and thermodynamics data yields a distinct model for protein-protein interactions where protein conformational entropy modulates the binding free-energy. Further, our findings provide a framework for the structure-based design of specific DPP11 inhibitors.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Dehua; Liu, Qing; Tisdale, Jeremy

    This paper reports Seebeck effects driven by both surface polarization difference and entropy difference by using intramolecular charge-transfer states in n-type and p-type conjugated polymers, namely IIDT and IIDDT, based on vertical conductor/polymer/conductor thin-film devices. Large Seebeck coefficients of -898 V/K and 1300 V/K from are observed from n-type IIDT p-type IIDDT, respectively, when the charge-transfer states are generated by a white light illumination of 100 mW/cm2. Simultaneously, electrical conductivities are increased from almost insulating states in dark condition to conducting states under photoexcitation in both n-type IIDT and p-type IIDDT devices. We find that the intramolecular charge-transfer states canmore » largely enhance Seebeck effects in the n-type IIDT and p-type IIDDT devices driven by both surface polarization difference and entropy difference. Furthermore, the Seebeck effects can be shifted between polarization and entropy regimes when electrical conductivities are changed. This reveals a new concept to develop Seebeck effects by controlling polarization and entropy regimes based on charge-transfer states in vertical conductor/polymer/conductor thin-film devices.« less

  11. Coarse-graining errors and numerical optimization using a relative entropy framework

    NASA Astrophysics Data System (ADS)

    Chaimovich, Aviel; Shell, M. Scott

    2011-03-01

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, Srel, that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework.

  12. Entropy uncertainty relations and stability of phase-temporal quantum cryptography with finite-length transmitted strings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com

    2012-12-15

    Any key-generation session contains a finite number of quantum-state messages, and it is there-fore important to understand the fundamental restrictions imposed on the minimal length of a string required to obtain a secret key with a specified length. The entropy uncertainty relations for smooth min and max entropies considerably simplify and shorten the proof of security. A proof of security of quantum key distribution with phase-temporal encryption is presented. This protocol provides the maximum critical error compared to other protocols up to which secure key distribution is guaranteed. In addition, unlike other basic protocols (of the BB84 type), which aremore » vulnerable with respect to an attack by 'blinding' of avalanche photodetectors, this protocol is stable with respect to such an attack and guarantees key security.« less

  13. Anosov C-systems and random number generators

    NASA Astrophysics Data System (ADS)

    Savvidy, G. K.

    2016-08-01

    We further develop our previous proposal to use hyperbolic Anosov C-systems to generate pseudorandom numbers and to use them for efficient Monte Carlo calculations in high energy particle physics. All trajectories of hyperbolic dynamical systems are exponentially unstable, and C-systems therefore have mixing of all orders, a countable Lebesgue spectrum, and a positive Kolmogorov entropy. These exceptional ergodic properties follow from the C-condition introduced by Anosov. This condition defines a rich class of dynamical systems forming an open set in the space of all dynamical systems. An important property of C-systems is that they have a countable set of everywhere dense periodic trajectories and their density increases exponentially with entropy. Of special interest are the C-systems defined on higher-dimensional tori. Such C-systems are excellent candidates for generating pseudorandom numbers that can be used in Monte Carlo calculations. An efficient algorithm was recently constructed that allows generating long C-system trajectories very rapidly. These trajectories have good statistical properties and can be used for calculations in quantum chromodynamics and in high energy particle physics.

  14. Geometric characterization of separability and entanglement in pure Gaussian states by single-mode unitary operations

    NASA Astrophysics Data System (ADS)

    Adesso, Gerardo; Giampaolo, Salvatore M.; Illuminati, Fabrizio

    2007-10-01

    We present a geometric approach to the characterization of separability and entanglement in pure Gaussian states of an arbitrary number of modes. The analysis is performed adapting to continuous variables a formalism based on single subsystem unitary transformations that has been recently introduced to characterize separability and entanglement in pure states of qubits and qutrits [S. M. Giampaolo and F. Illuminati, Phys. Rev. A 76, 042301 (2007)]. In analogy with the finite-dimensional case, we demonstrate that the 1×M bipartite entanglement of a multimode pure Gaussian state can be quantified by the minimum squared Euclidean distance between the state itself and the set of states obtained by transforming it via suitable local symplectic (unitary) operations. This minimum distance, corresponding to a , uniquely determined, extremal local operation, defines an entanglement monotone equivalent to the entropy of entanglement, and amenable to direct experimental measurement with linear optical schemes.

  15. Generative complexity of Gray-Scott model

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2018-03-01

    In the Gray-Scott reaction-diffusion system one reactant is constantly fed in the system, another reactant is reproduced by consuming the supplied reactant and also converted to an inert product. The rate of feeding one reactant in the system and the rate of removing another reactant from the system determine configurations of concentration profiles: stripes, spots, waves. We calculate the generative complexity-a morphological complexity of concentration profiles grown from a point-wise perturbation of the medium-of the Gray-Scott system for a range of the feeding and removal rates. The morphological complexity is evaluated using Shannon entropy, Simpson diversity, approximation of Lempel-Ziv complexity, and expressivity (Shannon entropy divided by space-filling). We analyse behaviour of the systems with highest values of the generative morphological complexity and show that the Gray-Scott systems expressing highest levels of the complexity are composed of the wave-fragments (similar to wave-fragments in sub-excitable media) and travelling localisations (similar to quasi-dissipative solitons and gliders in Conway's Game of Life).

  16. First-order irreversible thermodynamic approach to a simple energy converter

    NASA Astrophysics Data System (ADS)

    Arias-Hernandez, L. A.; Angulo-Brown, F.; Paez-Hernandez, R. T.

    2008-01-01

    Several authors have shown that dissipative thermal cycle models based on finite-time thermodynamics exhibit loop-shaped curves of power output versus efficiency, such as it occurs with actual dissipative thermal engines. Within the context of first-order irreversible thermodynamics (FOIT), in this work we show that for an energy converter consisting of two coupled fluxes it is also possible to find loop-shaped curves of both power output and the so-called ecological function versus efficiency. In a previous work Stucki [J. W. Stucki, Eur. J. Biochem. 109, 269 (1980)] used a FOIT approach to describe the modes of thermodynamic performance of oxidative phosphorylation involved in adenosine triphosphate (ATP) synthesis within mithochondrias. In that work the author did not use the mentioned loop-shaped curves and he proposed that oxidative phosphorylation operates in a steady state at both minimum entropy production and maximum efficiency simultaneously, by means of a conductance matching condition between extreme states of zero and infinite conductances, respectively. In the present work we show that all Stucki’s results about the oxidative phosphorylation energetics can be obtained without the so-called conductance matching condition. On the other hand, we also show that the minimum entropy production state implies both null power output and efficiency and therefore this state is not fulfilled by the oxidative phosphorylation performance. Our results suggest that actual efficiency values of oxidative phosphorylation performance are better described by a mode of operation consisting of the simultaneous maximization of both the so-called ecological function and the efficiency.

  17. Information-theoretic measures of hydrogen-like ions in weakly coupled Debye plasmas

    NASA Astrophysics Data System (ADS)

    Zan, Li Rong; Jiao, Li Guang; Ma, Jia; Ho, Yew Kam

    2017-12-01

    Recent development of information theory provides researchers an alternative and useful tool to quantitatively investigate the variation of the electronic structure when atoms interact with the external environment. In this work, we make systematic studies on the information-theoretic measures for hydrogen-like ions immersed in weakly coupled plasmas modeled by Debye-Hückel potential. Shannon entropy, Fisher information, and Fisher-Shannon complexity in both position and momentum spaces are quantified in high accuracy for the hydrogen atom in a large number of stationary states. The plasma screening effect on embedded atoms can significantly affect the electronic density distributions, in both conjugate spaces, and it is quantified by the variation of information quantities. It is shown that the composite quantities (the Shannon entropy sum and the Fisher information product in combined spaces and Fisher-Shannon complexity in individual space) give a more comprehensive description of the atomic structure information than single ones. The nodes of wave functions play a significant role in the changes of composite information quantities caused by plasmas. With the continuously increasing screening strength, all composite quantities in circular states increase monotonously, while in higher-lying excited states where nodal structures exist, they first decrease to a minimum and then increase rapidly before the bound state approaches the continuum limit. The minimum represents the most reduction of uncertainty properties of the atom in plasmas. The lower bounds for the uncertainty product of the system based on composite information quantities are discussed. Our research presents a comprehensive survey in the investigation of information-theoretic measures for simple atoms embedded in Debye model plasmas.

  18. Altered Enthalpy-Entropy Compensation in Picomolar Transition State Analogues of Human Purine Nucleoside Phosphorylase†

    PubMed Central

    Edwards, Achelle A.; Mason, Jennifer M.; Clinch, Keith; Tyler, Peter C.; Evans, Gary B.; Schramm, Vern L.

    2009-01-01

    Human purine nucleoside phosphorylase (PNP) belongs to the trimeric class of PNPs and is essential for catabolism of deoxyguanosine. Genetic deficiency of PNP in humans causes a specific T-cell immune deficiency and transition state analogue inhibitors of PNP are in development for treatment of T-cell cancers and autoimmune disorders. Four generations of Immucillins have been developed, each of which contains inhibitors binding with picomolar affinity to human PNP. Full inhibition of PNP occurs upon binding to the first of three subunits and binding to subsequent sites occurs with negative cooperativity. In contrast, substrate analogue and product bind without cooperativity. Titrations of human PNP using isothermal calorimetery indicate that binding of a structurally rigid first-generation Immucillin (K d = 56 pM) is driven by large negative enthalpy values (ΔH = −21.2 kcal/mol) with a substantial entropic (-TΔS) penalty. The tightest-binding inhibitors (K d = 5 to 9 pM) have increased conformational flexibility. Despite their conformational freedom in solution, flexible inhibitors bind with high affinity because of reduced entropic penalties. Entropic penalties are proposed to arise from conformational freezing of the PNP·inhibitor complex with the entropy term dominated by protein dynamics. The conformationally flexible Immucillins reduce the system entropic penalty. Disrupting the ribosyl 5’-hydroxyl interaction of transition state analogues with PNP causes favorable entropy of binding. Tight binding of the seventeen Immucillins is characterized by large enthalpic contributions, emphasizing their similarity to the transition state. By introducing flexibility into the inhibitor structure, the enthalpy-entropy compensation pattern is altered to permit tighter binding. PMID:19425594

  19. Discriminative power of visual attributes in dermatology.

    PubMed

    Giotis, Ioannis; Visser, Margaretha; Jonkman, Marcel; Petkov, Nicolai

    2013-02-01

    Visual characteristics such as color and shape of skin lesions play an important role in the diagnostic process. In this contribution, we quantify the discriminative power of such attributes using an information theoretical approach. We estimate the probability of occurrence of each attribute as a function of the skin diseases. We use the distribution of this probability across the studied diseases and its entropy to define the discriminative power of the attribute. The discriminative power has a maximum value for attributes that occur (or do not occur) for only one disease and a minimum value for those which are equally likely to be observed among all diseases. Verrucous surface, red and brown colors, and the presence of more than 10 lesions are among the most informative attributes. A ranking of attributes is also carried out and used together with a naive Bayesian classifier, yielding results that confirm the soundness of the proposed method. proposed measure is proven to be a reliable way of assessing the discriminative power of dermatological attributes, and it also helps generate a condensed dermatological lexicon. Therefore, it can be of added value to the manual or computer-aided diagnostic process. © 2012 John Wiley & Sons A/S.

  20. Numerical analysis of single and multiple jets

    NASA Astrophysics Data System (ADS)

    Boussoufi, Mustapha; Sabeur-Bendehina, Amina; Ouadha, Ahmed; Morsli, Souad; El Ganaoui, Mohammed

    2017-05-01

    The present study aims to use the concept of entropy generation in order to study numerically the flow and the interaction of multiple jets. Several configurations of a single jet surrounded by equidistant 3, 5, 7 and 9 circumferential jets have been studied. The turbulent incompressible Navier-Stokes equations have been solved numerically using the commercial computational fluid dynamics code Fluent. The standard k-ɛ model has been selected to assess the eddy viscosity. The domain has been reduced to a quarter of the geometry due to symmetry. Results for axial and radial velocities have been compared with experimental measurements from the literature. Furthermore, additional results involving entropy generation rate have been presented and discussed. Contribution to the topical issue "Materials for Energy harvesting, conversion and storage II (ICOME 2016)", edited by Jean-Michel Nunzi, Rachid Bennacer and Mohammed El Ganaoui

  1. Alloy design for intrinsically ductile refractory high-entropy alloys

    NASA Astrophysics Data System (ADS)

    Sheikh, Saad; Shafeie, Samrand; Hu, Qiang; Ahlström, Johan; Persson, Christer; Veselý, Jaroslav; Zýka, Jiří; Klement, Uta; Guo, Sheng

    2016-10-01

    Refractory high-entropy alloys (RHEAs), comprising group IV (Ti, Zr, Hf), V (V, Nb, Ta), and VI (Cr, Mo, W) refractory elements, can be potentially new generation high-temperature materials. However, most existing RHEAs lack room-temperature ductility, similar to conventional refractory metals and alloys. Here, we propose an alloy design strategy to intrinsically ductilize RHEAs based on the electron theory and more specifically to decrease the number of valence electrons through controlled alloying. A new ductile RHEA, Hf0.5Nb0.5Ta0.5Ti1.5Zr, was developed as a proof of concept, with a fracture stress of close to 1 GPa and an elongation of near 20%. The findings here will shed light on the development of ductile RHEAs for ultrahigh-temperature applications in aerospace and power-generation industries.

  2. Universal bounds on current fluctuations.

    PubMed

    Pietzonka, Patrick; Barato, Andre C; Seifert, Udo

    2016-05-01

    For current fluctuations in nonequilibrium steady states of Markovian processes, we derive four different universal bounds valid beyond the Gaussian regime. Different variants of these bounds apply to either the entropy change or any individual current, e.g., the rate of substrate consumption in a chemical reaction or the electron current in an electronic device. The bounds vary with respect to their degree of universality and tightness. A universal parabolic bound on the generating function of an arbitrary current depends solely on the average entropy production. A second, stronger bound requires knowledge both of the thermodynamic forces that drive the system and of the topology of the network of states. These two bounds are conjectures based on extensive numerics. An exponential bound that depends only on the average entropy production and the average number of transitions per time is rigorously proved. This bound has no obvious relation to the parabolic bound but it is typically tighter further away from equilibrium. An asymptotic bound that depends on the specific transition rates and becomes tight for large fluctuations is also derived. This bound allows for the prediction of the asymptotic growth of the generating function. Even though our results are restricted to networks with a finite number of states, we show that the parabolic bound is also valid for three paradigmatic examples of driven diffusive systems for which the generating function can be calculated using the additivity principle. Our bounds provide a general class of constraints for nonequilibrium systems.

  3. Parabolic replicator dynamics and the principle of minimum Tsallis information gain

    PubMed Central

    2013-01-01

    Background Non-linear, parabolic (sub-exponential) and hyperbolic (super-exponential) models of prebiological evolution of molecular replicators have been proposed and extensively studied. The parabolic models appear to be the most realistic approximations of real-life replicator systems due primarily to product inhibition. Unlike the more traditional exponential models, the distribution of individual frequencies in an evolving parabolic population is not described by the Maximum Entropy (MaxEnt) Principle in its traditional form, whereby the distribution with the maximum Shannon entropy is chosen among all the distributions that are possible under the given constraints. We sought to identify a more general form of the MaxEnt principle that would be applicable to parabolic growth. Results We consider a model of a population that reproduces according to the parabolic growth law and show that the frequencies of individuals in the population minimize the Tsallis relative entropy (non-additive information gain) at each time moment. Next, we consider a model of a parabolically growing population that maintains a constant total size and provide an “implicit” solution for this system. We show that in this case, the frequencies of the individuals in the population also minimize the Tsallis information gain at each moment of the ‘internal time” of the population. Conclusions The results of this analysis show that the general MaxEnt principle is the underlying law for the evolution of a broad class of replicator systems including not only exponential but also parabolic and hyperbolic systems. The choice of the appropriate entropy (information) function depends on the growth dynamics of a particular class of systems. The Tsallis entropy is non-additive for independent subsystems, i.e. the information on the subsystems is insufficient to describe the system as a whole. In the context of prebiotic evolution, this “non-reductionist” nature of parabolic replicator systems might reflect the importance of group selection and competition between ensembles of cooperating replicators. Reviewers This article was reviewed by Viswanadham Sridhara (nominated by Claus Wilke), Puushottam Dixit (nominated by Sergei Maslov), and Nick Grishin. For the complete reviews, see the Reviewers’ Reports section. PMID:23937956

  4. Thermodynamic perspectives on genetic instructions, the laws of biology and diseased states.

    PubMed

    Trevors, Jack T; Saier, Milton H

    2011-01-01

    This article examines in a broad perspective entropy and some examples of its relationship to evolution, genetic instructions and how we view diseases. Living organisms are programmed by functional genetic instructions (FGI), through cellular communication pathways, to grow and reproduce by maintaining a variety of hemistable, ordered structures (low entropy). Living organisms are far from equilibrium with their surrounding environmental systems, which tends towards increasing disorder (increasing entropy). Organisms free themselves from high entropy (high disorder) to maintain their cellular structures for a period of time sufficient to allow reproduction and the resultant offspring to reach reproductive ages. This time interval varies for different species. Bacteria, for example need no sexual parents; dividing cells are nearly identical to the previous generation of cells, and can begin a new cell cycle without delay under appropriate conditions. By contrast, human infants require years of care before they can reproduce. Living organisms maintain order in spite of their changing surrounding environment that decreases order according to the second law of thermodynamics. These events actually work together since living organisms create ordered biological structures by increasing local entropy. From a disease perspective, viruses and other disease agents interrupt the normal functioning of cells. The pressure for survival may result in mechanisms that allow organisms to resist attacks by viruses, other pathogens, destructive chemicals and physical agents such as radiation. However, when the attack is successful, the organism can be damaged until the cell, tissue, organ or entire organism is no longer functional and entropy increases. Copyright © 2010 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  5. On relativistic generalization of Perelman's W-entropy and thermodynamic description of gravitational fields and cosmology

    NASA Astrophysics Data System (ADS)

    Ruchin, Vyacheslav; Vacaru, Olivia; Vacaru, Sergiu I.

    2017-03-01

    Using double 2+2 and 3+1 nonholonomic fibrations on Lorentz manifolds, we extend the concept of W-entropy for gravitational fields in general relativity (GR). Such F- and W-functionals were introduced in the Ricci flow theory of three dimensional (3-d) Riemannian metrics by Perelman (the entropy formula for the Ricci flow and its geometric applications. arXiv:math.DG/0211159). Non-relativistic 3-d Ricci flows are characterized by associated statistical thermodynamical values determined by W-entropy. Generalizations for geometric flows of 4-d pseudo-Riemannian metrics are considered for models with local thermodynamical equilibrium and separation of dissipative and non-dissipative processes in relativistic hydrodynamics. The approach is elaborated in the framework of classical field theories (relativistic continuum and hydrodynamic models) without an underlying kinetic description, which will be elaborated in other work. The 3+1 splitting allows us to provide a general relativistic definition of gravitational entropy in the Lyapunov-Perelman sense. It increases monotonically as structure forms in the Universe. We can formulate a thermodynamic description of exact solutions in GR depending, in general, on all spacetime coordinates. A corresponding 2+2 splitting with nonholonomic deformation of linear connection and frame structures is necessary for generating in very general form various classes of exact solutions of the Einstein and general relativistic geometric flow equations. Finally, we speculate on physical macrostates and microstate interpretations of the W-entropy in GR, geometric flow theories and possible connections to string theory (a second unsolved problem also contained in Perelman's work) in Polyakov's approach.

  6. Analytical approach to entropy generation and heat transfer in CNT-nanofluid dynamics through a ciliated porous medium

    NASA Astrophysics Data System (ADS)

    Akbar, Noreen Sher; Shoaib, M.; Tripathi, Dharmendra; Bhushan, Shashi; Bég, O. Anwar

    2018-04-01

    The transportation of biological and industrial nanofluids by natural propulsion like cilia movement and self-generated contraction-relaxation of flexible walls has significant applications in numerous emerging technologies. Inspired by multi-disciplinary progress and innovation in this direction, a thermo-fluid mechanical model is proposed to study the entropy generation and convective heat transfer of nanofluids fabricated by the dispersion of single-wall carbon nanotubes (SWCNT) nanoparticles in water as the base fluid. The regime studied comprises heat transfer and steady, viscous, incompressible flow, induced by metachronal wave propulsion due to beating cilia, through a cylindrical tube containing a sparse (i.e., high permeability) homogenous porous medium. The flow is of the creeping type and is restricted under the low Reynolds number and long wavelength approximations. Slip effects at the wall are incorporated and the generalized Darcy drag-force model is utilized to mimic porous media effects. Cilia boundary conditions for velocity components are employed to determine analytical solutions to the resulting non-dimensionalized boundary value problem. The influence of pertinent physical parameters on temperature, axial velocity, pressure rise and pressure gradient, entropy generation function, Bejan number and stream-line distributions are computed numerically. A comparative study between SWCNT-nanofluids and pure water is also computed. The computations demonstrate that axial flow is accelerated with increasing slip parameter and Darcy number and is greater for SWCNT-nanofluids than for pure water. Furthermore the size of the bolus for SWCNT-nanofluids is larger than that of the pure water. The study is applicable in designing and fabricating nanoscale and microfluidics devices, artificial cilia and biomimetic micro-pumps.

  7. Analytical approach to entropy generation and heat transfer in CNT-nanofluid dynamics through a ciliated porous medium

    NASA Astrophysics Data System (ADS)

    Akbar, Noreen Sher; Shoaib, M.; Tripathi, Dharmendra; Bhushan, Shashi; Bég, O. Anwar

    2018-03-01

    The transportation of biological and industrial nanofluids by natural propulsion like cilia movement and self-generated contraction-relaxation of flexible walls has significant applications in numerous emerging technologies. Inspired by multi-disciplinary progress and innovation in this direction, a thermo-fluid mechanical model is proposed to study the entropy generation and convective heat transfer of nanofluids fabricated by the dispersion of single-wall carbon nanotubes (SWCNT) nanoparticles in water as the base fluid. The regime studied comprises heat transfer and steady, viscous, incompressible flow, induced by metachronal wave propulsion due to beating cilia, through a cylindrical tube containing a sparse (i.e., high permeability) homogenous porous medium. The flow is of the creeping type and is restricted under the low Reynolds number and long wavelength approximations. Slip effects at the wall are incorporated and the generalized Darcy drag-force model is utilized to mimic porous media effects. Cilia boundary conditions for velocity components are employed to determine analytical solutions to the resulting non-dimensionalized boundary value problem. The influence of pertinent physical parameters on temperature, axial velocity, pressure rise and pressure gradient, entropy generation function, Bejan number and stream-line distributions are computed numerically. A comparative study between SWCNT-nanofluids and pure water is also computed. The computations demonstrate that axial flow is accelerated with increasing slip parameter and Darcy number and is greater for SWCNT-nanofluids than for pure water. Furthermore the size of the bolus for SWCNT-nanofluids is larger than that of the pure water. The study is applicable in designing and fabricating nanoscale and microfluidics devices, artificial cilia and biomimetic micro-pumps.

  8. Path-integral Monte Carlo method for Rényi entanglement entropies.

    PubMed

    Herdman, C M; Inglis, Stephen; Roy, P-N; Melko, R G; Del Maestro, A

    2014-07-01

    We introduce a quantum Monte Carlo algorithm to measure the Rényi entanglement entropies in systems of interacting bosons in the continuum. This approach is based on a path-integral ground state method that can be applied to interacting itinerant bosons in any spatial dimension with direct relevance to experimental systems of quantum fluids. We demonstrate how it may be used to compute spatial mode entanglement, particle partitioned entanglement, and the entanglement of particles, providing insights into quantum correlations generated by fluctuations, indistinguishability, and interactions. We present proof-of-principle calculations and benchmark against an exactly soluble model of interacting bosons in one spatial dimension. As this algorithm retains the fundamental polynomial scaling of quantum Monte Carlo when applied to sign-problem-free models, future applications should allow for the study of entanglement entropy in large-scale many-body systems of interacting bosons.

  9. Novel quantum phase transition from bounded to extensive entanglement

    PubMed Central

    Zhang, Zhao; Ahmadain, Amr

    2017-01-01

    The nature of entanglement in many-body systems is a focus of intense research with the observation that entanglement holds interesting information about quantum correlations in large systems and their relation to phase transitions. In particular, it is well known that although generic, many-body states have large, extensive entropy, ground states of reasonable local Hamiltonians carry much smaller entropy, often associated with the boundary length through the so-called area law. Here we introduce a continuous family of frustration-free Hamiltonians with exactly solvable ground states and uncover a remarkable quantum phase transition whereby the entanglement scaling changes from area law into extensively large entropy. This transition shows that entanglement in many-body systems may be enhanced under special circumstances with a potential for generating “useful” entanglement for the purpose of quantum computing and that the full implications of locality and its restrictions on possible ground states may hold further surprises. PMID:28461464

  10. Novel quantum phase transition from bounded to extensive entanglement.

    PubMed

    Zhang, Zhao; Ahmadain, Amr; Klich, Israel

    2017-05-16

    The nature of entanglement in many-body systems is a focus of intense research with the observation that entanglement holds interesting information about quantum correlations in large systems and their relation to phase transitions. In particular, it is well known that although generic, many-body states have large, extensive entropy, ground states of reasonable local Hamiltonians carry much smaller entropy, often associated with the boundary length through the so-called area law. Here we introduce a continuous family of frustration-free Hamiltonians with exactly solvable ground states and uncover a remarkable quantum phase transition whereby the entanglement scaling changes from area law into extensively large entropy. This transition shows that entanglement in many-body systems may be enhanced under special circumstances with a potential for generating "useful" entanglement for the purpose of quantum computing and that the full implications of locality and its restrictions on possible ground states may hold further surprises.

  11. Nonlinear dynamic analysis of voices before and after surgical excision of vocal polyps

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; McGilligan, Clancy; Zhou, Liang; Vig, Mark; Jiang, Jack J.

    2004-05-01

    Phase space reconstruction, correlation dimension, and second-order entropy, methods from nonlinear dynamics, are used to analyze sustained vowels generated by patients before and after surgical excision of vocal polyps. Two conventional acoustic perturbation parameters, jitter and shimmer, are also employed to analyze voices before and after surgery. Presurgical and postsurgical analyses of jitter, shimmer, correlation dimension, and second-order entropy are statistically compared. Correlation dimension and second-order entropy show a statistically significant decrease after surgery, indicating reduced complexity and higher predictability of postsurgical voice dynamics. There is not a significant postsurgical difference in shimmer, although jitter shows a significant postsurgical decrease. The results suggest that jitter and shimmer should be applied to analyze disordered voices with caution; however, nonlinear dynamic methods may be useful for analyzing abnormal vocal function and quantitatively evaluating the effects of surgical excision of vocal polyps.

  12. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  13. ATR applications of minimax entropy models of texture and shape

    NASA Astrophysics Data System (ADS)

    Zhu, Song-Chun; Yuille, Alan L.; Lanterman, Aaron D.

    2001-10-01

    Concepts from information theory have recently found favor in both the mainstream computer vision community and the military automatic target recognition community. In the computer vision literature, the principles of minimax entropy learning theory have been used to generate rich probabilitistic models of texture and shape. In addition, the method of types and large deviation theory has permitted the difficulty of various texture and shape recognition tasks to be characterized by 'order parameters' that determine how fundamentally vexing a task is, independent of the particular algorithm used. These information-theoretic techniques have been demonstrated using traditional visual imagery in applications such as simulating cheetah skin textures and such as finding roads in aerial imagery. We discuss their application to problems in the specific application domain of automatic target recognition using infrared imagery. We also review recent theoretical and algorithmic developments which permit learning minimax entropy texture models for infrared textures in reasonable timeframes.

  14. Mixing entropy in Dean flows

    NASA Astrophysics Data System (ADS)

    Fodor, Petru; Vyhnalek, Brian; Kaufman, Miron

    2013-03-01

    We investigate mixing in Dean flows by solving numerically the Navier-Stokes equation for a circular channel. Tracers of two chemical species are carried by the fluid. The centrifugal forces, experienced as the fluid travels along a curved trajectory, coupled with the fluid incompressibility induce cross-sectional rotating flows (Dean vortices). These transversal flows promote the mixing of the chemical species. We generate images for different cross sections along the trajectory. The mixing efficiency is evaluated using the Shannon entropy. Previously we have found, P. S. Fodor and M. Kaufman, Modern Physics Letters B 25, 1111 (2011), this measure to be useful in understanding mixing in the staggered herringbone mixer. The mixing entropy is determined as function of the Reynolds number, the angle of the cross section and the observation scale (number of bins). Quantitative comparison of the mixing in the Dean micromixer and in the staggered herringbone mixer is attempted.

  15. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    PubMed

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  16. Using ordinal partition transition networks to analyze ECG data

    NASA Astrophysics Data System (ADS)

    Kulp, Christopher W.; Chobot, Jeremy M.; Freitas, Helena R.; Sprechini, Gene D.

    2016-07-01

    Electrocardiogram (ECG) data from patients with a variety of heart conditions are studied using ordinal pattern partition networks. The ordinal pattern partition networks are formed from the ECG time series by symbolizing the data into ordinal patterns. The ordinal patterns form the nodes of the network and edges are defined through the time ordering of the ordinal patterns in the symbolized time series. A network measure, called the mean degree, is computed from each time series-generated network. In addition, the entropy and number of non-occurring ordinal patterns (NFP) is computed for each series. The distribution of mean degrees, entropies, and NFPs for each heart condition studied is compared. A statistically significant difference between healthy patients and several groups of unhealthy patients with varying heart conditions is found for the distributions of the mean degrees, unlike for any of the distributions of the entropies or NFPs.

  17. Dynamic Cross-Entropy.

    PubMed

    Aur, Dorian; Vila-Rodriguez, Fidel

    2017-01-01

    Complexity measures for time series have been used in many applications to quantify the regularity of one dimensional time series, however many dynamical systems are spatially distributed multidimensional systems. We introduced Dynamic Cross-Entropy (DCE) a novel multidimensional complexity measure that quantifies the degree of regularity of EEG signals in selected frequency bands. Time series generated by discrete logistic equations with varying control parameter r are used to test DCE measures. Sliding window DCE analyses are able to reveal specific period doubling bifurcations that lead to chaos. A similar behavior can be observed in seizures triggered by electroconvulsive therapy (ECT). Sample entropy data show the level of signal complexity in different phases of the ictal ECT. The transition to irregular activity is preceded by the occurrence of cyclic regular behavior. A significant increase of DCE values in successive order from high frequencies in gamma to low frequencies in delta band reveals several phase transitions into less ordered states, possible chaos in the human brain. To our knowledge there are no reliable techniques able to reveal the transition to chaos in case of multidimensional times series. In addition, DCE based on sample entropy appears to be robust to EEG artifacts compared to DCE based on Shannon entropy. The applied technique may offer new approaches to better understand nonlinear brain activity. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Dynamical noise filter and conditional entropy analysis in chaos synchronization.

    PubMed

    Wang, Jiao; Lai, C-H

    2006-06-01

    It is shown that, in a chaotic synchronization system whose driving signal is exposed to channel noise, the estimation of the drive system states can be greatly improved by applying the dynamical noise filtering to the response system states. If the noise is bounded in a certain range, the estimation errors, i.e., the difference between the filtered responding states and the driving states, can be made arbitrarily small. This property can be used in designing an alternative digital communication scheme. An analysis based on the conditional entropy justifies the application of dynamical noise filtering in generating quality synchronization.

  19. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    PubMed

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  20. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies

    PubMed Central

    Lorenz, Ralph D.

    2010-01-01

    The ‘two-box model’ of planetary climate is discussed. This model has been used to demonstrate consistency of the equator–pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b. PMID:20368253

  1. Shock melting and vaporization of metals.

    NASA Technical Reports Server (NTRS)

    Ahrens, T. J.

    1972-01-01

    The effect of initial porosity on shock induction of melting and vaporization is investigated for Ba, Sr, Li, Fe, Al, U, and Th. For the less compressible of these metals, it is found that for a given strong shock-generation system (explosive in contact, or flyer-plate impact) an optimum initial specific volume exists such that the total entropy production, and hence the amount of metal liquid or vapor, is a maximum. Initial volumes from 1.4 to 2.0 times crystal volumes, depending on the metal sample and shock-inducing system, will result in optimum post-shock entropies.

  2. A Novel Method to Increase LinLog CMOS Sensors’ Performance in High Dynamic Range Scenarios

    PubMed Central

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J.; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor’s maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method. PMID:22164083

  3. A novel method to increase LinLog CMOS sensors' performance in high dynamic range scenarios.

    PubMed

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor's maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method.

  4. Nature of phase transitions in crystalline and amorphous GeTe-Sb2Te3 phase change materials.

    PubMed

    Kalkan, B; Sen, S; Clark, S M

    2011-09-28

    The thermodynamic nature of phase stabilities and transformations are investigated in crystalline and amorphous Ge(1)Sb(2)Te(4) (GST124) phase change materials as a function of pressure and temperature using high-resolution synchrotron x-ray diffraction in a diamond anvil cell. The phase transformation sequences upon compression, for cubic and hexagonal GST124 phases are found to be: cubic → amorphous → orthorhombic → bcc and hexagonal → orthorhombic → bcc. The Clapeyron slopes for melting of the hexagonal and bcc phases are negative and positive, respectively, resulting in a pressure dependent minimum in the liquidus. When taken together, the phase equilibria relations are consistent with the presence of polyamorphism in this system with the as-deposited amorphous GST phase being the low entropy low-density amorphous phase and the laser melt-quenched and high-pressure amorphized GST being the high entropy high-density amorphous phase. The metastable phase boundary between these two polyamorphic phases is expected to have a negative Clapeyron slope. © 2011 American Institute of Physics

  5. Simulations of dissociation constants in low pressure supercritical water

    NASA Astrophysics Data System (ADS)

    Halstead, S. J.; An, P.; Zhang, S.

    2014-09-01

    This article reports molecular dynamics simulations of the dissociation of hydrochloric acid and sodium hydroxide in water from ambient to supercritical temperatures at a fixed pressure of 250 atm. Corrosion of reaction vessels is known to be a serious problem of supercritical water, and acid/base dissociation can be a significant contributing factor to this. The SPC/e model was used in conjunction with solute models determined from density functional calculations and OPLSAA Lennard-Jones parameters. Radial distribution functions were calculated, and these show a significant increase in solute-solvent ordering upon forming the product ions at all temperatures. For both dissociations, rapidly decreasing entropy of reaction was found to be the controlling thermodynamic factor, and this is thought to arise due to the ions produced from dissociation maintaining a relatively high density and ordered solvation shell compared to the reactants. The change in entropy of reaction reaches a minimum at the critical temperature. The values of pKa and pKb were calculated and both increased with temperature, in qualitative agreement with other work, until a maximum value at 748 K, after which there was a slight decrease.

  6. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    PubMed

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  7. Conserved charges of minimal massive gravity coupled to scalar field

    NASA Astrophysics Data System (ADS)

    Setare, M. R.; Adami, H.

    2018-02-01

    Recently, the theory of topologically massive gravity non-minimally coupled to a scalar field has been proposed, which comes from the Lorentz-Chern-Simons theory (JHEP 06, 113, 2015), a torsion-free theory. We extend this theory by adding an extra term which makes the torsion to be non-zero. We show that the BTZ spacetime is a particular solution to this theory in the case where the scalar field is constant. The quasi-local conserved charge is defined by the concept of the generalized off-shell ADT current. Also a general formula is found for the entropy of the stationary black hole solution in context of the considered theory. The obtained formulas are applied to the BTZ black hole solution in order to obtain the energy, the angular momentum and the entropy of this solution. The central extension term, the central charges and the eigenvalues of the Virasoro algebra generators for the BTZ black hole solution are thus obtained. The energy and the angular momentum of the BTZ black hole using the eigenvalues of the Virasoro algebra generators are calculated. Also, using the Cardy formula, the entropy of the BTZ black hole is found. It is found that the results obtained in two different ways exactly match, just as expected.

  8. Local subsystems in gauge theory and gravity

    DOE PAGES

    Donnelly, William; Freidel, Laurent

    2016-09-16

    We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less

  9. Quantifying control effort of biological and technical movements: an information-entropy-based approach.

    PubMed

    Haeufle, D F B; Günther, M; Wunner, G; Schmitt, S

    2014-01-01

    In biomechanics and biorobotics, muscles are often associated with reduced movement control effort and simplified control compared to technical actuators. This is based on evidence that the nonlinear muscle properties positively influence movement control. It is, however, open how to quantify the simplicity aspect of control effort and compare it between systems. Physical measures, such as energy consumption, stability, or jerk, have already been applied to compare biological and technical systems. Here a physical measure of control effort based on information entropy is presented. The idea is that control is simpler if a specific movement is generated with less processed sensor information, depending on the control scheme and the physical properties of the systems being compared. By calculating the Shannon information entropy of all sensor signals required for control, an information cost function can be formulated allowing the comparison of models of biological and technical control systems. Exemplarily applied to (bio-)mechanical models of hopping, the method reveals that the required information for generating hopping with a muscle driven by a simple reflex control scheme is only I=32 bits versus I=660 bits with a DC motor and a proportional differential controller. This approach to quantifying control effort captures the simplicity of a control scheme and can be used to compare completely different actuators and control approaches.

  10. Transitions in eigenvalue and wavefunction structure in (1+2) -body random matrix ensembles with spin.

    PubMed

    Vyas, Manan; Kota, V K B; Chavda, N D

    2010-03-01

    Finite interacting Fermi systems with a mean-field and a chaos generating two-body interaction are modeled by one plus two-body embedded Gaussian orthogonal ensemble of random matrices with spin degree of freedom [called EGOE(1+2)-s]. Numerical calculations are used to demonstrate that, as lambda , the strength of the interaction (measured in the units of the average spacing of the single-particle levels defining the mean-field), increases, generically there is Poisson to GOE transition in level fluctuations, Breit-Wigner to Gaussian transition in strength functions (also called local density of states) and also a duality region where information entropy will be the same in both the mean-field and interaction defined basis. Spin dependence of the transition points lambda_{c} , lambdaF, and lambdad , respectively, is described using the propagator for the spectral variances and the formula for the propagator is derived. We further establish that the duality region corresponds to a region of thermalization. For this purpose we compared the single-particle entropy defined by the occupancies of the single-particle orbitals with thermodynamic entropy and information entropy for various lambda values and they are very close to each other at lambda=lambdad.

  11. Gravitational entropy and the cosmological no-hair conjecture

    NASA Astrophysics Data System (ADS)

    Bolejko, Krzysztof

    2018-04-01

    The gravitational entropy and no-hair conjectures seem to predict contradictory future states of our Universe. The growth of the gravitational entropy is associated with the growth of inhomogeneity, while the no-hair conjecture argues that a universe dominated by dark energy should asymptotically approach a homogeneous and isotropic de Sitter state. The aim of this paper is to study these two conjectures. The investigation is based on the Simsilun simulation, which simulates the universe using the approximation of the Silent Universe. The Silent Universe is a solution to the Einstein equations that assumes irrotational, nonviscous, and insulated dust, with vanishing magnetic part of the Weyl curvature. The initial conditions for the Simsilun simulation are sourced from the Millennium simulation, which results with a realistically appearing but relativistic at origin simulation of a universe. The Simsilun simulation is evolved from the early universe (t =25 Myr ) until far future (t =1000 Gyr ). The results of this investigation show that both conjectures are correct. On global scales, a universe with a positive cosmological constant and nonpositive spatial curvature does indeed approach the de Sitter state. At the same time it keeps generating the gravitational entropy.

  12. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Wei; Wang, Jin, E-mail: jin.wang.1@stonybrook.edu; State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, 130022 Changchun, China and College of Physics, Jilin University, 130021 Changchun

    We have established a general non-equilibrium thermodynamic formalism consistently applicable to both spatially homogeneous and, more importantly, spatially inhomogeneous systems, governed by the Langevin and Fokker-Planck stochastic dynamics with multiple state transition mechanisms, using the potential-flux landscape framework as a bridge connecting stochastic dynamics with non-equilibrium thermodynamics. A set of non-equilibrium thermodynamic equations, quantifying the relations of the non-equilibrium entropy, entropy flow, entropy production, and other thermodynamic quantities, together with their specific expressions, is constructed from a set of dynamical decomposition equations associated with the potential-flux landscape framework. The flux velocity plays a pivotal role on both the dynamic andmore » thermodynamic levels. On the dynamic level, it represents a dynamic force breaking detailed balance, entailing the dynamical decomposition equations. On the thermodynamic level, it represents a thermodynamic force generating entropy production, manifested in the non-equilibrium thermodynamic equations. The Ornstein-Uhlenbeck process and more specific examples, the spatial stochastic neuronal model, in particular, are studied to test and illustrate the general theory. This theoretical framework is particularly suitable to study the non-equilibrium (thermo)dynamics of spatially inhomogeneous systems abundant in nature. This paper is the second of a series.« less

  14. Temporal and Spatial Evolution Characteristics of Disturbance Wave in a Hypersonic Boundary Layer due to Single-Frequency Entropy Disturbance

    PubMed Central

    Lv, Hongqing; Shi, Jianqiang

    2014-01-01

    By using a high-order accurate finite difference scheme, direct numerical simulation of hypersonic flow over an 8° half-wedge-angle blunt wedge under freestream single-frequency entropy disturbance is conducted; the generation and the temporal and spatial nonlinear evolution of boundary layer disturbance waves are investigated. Results show that, under the freestream single-frequency entropy disturbance, the entropy state of boundary layer is changed sharply and the disturbance waves within a certain frequency range are induced in the boundary layer. Furthermore, the amplitudes of disturbance waves in the period phase are larger than that in the response phase and ablation phase and the frequency range in the boundary layer in the period phase is narrower than that in these two phases. In addition, the mode competition, dominant mode transformation, and disturbance energy transfer exist among different modes both in temporal and in spatial evolution. The mode competition changes the characteristics of nonlinear evolution of the unstable waves in the boundary layer. The development of the most unstable mode along streamwise relies more on the motivation of disturbance waves in the upstream than that of other modes on this motivation. PMID:25143983

  15. Temporal and spatial evolution characteristics of disturbance wave in a hypersonic boundary layer due to single-frequency entropy disturbance.

    PubMed

    Wang, Zhenqing; Tang, Xiaojun; Lv, Hongqing; Shi, Jianqiang

    2014-01-01

    By using a high-order accurate finite difference scheme, direct numerical simulation of hypersonic flow over an 8° half-wedge-angle blunt wedge under freestream single-frequency entropy disturbance is conducted; the generation and the temporal and spatial nonlinear evolution of boundary layer disturbance waves are investigated. Results show that, under the freestream single-frequency entropy disturbance, the entropy state of boundary layer is changed sharply and the disturbance waves within a certain frequency range are induced in the boundary layer. Furthermore, the amplitudes of disturbance waves in the period phase are larger than that in the response phase and ablation phase and the frequency range in the boundary layer in the period phase is narrower than that in these two phases. In addition, the mode competition, dominant mode transformation, and disturbance energy transfer exist among different modes both in temporal and in spatial evolution. The mode competition changes the characteristics of nonlinear evolution of the unstable waves in the boundary layer. The development of the most unstable mode along streamwise relies more on the motivation of disturbance waves in the upstream than that of other modes on this motivation.

  16. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  17. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  18. Design of Novel Precipitate-Strengthened Al-Co-Cr-Fe-Nb-Ni High-Entropy Superalloys

    NASA Astrophysics Data System (ADS)

    Antonov, Stoichko; Detrois, Martin; Tin, Sammy

    2018-01-01

    A series of non-equiatomic Al-Co-Cr-Fe-Nb-Ni high-entropy alloys, with varying levels of Co, Nb and Fe, were investigated in an effort to obtain microstructures similar to conventional Ni-based superalloys. Elevated levels of Co were observed to significantly decrease the solvus temperature of the γ' precipitates. Both Nb and Co in excessive concentrations promoted the formation of Laves and NiAl phases that formed either during solidification and remained undissolved during homogenization or upon high-temperature aging. Lowering the content of Nb, Co, or Fe prevented the formation of the eutectic type Laves. In addition, lowering the Co content resulted in a higher number density and volume fraction of the γ' precipitates, while increasing the Fe content led to the destabilization of the γ' precipitates. Various aging treatments were performed which led to different size distributions of the strengthening phase. Results from the microstructural characterization and hardness property assessments of these high-entropy alloys were compared to a commercial, high-strength Ni-based superalloy RR1000. Potentially, precipitation-strengthened high-entropy alloys could find applications replacing Ni-based superalloys as structural materials in power generation applications.

  19. Information entropy of humpback whale songs.

    PubMed

    Suzuki, Ryuji; Buck, John R; Tyack, Peter L

    2006-03-01

    The structure of humpback whale (Megaptera novaeangliae) songs was examined using information theory techniques. The song is an ordered sequence of individual sound elements separated by gaps of silence. Song samples were converted into sequences of discrete symbols by both human and automated classifiers. This paper analyzes the song structure in these symbol sequences using information entropy estimators and autocorrelation estimators. Both parametric and nonparametric entropy estimators are applied to the symbol sequences representing the songs. The results provide quantitative evidence consistent with the hierarchical structure proposed for these songs by Payne and McVay [Science 173, 587-597 (1971)]. Specifically, this analysis demonstrates that: (1) There is a strong structural constraint, or syntax, in the generation of the songs, and (2) the structural constraints exhibit periodicities with periods of 6-8 and 180-400 units. This implies that no empirical Markov model is capable of representing the songs' structure. The results are robust to the choice of either human or automated song-to-symbol classifiers. In addition, the entropy estimates indicate that the maximum amount of information that could be communicated by the sequence of sounds made is less than 1 bit per second.

  20. Coarse-graining errors and numerical optimization using a relative entropy framework.

    PubMed

    Chaimovich, Aviel; Shell, M Scott

    2011-03-07

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, S(rel), that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework. © 2011 American Institute of Physics.

  1. Multivariate multiscale entropy of financial markets

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun

    2017-11-01

    In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.

  2. Nonlinear Complexity Analysis of Brain fMRI Signals in Schizophrenia

    PubMed Central

    Sokunbi, Moses O.; Gradin, Victoria B.; Waiter, Gordon D.; Cameron, George G.; Ahearn, Trevor S.; Murray, Alison D.; Steele, Douglas J.; Staff, Roger T.

    2014-01-01

    We investigated the differences in brain fMRI signal complexity in patients with schizophrenia while performing the Cyberball social exclusion task, using measures of Sample entropy and Hurst exponent (H). 13 patients meeting diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM IV) criteria for schizophrenia and 16 healthy controls underwent fMRI scanning at 1.5 T. The fMRI data of both groups of participants were pre-processed, the entropy characterized and the Hurst exponent extracted. Whole brain entropy and H maps of the groups were generated and analysed. The results after adjusting for age and sex differences together show that patients with schizophrenia exhibited higher complexity than healthy controls, at mean whole brain and regional levels. Also, both Sample entropy and Hurst exponent agree that patients with schizophrenia have more complex fMRI signals than healthy controls. These results suggest that schizophrenia is associated with more complex signal patterns when compared to healthy controls, supporting the increase in complexity hypothesis, where system complexity increases with age or disease, and also consistent with the notion that schizophrenia is characterised by a dysregulation of the nonlinear dynamics of underlying neuronal systems. PMID:24824731

  3. On Entropy Generation and the Effect of Heat and Mass Transfer Coupling in a Distillation Process

    NASA Astrophysics Data System (ADS)

    Burgos-Madrigal, Paulina; Mendoza, Diego F.; López de Haro, Mariano

    2018-01-01

    The entropy production rates as obtained from the exergy analysis, entropy balance and the nonequilibrium thermodynamics approach are compared for two distillation columns. The first case is a depropanizer column involving a mixture of ethane, propane, n-butane and n-pentane. The other is a weighed sample of Mexican crude oil distilled with a pilot scale fractionating column. The composition, temperature and flow profiles, for a given duty and operating conditions in each column, are obtained with the Aspen Plus V8.4 software by using the RateFrac model with a rate-based nonequilibrium column. For the depropanizer column the highest entropy production rate is found in the central trays where most of the mass transfer occurs, while in the second column the highest values correspond to the first three stages (where the vapor mixture is in contact with the cold liquid reflux), and to the last three stages (where the highest temperatures take place). The importance of the explicit inclusion of thermal diffusion in these processes is evaluated. In the depropanizer column, the effect of the coupling between heat and mass transfer is found to be negligible, while for the fractionating column it becomes appreciable.

  4. Entropy Generation Analysis through Helical Coil Heat Exchanger in an Agitated Vessel

    NASA Astrophysics Data System (ADS)

    Ashok Reddy, K.

    2018-03-01

    Entropy Generation have been obtained while conducting the experiments for different sodium carboxymethyl cellulose concentrations 0.05%,0.1%,0.15% and 0.2% of Newtonian and non Newtonian fluids and the data made available by passing the test fluid at different flow rates through a helical coil in a mixing coil using paddle impeller. Heating of fluids depend on operational parameters, geometry of the mixing vessel and the type of impeller used. A new design of heating element was design and fabricated by providing kanthal wire inserted into a glove knitted with fiber glass yarn as glass fabric is flexible, heat resistant and can accommodate to adopt small difference in size of the vessel, perfectly. The knitted fabric is made to the shape of vessel used in the experiment and the heating elements are inserted so that it gets embedded and forms part of the glove knitted with yarn of fiber glass.

  5. Thermodynamic properties of a liquid crystal carbosilane dendrimer

    NASA Astrophysics Data System (ADS)

    Samosudova, Ya. S.; Markin, A. V.; Smirnova, N. N.; Ogurtsov, T. G.; Boiko, N. I.; Shibaev, V. P.

    2016-11-01

    The temperature dependence of the heat capacity of a first-generation liquid crystal carbosilane dendrimer with methoxyphenyl benzoate end groups is studied for the first time in the region of 6-370 K by means of precision adiabatic vacuum calorimetry. Physical transformations are observed in this interval of temperatures, and their standard thermodynamic characteristics are determined and discussed. Standard thermodynamic functions C p ° ( T), H°( T) - H°(0), S°( T) - S°(0), and G°( T) - H°(0) are calculated from the obtained experimental data for the region of T → 0 to 370 K. The standard entropy of formation of the dendrimer in the partially crystalline state at T = 298.15 K is calculated, and the standard entropy of the hypothetic reaction of its synthesis at this temperature is estimated. The thermodynamic properties of the studied dendrimer are compared to those of second- and fourth-generation liquid crystal carbosilane dendrimers with the same end groups studied earlier.

  6. Research and implementation of group animation based on normal cloud model

    NASA Astrophysics Data System (ADS)

    Li, Min; Wei, Bin; Peng, Bao

    2011-12-01

    Group Animation is a difficult technology problem which always has not been solved in computer Animation technology, All current methods have their limitations. This paper put forward a method: the Motion Coordinate and Motion Speed of true fish group was collected as sample data, reverse cloud generator was designed and run, expectation, entropy and super entropy are gotten. Which are quantitative value of qualitative concept. These parameters are used as basis, forward cloud generator was designed and run, Motion Coordinate and Motion Speed of two-dimensional fish group animation are produced, And two spirit state variable about fish group : the feeling of hunger, the feeling of fear are designed. Experiment is used to simulated the motion state of fish Group Animation which is affected by internal cause and external cause above, The experiment shows that the Group Animation which is designed by this method has strong Realistic.

  7. A desalination battery.

    PubMed

    Pasta, Mauro; Wessells, Colin D; Cui, Yi; La Mantia, Fabio

    2012-02-08

    Water desalination is an important approach to provide fresh water around the world, although its high energy consumption, and thus high cost, call for new, efficient technology. Here, we demonstrate the novel concept of a "desalination battery", which operates by performing cycles in reverse on our previously reported mixing entropy battery. Rather than generating electricity from salinity differences, as in mixing entropy batteries, desalination batteries use an electrical energy input to extract sodium and chloride ions from seawater and to generate fresh water. The desalination battery is comprised by a Na(2-x)Mn(5)O(10) nanorod positive electrode and Ag/AgCl negative electrode. Here, we demonstrate an energy consumption of 0.29 Wh l(-1) for the removal of 25% salt using this novel desalination battery, which is promising when compared to reverse osmosis (~ 0.2 Wh l(-1)), the most efficient technique presently available. © 2012 American Chemical Society

  8. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  9. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  10. Effect of rotation preference on spontaneous alternation behavior on Y maze and introduction of a new analytical method, entropy of spontaneous alternation.

    PubMed

    Bak, Jia; Pyeon, Hae-In; Seok, Jin-I; Choi, Yun-Sik

    2017-03-01

    Y maze has been used to test spatial working memory in rodents. To this end, the percentage of spontaneous alternation has been employed. Alternation indicates sequential entries into all three arms; e.g., when an animal visits all three arms clockwise or counterclockwise sequentially, alternation is achieved. Interestingly, animals have a tendency to rotate or turn to a preferred side. Thus, when an animal has a high rotation preference, this may influence their alternation behavior. Here, we have generated a new analytical method, termed entropy of spontaneous alternation, to offset the effect of rotation preference on Y maze. To validate the entropy of spontaneous alternation, we employed a free rotation test using a cylinder and a spatial working memory test on Y maze. We identified that mice showed 65.1% rotation preference on average. Importantly, the percentage of spontaneous alternation in the high preference group (more than 70% rotation to a preferred side) was significantly higher than that in the no preference group (<55%). In addition, there was a clear correlation between rotation preference on cylinder and turning preference on Y maze. On the other hand, this potential leverage effect that arose from rotation preference disappeared when the animal behavior on Y maze was analyzed with the entropy of spontaneous alternation. Further, entropy of spontaneous alternation significantly determined the loss of spatial working memory by scopolamine administration. Combined, these data indicate that the entropy of spontaneous alternation provides higher credibility when spatial working memory is evaluated using Y maze. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Thermodynamic perspectives on genetic instructions, the laws of biology, diseased states and human population control

    PubMed Central

    Saier, M. H.

    2014-01-01

    This article examines in a broad perspective entropy and some examples of its relationship to evolution, genetic instructions and how we view diseases. Many knowledge gaps abound, hence our understanding is still fragmented and incomplete. Living organisms are programmed by functional genetic instructions (FGI), through cellular communication pathways, to grow and reproduce by maintaining a variety of hemistable, ordered structures (low entropy). Living organisms are far from equilibrium with their surrounding environmental systems, which tends towards increasing disorder (increasing entropy). Organisms must free themselves from high entropy (high disorder) to maintain their cellular structures for a period of time sufficient enough to allow reproduction and the resultant offspring to reach reproductive ages. This time interval varies for different species. Bacteria, for example need no sexual parents; dividing cells are nearly identical to the previous generation of cells, and can begin a new cell cycle without delay under appropriate conditions. By contrast, human infants require years of care before they can reproduce. Living organisms maintain order in spite of their changing surrounding environment, that decreases order according to the second law of thermodynamics. These events actually work together since living organisms create ordered biological structures by increasing local entropy. From a disease perspective, viruses and other disease agents interrupt the normal functioning of cells. The pressure for survival may result in mechanisms that allow organisms to resist attacks by viruses, other pathogens, destructive chemicals and physical agents such as radiation. However, when the attack is successful, the organism can be damaged until the cell, tissue, organ or entire organism is no longer functional and entropy increases. PMID:21262480

  12. Differentiating benign from malignant mediastinal lymph nodes visible at EBUS using grey-scale textural analysis.

    PubMed

    Edey, Anthony J; Pollentine, Adrian; Doody, Claire; Medford, Andrew R L

    2015-04-01

    Recent data suggest that grey-scale textural analysis on endobronchial ultrasound (EBUS) imaging can differentiate benign from malignant lymphadenopathy. The objective of studies was to evaluate grey-scale textural analysis and examine its clinical utility. Images from 135 consecutive clinically indicated EBUS procedures were evaluated retrospectively using MATLAB software (MathWorks, Natick, MA, USA). Manual node mapping was performed to obtain a region of interest and grey-scale textural features (range of pixel values and entropy) were analysed. The initial analysis involved 94 subjects and receiver operating characteristic (ROC) curves were generated. The ROC thresholds were then applied on a second cohort (41 subjects) to validate the earlier findings. A total of 371 images were evaluated. There was no difference in proportions of malignant disease (56% vs 53%, P = 0.66) in the prediction (group 1) and validation (group 2) sets. There was no difference in range of pixel values in group 1 but entropy was significantly higher in the malignant group (5.95 vs 5.77, P = 0.03). Higher entropy was seen in adenocarcinoma versus lymphoma (6.00 vs 5.50, P < 0.05). An ROC curve for entropy gave an area under the curve of 0.58 with 51% sensitivity and 71% specificity for entropy greater than 5.94 for malignancy. In group 2, the entropy threshold phenotyped only 47% of benign cases and 20% of malignant cases correctly. These findings suggest that use of EBUS grey-scale textural analysis for differentiation of malignant from benign lymphadenopathy may not be accurate. Further studies are required. © 2015 Asian Pacific Society of Respirology.

  13. Entropy, materials, and posterity

    USGS Publications Warehouse

    Cloud, P.

    1977-01-01

    Materials and energy are the interdependent feedstocks of economic systems, and thermodynamics is their moderator. It costs energy to transform the dispersed minerals of Earth's crust into ordered materials and structures. And it costs materials to collect and focus the energy to perform work - be it from solar, fossil fuel, nuclear, or other sources. The greater the dispersal of minerals sought, the more energy is required to collect them into ordered states. But available energy can be used once only. And the ordered materials of industrial economies become disordered with time. They may be partially reordered and recycled, but only at further costs in energy. Available energy everywhere degrades to bound states and order to disorder - for though entropy may be juggled it always increases. Yet industry is utterly dependent on low entropy states of matter and energy, while decreasing grades of ore require ever higher inputs of energy to convert them to metals, with ever increasing growth both of entropy and environmental hazard. Except as we may prize a thing for its intrinsic qualities - beauty, leisure, love, or gold - low-entropy is the only thing of real value. It is worth whatever the market will bear, and it becomes more valuable as entropy increases. It would be foolish of suppliers to sell it more cheaply or in larger amounts than their own enjoyment of life requires, whatever form it may take. For this reason, and because of physical constraints on the availability of all low-entropy states, the recent energy crises is only the first of a sequence of crises to be expected in energy and materials as long as current trends continue. The apportioning of low-entropy states in a modern industrial society is achieved more or less according to the theory of competitive markets. But the rational powers of this theory suffer as the world grows increasingly polarized into rich, over-industrialized nations with diminishing resource bases and poor, supplier nations with little industry. The theory also discounts posterity, the more so as population density and percapita rates of consumption continue to grow. A new social, economic, and ecologic norm that leads to population control, conservation, and an apportionment of low-entropy states across the generations is needed to assure to posterity the options that properly belong to it as an important but voiceless constituency of the collectivity we call mankind. ?? 1977 Ferdinand Enke Verlag Stuttgart.

  14. Security and composability of randomness expansion from Bell inequalities

    NASA Astrophysics Data System (ADS)

    Fehr, Serge; Gelles, Ran; Schaffner, Christian

    2013-01-01

    The nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness expansion. R. Colbeck and A. Kent [J. Phys. A1751-811310.1088/1751-8113/44/9/095305 44, 095305 (2011)] proposed the first method for generating randomness from untrusted devices, but without providing a rigorous analysis. This was addressed subsequently by S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], who aimed at deriving a lower bound on the min-entropy of the data extracted from an untrusted device based only on the observed nonlocal behavior of the device. Although that article succeeded in developing important tools for reaching the stated goal, the proof itself contained a bug, and the given formal claim on the guaranteed amount of min-entropy needs to be revisited. In this paper we build on the tools provided by Pironio and obtain a meaningful lower bound on the min-entropy of the data produced by an untrusted device based on the observed nonlocal behavior of the device. Our main result confirms the essence of the (improperly formulated) claims of Pironio and puts them on solid ground. We also address the question of composability and show that different untrusted devices can be composed in an alternating manner under the assumption that they are not entangled. This enables superpolynomial randomness expansion based on two untrusted yet unentangled devices.

  15. Approximate Entropies for Stochastic Time Series and EKG Time Series of Patients with Epilepsy and Pseudoseizures

    NASA Astrophysics Data System (ADS)

    Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron

    2009-10-01

    A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.

  16. Parallel Unstructured Grid Generation for Complex Real-World Aerodynamic Simulations

    NASA Astrophysics Data System (ADS)

    Zagaris, George

    We begin by defining the concept of `open' Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain `boundary' states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow of probability vanishes between all pairs of states. External couplings which fix the probabilities of boundary states can maintain such systems in non-equilibrium steady states in which non-zero probability currents flow. We show that these non-equilibrium steady states minimize a quadratic form which we call 'dissipation.' This is closely related to Prigogine's principle of minimum entropy production. We bound the rate of change of the entropy of a driven non-equilibrium steady state relative to the underlying equilibrium state in terms of the flow of probability through the boundary of the process. We then consider open Markov processes as morphisms in a symmetric monoidal category by splitting up their boundary states into certain sets of `inputs' and `outputs.' Composition corresponds to gluing the outputs of one such open Markov process onto the inputs of another so that the probability flowing out of the first process is equal to the probability flowing into the second. Tensoring in this category corresponds to placing two such systems side by side. We construct a `black-box' functor characterizing the behavior of an open Markov process in terms of the space of possible steady state probabilities and probability currents along the boundary. The fact that this is a functor means that the behavior of a composite open Markov process can be computed by composing the behaviors of the open Markov processes from which it is composed. We prove a similar black-boxing theorem for reaction networks whose dynamics are given by the non-linear rate equation. Along the way we describe a more general category of open dynamical systems where composition corresponds to gluing together open dynamical systems.

  17. Rolling bearing fault diagnosis based on time-delayed feedback monostable stochastic resonance and adaptive minimum entropy deconvolution

    NASA Astrophysics Data System (ADS)

    Li, Jimeng; Li, Ming; Zhang, Jinfeng

    2017-08-01

    Rolling bearings are the key components in the modern machinery, and tough operation environments often make them prone to failure. However, due to the influence of the transmission path and background noise, the useful feature information relevant to the bearing fault contained in the vibration signals is weak, which makes it difficult to identify the fault symptom of rolling bearings in time. Therefore, the paper proposes a novel weak signal detection method based on time-delayed feedback monostable stochastic resonance (TFMSR) system and adaptive minimum entropy deconvolution (MED) to realize the fault diagnosis of rolling bearings. The MED method is employed to preprocess the vibration signals, which can deconvolve the effect of transmission path and clarify the defect-induced impulses. And a modified power spectrum kurtosis (MPSK) index is constructed to realize the adaptive selection of filter length in the MED algorithm. By introducing the time-delayed feedback item in to an over-damped monostable system, the TFMSR method can effectively utilize the historical information of input signal to enhance the periodicity of SR output, which is beneficial to the detection of periodic signal. Furthermore, the influence of time delay and feedback intensity on the SR phenomenon is analyzed, and by selecting appropriate time delay, feedback intensity and re-scaling ratio with genetic algorithm, the SR can be produced to realize the resonance detection of weak signal. The combination of the adaptive MED (AMED) method and TFMSR method is conducive to extracting the feature information from strong background noise and realizing the fault diagnosis of rolling bearings. Finally, some experiments and engineering application are performed to evaluate the effectiveness of the proposed AMED-TFMSR method in comparison with a traditional bistable SR method.

  18. Why Does the Human Body Maintain a Constant 37-Degree Temperature?: Thermodynamic Switch Controls Chemical Equilibrium in Biological Systems

    NASA Astrophysics Data System (ADS)

    Chun, Paul W.

    2005-01-01

    Applying the Planck-Benzinger methodology to biological systems, we have established that the negative Gibbs free energy minimum at a well-defined stable temperature, langTSrang, where the bound unavailable energy TΔS° = 0, has its origin in the sequence-specific hydrophobic interactions. Each such system we have examined confirms the existence of a thermodynamic molecular switch wherein a change of sign in [ΔCp°]reaction leads to a true negative minimum in the Gibbs free energy change of reaction, and hence a maximum in the related equilibrium constant, Keq. At this temperature, langTSrang, where ΔH°(TS)(-) = ΔG°(TS)(-)min, the maximum work can be accomplished in transpiration, digestion, reproduction or locomotion. In the human body, this temperature is 37°C. The langTSrang values may vary from one living organism to another, but the fact that the value of TΔS°(T) = 0 will not. There is a lower cutoff point, langThrang, where enthalpy is unfavorable but entropy is favorable, i.e. ΔH°(Th)(+) = TΔS°(Th)(+), and an upper limit, langTmrang, above which enthalpy is favorable but entropy is unfavorable, i.e. ΔH°(Tm)(-) = TΔS°(Tm)(-). Only between these two temperature limits, where ΔG°(T) = 0, is the net chemical driving force favorable for such biological processes as protein folding, protein-protein, protein-nucleic acid or protein-membrane interactions, and protein self-assembly. All interacting biological systems examined using the Planck-Benzinger methodology have shown such a thermodynamic switch at the molecular level, suggesting that its existence may be universal.

  19. Measurement Uncertainty Relations for Discrete Observables: Relative Entropy Formulation

    NASA Astrophysics Data System (ADS)

    Barchielli, Alberto; Gregoratti, Matteo; Toigo, Alessandro

    2018-02-01

    We introduce a new information-theoretic formulation of quantum measurement uncertainty relations, based on the notion of relative entropy between measurement probabilities. In the case of a finite-dimensional system and for any approximate joint measurement of two target discrete observables, we define the entropic divergence as the maximal total loss of information occurring in the approximation at hand. For fixed target observables, we study the joint measurements minimizing the entropic divergence, and we prove the general properties of its minimum value. Such a minimum is our uncertainty lower bound: the total information lost by replacing the target observables with their optimal approximations, evaluated at the worst possible state. The bound turns out to be also an entropic incompatibility degree, that is, a good information-theoretic measure of incompatibility: indeed, it vanishes if and only if the target observables are compatible, it is state-independent, and it enjoys all the invariance properties which are desirable for such a measure. In this context, we point out the difference between general approximate joint measurements and sequential approximate joint measurements; to do this, we introduce a separate index for the tradeoff between the error of the first measurement and the disturbance of the second one. By exploiting the symmetry properties of the target observables, exact values, lower bounds and optimal approximations are evaluated in two different concrete examples: (1) a couple of spin-1/2 components (not necessarily orthogonal); (2) two Fourier conjugate mutually unbiased bases in prime power dimension. Finally, the entropic incompatibility degree straightforwardly generalizes to the case of many observables, still maintaining all its relevant properties; we explicitly compute it for three orthogonal spin-1/2 components.

  20. Using quantum erasure to exorcize Maxwell's demon: I. Concepts and context

    NASA Astrophysics Data System (ADS)

    Scully, Marlan O.; Rostovtsev, Yuri; Sariyanni, Zoe-Elizabeth; Suhail Zubairy, M.

    2005-10-01

    Szilard [L. Szilard, Zeitschrift für Physik, 53 (1929) 840] made a decisive step toward solving the Maxwell demon problem by introducing and analyzing the single atom heat engine. Bennett [Sci. Am. 255 (1987) 107] completed the solution by pointing out that there must be an entropy, ΔS=kln2, generated as the result of information erased on each cycle. Nevertheless, others have disagreed. For example, philosophers such as Popper “have found the literature surrounding Maxwell's demon deeply problematic.” We propose and analyze a new kind of single atom quantum heat engine which allows us to resolve the Maxwell demon paradox simply, and without invoking the notions of information or entropy. The energy source of the present quantum engine [Scully, Phys. Rev. Lett. 87 (2001) 22601] is a Stern-Gerlach apparatus acting as a demonesque heat sorter. An isothermal compressor acts as the entropy sink. In order to complete a thermodynamic cycle, an energy of ΔW=kTln2 must be expended. This energy is essentially a “reset” or “eraser” energy. Thus Bennett's entropy ΔS=ΔW/T emerges as a simple consequence of the quantum thermodynamics of our heat engine. It would seem that quantum mechanics contains the kernel of information entropy at its very core. That is the concept of information erasure as it appears in quantum mechanics [Scully and Drühl, Phys. Rev. A 25 (1982) 2208] and the present quantum heat engine have a deep common origin.

  1. ELECTRIC CURRENT FILAMENTATION AT A NON-POTENTIAL MAGNETIC NULL-POINT DUE TO PRESSURE PERTURBATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jelínek, P.; Karlický, M.; Murawski, K., E-mail: pjelinek@prf.jcu.cz

    2015-10-20

    An increase of electric current densities due to filamentation is an important process in any flare. We show that the pressure perturbation, followed by an entropy wave, triggers such a filamentation in the non-potential magnetic null-point. In the two-dimensional (2D), non-potential magnetic null-point, we generate the entropy wave by a negative or positive pressure pulse that is launched initially. Then, we study its evolution under the influence of the gravity field. We solve the full set of 2D time dependent, ideal magnetohydrodynamic equations numerically, making use of the FLASH code. The negative pulse leads to an entropy wave with amore » plasma density greater than in the ambient atmosphere and thus this wave falls down in the solar atmosphere, attracted by the gravity force. In the case of the positive pressure pulse, the plasma becomes evacuated and the entropy wave propagates upward. However, in both cases, owing to the Rayleigh–Taylor instability, the electric current in a non-potential magnetic null-point is rapidly filamented and at some locations the electric current density is strongly enhanced in comparison to its initial value. Using numerical simulations, we find that entropy waves initiated either by positive or negative pulses result in an increase of electric current densities close to the magnetic null-point and thus the energy accumulated here can be released as nanoflares or even flares.« less

  2. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  3. Universal features of fluctuations in globular proteins.

    PubMed

    Erman, Burak

    2016-06-01

    Using data from 2000 non-homologous protein crystal structures, we show that the distribution of residue B factors of proteins collapses onto a single master curve. We show by maximum entropy arguments that this curve is a Gamma function whose order and dispersion are obtained from experimental data. The distribution for any given specific protein can be generated from the master curve by a linear transformation. Any perturbation of the B factor distribution of a protein, imposed at constant energy, causes a decrease in the entropy of the protein relative to that of the reference state. Proteins 2016; 84:721-725. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Geometric characterization of separability and entanglement in pure Gaussian states by single-mode unitary operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adesso, Gerardo; CNR-INFM Coherentia, Naples; CNISM, Unita di Salerno, Salerno

    2007-10-15

    We present a geometric approach to the characterization of separability and entanglement in pure Gaussian states of an arbitrary number of modes. The analysis is performed adapting to continuous variables a formalism based on single subsystem unitary transformations that has been recently introduced to characterize separability and entanglement in pure states of qubits and qutrits [S. M. Giampaolo and F. Illuminati, Phys. Rev. A 76, 042301 (2007)]. In analogy with the finite-dimensional case, we demonstrate that the 1xM bipartite entanglement of a multimode pure Gaussian state can be quantified by the minimum squared Euclidean distance between the state itself andmore » the set of states obtained by transforming it via suitable local symplectic (unitary) operations. This minimum distance, corresponding to a, uniquely determined, extremal local operation, defines an entanglement monotone equivalent to the entropy of entanglement, and amenable to direct experimental measurement with linear optical schemes.« less

  5. Automated mango fruit assessment using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  6. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  7. Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung

    2015-04-01

    In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting

  8. Quantum Random Number Generation Using a Quanta Image Sensor

    PubMed Central

    Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.

    2016-01-01

    A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698

  9. Deformation bands, the LEDS theory, and their importance in texture development: Part II. Theoretical conclusions

    NASA Astrophysics Data System (ADS)

    Kuhlmann-Wilsdorf, D.

    1999-09-01

    The facts regarding “regular” deformation bands (DBs) outlined in Part I of this series of articles are related to the low-energy dislocation structure (LEDS) theory of dislocation-based plasticity. They prompt an expansion of the theory by including the stresses due to strain gradients on account of changing selections of slip systems to the previously known dislocation driving forces. This last and until now neglected driving force is much smaller than the components considered hitherto, principally due to the applied stress and to mutual stress-screening among neighbor dislocations. As a result, it permits a near-proof of the LEDS hypothesis, to wit that among all structures which, in principle, are accessible to the dislocations, that one is realized which has the lowest free energy. Specifically, the temperature rises that would result from annihilating the largest DBs amount to only several millidegrees Centigrade, meaning that they, and by implication the entire dislocation structures, are close to thermodynamical equilibrium. This is in stark contrast to the assumption of the presently widespread self-organizing dislocation structures (SODS) modeling that plastic deformation occurs far from equilibrium and is subject to Prigogine’s thermodynamics of energy-flow-through systems. It also holds out promise for future rapid advances in the construction of constitutive equations, since the LEDS hypothesis is the principal basis of the LEDS theory of plastic deformation and follows directly from the second law of thermodynamics in conjunction with Newton’s third law. By contrast, all other known models of metal plasticity are in conflict with the LEDS hypothesis. In regard to texture modeling, the present analysis shows that Taylor’s criterion of minimum plastic work is incorrect and should be replaced by the criterion of minimum free energy in the stressed state. Last, the LEDS hypothesis is but a special case of the more general low-energy structure (LES) hypothesis, applying to plastic deformation independent of the deformation mechanism. It is thus seen that plastic deformation is one of nature’s means to generate order, as a byproduct of the entropy generation when mechanical work is largely converted into heat.

  10. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  11. [The motive force of evolution based on the principle of organismal adjustment evolution.].

    PubMed

    Cao, Jia-Shu

    2010-08-01

    From the analysis of the existing problems of the prevalent theories of evolution, this paper discussed the motive force of evolution based on the knowledge of the principle of organismal adjustment evolution to get a new understanding of the evolution mechanism. In the guide of Schrodinger's theory - "life feeds on negative entropy", the author proposed that "negative entropy flow" actually includes material flow, energy flow and information flow, and the "negative entropy flow" is the motive force for living and development. By modifying my own theory of principle of organismal adjustment evolution (not adaptation evolution), a new theory of "regulation system of organismal adjustment evolution involved in DNA, RNA and protein interacting with environment" is proposed. According to the view that phylogenetic development is the "integral" of individual development, the difference of negative entropy flow between organisms and environment is considered to be a motive force for evolution, which is a new understanding of the mechanism of evolution. Based on such understanding, evolution is regarded as "a changing process that one subsystem passes all or part of its genetic information to the next generation in a larger system, and during the adaptation process produces some new elements, stops some old ones, and thereby lasts in the larger system". Some other controversial questions related to evolution are also discussed.

  12. Analysis of cardiac signals using spatial filling index and time-frequency domain

    PubMed Central

    Faust, Oliver; Acharya U, Rajendra; Krishnan, SM; Min, Lim Choo

    2004-01-01

    Background Analysis of heart rate variation (HRV) has become a popular noninvasive tool for assessing the activities of the autonomic nervous system (ANS). HRV analysis is based on the concept that fast fluctuations may specifically reflect changes of sympathetic and vagal activity. It shows that the structure generating the signal is not simply linear, but also involves nonlinear contributions. These signals are essentially non-stationary; may contain indicators of current disease, or even warnings about impending diseases. The indicators may be present at all times or may occur at random in the time scale. However, to study and pinpoint abnormalities in voluminous data collected over several hours is strenuous and time consuming. Methods This paper presents the spatial filling index and time-frequency analysis of heart rate variability signal for disease identification. Renyi's entropy is evaluated for the signal in the Wigner-Ville and Continuous Wavelet Transformation (CWT) domain. Results This Renyi's entropy gives lower 'p' value for scalogram than Wigner-Ville distribution and also, the contours of scalogram visually show the features of the diseases. And in the time-frequency analysis, the Renyi's entropy gives better result for scalogram than the Wigner-Ville distribution. Conclusion Spatial filling index and Renyi's entropy has distinct regions for various diseases with an accuracy of more than 95%. PMID:15361254

  13. Analysis of the GRNs Inference by Using Tsallis Entropy and a Feature Selection Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Fabrício M.; de Oliveira, Evaldo A.; Cesar, Roberto M.

    An important problem in the bioinformatics field is to understand how genes are regulated and interact through gene networks. This knowledge can be helpful for many applications, such as disease treatment design and drugs creation purposes. For this reason, it is very important to uncover the functional relationship among genes and then to construct the gene regulatory network (GRN) from temporal expression data. However, this task usually involves data with a large number of variables and small number of observations. In this way, there is a strong motivation to use pattern recognition and dimensionality reduction approaches. In particular, feature selection is specially important in order to select the most important predictor genes that can explain some phenomena associated with the target genes. This work presents a first study about the sensibility of entropy methods regarding the entropy functional form, applied to the problem of topology recovery of GRNs. The generalized entropy proposed by Tsallis is used to study this sensibility. The inference process is based on a feature selection approach, which is applied to simulated temporal expression data generated by an artificial gene network (AGN) model. The inferred GRNs are validated in terms of global network measures. Some interesting conclusions can be drawn from the experimental results, as reported for the first time in the present paper.

  14. Refined composite multiscale weighted-permutation entropy of financial time series

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  15. Determining size and dispersion of minimum viable populations for land management planning and species conservation

    NASA Astrophysics Data System (ADS)

    Lehmkuhl, John F.

    1984-03-01

    The concept of minimum populations of wildlife and plants has only recently been discussed in the literature. Population genetics has emerged as a basic underlying criterion for determining minimum population size. This paper presents a genetic framework and procedure for determining minimum viable population size and dispersion strategies in the context of multiple-use land management planning. A procedure is presented for determining minimum population size based on maintenance of genetic heterozygosity and reduction of inbreeding. A minimum effective population size ( N e ) of 50 breeding animals is taken from the literature as the minimum shortterm size to keep inbreeding below 1% per generation. Steps in the procedure adjust N e to account for variance in progeny number, unequal sex ratios, overlapping generations, population fluctuations, and period of habitat/population constraint. The result is an approximate census number that falls within a range of effective population size of 50 500 individuals. This population range defines the time range of short- to long-term population fitness and evolutionary potential. The length of the term is a relative function of the species generation time. Two population dispersion strategies are proposed: core population and dispersed population.

  16. Short pauses in thalamic deep brain stimulation promote tremor and neuronal bursting.

    PubMed

    Swan, Brandon D; Brocker, David T; Hilliard, Justin D; Tatter, Stephen B; Gross, Robert E; Turner, Dennis A; Grill, Warren M

    2016-02-01

    We conducted intraoperative measurements of tremor during DBS containing short pauses (⩽50 ms) to determine if there is a minimum pause duration that preserves tremor suppression. Nine subjects with ET and thalamic DBS participated during IPG replacement surgery. Patterns of DBS included regular 130 Hz stimulation interrupted by 0, 15, 25 or 50 ms pauses. The same patterns were applied to a model of the thalamic network to quantify effects of pauses on activity of model neurons. All patterns of DBS decreased tremor relative to 'off'. Patterns with pauses generated less tremor reduction than regular high frequency DBS. The model revealed that rhythmic burst-driver inputs to thalamus were masked during DBS, but pauses in stimulation allowed propagation of bursting activity. The mean firing rate of bursting-type model neurons as well as the firing pattern entropy of model neurons were both strongly correlated with tremor power across stimulation conditions. The temporal pattern of stimulation influences the efficacy of thalamic DBS. Pauses in stimulation resulted in decreased tremor suppression indicating that masking of pathological bursting is a mechanism of thalamic DBS for tremor. Pauses in stimulation decreased the efficacy of open-loop DBS for suppression of tremor. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Supertranslations and Superrotations at the Black Hole Horizon.

    PubMed

    Donnay, Laura; Giribet, Gaston; González, Hernán A; Pino, Miguel

    2016-03-04

    We show that the asymptotic symmetries close to nonextremal black hole horizons are generated by an extension of supertranslations. This group is generated by a semidirect sum of Virasoro and Abelian currents. The charges associated with the asymptotic Killing symmetries satisfy the same algebra. When considering the special case of a stationary black hole, the zero mode charges correspond to the angular momentum and the entropy at the horizon.

  18. Overall heat transfer coefficient and pressure drop in a typical tubular exchanger employing alumina nano-fluid as the tube side hot fluid

    NASA Astrophysics Data System (ADS)

    Kabeel, A. E.; Abdelgaied, Mohamed

    2016-08-01

    Nano-fluids are used to improve the heat transfer rates in heat exchangers, especially; the shell-and-tube heat exchanger that is considered one of the most important types of heat exchangers. In the present study, an experimental loop is constructed to study the thermal characteristics of the shell-and-tube heat exchanger; at different concentrations of Al2O3 nonmetallic particles (0.0, 2, 4, and 6 %). This material concentrations is by volume concentrations in pure water as a base fluid. The effects of nano-fluid concentrations on the performance of shell and tube heat exchanger have been conducted based on the overall heat transfer coefficient, the friction factor, the pressure drop in tube side, and the entropy generation rate. The experimental results show that; the highest heat transfer coefficient is obtained at a nano-fluid concentration of 4 % of the shell side. In shell side the maximum percentage increase in the overall heat transfer coefficient has reached 29.8 % for a nano-fluid concentration of 4 %, relative to the case of the base fluid (water) at the same tube side Reynolds number. However; in the tube side the maximum relative increase in pressure drop has recorded the values of 12, 28 and 48 % for a nano-material concentration of 2, 4 and 6 %, respectively, relative to the case without nano-fluid, at an approximate value of 56,000 for Reynolds number. The entropy generation reduces with increasing the nonmetallic particle volume fraction of the same flow rates. For increase the nonmetallic particle volume fraction from 0.0 to 6 % the rate of entropy generation decrease by 10 %.

  19. Entropy, pumped-storage and energy system finance

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Georgios

    2015-04-01

    Pumped-storage holds a key role for integrating renewable energy units with non-renewable fuel plants into large-scale energy systems of electricity output. An emerging issue is the development of financial engineering models with physical basis to systematically fund energy system efficiency improvements across its operation. A fundamental physically-based economic concept is the Scarcity Rent; which concerns the pricing of a natural resource's scarcity. Specifically, the scarcity rent comprises a fraction of a depleting resource's full price and accumulates to fund its more efficient future use. In an integrated energy system, scarcity rents derive from various resources and can be deposited to a pooled fund to finance the energy system's overall efficiency increase; allowing it to benefit from economies of scale. With pumped-storage incorporated to the system, water upgrades to a hub resource, in which the scarcity rents of all connected energy sources are denominated to. However, as available water for electricity generation or storage is also limited, a scarcity rent upon it is also imposed. It is suggested that scarcity rent generation is reducible to three (3) main factors, incorporating uncertainty: (1) water's natural renewability, (2) the energy system's intermittent components and (3) base-load prediction deviations from actual loads. For that purpose, the concept of entropy is used in order to measure the energy system's overall uncertainty; hence pumped-storage intensity requirements and generated water scarcity rents. Keywords: pumped-storage, integration, energy systems, financial engineering, physical basis, Scarcity Rent, pooled fund, economies of scale, hub resource, uncertainty, entropy Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)

  20. Diffusion profiling of tumor volumes using a histogram approach can predict proliferation and further microarchitectural features in medulloblastoma.

    PubMed

    Schob, Stefan; Beeskow, Anne; Dieckow, Julia; Meyer, Hans-Jonas; Krause, Matthias; Frydrychowicz, Clara; Hirsch, Franz-Wolfgang; Surov, Alexey

    2018-05-31

    Medulloblastomas are the most common central nervous system tumors in childhood. Treatment and prognosis strongly depend on histology and transcriptomic profiling. However, the proliferative potential also has prognostical value. Our study aimed to investigate correlations between histogram profiling of diffusion-weighted images and further microarchitectural features. Seven patients (age median 14.6 years, minimum 2 years, maximum 20 years; 5 male, 2 female) were included in this retrospective study. Using a Matlab-based analysis tool, histogram analysis of whole apparent diffusion coefficient (ADC) volumes was performed. ADC entropy revealed a strong inverse correlation with the expression of the proliferation marker Ki67 (r = - 0.962, p = 0.009) and with total nuclear area (r = - 0.888, p = 0.044). Furthermore, ADC percentiles, most of all ADCp90, showed significant correlations with Ki67 expression (r = 0.902, p = 0.036). Diffusion histogram profiling of medulloblastomas provides valuable in vivo information which potentially can be used for risk stratification and prognostication. First of all, entropy revealed to be the most promising imaging biomarker. However, further studies are warranted.

  1. Free Volume, Energy, and Entropy at the Polymer Glass Transition: New Results and Connections with Widely Used Treatments

    NASA Astrophysics Data System (ADS)

    White, Ronald; Lipson, Jane

    Free volume has a storied history in polymer physics. To introduce our own results, we consider how free volume has been defined in the past, e.g. in the works of Fox and Flory, Doolittle, and the equation of Williams, Landel, and Ferry. We contrast these perspectives with our own analysis using our Locally Correlated Lattice (LCL) model where we have found a striking connection between polymer free volume (analyzed using PVT data) and the polymer's corresponding glass transition temperature, Tg. The pattern, covering over 50 different polymers, is robust enough to be reasonably predictive based on melt properties alone; when a melt hits this T-dependent boundary of critical minimum free volume it becomes glassy. We will present a broad selection of results from our thermodynamic analysis, and make connections with historical treatments. We will discuss patterns that have emerged across the polymers in the energy and entropy when quantified as ''per LCL theoretical segment''. Finally we will relate the latter trend to the point of view popularized in the theory of Adam and Gibbs. The authors gratefully acknowledge support from NSF DMR-1403757.

  2. Quantization Of Temperature

    NASA Astrophysics Data System (ADS)

    O'Brien, Paul

    2017-01-01

    Max Plank did not quantize temperature. I will show that the Plank temperature violates the Plank scale. Plank stated that the Plank scale was Natures scale and independent of human construct. Also stating that even aliens would derive the same values. He made a huge mistake, because temperature is based on the Kelvin scale, which is man-made just like the meter and kilogram. He did not discover natures scale for the quantization of temperature. His formula is flawed, and his value is incorrect. Plank's calculation is Tp = c2Mp/Kb. The general form of this equation is T = E/Kb Why is this wrong? The temperature for a fixed amount of energy is dependent upon the volume it occupies. Using the correct formula involves specifying the radius of the volume in the form of (RE). This leads to an inequality and a limit that is equivalent to the Bekenstein Bound, but using temperature instead of entropy. Rewriting this equation as a limit defines both the maximum temperature and Boltzmann's constant. This will saturate any space-time boundary with maximum temperature and information density, also the minimum radius and entropy. The general form of the equation then becomes a limit in BH thermodynamics T <= (RE)/(λKb) .

  3. Hot-start Giant Planets Form with Radiative Interiors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berardo, David; Cumming, Andrew, E-mail: david.berardo@mcgill.ca, E-mail: andrew.cumming@mcgill.ca

    In the hot-start core accretion formation model for gas giants, the interior of a planet is usually assumed to be fully convective. By calculating the detailed internal evolution of a planet assuming hot-start outer boundary conditions, we show that such a planet will in fact form with a radially increasing internal entropy profile, so that its interior will be radiative instead of convective. For a hot outer boundary, there is a minimum value for the entropy of the internal adiabat S {sub min} below which the accreting envelope does not match smoothly onto the interior, but instead deposits high entropymore » material onto the growing interior. One implication of this would be to at least temporarily halt the mixing of heavy elements within the planet, which are deposited by planetesimals accreted during formation. The compositional gradient this would impose could subsequently disrupt convection during post-accretion cooling, which would alter the observed cooling curve of the planet. However, even with a homogeneous composition, for which convection develops as the planet cools, the difference in cooling timescale will change the inferred mass of directly imaged gas giants.« less

  4. Multicellular regulation of entropy, spatial order, and information

    NASA Astrophysics Data System (ADS)

    Youk, Hyun

    Many multicellular systems such as tissues and microbial biofilms consist of cells that secrete and sense signalling molecules. Understanding how collective behaviours of secrete-and-sense cells is an important challenge. We combined experimental and theoretical approaches to understand multicellular coordination of gene expression and spatial pattern formation among secrete-and-sense cells. We engineered secrete-and-sense yeast cells to show that cells can collectively and permanently remember a past event by reminding each other with their secreted signalling molecule. If one cell ``forgets'' then another cell can remind it. Cell-cell communication ensures a long-term (permanent) memory by overcoming common limitations of intracellular memory. We also established a new theoretical framework inspired by statistical mechanics to understand how fields of secrete-and-sense cells form spatial patterns. We introduce new metrics - cellular entropy, cellular Hamiltonian, and spatial order index - for dynamics of cellular automata that form spatial patterns. Our theory predicts how fast any spatial patterns form, how ordered they are, and establishes cellular Hamiltonian that, like energy for non-living systems, monotonically decreases towards a minimum over time. ERC Starting Grant (MultiCellSysBio), NWO VIDI, NWO NanoFront.

  5. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  6. Entropy and cosmology.

    NASA Astrophysics Data System (ADS)

    Zucker, M. H.

    This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own temperature and thus, by itself; reverse entropy. The vast encompassing gravitational forces that the universe has at its disposal, forces that dominate the phase of contraction, provide the compacting, compressive mechanism that regenerates heat in an expanded, cooled universe and decreases entropy. And this phenomenon takes place without diminishing or depleting the finite amount of mass/energy with which the universe began. The fact that the universe can reverse the entropic process leads to possibilities previously ignored when assessing which of the three models (open, closed, of flat) most probably represents the future of the universe. After analyzing the models, the conclusion reached here is that the open model is only an expanded version of the closed model and therefore is not open, and the closed model will never collapse to a big crunch and, therefore, is not closed. Which leaves a modified model, oscillating forever between limited phases of expansion and contraction (a universe in "dynamic equilibrium") as the only feasible choice.

  7. Research of MPPT for photovoltaic generation based on two-dimensional cloud model

    NASA Astrophysics Data System (ADS)

    Liu, Shuping; Fan, Wei

    2013-03-01

    The cloud model is a mathematical representation to fuzziness and randomness in linguistic concepts. It represents a qualitative concept with expected value Ex, entropy En and hyper entropy He, and integrates the fuzziness and randomness of a linguistic concept in a unified way. This model is a new method for transformation between qualitative and quantitative in the knowledge. This paper is introduced MPPT (maximum power point tracking, MPPT) controller based two- dimensional cloud model through analysis of auto-optimization MPPT control of photovoltaic power system and combining theory of cloud model. Simulation result shows that the cloud controller is simple and easy, directly perceived through the senses, and has strong robustness, better control performance.

  8. Unitary n -designs via random quenches in atomic Hubbard and spin models: Application to the measurement of Rényi entropies

    NASA Astrophysics Data System (ADS)

    Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.

  9. MHD natural convection and entropy generation in an open cavity having different horizontal porous blocks saturated with a ferrofluid

    NASA Astrophysics Data System (ADS)

    Gibanov, Nikita S.; Sheremet, Mikhail A.; Oztop, Hakan F.; Al-Salem, Khaled

    2018-04-01

    In this study, natural convection combined with entropy generation of Fe3O4-water nanofluid within a square open cavity filled with two different porous blocks under the influence of uniform horizontal magnetic field is numerically studied. Porous blocks of different thermal properties, permeability and porosity are located on the bottom wall. The bottom wall of the cavity is kept at hot temperature Th, while upper open boundary is at constant cold temperature Tc and other walls of the cavity are supposed to be adiabatic. Governing equations with corresponding boundary conditions formulated in dimensionless stream function and vorticity using Brinkman-extended Darcy model for porous blocks have been solved numerically using finite difference method. Numerical analysis has been carried out for wide ranges of Hartmann number, nanoparticles volume fraction and length of the porous blocks. It has been found that an addition of spherical ferric oxide nanoparticles can order the flow structures inside the cavity.

  10. Cave spiders choose optimal environmental factors with respect to the generated entropy when laying their cocoon

    PubMed Central

    Chiavazzo, Eliodoro; Isaia, Marco; Mammola, Stefano; Lepore, Emiliano; Ventola, Luigi; Asinari, Pietro; Pugno, Nicola Maria

    2015-01-01

    The choice of a suitable area to spiders where to lay eggs is promoted in terms of Darwinian fitness. Despite its importance, the underlying factors behind this key decision are generally poorly understood. Here, we designed a multidisciplinary study based both on in-field data and laboratory experiments focusing on the European cave spider Meta menardi (Araneae, Tetragnathidae) and aiming at understanding the selective forces driving the female in the choice of the depositional area. Our in-field data analysis demonstrated a major role of air velocity and distance from the cave entrance within a particular cave in driving the female choice. This has been interpreted using a model based on the Entropy Generation Minimization - EGM - method, without invoking best fit parameters and thanks to independent lab experiments, thus demonstrating that the female chooses the depositional area according to minimal level of thermo-fluid-dynamic irreversibility. This methodology may pave the way to a novel approach in understanding evolutionary strategies for other living organisms. PMID:25556697

  11. First steps towards a constructal Microbial Fuel Cell.

    PubMed

    Lepage, Guillaume; Perrier, Gérard; Ramousse, Julien; Merlin, Gérard

    2014-06-01

    In order to reach real operating conditions with consequent organic charge flow, a multi-channel reactor for Microbial Fuel Cells is designed. The feed-through double chamber reactor is a two-dimensional system with four parallel channels and Reticulated Vitreous Carbon as electrodes. Based on thermodynamical calculations, the constructal-inspired distributor is optimized with the aim to reduce entropy generation along the distributing path. In the case of negligible singular pressure drops, the Hess-Murray law links the lengths and the hydraulic diameters of the successive reducing ducts leading to one given working channel. The determination of generated entropy in the channels of our constructal MFC is based on the global hydraulic resistance caused by both regular and singular pressure drops. Polarization, power and Electrochemical Impedance Spectroscopy show the robustness and the efficiency of the cell, and therefore the potential of the constructal approach. Routes towards improvements are suggested in terms of design evolutions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Using Maximum Entropy to Find Patterns in Genomes

    NASA Astrophysics Data System (ADS)

    Liu, Sophia; Hockenberry, Adam; Lancichinetti, Andrea; Jewett, Michael; Amaral, Luis

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. To accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. This approach can also be easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes. National Institute of General Medical Science, Northwestern University Presidential Fellowship, National Science Foundation, David and Lucile Packard Foundation, Camille Dreyfus Teacher Scholar Award.

  13. FPGA Implementation of Metastability-Based True Random Number Generator

    NASA Astrophysics Data System (ADS)

    Hata, Hisashi; Ichikawa, Shuichi

    True random number generators (TRNGs) are important as a basis for computer security. Though there are some TRNGs composed of analog circuit, the use of digital circuits is desired for the application of TRNGs to logic LSIs. Some of the digital TRNGs utilize jitter in free-running ring oscillators as a source of entropy, which consume large power. Another type of TRNG exploits the metastability of a latch to generate entropy. Although this kind of TRNG has been mostly implemented with full-custom LSI technology, this study presents an implementation based on common FPGA technology. Our TRNG is comprised of logic gates only, and can be integrated in any kind of logic LSI. The RS latch in our TRNG is implemented as a hard-macro to guarantee the quality of randomness by minimizing the signal skew and load imbalance of internal nodes. To improve the quality and throughput, the output of 64-256 latches are XOR'ed. The derived design was verified on a Xilinx Virtex-4 FPGA (XC4VFX20), and passed NIST statistical test suite without post-processing. Our TRNG with 256 latches occupies 580 slices, while achieving 12.5Mbps throughput.

  14. XMM Observations of Low Mass Groups

    NASA Technical Reports Server (NTRS)

    Davis, David S.

    2005-01-01

    The contents of this report contains discussion of the two-dimensional XMM-Newton group survey. The analysis of the NGC 2300 and Pavo observations indicated by the azimuthally averaged analysis that the temperature structure is minimal to the NGC2300 system; however, the Pavo system shows signs of a merger in progress. XMM data is used to generate two dimensional maps of the temperature and abundance used to generate maps of pressure and entropy.

  15. Whole-Lesion Apparent Diffusion Coefficient-Based Entropy-Related Parameters for Characterizing Cervical Cancers: Initial Findings.

    PubMed

    Guan, Yue; Li, Weifeng; Jiang, Zhuoran; Chen, Ying; Liu, Song; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-12-01

    This study aimed to develop whole-lesion apparent diffusion coefficient (ADC)-based entropy-related parameters of cervical cancer to preliminarily assess intratumoral heterogeneity of this lesion in comparison to adjacent normal cervical tissues. A total of 51 women (mean age, 49 years) with cervical cancers confirmed by biopsy underwent 3-T pelvic diffusion-weighted magnetic resonance imaging with b values of 0 and 800 s/mm 2 prospectively. ADC-based entropy-related parameters including first-order entropy and second-order entropies were derived from the whole tumor volume as well as adjacent normal cervical tissues. Intraclass correlation coefficient, Wilcoxon test with Bonferroni correction, Kruskal-Wallis test, and receiver operating characteristic curve were used for statistical analysis. All the parameters showed excellent interobserver agreement (all intraclass correlation coefficients  > 0.900). Entropy, entropy(H) 0 , entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean were significantly higher, whereas entropy(H) range and entropy(H) std were significantly lower in cervical cancers compared to adjacent normal cervical tissues (all P <.0001). Kruskal-Wallis test showed that there were no significant differences among the values of various second-order entropies including entropy(H) 0, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean. All second-order entropies had larger area under the receiver operating characteristic curve than first-order entropy in differentiating cervical cancers from adjacent normal cervical tissues. Further, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean had the same largest area under the receiver operating characteristic curve of 0.867. Whole-lesion ADC-based entropy-related parameters of cervical cancers were developed successfully, which showed initial potential in characterizing intratumoral heterogeneity in comparison to adjacent normal cervical tissues. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Oxygen enhanced switching to combustion of lower rank fuels

    DOEpatents

    Kobayashi, Hisashi; Bool, III, Lawrence E.; Wu, Kuang Tsai

    2004-03-02

    A furnace that combusts fuel, such as coal, of a given minimum energy content to obtain a stated minimum amount of energy per unit of time is enabled to combust fuel having a lower energy content, while still obtaining at least the stated minimum energy generation rate, by replacing a small amount of the combustion air fed to the furnace by oxygen. The replacement of oxygen for combustion air also provides reduction in the generation of NOx.

  17. Apparent diffusion coefficient histogram shape analysis for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2016-10-22

    To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.

  18. On quantum Rényi entropies: A new generalization and some properties

    NASA Astrophysics Data System (ADS)

    Müller-Lennert, Martin; Dupuis, Frédéric; Szehr, Oleg; Fehr, Serge; Tomamichel, Marco

    2013-12-01

    The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.

  19. Precipitation behavior of AlxCoCrFeNi high entropy alloys under ion irradiation

    NASA Astrophysics Data System (ADS)

    Yang, Tengfei; Xia, Songqin; Liu, Shi; Wang, Chenxu; Liu, Shaoshuai; Fang, Yuan; Zhang, Yong; Xue, Jianming; Yan, Sha; Wang, Yugang

    2016-08-01

    Materials performance is central to the satisfactory operation of current and future nuclear energy systems due to the severe irradiation environment in reactors. Searching for structural materials with excellent irradiation tolerance is crucial for developing the next generation nuclear reactors. Here, we report the irradiation responses of a novel multi-component alloy system, high entropy alloy (HEA) AlxCoCrFeNi (x = 0.1, 0.75 and 1.5), focusing on their precipitation behavior. It is found that the single phase system, Al0.1CoCrFeNi, exhibits a great phase stability against ion irradiation. No precipitate is observed even at the highest fluence. In contrast, numerous coherent precipitates are present in both multi-phase HEAs. Based on the irradiation-induced/enhanced precipitation theory, the excellent structural stability against precipitation of Al0.1CoCrFeNi is attributed to the high configurational entropy and low atomic diffusion, which reduces the thermodynamic driving force and kinetically restrains the formation of precipitate, respectively. For the multiphase HEAs, the phase separations and formation of ordered phases reduce the system configurational entropy, resulting in the similar precipitation behavior with corresponding binary or ternary conventional alloys. This study demonstrates the structural stability of single-phase HEAs under irradiation and provides important implications for searching for HEAs with higher irradiation tolerance.

  20. Noise reduction algorithm with the soft thresholding based on the Shannon entropy and bone-conduction speech cross- correlation bands.

    PubMed

    Na, Sung Dae; Wei, Qun; Seong, Ki Woong; Cho, Jin Ho; Kim, Myoung Nam

    2018-01-01

    The conventional methods of speech enhancement, noise reduction, and voice activity detection are based on the suppression of noise or non-speech components of the target air-conduction signals. However, air-conduced speech is hard to differentiate from babble or white noise signals. To overcome this problem, the proposed algorithm uses the bone-conduction speech signals and soft thresholding based on the Shannon entropy principle and cross-correlation of air- and bone-conduction signals. A new algorithm for speech detection and noise reduction is proposed, which makes use of the Shannon entropy principle and cross-correlation with the bone-conduction speech signals to threshold the wavelet packet coefficients of the noisy speech. The proposed method can be get efficient result by objective quality measure that are PESQ, RMSE, Correlation, SNR. Each threshold is generated by the entropy and cross-correlation approaches in the decomposed bands using the wavelet packet decomposition. As a result, the noise is reduced by the proposed method using the MATLAB simulation. To verify the method feasibility, we compared the air- and bone-conduction speech signals and their spectra by the proposed method. As a result, high performance of the proposed method is confirmed, which makes it quite instrumental to future applications in communication devices, noisy environment, construction, and military operations.

  1. Cross-entropy clustering framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Tongal, Hakan; Sivakumar, Bellie

    2017-09-01

    There is an increasing interest in catchment classification and regionalization in hydrology, as they are useful for identification of appropriate model complexity and transfer of information from gauged catchments to ungauged ones, among others. This study introduces a nonlinear cross-entropy clustering (CEC) method for classification of catchments. The method specifically considers embedding dimension (m), sample entropy (SampEn), and coefficient of variation (CV) to represent dimensionality, complexity, and variability of the time series, respectively. The method is applied to daily streamflow time series from 217 gauging stations across Australia. The results suggest that a combination of linear and nonlinear parameters (i.e. m, SampEn, and CV), representing different aspects of the underlying dynamics of streamflows, could be useful for determining distinct patterns of flow generation mechanisms within a nonlinear clustering framework. For the 217 streamflow time series, nine hydrologically homogeneous clusters that have distinct patterns of flow regime characteristics and specific dominant hydrological attributes with different climatic features are obtained. Comparison of the results with those obtained using the widely employed k-means clustering method (which results in five clusters, with the loss of some information about the features of the clusters) suggests the superiority of the cross-entropy clustering method. The outcomes from this study provide a useful guideline for employing the nonlinear dynamic approaches based on hydrologic signatures and for gaining an improved understanding of streamflow variability at a large scale.

  2. Upper entropy axioms and lower entropy axioms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi

    2015-04-15

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less

  3. EEG entropy measures in anesthesia

    PubMed Central

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277

  4. Quantifying Non-Equilibrium in Hypersonic Flows Using Entropy Generation

    DTIC Science & Technology

    2007-03-01

    do this, two experimental cases performed at the Calspan- University of Buffalo Research Center ( CUBRC ) were modeled using Navier-Stokes based CFD...data provided by the CUBRC hypersonic wind tunnel facility (Holden and Wadhams, 2004). The wall data in Figure 9 and Figure 10 reveals some difference

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crooks, Gavin; Sivak, David

    Many interesting divergence measures between conjugate ensembles of nonequilibrium trajectories can be experimentally determined from the work distribution of the process. Herein, we review the statistical and physical significance of several of these measures, in particular the relative entropy (dissipation), Jeffreys divergence (hysteresis), Jensen-Shannon divergence (time-asymmetry), Chernoff divergence (work cumulant generating function), and Renyi divergence.

  6. Communication: Introducing prescribed biases in out-of-equilibrium Markov models

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.

    2018-03-01

    Markov models are often used in modeling complex out-of-equilibrium chemical and biochemical systems. However, many times their predictions do not agree with experiments. We need a systematic framework to update existing Markov models to make them consistent with constraints that are derived from experiments. Here, we present a framework based on the principle of maximum relative path entropy (minimum Kullback-Leibler divergence) to update Markov models using stationary state and dynamical trajectory-based constraints. We illustrate the framework using a biochemical model network of growth factor-based signaling. We also show how to find the closest detailed balanced Markov model to a given Markov model. Further applications and generalizations are discussed.

  7. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    PubMed

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  8. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis

    PubMed Central

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956

  9. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  10. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  11. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    PubMed

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  12. Scale-specific effects: A report on multiscale analysis of acupunctured EEG in entropy and power

    NASA Astrophysics Data System (ADS)

    Song, Zhenxi; Deng, Bin; Wei, Xile; Cai, Lihui; Yu, Haitao; Wang, Jiang; Wang, Ruofan; Chen, Yingyuan

    2018-02-01

    Investigating acupuncture effects contributes to improving clinical application and understanding neuronal dynamics under external stimulation. In this report, we recorded electroencephalography (EEG) signals evoked by acupuncture at ST36 acupoint with three stimulus frequencies of 50, 100 and 200 times per minutes, and selected non-acupuncture EEGs as the control group. Multiscale analyses were introduced to investigate the possible acupuncture effects on complexity and power in multiscale level. Using multiscale weighted-permutation entropy, we found the significant effects on increased complexity degree in EEG signals induced by acupuncture. The comparison of three stimulation manipulations showed that 100 times/min generated most obvious effects, and affected most cortical regions. By estimating average power spectral density, we found decreased power induced by acupuncture. The joint distribution of entropy and power indicated an inverse correlation, and this relationship was weakened by acupuncture effects, especially under the manipulation of 100 times/min frequency. Above findings are more evident and stable in large scales than small scales, which suggests that multiscale analysis allows evaluating significant effects in specific scale and enables to probe the inherent characteristics underlying physiological signals.

  13. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models

    PubMed Central

    Grün, Sonja; Helias, Moritz

    2017-01-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition. PMID:28968396

  14. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models.

    PubMed

    Rostami, Vahid; Porta Mana, PierGianLuca; Grün, Sonja; Helias, Moritz

    2017-10-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition.

  15. Comparison of Analytic Hierarchy Process, Catastrophe and Entropy techniques for evaluating groundwater prospect of hard-rock aquifer systems

    NASA Astrophysics Data System (ADS)

    Jenifer, M. Annie; Jha, Madan K.

    2017-05-01

    Groundwater is a treasured underground resource, which plays a central role in sustainable water management. However, it being hidden and dynamic in nature, its sustainable development and management calls for precise quantification of this precious resource at an appropriate scale. This study demonstrates the efficacy of three GIS-based multi-criteria decision analysis (MCDA) techniques, viz., Analytic Hierarchy Process (AHP), Catastrophe and Entropy in evaluating groundwater potential through a case study in hard-rock aquifer systems. Using satellite imagery and relevant field data, eight thematic layers (rainfall, land slope, drainage density, soil, lineament density, geology, proximity to surface water bodies and elevation) of the factors having significant influence on groundwater occurrence were prepared. These thematic layers and their features were assigned suitable weights based on the conceptual frameworks of AHP, Catastrophe and Entropy techniques and then they were integrated in the GIS environment to generate an integrated raster layer depicting groundwater potential index of the study area. The three groundwater prospect maps thus yielded by these MCDA techniques were verified using a novel approach (concept of 'Dynamic Groundwater Potential'). The validation results revealed that the groundwater potential predicted by the AHP technique has a pronounced accuracy of 87% compared to the Catastrophe (46% accuracy) and Entropy techniques (51% accuracy). It is concluded that the AHP technique is the most reliable for the assessment of groundwater resources followed by the Entropy method. The developed groundwater potential maps can serve as a scientific guideline for the cost-effective siting of wells and the effective planning of groundwater development at a catchment or basin scale.

  16. [Maximum entropy model versus remote sensing-based methods for extracting Oncomelania hupensis snail habitats].

    PubMed

    Cong-Cong, Xia; Cheng-Fang, Lu; Si, Li; Tie-Jun, Zhang; Sui-Heng, Lin; Yi, Hu; Ying, Liu; Zhi-Jie, Zhang

    2016-12-02

    To explore the technique of maximum entropy model for extracting Oncomelania hupensis snail habitats in Poyang Lake zone. The information of snail habitats and related environment factors collected in Poyang Lake zone were integrated to set up the maximum entropy based species model and generate snail habitats distribution map. Two Landsat 7 ETM+ remote sensing images of both wet and drought seasons in Poyang Lake zone were obtained, where the two indices of modified normalized difference water index (MNDWI) and normalized difference vegetation index (NDVI) were applied to extract snail habitats. The ROC curve, sensitivities and specificities were applied to assess their results. Furthermore, the importance of the variables for snail habitats was analyzed by using Jackknife approach. The evaluation results showed that the area under receiver operating characteristic curve (AUC) of testing data by the remote sensing-based method was only 0.56, and the sensitivity and specificity were 0.23 and 0.89 respectively. Nevertheless, those indices above-mentioned of maximum entropy model were 0.876, 0.89 and 0.74 respectively. The main concentration of snail habitats in Poyang Lake zone covered the northeast part of Yongxiu County, northwest of Yugan County, southwest of Poyang County and middle of Xinjian County, and the elevation was the most important environment variable affecting the distribution of snails, and the next was land surface temperature (LST). The maximum entropy model is more reliable and accurate than the remote sensing-based method for the sake of extracting snail habitats, which has certain guiding significance for the relevant departments to carry out measures to prevent and control high-risk snail habitats.

  17. Zn-metalloprotease sequences in extremophiles

    NASA Astrophysics Data System (ADS)

    Holden, T.; Dehipawala, S.; Golebiewska, U.; Cheung, E.; Tremberger, G., Jr.; Williams, E.; Schneider, P.; Gadura, N.; Lieberman, D.; Cheung, T.

    2010-09-01

    The Zn-metalloprotease family contains conserved amino acid structures such that the nucleotide fluctuation at the DNA level would exhibit correlated randomness as described by fractal dimension. A nucleotide sequence fractal dimension can be calculated from a numerical series consisting of the atomic numbers of each nucleotide. The structure's vibration modes can also be studied using a Gaussian Network Model. The vibration measure and fractal dimension values form a two-dimensional plot with a standard vector metric that can be used for comparison of structures. The preference for amino acid usage in extremophiles may suppress nucleotide fluctuations that could be analyzed in terms of fractal dimension and Shannon entropy. A protein level cold adaptation study of the thermolysin Zn-metalloprotease family using molecular dynamics simulation was reported recently and our results show that the associated nucleotide fluctuation suppression is consistent with a regression pattern generated from the sequences's fractal dimension and entropy values (R-square { 0.98, N =5). It was observed that cold adaptation selected for high entropy and low fractal dimension values. Extension to the Archaemetzincin M54 family in extremophiles reveals a similar regression pattern (R-square = 0.98, N = 6). It was observed that the metalloprotease sequences of extremely halophilic organisms possess high fractal dimension and low entropy values as compared with non-halophiles. The zinc atom is usually bonded to the histidine residue, which shows limited levels of vibration in the Gaussian Network Model. The variability of the fractal dimension and entropy for a given protein structure suggests that extremophiles would have evolved after mesophiles, consistent with the bias usage of non-prebiotic amino acids by extremophiles. It may be argued that extremophiles have the capacity to offer extinction protection during drastic changes in astrobiological environments.

  18. Exergy and the economic process

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Georgios

    2016-04-01

    The Second Law of Thermodynamics (2nd Law) dictates that the introduction of physical work in a system requires the existence of a heat gradient, according to the universal notion of Carnot Heat Engine. This is the corner stone for the notion of exergy as well, as exergy is actually the potential of physical work generation across the process of equilibration of a number of unified systems with different thermodynamic states. However, although energy concerns the abstract ability of work generation, exergy concerns the specific ability of work generation, due to the requirement for specifying an environment of reference, in relation to which the thermodynamic equilibration takes place; also determining heat engine efficiencies. Consequently, while energy is always conserved, exergy -deriving from heat gradient equilibration- is always consumed. According to this perspective, the availability of heat gradients is what fundamentally drives the evolution of econosystems, via enhancing -or even substituting- human labor (Boulding 1978; Chen 2005; Ayres and Warr 2009). In addition, exergy consumption is irreversible, via the gradual transformation of useful physical work to entropy; hence reducing its future economic availability. By extending Roegen's relative approach (1971), it could be postulated that this irreversible exhaustion of exergy comprises the fundamental cause of economic scarcity, which is the corner stone for the development of economic science. Conclusively, scarcity consists in: (a) the difficulty of allocating -in the Earth System- very high heat gradients that would make humanity's heat engines very efficient and (b) the irreversible depletion of existent heat gradients due to entropy production. In addition, the concept of exergy could be used to study natural resource degradation and pollution at the biogeochemical level and understand why heat gradient scarcity in the Earth System was eventually inevitable. All of these issues are analyzed both theoretically and quantitatively. Keywords: 2nd Law, physical work, heat gradient, Carnot Heat Engine, exergy, energy, reference environment, econosystems, irreversibility, entropy, scarcity, resource degradation, pollution References 1. Ayres, Robert U. and Benjamin Warr (2009), The Economic Growth Engine: How Energy and Work Drive Material Prosperity, Edward Elgar and IIASA 2. Boulding, Kenneth E. (1978), Ecodynamics: A New Theory of Societal Evolution, Sage Publication 3. Chen, Jing (2005), The Physical Foundations of Economics: An Analytic Thermodynamic Theory, World Scientific 4. Roegen, Nicolas Georgescu (1971), The Entropy Law and the Economic Process, Harvard University Press

  19. Measure-valued solutions to the complete Euler system revisited

    NASA Astrophysics Data System (ADS)

    Březina, Jan; Feireisl, Eduard

    2018-06-01

    We consider the complete Euler system describing the time evolution of a general inviscid compressible fluid. We introduce a new concept of measure-valued solution based on the total energy balance and entropy inequality for the physical entropy without any renormalization. This class of so-called dissipative measure-valued solutions is large enough to include the vanishing dissipation limits of the Navier-Stokes-Fourier system. Our main result states that any sequence of weak solutions to the Navier-Stokes-Fourier system with vanishing viscosity and heat conductivity coefficients generates a dissipative measure-valued solution of the Euler system under some physically grounded constitutive relations. Finally, we discuss the same asymptotic limit for the bi-velocity fluid model introduced by H.Brenner.

  20. SU-E-QI-17: Dependence of 3D/4D PET Quantitative Image Features On Noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliver, J; Budzevich, M; Zhang, G

    2014-06-15

    Purpose: Quantitative imaging is a fast evolving discipline where a large number of features are extracted from images; i.e., radiomics. Some features have been shown to have diagnostic, prognostic and predictive value. However, they are sensitive to acquisition and processing factors; e.g., noise. In this study noise was added to positron emission tomography (PET) images to determine how features were affected by noise. Methods: Three levels of Gaussian noise were added to 8 lung cancer patients PET images acquired in 3D mode (static) and using respiratory tracking (4D); for the latter images from one of 10 phases were used. Amore » total of 62 features: 14 shape, 19 intensity (1stO), 18 GLCM textures (2ndO; from grey level co-occurrence matrices) and 11 RLM textures (2ndO; from run-length matrices) features were extracted from segmented tumors. Dimensions of GLCM were 256×256, calculated using 3D images with a step size of 1 voxel in 13 directions. Grey levels were binned into 256 levels for RLM and features were calculated in all 13 directions. Results: Feature variation generally increased with noise. Shape features were the most stable while RLM were the most unstable. Intensity and GLCM features performed well; the latter being more robust. The most stable 1stO features were compactness, maximum and minimum length, standard deviation, root-mean-squared, I30, V10-V90, and entropy. The most stable 2ndO features were entropy, sum-average, sum-entropy, difference-average, difference-variance, difference-entropy, information-correlation-2, short-run-emphasis, long-run-emphasis, and run-percentage. In general, features computed from images from one of the phases of 4D scans were more stable than from 3D scans. Conclusion: This study shows the need to characterize image features carefully before they are used in research and medical applications. It also shows that the performance of features, and thereby feature selection, may be assessed in part by noise analysis.« less

  1. Refined two-index entropy and multiscale analysis for complex system

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2016-10-01

    As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.

  2. Entropy criteria applied to pattern selection in systems with free boundaries

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-10-01

    The steady state differential or integral equations which describe patterned dissipative structures, typically to be identified with first order phase transformation morphologies like isothermal pearlites, are invariably degenerate in one or more order parameters (the lamellar spacing in the pearlite case). It is often observed that a different pattern is attained at the steady state for each initial condition (the hysteresis or metastable case). Alternatively, boundary perturbations and internal fluctuations during transition up to, or at the steady state, destroy the path coherence. In this case a statistical ensemble of imperfect patterns often emerges which represents a fluctuating but recognizably patterned and unique average steady state. It is cases like cellular, lamellar pearlite, involving an assembly of individual cell patterns which are regularly perturbed by local fluctuation and growth processes, which concern us here. Such weakly fluctuating nonlinear steady state ensembles can be arranged in a thought experiment so as to evolve as subsystems linking two very large mass-energy reservoirs in isolation. Operating on this discontinuous thermodynamic ideal, Onsager’s principle of maximum path probability for isolated systems, which we interpret as a minimal time correlation function connecting subsystem and baths, identifies the stable steady state at a parametric minimum or maximum (or both) in the dissipation rate. This nonlinear principle is independent of the Principle of Minimum Dissipation which is applicable in the linear regime of irreversible thermodynamics. The statistical argument is equivalent to the weak requirement that the isolated system entropy as a function of time be differentiable to the second order despite the macroscopic pattern fluctuations which occur in the subsystem. This differentiability condition is taken for granted in classical stability theory based on the 2nd Law. The optimal principle as applied to isothermal and forced velocity pearlites (in this case maximal) possesses a Le Chatelier (perturbation) Principle which can be formulated exactly via Langer’s conjecture that “each lamella must grow in a direction which is perpendicular to the solidification front”. This is the first example of such an equivalence to be experimentally and theoretically recognized in nonlinear irreversible thermodynamics. A further application to binary solidification cells is reviewed. In this case the optimum in the dissipation is a minimum and the closure between theory and experiment is excellent. Other applications in thermal-hydraulics, biology, and solid state physics are briefy described.

  3. Aging and cardiovascular complexity: effect of the length of RR tachograms

    PubMed Central

    Nagaraj, Nithin

    2016-01-01

    As we age, our hearts undergo changes that result in a reduction in complexity of physiological interactions between different control mechanisms. This results in a potential risk of cardiovascular diseases which are the number one cause of death globally. Since cardiac signals are nonstationary and nonlinear in nature, complexity measures are better suited to handle such data. In this study, three complexity measures are used, namely Lempel–Ziv complexity (LZ), Sample Entropy (SampEn) and Effort-To-Compress (ETC). We determined the minimum length of RR tachogram required for characterizing complexity of healthy young and healthy old hearts. All the three measures indicated significantly lower complexity values for older subjects than younger ones. However, the minimum length of heart-beat interval data needed differs for the three measures, with LZ and ETC needing as low as 10 samples, whereas SampEn requires at least 80 samples. Our study indicates that complexity measures such as LZ and ETC are good candidates for the analysis of cardiovascular dynamics since they are able to work with very short RR tachograms. PMID:27957395

  4. Comment on "Inference with minimal Gibbs free energy in information field theory".

    PubMed

    Iatsenko, D; Stefanovska, A; McClintock, P V E

    2012-03-01

    Enßlin and Weig [Phys. Rev. E 82, 051112 (2010)] have introduced a "minimum Gibbs free energy" (MGFE) approach for estimation of the mean signal and signal uncertainty in Bayesian inference problems: it aims to combine the maximum a posteriori (MAP) and maximum entropy (ME) principles. We point out, however, that there are some important questions to be clarified before the new approach can be considered fully justified, and therefore able to be used with confidence. In particular, after obtaining a Gaussian approximation to the posterior in terms of the MGFE at some temperature T, this approximation should always be raised to the power of T to yield a reliable estimate. In addition, we show explicitly that MGFE indeed incorporates the MAP principle, as well as the MDI (minimum discrimination information) approach, but not the well-known ME principle of Jaynes [E.T. Jaynes, Phys. Rev. 106, 620 (1957)]. We also illuminate some related issues and resolve apparent discrepancies. Finally, we investigate the performance of MGFE estimation for different values of T, and we discuss the advantages and shortcomings of the approach.

  5. Microcanonical entropy for classical systems

    NASA Astrophysics Data System (ADS)

    Franzosi, Roberto

    2018-03-01

    The entropy definition in the microcanonical ensemble is revisited. We propose a novel definition for the microcanonical entropy that resolve the debate on the correct definition of the microcanonical entropy. In particular we show that this entropy definition fixes the problem inherent the exact extensivity of the caloric equation. Furthermore, this entropy reproduces results which are in agreement with the ones predicted with standard Boltzmann entropy when applied to macroscopic systems. On the contrary, the predictions obtained with the standard Boltzmann entropy and with the entropy we propose, are different for small system sizes. Thus, we conclude that the Boltzmann entropy provides a correct description for macroscopic systems whereas extremely small systems should be better described with the entropy that we propose here.

  6. On S-mixing entropy of quantum channels

    NASA Astrophysics Data System (ADS)

    Mukhamedov, Farrukh; Watanabe, Noboru

    2018-06-01

    In this paper, an S-mixing entropy of quantum channels is introduced as a generalization of Ohya's S-mixing entropy. We investigate several properties of the introduced entropy. Moreover, certain relations between the S-mixing entropy and the existing map and output entropies of quantum channels are investigated as well. These relations allowed us to find certain connections between separable states and the introduced entropy. Hence, there is a sufficient condition to detect entangled states. Moreover, several properties of the introduced entropy are investigated. Besides, entropies of qubit and phase-damping channels are calculated.

  7. Fermionic entanglement in superconducting systems

    NASA Astrophysics Data System (ADS)

    Di Tullio, M.; Gigena, N.; Rossignoli, R.

    2018-06-01

    We examine distinct measures of fermionic entanglement in the exact ground state of a finite superconducting system. It is first shown that global measures such as the one-body entanglement entropy, which represents the minimum relative entropy between the exact ground state and the set of fermionic Gaussian states, exhibit a close correlation with the BCS gap, saturating in the strong superconducting regime. The same behavior is displayed by the bipartite entanglement between the set of all single-particle states k of positive quasimomenta and their time-reversed partners k ¯. In contrast, the entanglement associated with the reduced density matrix of four single-particle modes k ,k ¯ , k',k¯' , which can be measured through a properly defined fermionic concurrence, exhibits a different behavior, showing a peak in the vicinity of the superconducting transition for states k ,k' close to the Fermi level and becoming small in the strong coupling regime. In the latter, such reduced state exhibits, instead, a finite mutual information and quantum discord. While the first measures can be correctly estimated with the BCS approximation, the previous four-level concurrence lies strictly beyond the latter, requiring at least a particle-number projected BCS treatment for its description. Formal properties of all previous entanglement measures are as well discussed.

  8. Application of maximum entropy to statistical inference for inversion of data from a single track segment.

    PubMed

    Stotts, Steven A; Koch, Robert A

    2017-08-01

    In this paper an approach is presented to estimate the constraint required to apply maximum entropy (ME) for statistical inference with underwater acoustic data from a single track segment. Previous algorithms for estimating the ME constraint require multiple source track segments to determine the constraint. The approach is relevant for addressing model mismatch effects, i.e., inaccuracies in parameter values determined from inversions because the propagation model does not account for all acoustic processes that contribute to the measured data. One effect of model mismatch is that the lowest cost inversion solution may be well outside a relatively well-known parameter value's uncertainty interval (prior), e.g., source speed from track reconstruction or towed source levels. The approach requires, for some particular parameter value, the ME constraint to produce an inferred uncertainty interval that encompasses the prior. Motivating this approach is the hypothesis that the proposed constraint determination procedure would produce a posterior probability density that accounts for the effect of model mismatch on inferred values of other inversion parameters for which the priors might be quite broad. Applications to both measured and simulated data are presented for model mismatch that produces minimum cost solutions either inside or outside some priors.

  9. Analytical design of intelligent machines

    NASA Technical Reports Server (NTRS)

    Saridis, George N.; Valavanis, Kimon P.

    1987-01-01

    The problem of designing 'intelligent machines' to operate in uncertain environments with minimum supervision or interaction with a human operator is examined. The structure of an 'intelligent machine' is defined to be the structure of a Hierarchically Intelligent Control System, composed of three levels hierarchically ordered according to the principle of 'increasing precision with decreasing intelligence', namely: the organizational level, performing general information processing tasks in association with a long-term memory; the coordination level, dealing with specific information processing tasks with a short-term memory; and the control level, which performs the execution of various tasks through hardware using feedback control methods. The behavior of such a machine may be managed by controls with special considerations and its 'intelligence' is directly related to the derivation of a compatible measure that associates the intelligence of the higher levels with the concept of entropy, which is a sufficient analytic measure that unifies the treatment of all the levels of an 'intelligent machine' as the mathematical problem of finding the right sequence of internal decisions and controls for a system structured in the order of intelligence and inverse order of precision such that it minimizes its total entropy. A case study on the automatic maintenance of a nuclear plant illustrates the proposed approach.

  10. Einstein-Podolsky-Rosen paradox implies a minimum achievable temperature

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    2017-01-01

    This work examines the thermodynamic consequences of the repeated partial projection model for coupling a quantum system to an arbitrary series of environments under feedback control. This paper provides observational definitions of heat and work that can be realized in current laboratory setups. In contrast to other definitions, it uses only properties of the environment and the measurement outcomes, avoiding references to the "measurement" of the central system's state in any basis. These definitions are consistent with the usual laws of thermodynamics at all temperatures, while never requiring complete projective measurement of the entire system. It is shown that the back action of measurement must be counted as work rather than heat to satisfy the second law. Comparisons are made to quantum jump (unravelling) and transition-probability based definitions, many of which appear as particular limits of the present model. These limits show that our total entropy production is a lower bound on traditional definitions of heat that trace out the measurement device. Examining the master equation approximation to the process at finite measurement rates, we show that most interactions with the environment make the system unable to reach absolute zero. We give an explicit formula for the minimum temperature achievable in repeatedly measured quantum systems. The phenomenon of minimum temperature offers an explanation of recent experiments aimed at testing fluctuation theorems in the quantum realm and places a fundamental purity limit on quantum computers.

  11. Criteria for scaling heat exchangers to miniature size

    NASA Technical Reports Server (NTRS)

    Rudolfvonrohr, P. B.; Smith, J. L., Jr.

    1985-01-01

    The purpose of this work is to highlight the particular aspects of miniature heat exchangers performance and to determine an appropriate design approach. A thermodynamic analysis is performed to express the generated entropy as a function of material and geometric characteristics of the heat exchangers. This expression is then used to size miniature heat exchangers.

  12. Comparison of image deconvolution algorithms on simulated and laboratory infrared images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Proctor, D.

    1994-11-15

    We compare Maximum Likelihood, Maximum Entropy, Accelerated Lucy-Richardson, Weighted Goodness of Fit, and Pixon reconstructions of simple scenes as a function of signal-to-noise ratio for simulated images with randomly generated noise. Reconstruction results of infrared images taken with the TAISIR (Temperature and Imaging System InfraRed) are also discussed.

  13. Towards a next generation open-source video codec

    NASA Astrophysics Data System (ADS)

    Bankoski, Jim; Bultje, Ronald S.; Grange, Adrian; Gu, Qunshan; Han, Jingning; Koleszar, John; Mukherjee, Debargha; Wilkins, Paul; Xu, Yaowu

    2013-02-01

    Google has recently been developing a next generation opensource video codec called VP9, as part of the experimental branch of the libvpx repository included in the WebM project (http://www.webmproject.org/). Starting from the VP8 video codec released by Google in 2010 as the baseline, a number of enhancements and new tools have been added to improve the coding efficiency. This paper provides a technical overview of the current status of this project along with comparisons and other stateoftheart video codecs H. 264/AVC and HEVC. The new tools that have been added so far include: larger prediction block sizes up to 64x64, various forms of compound INTER prediction, more modes for INTRA prediction, ⅛pel motion vectors and 8tap switchable subpel interpolation filters, improved motion reference generation and motion vector coding, improved entropy coding and framelevel entropy adaptation for various symbols, improved loop filtering, incorporation of Asymmetric Discrete Sine Transforms and larger 16x16 and 32x32 DCTs, frame level segmentation to group similar areas together, etc. Other tools and various bitstream features are being actively worked on as well. The VP9 bitstream is expected to be finalized by earlyto mid2013. Results show VP9 to be quite competitive in performance with mainstream stateoftheart codecs.

  14. Novel pseudo-random number generator based on quantum random walks.

    PubMed

    Yang, Yu-Guang; Zhao, Qian-Qian

    2016-02-04

    In this paper, we investigate the potential application of quantum computation for constructing pseudo-random number generators (PRNGs) and further construct a novel PRNG based on quantum random walks (QRWs), a famous quantum computation model. The PRNG merely relies on the equations used in the QRWs, and thus the generation algorithm is simple and the computation speed is fast. The proposed PRNG is subjected to statistical tests such as NIST and successfully passed the test. Compared with the representative PRNG based on quantum chaotic maps (QCM), the present QRWs-based PRNG has some advantages such as better statistical complexity and recurrence. For example, the normalized Shannon entropy and the statistical complexity of the QRWs-based PRNG are 0.999699456771172 and 1.799961178212329e-04 respectively given the number of 8 bits-words, say, 16Mbits. By contrast, the corresponding values of the QCM-based PRNG are 0.999448131481064 and 3.701210794388818e-04 respectively. Thus the statistical complexity and the normalized entropy of the QRWs-based PRNG are closer to 0 and 1 respectively than those of the QCM-based PRNG when the number of words of the analyzed sequence increases. It provides a new clue to construct PRNGs and also extends the applications of quantum computation.

  15. Novel pseudo-random number generator based on quantum random walks

    PubMed Central

    Yang, Yu-Guang; Zhao, Qian-Qian

    2016-01-01

    In this paper, we investigate the potential application of quantum computation for constructing pseudo-random number generators (PRNGs) and further construct a novel PRNG based on quantum random walks (QRWs), a famous quantum computation model. The PRNG merely relies on the equations used in the QRWs, and thus the generation algorithm is simple and the computation speed is fast. The proposed PRNG is subjected to statistical tests such as NIST and successfully passed the test. Compared with the representative PRNG based on quantum chaotic maps (QCM), the present QRWs-based PRNG has some advantages such as better statistical complexity and recurrence. For example, the normalized Shannon entropy and the statistical complexity of the QRWs-based PRNG are 0.999699456771172 and 1.799961178212329e-04 respectively given the number of 8 bits-words, say, 16Mbits. By contrast, the corresponding values of the QCM-based PRNG are 0.999448131481064 and 3.701210794388818e-04 respectively. Thus the statistical complexity and the normalized entropy of the QRWs-based PRNG are closer to 0 and 1 respectively than those of the QCM-based PRNG when the number of words of the analyzed sequence increases. It provides a new clue to construct PRNGs and also extends the applications of quantum computation. PMID:26842402

  16. Study of quantum correlation swapping with relative entropy methods

    NASA Astrophysics Data System (ADS)

    Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun

    2016-02-01

    To generate long-distance shared quantum correlations (QCs) for information processing in future quantum networks, recently we proposed the concept of QC repeater and its kernel technique named QC swapping. Besides, we extensively studied the QC swapping between two simple QC resources (i.e., a pair of Werner states) with four different methods to quantify QCs (Xie et al. in Quantum Inf Process 14:653-679, 2015). In this paper, we continue to treat the same issue by employing other three different methods associated with relative entropies, i.e., the MPSVW method (Modi et al. in Phys Rev Lett 104:080501, 2010), the Zhang method (arXiv:1011.4333 [quant-ph]) and the RS method (Rulli and Sarandy in Phys Rev A 84:042109, 2011). We first derive analytic expressions of all QCs which occur during the swapping process and then reveal their properties about monotonicity and threshold. Importantly, we find that a long-distance shared QC can be generated from two short-distance ones via QC swapping indeed. In addition, we simply compare our present results with our previous ones.

  17. Performance optimization of plate heat exchangers with chevron plates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muley, A.; Manglik, R.M.

    1999-07-01

    The enhanced heat transfer performance of a chevron plate heat exchanger (PHE) is evaluated employing (1) energy-conservation based performance evaluation criteria (PECs), and (2) the second-law based minimization of entropy generation principle. Single-phase laminar and turbulent flow convection for three different chevron-plate arrangements are considered. The influence of plate surface corrugation characteristics and their stack arrangements on the heat exchanger's thermal-hydraulic performance is delineated. Based on the different figures of merit, the results show that the extent of heat transfer enhancement increases with flow Re and chevron angle {beta} in laminar flow, but it diminishes with increasing Re in turbulentmore » flows. With up to 2.9 times higher Q, 48% lower A, and entropy generation number N{sub s,a} {lt} 1, relative to an equivalent flat-plate pack, chevron plates are found to be especially suitable in the low to medium flow rates range (20 {le} Re {le} 2,000). Also, there appears to be no significant advantage of using a mixed-plate over a symmetric-plate arrangement.« less

  18. Laguerre-polynomial-weighted squeezed vacuum: generation and its properties of entanglement

    NASA Astrophysics Data System (ADS)

    Ye, Wei; Zhang, Kuizheng; Zhang, Haoliang; Xu, Xuexiang; Hu, Liyun

    2018-02-01

    We theoretically prepare a kind of two-mode entangled non-Gaussian state generated by combining quantum catalysis and parametric-down amplifier operated on the two-mode squeezing vacuum state. We then investigate the entanglement properties by examining Von Neumann entropy, EPR correlation, squeezing effect and the fidelity of teleportation. It is shown that only Von Neumann entropy can be enhanced by both single- and two-mode catalysis in a small squeezing region, while the other properties can be enhanced only by two-mode catalysis including symmetrical and asymmetrical cases. A comparison among these properties shows that the squeezing and the EPR correlation definitely lead to the improvement of both the entanglement and the fidelity, and the region of enhanced fidelity can be seen as a sub-region of the enhanced entanglement which indicates that the entanglement is not always beneficial for the fidelity. In addition, the effect of photon-loss after catalysis on the fidelity is considered and the symmetrical two-photon catalysis may present better behavior than the symmetrical single-photon case against the decoherence in a certain region.

  19. A Framework for Final Drive Simultaneous Failure Diagnosis Based on Fuzzy Entropy and Sparse Bayesian Extreme Learning Machine

    PubMed Central

    Ye, Qing; Pan, Hao; Liu, Changhua

    2015-01-01

    This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F 1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach. PMID:25722717

  20. Entropy generation in a parallel-plate active magnetic regenerator with insulator layers

    NASA Astrophysics Data System (ADS)

    Mugica Guerrero, Ibai; Poncet, Sébastien; Bouchard, Jonathan

    2017-02-01

    This paper proposes a feasible solution to diminish conduction losses in active magnetic regenerators. Higher performances of these machines are linked to a lower thermal conductivity of the Magneto-Caloric Material (MCM) in the streamwise direction. The concept presented here involves the insertion of insulator layers along the length of a parallel-plate magnetic regenerator in order to reduce the heat conduction within the MCM. This idea is investigated by means of a 1D numerical model. This model solves not only the energy equations for the fluid and solid domains but also the magnetic circuit that conforms the experimental setup of reference. In conclusion, the addition of insulator layers within the MCM increases the temperature span, cooling load, and coefficient of performance by a combination of lower heat conduction losses and an increment of the global Magneto-Caloric Effect. The generated entropy by solid conduction, fluid convection, and conduction and viscous losses are calculated to help understand the implications of introducing insulator layers in magnetic regenerators. Finally, the optimal number of insulator layers is studied.

Top