Sample records for entropy generation minimization

  1. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  2. Effective Techniques for Augmenting Heat Transfer: An Application of Entropy Generation Minimization Principles.

    DTIC Science & Technology

    1980-12-01

    augmentation techniques, entropy generation, irreversibility, exergy . 20. ABSTRACT (Continue on rovers. side If necessary and Identify by block number...35 3.5 Internally finned tubes ...... ................. .. 37 3.6 Internally roughened tubes ..... ............... . 41 3.7 Other heat transfer...irreversibility and entropy generation as fundamental criterion for evaluating and, eventually, minimizing the waste of usable energy ( exergy ) in energy

  3. Nonlinear radiative heat flux and heat source/sink on entropy generation minimization rate

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Khan, M. Waleed Ahmed; Khan, M. Ijaz; Alsaedi, A.

    2018-06-01

    Entropy generation minimization in nonlinear radiative mixed convective flow towards a variable thicked surface is addressed. Entropy generation for momentum and temperature is carried out. The source for this flow analysis is stretching velocity of sheet. Transformations are used to reduce system of partial differential equations into ordinary ones. Total entropy generation rate is determined. Series solutions for the zeroth and mth order deformation systems are computed. Domain of convergence for obtained solutions is identified. Velocity, temperature and concentration fields are plotted and interpreted. Entropy equation is studied through nonlinear mixed convection and radiative heat flux. Velocity and temperature gradients are discussed through graphs. Meaningful results are concluded in the final remarks.

  4. Entropy Generation Minimization in Dimethyl Ether Synthesis: A Case Study

    NASA Astrophysics Data System (ADS)

    Kingston, Diego; Razzitte, Adrián César

    2018-04-01

    Entropy generation minimization is a method that helps improve the efficiency of real processes and devices. In this article, we study the entropy production (due to chemical reactions, heat exchange and friction) in a conventional reactor that synthesizes dimethyl ether and minimize it by modifying different operating variables of the reactor, such as composition, temperature and pressure, while aiming at a fixed production of dimethyl ether. Our results indicate that it is possible to reduce the entropy production rate by nearly 70 % and that, by changing only the inlet composition, it is possible to cut it by nearly 40 %, though this comes at the expense of greater dissipation due to heat transfer. We also study the alternative of coupling the reactor with another, where dehydrogenation of methylcyclohexane takes place. In that case, entropy generation can be reduced by 54 %, when pressure, temperature and inlet molar flows are varied. These examples show that entropy generation analysis can be a valuable tool in engineering design and applications aiming at process intensification and efficient operation of plant equipment.

  5. Optimization of a Circular Microchannel With Entropy Generation Minimization Method

    NASA Astrophysics Data System (ADS)

    Jafari, Arash; Ghazali, Normah Mohd

    2010-06-01

    New advances in micro and nano scales are being realized and the contributions of micro and nano heat dissipation devices are of high importance in this novel technology development. Past studies showed that microchannel design depends on its thermal resistance and pressure drop. However, entropy generation minimization (EGM) as a new optimization theory stated that the rate of entropy generation should be also optimized. Application of EGM in microchannel heat sink design is reviewed and discussed in this paper. Latest principles for deriving the entropy generation relations are discussed to present how this approach can be achieved. An optimization procedure using EGM method with the entropy generation rate is derived for a circular microchannel heat sink based upon thermal resistance and pressure drop. The equations are solved using MATLAB and the obtained results are compared to similar past studies. The effects of channel diameter, number of channels, heat flux, and pumping power on the entropy generation rate and Reynolds number are investigated. Analytical correlations are utilized for heat transfer and friction coefficients. A minimum entropy generation has been observed for N = 40 and channel diameter of 90μm. It is concluded that for N = 40 and channel hydraulic diameter of 90μm, the circular microchannel heat sink is on its optimum operating point based on second law of thermodynamics.

  6. Zero entropy continuous interval maps and MMLS-MMA property

    NASA Astrophysics Data System (ADS)

    Jiang, Yunping

    2018-06-01

    We prove that the flow generated by any continuous interval map with zero topological entropy is minimally mean-attractable and minimally mean-L-stable. One of the consequences is that any oscillating sequence is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy. In particular, the Möbius function is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy (Sarnak’s conjecture for continuous interval maps). Another consequence is a non-trivial example of a flow having discrete spectrum. We also define a log-uniform oscillating sequence and show a result in ergodic theory for comparison. This material is based upon work supported by the National Science Foundation. It is also partially supported by a collaboration grant from the Simons Foundation (grant number 523341) and PSC-CUNY awards and a grant from NSFC (grant number 11571122).

  7. Radiative entropy generation in a gray absorbing, emitting, and scattering planar medium at radiative equilibrium

    NASA Astrophysics Data System (ADS)

    Sadeghi, Pegah; Safavinejad, Ali

    2017-11-01

    Radiative entropy generation through a gray absorbing, emitting, and scattering planar medium at radiative equilibrium with diffuse-gray walls is investigated. The radiative transfer equation and radiative entropy generation equations are solved using discrete ordinates method. Components of the radiative entropy generation are considered for two different boundary conditions: two walls are at a prescribed temperature and mixed boundary conditions, which one wall is at a prescribed temperature and the other is at a prescribed heat flux. The effect of wall emissivities, optical thickness, single scattering albedo, and anisotropic-scattering factor on the entropy generation is attentively investigated. The results reveal that entropy generation in the system mainly arises from irreversible radiative transfer at wall with lower temperature. Total entropy generation rate for the system with prescribed temperature at walls remarkably increases as wall emissivity increases; conversely, for system with mixed boundary conditions, total entropy generation rate slightly decreases. Furthermore, as the optical thickness increases, total entropy generation rate remarkably decreases for the system with prescribed temperature at walls; nevertheless, for the system with mixed boundary conditions, total entropy generation rate increases. The variation of single scattering albedo does not considerably affect total entropy generation rate. This parametric analysis demonstrates that the optical thickness and wall emissivities have a significant effect on the entropy generation in the system at radiative equilibrium. Considering the parameters affecting radiative entropy generation significantly, provides an opportunity to optimally design or increase overall performance and efficiency by applying entropy minimization techniques for the systems at radiative equilibrium.

  8. Spatial chaos of Wang tiles with two symbols

    NASA Astrophysics Data System (ADS)

    Chen, Jin-Yu; Chen, Yu-Jie; Hu, Wen-Guei; Lin, Song-Sun

    2016-02-01

    This investigation completely classifies the spatial chaos problem in plane edge coloring (Wang tiles) with two symbols. For a set of Wang tiles B , spatial chaos occurs when the spatial entropy h ( B ) is positive. B is called a minimal cycle generator if P ( B ) ≠ 0̸ and P ( B ' ) = 0̸ whenever B ' ⫋ B , where P ( B ) is the set of all periodic patterns on ℤ2 generated by B . Given a set of Wang tiles B , write B = C 1 ∪ C 2 ∪ ⋯ ∪ C k ∪ N , where Cj, 1 ≤ j ≤ k, are minimal cycle generators and B contains no minimal cycle generator except those contained in C1∪C2∪⋯∪Ck. Then, the positivity of spatial entropy h ( B ) is completely determined by C1∪C2∪⋯∪Ck. Furthermore, there are 39 equivalence classes of marginal positive-entropy sets of Wang tiles and 18 equivalence classes of saturated zero-entropy sets of Wang tiles. For a set of Wang tiles B , h ( B ) is positive if and only if B contains a MPE set, and h ( B ) is zero if and only if B is a subset of a SZE set.

  9. Thermal performance of plate fin heat sink cooled by air slot impinging jet with different cross-sectional area

    NASA Astrophysics Data System (ADS)

    Mesalhy, O. M.; El-Sayed, Mostafa M.

    2015-06-01

    Flow and heat transfer characteristics of a plate-fin heat sink cooled by a rectangular impinging jet with different cross-sectional area were studied experimentally and numerically. The study concentrated on investigating the effect of jet width, fin numbers, and fin heights on thermal performance. Entropy generation minimization method was used to define the optimum design and operating conditions. It is found that, the jet width that minimizes entropy generation changes with heat sink height and fin numbers.

  10. New thermodynamics of entropy generation minimization with nonlinear thermal radiation and nanomaterials

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Khan, M. Ijaz; Qayyum, Sumaira; Alsaedi, A.; Khan, M. Imran

    2018-03-01

    This research addressed entropy generation for MHD stagnation point flow of viscous nanofluid over a stretching surface. Characteristics of heat transport are analyzed through nonlinear radiation and heat generation/absorption. Nanoliquid features for Brownian moment and thermophoresis have been considered. Fluid in the presence of constant applied inclined magnetic field is considered. Flow problem is mathematically modeled and governing expressions are changed into nonlinear ordinary ones by utilizing appropriate transformations. The effects of pertinent variables on velocity, nanoparticle concentration and temperature are discussed graphically. Furthermore Brownian motion and thermophoresis effects on entropy generation and Bejan number have been examined. Total entropy generation is inspected through various flow variables. Consideration is mainly given to the convergence process. Velocity, temperature and mass gradients at the surface of sheet are calculated numerically.

  11. Heat Transfer and Entropy Generation Analysis of an Intermediate Heat Exchanger in ADS

    NASA Astrophysics Data System (ADS)

    Wang, Yongwei; Huai, Xiulan

    2018-04-01

    The intermediate heat exchanger for enhancement heat transfer is the important equipment in the usage of nuclear energy. In the present work, heat transfer and entropy generation of an intermediate heat exchanger (IHX) in the accelerator driven subcritical system (ADS) are investigated experimentally. The variation of entropy generation number with performance parameters of the IHX is analyzed, and effects of inlet conditions of the IHX on entropy generation number and heat transfer are discussed. Compared with the results at two working conditions of the constant mass flow rates of liquid lead-bismuth eutectic (LBE) and helium gas, the total pumping power all tends to reduce with the decreasing entropy generation number, but the variations of the effectiveness, number of transfer units and thermal capacity rate ratio are inconsistent, and need to analyze respectively. With the increasing inlet mass flow rate or LBE inlet temperature, the entropy generation number increases and the heat transfer is enhanced, while the opposite trend occurs with the increasing helium gas inlet temperature. The further study is necessary for obtaining the optimized operation parameters of the IHX to minimize entropy generation and enhance heat transfer.

  12. Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level

    NASA Astrophysics Data System (ADS)

    Silva, Carlos; Annamalai, Kalyan

    2008-06-01

    The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.

  13. Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows

    NASA Astrophysics Data System (ADS)

    Safari, Mehdi

    Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.

  14. Enhancement of heat transfer and entropy generation analysis of nanofluids turbulent convection flow in square section tubes

    NASA Astrophysics Data System (ADS)

    Bianco, Vincenzo; Nardini, Sergio; Manca, Oronzio

    2011-12-01

    In this article, developing turbulent forced convection flow of a water-Al2O3 nanofluid in a square tube, subjected to constant and uniform wall heat flux, is numerically investigated. The mixture model is employed to simulate the nanofluid flow and the investigation is accomplished for particles size equal to 38 nm. An entropy generation analysis is also proposed in order to find the optimal working condition for the given geometry under given boundary conditions. A simple analytical procedure is proposed to evaluate the entropy generation and its results are compared with the numerical calculations, showing a very good agreement. A comparison of the resulting Nusselt numbers with experimental correlations available in literature is accomplished. To minimize entropy generation, the optimal Reynolds number is determined.

  15. Towards the minimization of thermodynamic irreversibility in an electrically actuated microflow of a viscoelastic fluid under electrical double layer phenomenon

    NASA Astrophysics Data System (ADS)

    Sarma, Rajkumar; Jain, Manish; Mondal, Pranab Kumar

    2017-10-01

    We discuss the entropy generation minimization for electro-osmotic flow of a viscoelastic fluid through a parallel plate microchannel under the combined influences of interfacial slip and conjugate transport of heat. We use in this study the simplified Phan-Thien-Tanner model to describe the rheological behavior of the viscoelastic fluid. Using Navier's slip law and thermal boundary conditions of the third kind, we solve the transport equations analytically and evaluate the global entropy generation rate of the system. We examine the influential role of the following parameters on the entropy generation rate of the system, viz., the viscoelastic parameter (ɛDe2), Debye-Hückel parameter ( κ ¯ ) , channel wall thickness (δ), thermal conductivity of the wall (γ), Biot number (Bi), Peclet number (Pe), and axial temperature gradient (B). This investigation finally establishes the optimum values of the abovementioned parameters, leading to the minimum entropy generation of the system. We believe that results of this analysis could be helpful in optimizing the second-law performance of microscale thermal management devices, including the micro-heat exchangers, micro-reactors, and micro-heat pipes.

  16. Entropy generation minimization (EGM) of nanofluid flow by a thin moving needle with nonlinear thermal radiation

    NASA Astrophysics Data System (ADS)

    Waleed Ahmed Khan, M.; Ijaz Khan, M.; Hayat, T.; Alsaedi, A.

    2018-04-01

    Entropy generation minimization (EGM) and heat transport in nonlinear radiative flow of nanomaterials over a thin moving needle has been discussed. Nonlinear thermal radiation and viscous dissipation terms are merged in the energy expression. Water is treated as ordinary fluid while nanomaterials comprise titanium dioxide, copper and aluminum oxide. The nonlinear governing expressions of flow problems are transferred to ordinary ones and then tackled for numerical results by Built-in-shooting technique. In first section of this investigation, the entropy expression is derived as a function of temperature and velocity gradients. Geometrical and physical flow field variables are utilized to make it nondimensionalized. An entropy generation analysis is utilized through second law of thermodynamics. The results of temperature, velocity, concentration, surface drag force and heat transfer rate are explored. Our outcomes reveal that surface drag force and Nusselt number (heat transfer) enhanced linearly for higher nanoparticle volume fraction. Furthermore drag force decays for aluminum oxide and it enhances for copper nanoparticles. In addition, the lowest heat transfer rate is achieved for higher radiative parameter. Temperature field is enhanced with increase in temperature ratio parameter.

  17. Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.

    PubMed

    Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai

    2017-02-20

    Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.

  18. Entropy Production in Chemical Reactors

    NASA Astrophysics Data System (ADS)

    Kingston, Diego; Razzitte, Adrián C.

    2017-06-01

    We have analyzed entropy production in chemically reacting systems and extended previous results to the two limiting cases of ideal reactors, namely continuous stirred tank reactor (CSTR) and plug flow reactor (PFR). We have found upper and lower bounds for the entropy production in isothermal systems and given expressions for non-isothermal operation and analyzed the influence of pressure and temperature in entropy generation minimization in reactors with a fixed volume and production. We also give a graphical picture of entropy production in chemical reactions subject to constant volume, which allows us to easily assess different options. We show that by dividing a reactor into two smaller ones, operating at different temperatures, the entropy production is lowered, going as near as 48 % less in the case of a CSTR and PFR in series, and reaching 58 % with two CSTR. Finally, we study the optimal pressure and temperature for a single isothermal PFR, taking into account the irreversibility introduced by a compressor and a heat exchanger, decreasing the entropy generation by as much as 30 %.

  19. Effect of entropy change of lithium intercalation in cathodes and anodes on Li-ion battery thermal management

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vilayanur V.; Choi, Daiwon; Wang, Donghai; Xu, Wu; Towne, Silas; Williford, Ralph E.; Zhang, Ji-Guang; Liu, Jun; Yang, Zhenguo

    The entropy changes (Δ S) in various cathode and anode materials, as well as in complete Li-ion batteries, were measured using an electrochemical thermodynamic measurement system (ETMS). LiCoO 2 has a much larger entropy change than electrodes based on LiNi xCo yMn zO 2 and LiFePO 4, while lithium titanate based anodes have lower entropy change compared to graphite anodes. The reversible heat generation rate was found to be a significant portion of the total heat generation rate. The appropriate combinations of cathode and anode were investigated to minimize reversible heat generation rate across the 0-100% state of charge (SOC) range. In addition to screening for battery electrode materials with low reversible heat, the techniques described in this paper can be a useful engineering tool for battery thermal management in stationary and transportation applications.

  20. Entropy generation minimization for the sloshing phenomenon in half-full elliptical storage tanks

    NASA Astrophysics Data System (ADS)

    Saghi, Hassan

    2018-02-01

    In this paper, the entropy generation in the sloshing phenomenon was obtained in elliptical storage tanks and the optimum geometry of tank was suggested. To do this, a numerical model was developed to simulate the sloshing phenomenon by using coupled Reynolds-Averaged Navier-Stokes (RANS) solver and the Volume-of-Fluid (VOF) method. The RANS equations were discretized and solved using the staggered grid finite difference and SMAC methods, and the available data were used for the model validation. Some parameters consisting of maximum free surface displacement (MFSD), maximum horizontal force exerted on the tank perimeter (MHF), tank perimeter (TP), and total entropy generation (Sgen) were introduced as design criteria for elliptical storage tanks. The entropy generation distribution provides designers with useful information about the causes of the energy loss. In this step, horizontal periodic sway motions as X =amsin(ωt) were applied to elliptical storage tanks with different aspect ratios namely ratios of large diameter to small diameter of elliptical storage tank (AR). Then, the effect of am and ω was studied on the results. The results show that the relation between MFSD and MHF is almost linear relative to the sway motion amplitude. Moreover, the results show that an increase in the AR causes a decrease in the MFSD and MHF. The results, also, show that the relation between MFSD and MHF is nonlinear relative to the sway motion angular frequency. Furthermore, the results show that an increase in the AR causes that the relation between MFSD and MHF becomes linear relative to the sway motion angular frequency. In addition, MFSD and MHF were minimized in a sway motion with a 7 rad/s angular frequency. Finally, the results show that the elliptical storage tank with AR =1.2-1.4 is the optimum section.

  1. Coarse-graining errors and numerical optimization using a relative entropy framework

    NASA Astrophysics Data System (ADS)

    Chaimovich, Aviel; Shell, M. Scott

    2011-03-01

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, Srel, that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework.

  2. An efficient algorithm for automatic phase correction of NMR spectra based on entropy minimization

    NASA Astrophysics Data System (ADS)

    Chen, Li; Weng, Zhiqiang; Goh, LaiYoong; Garland, Marc

    2002-09-01

    A new algorithm for automatic phase correction of NMR spectra based on entropy minimization is proposed. The optimal zero-order and first-order phase corrections for a NMR spectrum are determined by minimizing entropy. The objective function is constructed using a Shannon-type information entropy measure. Entropy is defined as the normalized derivative of the NMR spectral data. The algorithm has been successfully applied to experimental 1H NMR spectra. The results of automatic phase correction are found to be comparable to, or perhaps better than, manual phase correction. The advantages of this automatic phase correction algorithm include its simple mathematical basis and the straightforward, reproducible, and efficient optimization procedure. The algorithm is implemented in the Matlab program ACME—Automated phase Correction based on Minimization of Entropy.

  3. Minimal entropy approximation for cellular automata

    NASA Astrophysics Data System (ADS)

    Fukś, Henryk

    2014-02-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim.

  4. Coarse-graining errors and numerical optimization using a relative entropy framework.

    PubMed

    Chaimovich, Aviel; Shell, M Scott

    2011-03-07

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, S(rel), that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework. © 2011 American Institute of Physics.

  5. Cave spiders choose optimal environmental factors with respect to the generated entropy when laying their cocoon

    PubMed Central

    Chiavazzo, Eliodoro; Isaia, Marco; Mammola, Stefano; Lepore, Emiliano; Ventola, Luigi; Asinari, Pietro; Pugno, Nicola Maria

    2015-01-01

    The choice of a suitable area to spiders where to lay eggs is promoted in terms of Darwinian fitness. Despite its importance, the underlying factors behind this key decision are generally poorly understood. Here, we designed a multidisciplinary study based both on in-field data and laboratory experiments focusing on the European cave spider Meta menardi (Araneae, Tetragnathidae) and aiming at understanding the selective forces driving the female in the choice of the depositional area. Our in-field data analysis demonstrated a major role of air velocity and distance from the cave entrance within a particular cave in driving the female choice. This has been interpreted using a model based on the Entropy Generation Minimization - EGM - method, without invoking best fit parameters and thanks to independent lab experiments, thus demonstrating that the female chooses the depositional area according to minimal level of thermo-fluid-dynamic irreversibility. This methodology may pave the way to a novel approach in understanding evolutionary strategies for other living organisms. PMID:25556697

  6. Conserved charges of minimal massive gravity coupled to scalar field

    NASA Astrophysics Data System (ADS)

    Setare, M. R.; Adami, H.

    2018-02-01

    Recently, the theory of topologically massive gravity non-minimally coupled to a scalar field has been proposed, which comes from the Lorentz-Chern-Simons theory (JHEP 06, 113, 2015), a torsion-free theory. We extend this theory by adding an extra term which makes the torsion to be non-zero. We show that the BTZ spacetime is a particular solution to this theory in the case where the scalar field is constant. The quasi-local conserved charge is defined by the concept of the generalized off-shell ADT current. Also a general formula is found for the entropy of the stationary black hole solution in context of the considered theory. The obtained formulas are applied to the BTZ black hole solution in order to obtain the energy, the angular momentum and the entropy of this solution. The central extension term, the central charges and the eigenvalues of the Virasoro algebra generators for the BTZ black hole solution are thus obtained. The energy and the angular momentum of the BTZ black hole using the eigenvalues of the Virasoro algebra generators are calculated. Also, using the Cardy formula, the entropy of the BTZ black hole is found. It is found that the results obtained in two different ways exactly match, just as expected.

  7. An Error-Entropy Minimization Algorithm for Tracking Control of Nonlinear Stochastic Systems with Non-Gaussian Variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yunlong; Wang, Aiping; Guo, Lei

    This paper presents an error-entropy minimization tracking control algorithm for a class of dynamic stochastic system. The system is represented by a set of time-varying discrete nonlinear equations with non-Gaussian stochastic input, where the statistical properties of stochastic input are unknown. By using Parzen windowing with Gaussian kernel to estimate the probability densities of errors, recursive algorithms are then proposed to design the controller such that the tracking error can be minimized. The performance of the error-entropy minimization criterion is compared with the mean-square-error minimization in the simulation results.

  8. MO-FG-204-01: Improved Noise Suppression for Dual-Energy CT Through Entropy Minimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrongolo, M; Zhu, L

    2015-06-15

    Purpose: In dual energy CT (DECT), noise amplification during signal decomposition significantly limits the utility of basis material images. Since clinically relevant objects contain a limited number of materials, we propose to suppress noise for DECT based on image entropy minimization. An adaptive weighting scheme is employed during noise suppression to improve decomposition accuracy with limited effect on spatial resolution and image texture preservation. Methods: From decomposed images, we first generate a 2D plot of scattered data points, using basis material densities as coordinates. Data points representing the same material generate a highly asymmetric cluster. We orient an axis bymore » minimizing the entropy in a 1D histogram of these points projected onto the axis. To suppress noise, we replace pixel values of decomposed images with center-of-mass values in the direction perpendicular to the optimal axis. To limit errors due to cluster overlap, we weight each data point’s contribution based on its high and low energy CT values and location within the image. The proposed method’s performance is assessed on physical phantom studies. Electron density is used as the quality metric for decomposition accuracy. Our results are compared to those without noise suppression and with a recently developed iterative method. Results: The proposed method reduces noise standard deviations of the decomposed images by at least one order of magnitude. On the Catphan phantom, this method greatly preserves the spatial resolution and texture of the CT images and limits induced error in measured electron density to below 1.2%. In the head phantom study, the proposed method performs the best in retaining fine, intricate structures. Conclusion: The entropy minimization based algorithm with adaptive weighting substantially reduces DECT noise while preserving image spatial resolution and texture. Future investigations will include extensive investigations on material decomposition accuracy that go beyond the current electron density calculations. This work was supported in part by the National Institutes of Health (NIH) under Grant Number R21 EB012700.« less

  9. Megahertz-Rate Semi-Device-Independent Quantum Random Number Generators Based on Unambiguous State Discrimination

    NASA Astrophysics Data System (ADS)

    Brask, Jonatan Bohr; Martin, Anthony; Esposito, William; Houlmann, Raphael; Bowles, Joseph; Zbinden, Hugo; Brunner, Nicolas

    2017-05-01

    An approach to quantum random number generation based on unambiguous quantum state discrimination is developed. We consider a prepare-and-measure protocol, where two nonorthogonal quantum states can be prepared, and a measurement device aims at unambiguously discriminating between them. Because the states are nonorthogonal, this necessarily leads to a minimal rate of inconclusive events whose occurrence must be genuinely random and which provide the randomness source that we exploit. Our protocol is semi-device-independent in the sense that the output entropy can be lower bounded based on experimental data and a few general assumptions about the setup alone. It is also practically relevant, which we demonstrate by realizing a simple optical implementation, achieving rates of 16.5 Mbits /s . Combining ease of implementation, a high rate, and a real-time entropy estimation, our protocol represents a promising approach intermediate between fully device-independent protocols and commercial quantum random number generators.

  10. XMM Observations of Low Mass Groups

    NASA Technical Reports Server (NTRS)

    Davis, David S.

    2005-01-01

    The contents of this report contains discussion of the two-dimensional XMM-Newton group survey. The analysis of the NGC 2300 and Pavo observations indicated by the azimuthally averaged analysis that the temperature structure is minimal to the NGC2300 system; however, the Pavo system shows signs of a merger in progress. XMM data is used to generate two dimensional maps of the temperature and abundance used to generate maps of pressure and entropy.

  11. An entropy-assisted musculoskeletal shoulder model.

    PubMed

    Xu, Xu; Lin, Jia-Hua; McGorry, Raymond W

    2017-04-01

    Optimization combined with a musculoskeletal shoulder model has been used to estimate mechanical loading of musculoskeletal elements around the shoulder. Traditionally, the objective function is to minimize the summation of the total activities of the muscles with forces, moments, and stability constraints. Such an objective function, however, tends to neglect the antagonist muscle co-contraction. In this study, an objective function including an entropy term is proposed to address muscle co-contractions. A musculoskeletal shoulder model is developed to apply the proposed objective function. To find the optimal weight for the entropy term, an experiment was conducted. In the experiment, participants generated various 3-D shoulder moments in six shoulder postures. The surface EMG of 8 shoulder muscles was measured and compared with the predicted muscle activities based on the proposed objective function using Bhattacharyya distance and concordance ratio under different weight of the entropy term. The results show that a small weight of the entropy term can improve the predictability of the model in terms of muscle activities. Such a result suggests that the concept of entropy could be helpful for further understanding the mechanism of muscle co-contractions as well as developing a shoulder biomechanical model with greater validity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The gravity dual of Rényi entropy.

    PubMed

    Dong, Xi

    2016-08-12

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Rényi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometric prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Rényi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Rényi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.

  13. The gravity dual of Rényi entropy

    PubMed Central

    Dong, Xi

    2016-01-01

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Rényi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometric prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Rényi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Rényi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity. PMID:27515122

  14. Entropy generation of nanofluid flow in a microchannel heat sink

    NASA Astrophysics Data System (ADS)

    Manay, Eyuphan; Akyürek, Eda Feyza; Sahin, Bayram

    2018-06-01

    Present study aims to investigate the effects of the presence of nano sized TiO2 particles in the base fluid on entropy generation rate in a microchannel heat sink. Pure water was chosen as base fluid, and TiO2 particles were suspended into the pure water in five different particle volume fractions of 0.25%, 0.5%, 1.0%, 1.5% and 2.0%. Under laminar, steady state flow and constant heat flux boundary conditions, thermal, frictional, total entropy generation rates and entropy generation number ratios of nanofluids were experimentally analyzed in microchannel flow for different channel heights of 200 μm, 300 μm, 400 μm and 500 μm. It was observed that frictional and total entropy generation rates increased as thermal entropy generation rate were decreasing with an increase in particle volume fraction. In microchannel flows, thermal entropy generation could be neglected due to its too low rate smaller than 1.10e-07 in total entropy generation. Higher channel heights caused higher thermal entropy generation rates, and increasing channel height yielded an increase from 30% to 52% in thermal entropy generation. When channel height decreased, an increase of 66%-98% in frictional entropy generation was obtained. Adding TiO2 nanoparticles into the base fluid caused thermal entropy generation to decrease about 1.8%-32.4%, frictional entropy generation to increase about 3.3%-21.6%.

  15. Entropy uncertainty relations and stability of phase-temporal quantum cryptography with finite-length transmitted strings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com

    2012-12-15

    Any key-generation session contains a finite number of quantum-state messages, and it is there-fore important to understand the fundamental restrictions imposed on the minimal length of a string required to obtain a secret key with a specified length. The entropy uncertainty relations for smooth min and max entropies considerably simplify and shorten the proof of security. A proof of security of quantum key distribution with phase-temporal encryption is presented. This protocol provides the maximum critical error compared to other protocols up to which secure key distribution is guaranteed. In addition, unlike other basic protocols (of the BB84 type), which aremore » vulnerable with respect to an attack by 'blinding' of avalanche photodetectors, this protocol is stable with respect to such an attack and guarantees key security.« less

  16. Automated EEG entropy measurements in coma, vegetative state/unresponsive wakefulness syndrome and minimally conscious state

    PubMed Central

    Gosseries, Olivia; Schnakers, Caroline; Ledoux, Didier; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurélie; Demertzi, Athéna; Noirhomme, Quentin; Lehembre, Rémy; Damas, Pierre; Goldman, Serge; Peeters, Erika; Moonen, Gustave; Laureys, Steven

    Summary Monitoring the level of consciousness in brain-injured patients with disorders of consciousness is crucial as it provides diagnostic and prognostic information. Behavioral assessment remains the gold standard for assessing consciousness but previous studies have shown a high rate of misdiagnosis. This study aimed to investigate the usefulness of electroencephalography (EEG) entropy measurements in differentiating unconscious (coma or vegetative) from minimally conscious patients. Left fronto-temporal EEG recordings (10-minute resting state epochs) were prospectively obtained in 56 patients and 16 age-matched healthy volunteers. Patients were assessed in the acute (≤1 month post-injury; n=29) or chronic (>1 month post-injury; n=27) stage. The etiology was traumatic in 23 patients. Automated online EEG entropy calculations (providing an arbitrary value ranging from 0 to 91) were compared with behavioral assessments (Coma Recovery Scale-Revised) and outcome. EEG entropy correlated with Coma Recovery Scale total scores (r=0.49). Mean EEG entropy values were higher in minimally conscious (73±19; mean and standard deviation) than in vegetative/unresponsive wakefulness syndrome patients (45±28). Receiver operating characteristic analysis revealed an entropy cut-off value of 52 differentiating acute unconscious from minimally conscious patients (sensitivity 89% and specificity 90%). In chronic patients, entropy measurements offered no reliable diagnostic information. EEG entropy measurements did not allow prediction of outcome. User-independent time-frequency balanced spectral EEG entropy measurements seem to constitute an interesting diagnostic – albeit not prognostic – tool for assessing neural network complexity in disorders of consciousness in the acute setting. Future studies are needed before using this tool in routine clinical practice, and these should seek to improve automated EEG quantification paradigms in order to reduce the remaining false negative and false positive findings. PMID:21693085

  17. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle

    NASA Astrophysics Data System (ADS)

    Marín, Dolores; Sabater, Bartolomé

    2017-04-01

    Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.

  18. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle.

    PubMed

    Marín, Dolores; Sabater, Bartolomé

    2017-04-28

    Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO 2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.

  19. Minimization of the Renyi entropy production in the space-partitioning process.

    PubMed

    Cybulski, O; Babin, V; Hołyst, R

    2005-04-01

    The spontaneous division of space in Fleming-Viot processes is studied in terms of non-extensive thermodynamics. We analyze a system of n different types of Brownian particles confined in a box. Particles of different types annihilate each other when they come into close contact. Each process of annihilation is accompanied by a simultaneous nucleation of a particle of the same type, so that the number of particles of each component remains constant. The system eventually reaches a stationary state, in which the available space is divided into n separate subregions, each occupied by particles of one type. Within each subregion, the particle density distribution minimizes the Renyi entropy production. We show that the sum of these entropy productions in the stationary state is also minimized, i.e., the resulting boundaries between different components adopt a configuration which minimizes the total entropy production. The evolution of the system leads to decreasing of the total entropy production monotonically in time, irrespective of the initial conditions. In some circumstances, the stationary state is not unique-the entropy production may have several local minima for different configurations. In the case of a rectangular box, the existence and stability of different stationary states are studied as a function of the aspect ratio of the rectangle.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Xi

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Re´nyi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometricmore » prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Re´nyi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Re´nyi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.« less

  1. The gravity dual of Rényi entropy

    DOE PAGES

    Dong, Xi

    2016-08-12

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Re´nyi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometricmore » prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Re´nyi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Re´nyi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.« less

  2. The Holographic Entropy Cone

    DOE PAGES

    Bao, Ning; Nezami, Sepehr; Ooguri, Hirosi; ...

    2015-09-21

    We initiate a systematic enumeration and classification of entropy inequalities satisfied by the Ryu-Takayanagi formula for conformal field theory states with smooth holographic dual geometries. For 2, 3, and 4 regions, we prove that the strong subadditivity and the monogamy of mutual information give the complete set of inequalities. This is in contrast to the situation for generic quantum systems, where a complete set of entropy inequalities is not known for 4 or more regions. We also find an infinite new family of inequalities applicable to 5 or more regions. The set of all holographic entropy inequalities bounds the phasemore » space of Ryu-Takayanagi entropies, defining the holographic entropy cone. We characterize this entropy cone by reducing geometries to minimal graph models that encode the possible cutting and gluing relations of minimal surfaces. We find that, for a fixed number of regions, there are only finitely many independent entropy inequalities. To establish new holographic entropy inequalities, we introduce a combinatorial proof technique that may also be of independent interest in Riemannian geometry and graph theory.« less

  3. Most energetic passive states.

    PubMed

    Perarnau-Llobet, Martí; Hovhannisyan, Karen V; Huber, Marcus; Skrzypczyk, Paul; Tura, Jordi; Acín, Antonio

    2015-10-01

    Passive states are defined as those states that do not allow for work extraction in a cyclic (unitary) process. Within the set of passive states, thermal states are the most stable ones: they maximize the entropy for a given energy, and similarly they minimize the energy for a given entropy. Here we find the passive states lying in the other extreme, i.e., those that maximize the energy for a given entropy, which we show also minimize the entropy when the energy is fixed. These extremal properties make these states useful to obtain fundamental bounds for the thermodynamics of finite-dimensional quantum systems, which we show in several scenarios.

  4. Holographic Entanglement Entropy, SUSY & Calibrations

    NASA Astrophysics Data System (ADS)

    Colgáin, Eoin Ó.

    2018-01-01

    Holographic calculations of entanglement entropy boil down to identifying minimal surfaces in curved spacetimes. This generically entails solving second-order equations. For higher-dimensional AdS geometries, we demonstrate that supersymmetry and calibrations reduce the problem to first-order equations. We note that minimal surfaces corresponding to disks preserve supersymmetry, whereas strips do not.

  5. Theory and Normal Mode Analysis of Change in Protein Vibrational Dynamics on Ligand Binding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mortisugu, Kei; Njunda, Brigitte; Smith, Jeremy C

    2009-12-01

    The change of protein vibrations on ligand binding is of functional and thermodynamic importance. Here, this process is characterized using a simple analytical 'ball-and-spring' model and all-atom normal-mode analysis (NMA) of the binding of the cancer drug, methotrexate (MTX) to its target, dihydrofolate reductase (DHFR). The analytical model predicts that the coupling between protein vibrations and ligand external motion generates entropy-rich, low-frequency vibrations in the complex. This is consistent with the atomistic NMA which reveals vibrational softening in forming the DHFR-MTX complex, a result also in qualitative agreement with neutron-scattering experiments. Energy minimization of the atomistic bound-state (B) structure whilemore » gradually decreasing the ligand interaction to zero allows the generation of a hypothetical 'intermediate' (I) state, without the ligand force field but with a structure similar to that of B. In going from I to B, it is found that the vibrational entropies of both the protein and MTX decrease while the complex structure becomes enthalpically stabilized. However, the relatively weak DHFR:MTX interaction energy results in the net entropy gain arising from coupling between the protein and MTX external motion being larger than the loss of vibrational entropy on complex formation. This, together with the I structure being more flexible than the unbound structure, results in the observed vibrational softening on ligand binding.« less

  6. Entropy and equilibrium via games of complexity

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  7. On Entropy Trail

    NASA Astrophysics Data System (ADS)

    Farokhi, Saeed; Taghavi, Ray; Keshmiri, Shawn

    2015-11-01

    Stealth technology is developed for military aircraft to minimize their signatures. The primary attention was focused on radar signature, followed by the thermal and noise signatures of the vehicle. For radar evasion, advanced configuration designs, extensive use of carbon composites and radar-absorbing material, are developed. On thermal signature, mainly in the infra-red (IR) bandwidth, the solution was found in blended rectangular nozzles of high aspect ratio that are shielded from ground detectors. For noise, quiet and calm jets are integrated into vehicles with low-turbulence configuration design. However, these technologies are totally incapable of detecting new generation of revolutionary aircraft. These shall use all electric, distributed, propulsion system that are thermally transparent. In addition, composite skin and non-emitting sensors onboard the aircraft will lead to low signature. However, based on the second-law of thermodynamics, there is no air vehicle that can escape from leaving an entropy trail. Entropy is thus the only inevitable signature of any system, that once measured, can detect the source. By characterizing the entropy field based on its statistical properties, the source may be recognized, akin to face recognition technology. Direct measurement of entropy is cumbersome, however as a derived property, it can be easily measured. The measurement accuracy depends on the probe design and the sensors onboard. One novel air data sensor suite is introduced with promising potential to capture the entropy trail.

  8. Entropy of Vaidya Black Hole on Apparent Horizon with Minimal Length Revisited

    NASA Astrophysics Data System (ADS)

    Tang, Hao; Wu, Bin; Sun, Cheng-yi; Song, Yu; Yue, Rui-hong

    2018-03-01

    By considering the generalized uncertainty principle, the degrees of freedom near the apparent horizon of Vaidya black hole are calculated with the thin film model. The result shows that a cut-off can be introduced naturally rather than taking by hand. Furthermore, if the minimal length is chosen to be a specific value, the statistical entropy will satisfy the conventional area law at the horizon, which might reveal some deep things of the minimal length.

  9. Entropy of Vaidya Black Hole on Apparent Horizon with Minimal Length Revisited

    NASA Astrophysics Data System (ADS)

    Tang, Hao; Wu, Bin; Sun, Cheng-yi; Song, Yu; Yue, Rui-hong

    2018-07-01

    By considering the generalized uncertainty principle, the degrees of freedom near the apparent horizon of Vaidya black hole are calculated with the thin film model. The result shows that a cut-off can be introduced naturally rather than taking by hand. Furthermore, if the minimal length is chosen to be a specific value, the statistical entropy will satisfy the conventional area law at the horizon, which might reveal some deep things of the minimal length.

  10. Joint Entropy Minimization for Learning in Nonparametric Framework

    DTIC Science & Technology

    2006-06-09

    Tibshirani, G. Sherlock , W. C. Chan, T. C. Greiner, D. D. Weisenburger, J. O. Armitage, R. Warnke, R. Levy, W. Wilson, M. R. Grever, J. C. Byrd, D. Botstein, P...Entropy Minimization for Learning in Nonparametric Framework 33 [11] D.L. Collins, A.P. Zijdenbos, J.G. Kollokian, N.J. Sled, C.J. Kabani, C.J. Holmes

  11. Minimization of the Renyi entropy production in the stationary states of the Brownian process with matched death and birth rates.

    PubMed

    Cybulski, Olgierd; Babin, Volodymyr; Hołyst, Robert

    2004-01-01

    We analyze the Fleming-Viot process. The system is confined in a box, whose boundaries act as a sink of Brownian particles. The death rate at the boundaries is matched by the branching (birth) rate in the system and thus the number of particles is kept constant. We show that such a process is described by the Renyi entropy whose production is minimized in the stationary state. The entropy production in this process is a monotonically decreasing function of time irrespective of the initial conditions. The first Laplacian eigenvalue is shown to be equal to the Renyi entropy production in the stationary state. As an example we simulate the process in a two-dimensional box.

  12. Asymptotically spacelike warped anti-de Sitter spacetimes in generalized minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Setare, M. R.; Adami, H.

    2017-06-01

    In this paper we show that warped AdS3 black hole spacetime is a solution of the generalized minimal massive gravity (GMMG) and introduce suitable boundary conditions for asymptotically warped AdS3 spacetimes. Then we find the Killing vector fields such that transformations generated by them preserve the considered boundary conditions. We calculate the conserved charges which correspond to the obtained Killing vector fields and show that the algebra of the asymptotic conserved charges is given as the semi direct product of the Virasoro algebra with U(1) current algebra. We use a particular Sugawara construction to reconstruct the conformal algebra. Thus, we are allowed to use the Cardy formula to calculate the entropy of the warped black hole. We demonstrate that the gravitational entropy of the warped black hole exactly coincides with what we obtain via Cardy’s formula. As we expect, the warped Cardy formula also gives us exactly the same result as we obtain from the usual Cardy’s formula. We calculate mass and angular momentum of the warped black hole and then check that obtained mass, angular momentum and entropy to satisfy the first law of the black hole mechanics. According to the results of this paper we believe that the dual theory of the warped AdS3 black hole solution of GMMG is a warped CFT.

  13. Transient Dissipation and Structural Costs of Physical Information Transduction

    NASA Astrophysics Data System (ADS)

    Boyd, Alexander B.; Mandal, Dibyendu; Riechers, Paul M.; Crutchfield, James P.

    2017-06-01

    A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the information-processing second law (IPSL): the physical entropy of the Universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? We identify a minimal, and thus inescapable, transient dissipation of physical information processing, which is not captured by asymptotic rates, but is critical to adaptive thermodynamic processes such as those found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that "retrodictive" generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical system's structure as it comes to optimally transduce information.

  14. Entropy generation in magnetohydrodynamic radiative flow due to rotating disk in presence of viscous dissipation and Joule heating

    NASA Astrophysics Data System (ADS)

    Hayat, Tasawar; Qayyum, Sumaira; Khan, Muhammad Ijaz; Alsaedi, Ahmed

    2018-01-01

    Simultaneous effects of viscous dissipation and Joule heating in flow by rotating disk of variable thickness are examined. Radiative flow saturating porous space is considered. Much attention is given to entropy generation outcome. Developed nonlinear ordinary differential systems are computed for the convergent series solutions. Specifically, the results of velocity, temperature, entropy generation, Bejan number, coefficient of skin friction, and local Nusselt number are discussed. Clearly the entropy generation rate depends on velocity and temperature distributions. Moreover the entropy generation rate is a decreasing function of Hartmann number, Eckert number, and Reynolds number, while they gave opposite behavior for Bejan numbers.

  15. Entropy Generation in Regenerative Systems

    NASA Technical Reports Server (NTRS)

    Kittel, Peter

    1995-01-01

    Heat exchange to the oscillating flows in regenerative coolers generates entropy. These flows are characterized by oscillating mass flows and oscillating temperatures. Heat is transferred between the flow and heat exchangers and regenerators. In the former case, there is a steady temperature difference between the flow and the heat exchangers. In the latter case, there is no mean temperature difference. In this paper a mathematical model of the entropy generated is developed for both cases. Estimates of the entropy generated by this process are given for oscillating flows in heat exchangers and in regenerators. The practical significance of this entropy is also discussed.

  16. Large Eddy Simulation of Entropy Generation in a Turbulent Mixing Layer

    NASA Astrophysics Data System (ADS)

    Sheikhi, Reza H.; Safari, Mehdi; Hadi, Fatemeh

    2013-11-01

    Entropy transport equation is considered in large eddy simulation (LES) of turbulent flows. The irreversible entropy generation in this equation provides a more general description of subgrid scale (SGS) dissipation due to heat conduction, mass diffusion and viscosity effects. A new methodology is developed, termed the entropy filtered density function (En-FDF), to account for all individual entropy generation effects in turbulent flows. The En-FDF represents the joint probability density function of entropy, frequency, velocity and scalar fields within the SGS. An exact transport equation is developed for the En-FDF, which is modeled by a system of stochastic differential equations, incorporating the second law of thermodynamics. The modeled En-FDF transport equation is solved by a Lagrangian Monte Carlo method. The methodology is employed to simulate a turbulent mixing layer involving transport of passive scalars and entropy. Various modes of entropy generation are obtained from the En-FDF and analyzed. Predictions are assessed against data generated by direct numerical simulation (DNS). The En-FDF predictions are in good agreements with the DNS data.

  17. Entropy of a (1+1)-dimensional charged black hole to all orders in the Planck length

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Wan; Park, Young-Jai

    2013-02-01

    We study the statistical entropy of a scalar field on the (1+1)-dimensional Maxwell-dilaton background without an artificial cutoff by considering corrections to all orders in the Planck length obtained from a generalized uncertainty principle applied to the quantum state density. In contrast to the previous results for d ≥ 3 dimensional cases, we obtain an unadjustable entropy due to the independence of the minimal length, which plays the role of an adjustable parameter. However, this entropy is still proportional to the Bekenstein-Hawking entropy.

  18. Multipass Target Search in Natural Environments

    PubMed Central

    Otte, Michael W.; Sofge, Donald; Gupta, Satyandra K.

    2017-01-01

    Consider a disaster scenario where search and rescue workers must search difficult to access buildings during an earthquake or flood. Often, finding survivors a few hours sooner results in a dramatic increase in saved lives, suggesting the use of drones for expedient rescue operations. Entropy can be used to quantify the generation and resolution of uncertainty. When searching for targets, maximizing mutual information of future sensor observations will minimize expected target location uncertainty by minimizing the entropy of the future estimate. Motion planning for multi-target autonomous search requires planning over an area with an imperfect sensor and may require multiple passes, which is hindered by the submodularity property of mutual information. Further, mission duration constraints must be handled accordingly, requiring consideration of the vehicle’s dynamics to generate feasible trajectories and must plan trajectories spanning the entire mission duration, something which most information gathering algorithms are incapable of doing. If unanticipated changes occur in an uncertain environment, new plans must be generated quickly. In addition, planning multipass trajectories requires evaluating path dependent rewards, requiring planning in the space of all previously selected actions, compounding the problem. We present an anytime algorithm for autonomous multipass target search in natural environments. The algorithm is capable of generating long duration dynamically feasible multipass coverage plans that maximize mutual information using a variety of techniques such as ϵ-admissible heuristics to speed up the search. To the authors’ knowledge this is the first attempt at efficiently solving multipass target search problems of such long duration. The proposed algorithm is based on best first branch and bound and is benchmarked against state of the art algorithms adapted to the problem in natural Simplex environments, gathering the most information in the given search time. PMID:29099087

  19. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non-rich vectors, does not involve variational theory and does not involve differential equations, but is a better approximation of the minimal entropy path distance than the distance //b-a//(2). We compute minimal entropy distance matrices for examples of DNA myostatin genes and amino-acid sequences across several species. Output tree dendograms for our minimal entropy metric are compared with dendograms based on BLAST and BLAST identity scores.

  20. Cascade control of superheated steam temperature with neuro-PID controller.

    PubMed

    Zhang, Jianhua; Zhang, Fenfang; Ren, Mifeng; Hou, Guolian; Fang, Fang

    2012-11-01

    In this paper, an improved cascade control methodology for superheated processes is developed, in which the primary PID controller is implemented by neural networks trained by minimizing error entropy criterion. The entropy of the tracking error can be estimated recursively by utilizing receding horizon window technique. The measurable disturbances in superheated processes are input to the neuro-PID controller besides the sequences of tracking error in outer loop control system, hence, feedback control is combined with feedforward control in the proposed neuro-PID controller. The convergent condition of the neural networks is analyzed. The implementation procedures of the proposed cascade control approach are summarized. Compared with the neuro-PID controller using minimizing squared error criterion, the proposed neuro-PID controller using minimizing error entropy criterion may decrease fluctuations of the superheated steam temperature. A simulation example shows the advantages of the proposed method. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Recommendations and illustrations for the evaluation of photonic random number generators

    NASA Astrophysics Data System (ADS)

    Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi

    2017-09-01

    The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.

  2. Harvesting Entropy for Random Number Generation for Internet of Things Constrained Devices Using On-Board Sensors

    PubMed Central

    Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej

    2015-01-01

    Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things. PMID:26506357

  3. Harvesting entropy for random number generation for internet of things constrained devices using on-board sensors.

    PubMed

    Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej

    2015-10-22

    Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things.

  4. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giovannetti, Vittorio; Maccone, Lorenzo; Shapiro, Jeffrey H.

    The minimum Renyi and Wehrl output entropies are found for bosonic channels in which the signal photons are either randomly displaced by a Gaussian distribution (classical-noise channel), or coupled to a thermal environment through lossy propagation (thermal-noise channel). It is shown that the Renyi output entropies of integer orders z{>=}2 and the Wehrl output entropy are minimized when the channel input is a coherent state.

  6. FPGA Implementation of Metastability-Based True Random Number Generator

    NASA Astrophysics Data System (ADS)

    Hata, Hisashi; Ichikawa, Shuichi

    True random number generators (TRNGs) are important as a basis for computer security. Though there are some TRNGs composed of analog circuit, the use of digital circuits is desired for the application of TRNGs to logic LSIs. Some of the digital TRNGs utilize jitter in free-running ring oscillators as a source of entropy, which consume large power. Another type of TRNG exploits the metastability of a latch to generate entropy. Although this kind of TRNG has been mostly implemented with full-custom LSI technology, this study presents an implementation based on common FPGA technology. Our TRNG is comprised of logic gates only, and can be integrated in any kind of logic LSI. The RS latch in our TRNG is implemented as a hard-macro to guarantee the quality of randomness by minimizing the signal skew and load imbalance of internal nodes. To improve the quality and throughput, the output of 64-256 latches are XOR'ed. The derived design was verified on a Xilinx Virtex-4 FPGA (XC4VFX20), and passed NIST statistical test suite without post-processing. Our TRNG with 256 latches occupies 580 slices, while achieving 12.5Mbps throughput.

  7. Gaussian States Minimize the Output Entropy of One-Mode Quantum Gaussian Channels

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo; Trevisan, Dario; Giovannetti, Vittorio

    2017-04-01

    We prove the long-standing conjecture stating that Gaussian thermal input states minimize the output von Neumann entropy of one-mode phase-covariant quantum Gaussian channels among all the input states with a given entropy. Phase-covariant quantum Gaussian channels model the attenuation and the noise that affect any electromagnetic signal in the quantum regime. Our result is crucial to prove the converse theorems for both the triple trade-off region and the capacity region for broadcast communication of the Gaussian quantum-limited amplifier. Our result extends to the quantum regime the entropy power inequality that plays a key role in classical information theory. Our proof exploits a completely new technique based on the recent determination of the p →q norms of the quantum-limited amplifier [De Palma et al., arXiv:1610.09967]. This technique can be applied to any quantum channel.

  8. Gaussian States Minimize the Output Entropy of One-Mode Quantum Gaussian Channels.

    PubMed

    De Palma, Giacomo; Trevisan, Dario; Giovannetti, Vittorio

    2017-04-21

    We prove the long-standing conjecture stating that Gaussian thermal input states minimize the output von Neumann entropy of one-mode phase-covariant quantum Gaussian channels among all the input states with a given entropy. Phase-covariant quantum Gaussian channels model the attenuation and the noise that affect any electromagnetic signal in the quantum regime. Our result is crucial to prove the converse theorems for both the triple trade-off region and the capacity region for broadcast communication of the Gaussian quantum-limited amplifier. Our result extends to the quantum regime the entropy power inequality that plays a key role in classical information theory. Our proof exploits a completely new technique based on the recent determination of the p→q norms of the quantum-limited amplifier [De Palma et al., arXiv:1610.09967]. This technique can be applied to any quantum channel.

  9. On entropic uncertainty relations in the presence of a minimal length

    NASA Astrophysics Data System (ADS)

    Rastegin, Alexey E.

    2017-07-01

    Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.

  10. Free Energy in Introductory Physics

    NASA Astrophysics Data System (ADS)

    Prentis, Jeffrey J.; Obsniuk, Michael J.

    2016-02-01

    Energy and entropy are two of the most important concepts in science. For all natural processes where a system exchanges energy with its environment, the energy of the system tends to decrease and the entropy of the system tends to increase. Free energy is the special concept that specifies how to balance the opposing tendencies to minimize energy and maximize entropy. There are many pedagogical articles on energy and entropy. Here we present a simple model to illustrate the concept of free energy and the principle of minimum free energy.

  11. Connectivity in the human brain dissociates entropy and complexity of auditory inputs☆

    PubMed Central

    Nastase, Samuel A.; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-01-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. PMID:25536493

  12. Effect of Entropy Generation on Wear Mechanics and System Reliability

    NASA Astrophysics Data System (ADS)

    Gidwani, Akshay; James, Siddanth; Jagtap, Sagar; Karthikeyan, Ram; Vincent, S.

    2018-04-01

    Wear is an irreversible phenomenon. Processes such as mutual sliding and rolling between materials involve entropy generation. These processes are monotonic with respect to time. The concept of entropy generation is further quantified using Degradation Entropy Generation theorem formulated by Michael D. Bryant. The sliding-wear model can be extrapolated to different instances in order to further provide a potential analysis of machine prognostics as well as system and process reliability for various processes besides even mere mechanical processes. In other words, using the concept of ‘entropy generation’ and wear, one can quantify the reliability of a system with respect to time using a thermodynamic variable, which is the basis of this paper. Thus in the present investigation, a unique attempt has been made to establish correlation between entropy-wear-reliability which can be useful technique in preventive maintenance.

  13. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  14. Holographic derivation of entanglement entropy from the anti-de Sitter space/conformal field theory correspondence.

    PubMed

    Ryu, Shinsei; Takayanagi, Tadashi

    2006-05-12

    A holographic derivation of the entanglement entropy in quantum (conformal) field theories is proposed from anti-de Sitter/conformal field theory (AdS/CFT) correspondence. We argue that the entanglement entropy in d + 1 dimensional conformal field theories can be obtained from the area of d dimensional minimal surfaces in AdS(d+2), analogous to the Bekenstein-Hawking formula for black hole entropy. We show that our proposal agrees perfectly with the entanglement entropy in 2D CFT when applied to AdS(3). We also compare the entropy computed in AdS(5)XS(5) with that of the free N=4 super Yang-Mills theory.

  15. Topological terms, AdS2 n gravity, and renormalized entanglement entropy of holographic CFTs

    NASA Astrophysics Data System (ADS)

    Anastasiou, Giorgos; Araya, Ignacio J.; Olea, Rodrigo

    2018-05-01

    We extend our topological renormalization scheme for entanglement entropy to holographic CFTs of arbitrary odd dimensions in the context of the AdS /CFT correspondence. The procedure consists in adding the Chern form as a boundary term to the area functional of the Ryu-Takayanagi minimal surface. The renormalized entanglement entropy thus obtained can be rewritten in terms of the Euler characteristic and the AdS curvature of the minimal surface. This prescription considers the use of the replica trick to express the renormalized entanglement entropy in terms of the renormalized gravitational action evaluated on the conically singular replica manifold extended to the bulk. This renormalized action is obtained in turn by adding the Chern form as the counterterm at the boundary of the 2 n -dimensional asymptotically AdS bulk manifold. We explicitly show that, up to next-to-leading order in the holographic radial coordinate, the addition of this boundary term cancels the divergent part of the entanglement entropy. We discuss possible applications of the method for studying CFT parameters like central charges.

  16. Performance optimization of plate heat exchangers with chevron plates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muley, A.; Manglik, R.M.

    1999-07-01

    The enhanced heat transfer performance of a chevron plate heat exchanger (PHE) is evaluated employing (1) energy-conservation based performance evaluation criteria (PECs), and (2) the second-law based minimization of entropy generation principle. Single-phase laminar and turbulent flow convection for three different chevron-plate arrangements are considered. The influence of plate surface corrugation characteristics and their stack arrangements on the heat exchanger's thermal-hydraulic performance is delineated. Based on the different figures of merit, the results show that the extent of heat transfer enhancement increases with flow Re and chevron angle {beta} in laminar flow, but it diminishes with increasing Re in turbulentmore » flows. With up to 2.9 times higher Q, 48% lower A, and entropy generation number N{sub s,a} {lt} 1, relative to an equivalent flat-plate pack, chevron plates are found to be especially suitable in the low to medium flow rates range (20 {le} Re {le} 2,000). Also, there appears to be no significant advantage of using a mixed-plate over a symmetric-plate arrangement.« less

  17. Numerical investigation for entropy generation in hydromagnetic flow of fluid with variable properties and slip

    NASA Astrophysics Data System (ADS)

    Khan, M. Ijaz; Hayat, Tasawar; Alsaedi, Ahmed

    2018-02-01

    This modeling and computations present the study of viscous fluid flow with variable properties by a rotating stretchable disk. Rotating flow is generated through nonlinear rotating stretching surface. Nonlinear thermal radiation and heat generation/absorption are studied. Flow is conducting for a constant applied magnetic field. No polarization is taken. Induced magnetic field is not taken into account. Attention is focused on the entropy generation rate and Bejan number. The entropy generation rate and Bejan number clearly depend on velocity and thermal fields. The von Kármán approach is utilized to convert the partial differential expressions into ordinary ones. These expressions are non-dimensionalized, and numerical results are obtained for flow variables. The effects of the magnetic parameter, Prandtl number, radiative parameter, heat generation/absorption parameter, and slip parameter on velocity and temperature fields as well as the entropy generation rate and Bejan number are discussed. Drag forces (radial and tangential) and heat transfer rates are calculated and discussed. Furthermore the entropy generation rate is a decreasing function of magnetic variable and Reynolds number. The Bejan number effect on the entropy generation rate is reverse to that of the magnetic variable. Also opposite behavior of heat transfers is observed for varying estimations of radiative and slip variables.

  18. Connectivity in the human brain dissociates entropy and complexity of auditory inputs.

    PubMed

    Nastase, Samuel A; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-03-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Copyright © 2014. Published by Elsevier Inc.

  19. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  20. Entropy generation in a mixed convection Poiseulle flow of molybdenum disulphide Jeffrey nanofluid

    NASA Astrophysics Data System (ADS)

    Gul, Aaiza; Khan, Ilyas; Makhanov, Stanislav S.

    2018-06-01

    Entropy analysis in a mixed convection Poiseulle flow of a Molybdenum Disulphide Jeffrey Nanofluid (MDJN) is presented. Mixed convection is caused due to buoyancy force and external pressure gradient. The problem is formulated in terms of a boundary value problem for a system of partial differential equations. An analytical solution for the velocity and the temperature is obtained using the perturbation technique. Entropy generation has been derived as a function of the velocity and temperature gradients. The solutions are displayed graphically and the relevant importance of the input parameters is discussed. A Jeffrey nanofluid (JN) has been compared with a second grade nanofluid (SGN) and Newtonian nanofluid (NN). It is found that the entropy generation decreases when the temperature increases whereas increasing the Brickman number increases entropy generation.

  1. Two faces of entropy and information in biological systems.

    PubMed

    Mitrokhin, Yuriy

    2014-10-21

    The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Investigating Friction as a Main Source of Entropy Generation in the Expansion of Confined Gas in a Piston-and-Cylinder Device

    ERIC Educational Resources Information Center

    Kang, Dun-Yen; Liou, Kai-Hsin; Chang, Wei-Lun

    2015-01-01

    The expansion or compression of gas confined in a piston-and-cylinder device is a classic working example used for illustrating the First and Second Laws of Thermodynamics. The balance of energy and entropy enables the estimation of a number of thermodynamic properties. The entropy generation (also called entropy production) resulting from this…

  3. MHD effects on heat transfer and entropy generation of nanofluid flow in an open cavity

    NASA Astrophysics Data System (ADS)

    Mehrez, Zouhaier; El Cafsi, Afif; Belghith, Ali; Le Quéré, Patrick

    2015-01-01

    The present numerical work investigates the effect of an external oriented magnetic field on heat transfer and entropy generation of Cu-water nanofluid flow in an open cavity heated from below. The governing equations are solved numerically by the finite-volume method. The study has been carried out for a wide range of solid volume fraction 0≤φ≤0.06, Hartmann number 0≤Ha≤100, Reynolds number 100≤Re≤500 and Richardson number 0.001≤Ri≤1 at three inclination angles of magnetic field γ: 0°, 45° and 90°. The numerical results are given by streamlines, isotherms, average Nusselt number, average entropy generation and Bejan number. The results show that flow behavior, temperature distribution, heat transfer and entropy generation are strongly affected by the presence of a magnetic field. The average Nusselt number and entropy generation, which increase by increasing volume fraction of nanoparticles, depend mainly on the Hartmann number and inclination angle of the magnetic field. The variation rates of heat transfer and entropy generation while adding nanoparticles or applying a magnetic field depend on the Richardson and Reynolds numbers.

  4. The minimal work cost of information processing

    NASA Astrophysics Data System (ADS)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  5. Extremal entanglement and mixedness in continuous variable systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adesso, Gerardo; Serafini, Alessio; Illuminati, Fabrizio

    2004-08-01

    We investigate the relationship between mixedness and entanglement for Gaussian states of continuous variable systems. We introduce generalized entropies based on Schatten p norms to quantify the mixedness of a state and derive their explicit expressions in terms of symplectic spectra. We compare the hierarchies of mixedness provided by such measures with the one provided by the purity (defined as tr {rho}{sup 2} for the state {rho}) for generic n-mode states. We then review the analysis proving the existence of both maximally and minimally entangled states at given global and marginal purities, with the entanglement quantified by the logarithmic negativity.more » Based on these results, we extend such an analysis to generalized entropies, introducing and fully characterizing maximally and minimally entangled states for given global and local generalized entropies. We compare the different roles played by the purity and by the generalized p entropies in quantifying the entanglement and the mixedness of continuous variable systems. We introduce the concept of average logarithmic negativity, showing that it allows a reliable quantitative estimate of continuous variable entanglement by direct measurements of global and marginal generalized p entropies.« less

  6. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  7. Entropy Generation in a Chemical Reaction

    ERIC Educational Resources Information Center

    Miranda, E. N.

    2010-01-01

    Entropy generation in a chemical reaction is analysed without using the general formalism of non-equilibrium thermodynamics at a level adequate for advanced undergraduates. In a first approach to the problem, the phenomenological kinetic equation of an elementary first-order reaction is used to show that entropy production is always positive. A…

  8. A Boltzmann machine for the organization of intelligent machines

    NASA Technical Reports Server (NTRS)

    Moed, Michael C.; Saridis, George N.

    1990-01-01

    A three-tier structure consisting of organization, coordination, and execution levels forms the architecture of an intelligent machine using the principle of increasing precision with decreasing intelligence from a hierarchically intelligent control. This system has been formulated as a probabilistic model, where uncertainty and imprecision can be expressed in terms of entropies. The optimal strategy for decision planning and task execution can be found by minimizing the total entropy in the system. The focus is on the design of the organization level as a Boltzmann machine. Since this level is responsible for planning the actions of the machine, the Boltzmann machine is reformulated to use entropy as the cost function to be minimized. Simulated annealing, expanding subinterval random search, and the genetic algorithm are presented as search techniques to efficiently find the desired action sequence and illustrated with numerical examples.

  9. Entropy Generation Across Earth's Bow Shock

    NASA Technical Reports Server (NTRS)

    Parks, George K.; McCarthy, Michael; Fu, Suiyan; Lee E. s; Cao, Jinbin; Goldstein, Melvyn L.; Canu, Patrick; Dandouras, Iannis S.; Reme, Henri; Fazakerley, Andrew; hide

    2011-01-01

    Earth's bow shock is a transition layer that causes an irreversible change in the state of plasma that is stationary in time. Theories predict entropy increases across the bow shock but entropy has never been directly measured. Cluster and Double Star plasma experiments measure 3D plasma distributions upstream and downstream of the bow shock that allow calculation of Boltzmann's entropy function H and his famous H-theorem, dH/dt O. We present the first direct measurements of entropy density changes across Earth's bow shock. We will show that this entropy generation may be part of the processes that produce the non-thermal plasma distributions is consistent with a kinetic entropy flux model derived from the collisionless Boltzmann equation, giving strong support that solar wind's total entropy across the bow shock remains unchanged. As far as we know, our results are not explained by any existing shock models and should be of interests to theorists.

  10. Subtype Differentiation of Small (≤ 4 cm) Solid Renal Mass Using Volumetric Histogram Analysis of DWI at 3-T MRI.

    PubMed

    Li, Anqin; Xing, Wei; Li, Haojie; Hu, Yao; Hu, Daoyu; Li, Zhen; Kamel, Ihab R

    2018-05-29

    The purpose of this article is to evaluate the utility of volumetric histogram analysis of apparent diffusion coefficient (ADC) derived from reduced-FOV DWI for small (≤ 4 cm) solid renal mass subtypes at 3-T MRI. This retrospective study included 38 clear cell renal cell carcinomas (RCCs), 16 papillary RCCs, 18 chromophobe RCCs, 13 minimal fat angiomyolipomas (AMLs), and seven oncocytomas evaluated with preoperative MRI. Volumetric ADC maps were generated using all slices of the reduced-FOV DW images to obtain histogram parameters, including mean, median, 10th percentile, 25th percentile, 75th percentile, 90th percentile, and SD ADC values, as well as skewness, kurtosis, and entropy. Comparisons of these parameters were made by one-way ANOVA, t test, and ROC curves analysis. ADC histogram parameters differentiated eight of 10 pairs of renal tumors. Three subtype pairs (clear cell RCC vs papillary RCC, clear cell RCC vs chromophobe RCC, and clear cell RCC vs minimal fat AML) were differentiated by mean ADC. However, five other subtype pairs (clear cell RCC vs oncocytoma, papillary RCC vs minimal fat AML, papillary RCC vs oncocytoma, chromophobe RCC vs minimal fat AML, and chromophobe RCC vs oncocytoma) were differentiated by histogram distribution parameters exclusively (all p < 0.05). Mean ADC, median ADC, 75th and 90th percentile ADC, SD ADC, and entropy of malignant tumors were significantly higher than those of benign tumors (all p < 0.05). Combination of mean ADC with histogram parameters yielded the highest AUC (0.851; sensitivity, 80.0%; specificity, 86.1%). Quantitative volumetric ADC histogram analysis may help differentiate various subtypes of small solid renal tumors, including benign and malignant lesions.

  11. Free Energy in Introductory Physics

    ERIC Educational Resources Information Center

    Prentis, Jeffrey J.; Obsniuk, Michael J.

    2016-01-01

    Energy and entropy are two of the most important concepts in science. For all natural processes where a system exchanges energy with its environment, the energy of the system tends to decrease and the entropy of the system tends to increase. Free energy is the special concept that specifies how to balance the opposing tendencies to minimize energy…

  12. Investment strategy due to the minimization of portfolio noise level by observations of coarse-grained entropy

    NASA Astrophysics Data System (ADS)

    Urbanowicz, Krzysztof; Hołyst, Janusz A.

    2004-12-01

    Using a recently developed method of noise level estimation that makes use of properties of the coarse-grained entropy, we have analyzed the noise level for the Dow Jones index and a few stocks from the New York Stock Exchange. We have found that the noise level ranges from 40% to 80% of the signal variance. The condition of a minimal noise level has been applied to construct optimal portfolios from selected shares. We show that the implementation of a corresponding threshold investment strategy leads to positive returns for historical data.

  13. Numerical study of entropy generation in MHD water-based carbon nanotubes along an inclined permeable surface

    NASA Astrophysics Data System (ADS)

    Soomro, Feroz Ahmed; Rizwan-ul-Haq; Khan, Z. H.; Zhang, Qiang

    2017-10-01

    Main theme of the article is to examine the entropy generation analysis for the magneto-hydrodynamic mixed convection flow of water functionalized carbon nanotubes along an inclined stretching surface. Thermophysical properties of both particles and working fluid are incorporated in the system of governing partial differential equations. Rehabilitation of nonlinear system of equations is obtained via similarity transformations. Moreover, solutions of these equations are further utilized to determine the volumetric entropy and characteristic entropy generation. Solutions of governing boundary layer equations are obtained numerically using the finite difference method. Effects of two types of carbon nanotubes, namely, single-wall carbon nanotubes (SWCNTs) and multi-wall carbon nanotubes (MWCNTs) with water as base fluid have been analyzed over the physical quantities of interest, namely, surface skin friction, heat transfer rate and entropy generation coefficients. Influential results of velocities, temperature, entropy generation and isotherms are plotted against the emerging parameter, namely, nanoparticle fraction 0≤φ ≤ 0.2, thermal convective parameter 0≤ λ ≤ 5, Hartmann number 0≤ M≤ 2, suction/injection parameter -1≤ S≤ 1, and Eckert number 0≤ Ec ≤ 2. It is finally concluded that skin friction increases due to the increase in the magnetic parameter, suction/injection and nanoparticle volume fraction, whereas the Nusselt number shows an increasing trend due to the increase in the suction parameter, mixed convection parameter and nanoparticle volume fraction. Similarly, entropy generation shows an opposite behavior for the Hartmann number and mixed convection parameter for both single-wall and multi-wall carbon nanotubes.

  14. Inhomogeneous Jacobi equation for minimal surfaces and perturbative change in holographic entanglement entropy

    NASA Astrophysics Data System (ADS)

    Ghosh, Avirup; Mishra, Rohit

    2018-04-01

    The change in holographic entanglement entropy (HEE) for small fluctuations about pure anti-de Sitter (AdS) is obtained by a perturbative expansion of the area functional in terms of the change in the bulk metric and the embedded extremal surface. However it is known that change in the embedding appears at second order or higher. It was shown that these changes in the embedding can be calculated in the 2 +1 dimensional case by solving a "generalized geodesic deviation equation." We generalize this result to arbitrary dimensions by deriving an inhomogeneous form of the Jacobi equation for minimal surfaces. The solutions of this equation map a minimal surface in a given space time to a minimal surface in a space time which is a perturbation over the initial space time. Using this we perturbatively calculate the changes in HEE up to second order for boosted black brane like perturbations over AdS4.

  15. Evidence for surprise minimization over value maximization in choice behavior

    PubMed Central

    Schwartenbeck, Philipp; FitzGerald, Thomas H. B.; Mathys, Christoph; Dolan, Ray; Kronbichler, Martin; Friston, Karl

    2015-01-01

    Classical economic models are predicated on the idea that the ultimate aim of choice is to maximize utility or reward. In contrast, an alternative perspective highlights the fact that adaptive behavior requires agents’ to model their environment and minimize surprise about the states they frequent. We propose that choice behavior can be more accurately accounted for by surprise minimization compared to reward or utility maximization alone. Minimizing surprise makes a prediction at variance with expected utility models; namely, that in addition to attaining valuable states, agents attempt to maximize the entropy over outcomes and thus ‘keep their options open’. We tested this prediction using a simple binary choice paradigm and show that human decision-making is better explained by surprise minimization compared to utility maximization. Furthermore, we replicated this entropy-seeking behavior in a control task with no explicit utilities. These findings highlight a limitation of purely economic motivations in explaining choice behavior and instead emphasize the importance of belief-based motivations. PMID:26564686

  16. Beyond the classical theory of heat conduction: a perspective view of future from entropy

    PubMed Central

    Lai, Xiang; Zhu, Pingan

    2016-01-01

    Energy is conserved by the first law of thermodynamics; its quality degrades constantly due to entropy generation, by the second law of thermodynamics. It is thus important to examine the entropy generation regarding the way to reduce its magnitude and the limit of entropy generation as time tends to infinity regarding whether it is bounded or not. This work initiates such an analysis with one-dimensional heat conduction. The work not only offers some fundamental insights of universe and its future, but also builds up the relation between the second law of thermodynamics and mathematical inequalities via developing the latter of either new or classical nature. A concise review of entropy is also included for the interest of performing the analysis in this work and the similar analysis for other processes in the future. PMID:27843400

  17. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models

    PubMed Central

    Grün, Sonja; Helias, Moritz

    2017-01-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition. PMID:28968396

  18. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models.

    PubMed

    Rostami, Vahid; Porta Mana, PierGianLuca; Grün, Sonja; Helias, Moritz

    2017-10-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition.

  19. Prediction of Protein Configurational Entropy (Popcoen).

    PubMed

    Goethe, Martin; Gleixner, Jan; Fita, Ignacio; Rubi, J Miguel

    2018-03-13

    A knowledge-based method for configurational entropy prediction of proteins is presented; this methodology is extremely fast, compared to previous approaches, because it does not involve any type of configurational sampling. Instead, the configurational entropy of a query fold is estimated by evaluating an artificial neural network, which was trained on molecular-dynamics simulations of ∼1000 proteins. The predicted entropy can be incorporated into a large class of protein software based on cost-function minimization/evaluation, in which configurational entropy is currently neglected for performance reasons. Software of this type is used for all major protein tasks such as structure predictions, proteins design, NMR and X-ray refinement, docking, and mutation effect predictions. Integrating the predicted entropy can yield a significant accuracy increase as we show exemplarily for native-state identification with the prominent protein software FoldX. The method has been termed Popcoen for Prediction of Protein Configurational Entropy. An implementation is freely available at http://fmc.ub.edu/popcoen/ .

  20. Entropy generation in Gaussian quantum transformations: applying the replica method to continuous-variable quantum information theory

    NASA Astrophysics Data System (ADS)

    Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.

    2016-02-01

    In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.

  1. Performance Analysis of Entropy Methods on K Means in Clustering Process

    NASA Astrophysics Data System (ADS)

    Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib

    2017-12-01

    K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.

  2. The Macro and Micro of it Is that Entropy Is the Spread of Energy

    NASA Astrophysics Data System (ADS)

    Phillips, Jeffrey A.

    2016-09-01

    While entropy is often described as "disorder," it is better thought of as a measure of how spread out energy is within a system. To illustrate this interpretation of entropy to introductory college or high school students, several activities have been created. Students first study the relationship between microstates and macrostates to better understand the probabilities involved. Then, each student observes how a system evolves as energy is allowed to move within it. By studying how the class's ensemble of systems evolves, the tendency of energy to spread, rather than concentrate, can be observed. All activities require minimal equipment and provide students with a tactile and visual experience with entropy.

  3. Irreversible entropy model for damage diagnosis in resistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuadras, Angel, E-mail: angel.cuadras@upc.edu; Crisóstomo, Javier; Ovejas, Victoria J.

    2015-10-28

    We propose a method to characterize electrical resistor damage based on entropy measurements. Irreversible entropy and the rate at which it is generated are more convenient parameters than resistance for describing damage because they are essentially positive in virtue of the second law of thermodynamics, whereas resistance may increase or decrease depending on the degradation mechanism. Commercial resistors were tested in order to characterize the damage induced by power surges. Resistors were biased with constant and pulsed voltage signals, leading to power dissipation in the range of 4–8 W, which is well above the 0.25 W nominal power to initiate failure. Entropymore » was inferred from the added power and temperature evolution. A model is proposed to understand the relationship among resistance, entropy, and damage. The power surge dissipates into heat (Joule effect) and damages the resistor. The results show a correlation between entropy generation rate and resistor failure. We conclude that damage can be conveniently assessed from irreversible entropy generation. Our results for resistors can be easily extrapolated to other systems or machines that can be modeled based on their resistance.« less

  4. Entanglement Entropy of Black Holes.

    PubMed

    Solodukhin, Sergey N

    2011-01-01

    The entanglement entropy is a fundamental quantity, which characterizes the correlations between sub-systems in a larger quantum-mechanical system. For two sub-systems separated by a surface the entanglement entropy is proportional to the area of the surface and depends on the UV cutoff, which regulates the short-distance correlations. The geometrical nature of entanglement-entropy calculation is particularly intriguing when applied to black holes when the entangling surface is the black-hole horizon. I review a variety of aspects of this calculation: the useful mathematical tools such as the geometry of spaces with conical singularities and the heat kernel method, the UV divergences in the entropy and their renormalization, the logarithmic terms in the entanglement entropy in four and six dimensions and their relation to the conformal anomalies. The focus in the review is on the systematic use of the conical singularity method. The relations to other known approaches such as 't Hooft's brick-wall model and the Euclidean path integral in the optical metric are discussed in detail. The puzzling behavior of the entanglement entropy due to fields, which non-minimally couple to gravity, is emphasized. The holographic description of the entanglement entropy of the blackhole horizon is illustrated on the two- and four-dimensional examples. Finally, I examine the possibility to interpret the Bekenstein-Hawking entropy entirely as the entanglement entropy.

  5. Entanglement Entropy of Black Holes

    NASA Astrophysics Data System (ADS)

    Solodukhin, Sergey N.

    2011-10-01

    The entanglement entropy is a fundamental quantity, which characterizes the correlations between sub-systems in a larger quantum-mechanical system. For two sub-systems separated by a surface the entanglement entropy is proportional to the area of the surface and depends on the UV cutoff, which regulates the short-distance correlations. The geometrical nature of entanglement-entropy calculation is particularly intriguing when applied to black holes when the entangling surface is the black-hole horizon. I review a variety of aspects of this calculation: the useful mathematical tools such as the geometry of spaces with conical singularities and the heat kernel method, the UV divergences in the entropy and their renormalization, the logarithmic terms in the entanglement entropy in four and six dimensions and their relation to the conformal anomalies. The focus in the review is on the systematic use of the conical singularity method. The relations to other known approaches such as 't Hooft's brick-wall model and the Euclidean path integral in the optical metric are discussed in detail. The puzzling behavior of the entanglement entropy due to fields, which non-minimally couple to gravity, is emphasized. The holographic description of the entanglement entropy of the blackhole horizon is illustrated on the two- and four-dimensional examples. Finally, I examine the possibility to interpret the Bekenstein-Hawking entropy entirely as the entanglement entropy.

  6. [Specific features in realization of the principle of minimum energy dissipation during individual development].

    PubMed

    Zotin, A A

    2012-01-01

    Realization of the principle of minimum energy dissipation (Prigogine's theorem) during individual development has been analyzed. This analysis has suggested the following reformulation of this principle for living objects: when environmental conditions are constant, the living system evolves to a current steady state in such a way that the difference between entropy production and entropy flow (psi(u) function) is positive and constantly decreases near the steady state, approaching zero. In turn, the current steady state tends to a final steady state in such a way that the difference between the specific entropy productions in an organism and its environment tends to be minimal. In general, individual development completely agrees with the law of entropy increase (second law of thermodynamics).

  7. All the entropies on the light-cone

    NASA Astrophysics Data System (ADS)

    Casini, Horacio; Testé, Eduardo; Torroba, Gonzalo

    2018-05-01

    We determine the explicit universal form of the entanglement and Renyi entropies, for regions with arbitrary boundary on a null plane or the light-cone. All the entropies are shown to saturate the strong subadditive inequality. This Renyi Markov property implies that the vacuum behaves like a product state. For the null plane, our analysis applies to general quantum field theories, and we show that the entropies do not depend on the region. For the light-cone, our approach is restricted to conformal field theories. In this case, the construction of the entropies is related to dilaton effective actions in two less dimensions. In particular, the universal logarithmic term in the entanglement entropy arises from a Wess-Zumino anomaly action. We also consider these properties in theories with holographic duals, for which we construct the minimal area surfaces for arbitrary shapes on the light-cone. We recover the Markov property and the universal form of the entropy, and argue that these properties continue to hold upon including stringy and quantum corrections. We end with some remarks on the recently proved entropic a-theorem in four spacetime dimensions.

  8. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  9. Electroweak phase transition and entropy release in the early universe

    NASA Astrophysics Data System (ADS)

    Chaudhuri, A.; Dolgov, A.

    2018-01-01

    It is shown that the vacuum-like energy of the Higgs potential at non-zero temperatures leads, in the course of the cosmological expansion, to a small but non-negligible rise of the entropy density in the comoving volume. This increase is calculated in the frameworks of the minimal standard model. The result can have a noticeable effect on the outcome of baryo-through-leptogenesis.

  10. Radar detection with the Neyman-Pearson criterion using supervised-learning-machines trained with the cross-entropy error

    NASA Astrophysics Data System (ADS)

    Jarabo-Amores, María-Pilar; la Mata-Moya, David de; Gil-Pita, Roberto; Rosa-Zurera, Manuel

    2013-12-01

    The application of supervised learning machines trained to minimize the Cross-Entropy error to radar detection is explored in this article. The detector is implemented with a learning machine that implements a discriminant function, which output is compared to a threshold selected to fix a desired probability of false alarm. The study is based on the calculation of the function the learning machine approximates to during training, and the application of a sufficient condition for a discriminant function to be used to approximate the optimum Neyman-Pearson (NP) detector. In this article, the function a supervised learning machine approximates to after being trained to minimize the Cross-Entropy error is obtained. This discriminant function can be used to implement the NP detector, which maximizes the probability of detection, maintaining the probability of false alarm below or equal to a predefined value. Some experiments about signal detection using neural networks are also presented to test the validity of the study.

  11. Renormalization of entanglement entropy from topological terms

    NASA Astrophysics Data System (ADS)

    Anastasiou, Giorgos; Araya, Ignacio J.; Olea, Rodrigo

    2018-05-01

    We propose a renormalization scheme for entanglement entropy of three-dimensional CFTs with a four-dimensional asymptotically AdS gravity dual in the context of the gauge/gravity correspondence. The procedure consists in adding the Chern form as a boundary term to the area functional of the Ryu-Takayanagi minimal surface. We provide an explicit prescription for the renormalized entanglement entropy, which is derived via the replica trick. This is achieved by considering a Euclidean gravitational action renormalized by the addition of the Chern form at the spacetime boundary, evaluated in the conically-singular replica manifold. We show that the addition of this boundary term cancels the divergent part of the entanglement entropy, recovering the results obtained by Taylor and Woodhead. We comment on how this prescription for renormalizing the entanglement entropy is in line with the general program of topological renormalization in asymptotically AdS gravity.

  12. Lossless quantum data compression with exponential penalization: an operational interpretation of the quantum Rényi entropy.

    PubMed

    Bellomo, Guido; Bosyk, Gustavo M; Holik, Federico; Zozor, Steeve

    2017-11-07

    Based on the problem of quantum data compression in a lossless way, we present here an operational interpretation for the family of quantum Rényi entropies. In order to do this, we appeal to a very general quantum encoding scheme that satisfies a quantum version of the Kraft-McMillan inequality. Then, in the standard situation, where one is intended to minimize the usual average length of the quantum codewords, we recover the known results, namely that the von Neumann entropy of the source bounds the average length of the optimal codes. Otherwise, we show that by invoking an exponential average length, related to an exponential penalization over large codewords, the quantum Rényi entropies arise as the natural quantities relating the optimal encoding schemes with the source description, playing an analogous role to that of von Neumann entropy.

  13. On the design of script languages for neural simulation.

    PubMed

    Brette, Romain

    2012-01-01

    In neural network simulators, models are specified according to a language, either specific or based on a general programming language (e.g. Python). There are also ongoing efforts to develop standardized languages, for example NeuroML. When designing these languages, efforts are often focused on expressivity, that is, on maximizing the number of model types than can be described and simulated. I argue that a complementary goal should be to minimize the cognitive effort required on the part of the user to use the language. I try to formalize this notion with the concept of "language entropy", and I propose a few practical guidelines to minimize the entropy of languages for neural simulation.

  14. Minimal entropy reconstructions of thermal images for emissivity correction

    NASA Astrophysics Data System (ADS)

    Allred, Lloyd G.

    1999-03-01

    Low emissivity with corresponding low thermal emission is a problem which has long afflicted infrared thermography. The problem is aggravated by reflected thermal energy which increases as the emissivity decreases, thus reducing the net signal-to-noise ratio, which degrades the resulting temperature reconstructions. Additional errors are introduced from the traditional emissivity-correction approaches, wherein one attempts to correct for emissivity either using thermocouples or using one or more baseline images, collected at known temperatures. These corrections are numerically equivalent to image differencing. Errors in the baseline images are therefore additive, causing the resulting measurement error to either double or triple. The practical application of thermal imagery usually entails coating the objective surface to increase the emissivity to a uniform and repeatable value. While the author recommends that the thermographer still adhere to this practice, he has devised a minimal entropy reconstructions which not only correct for emissivity variations, but also corrects for variations in sensor response, using the baseline images at known temperatures to correct for these values. The minimal energy reconstruction is actually based on a modified Hopfield neural network which finds the resulting image which best explains the observed data and baseline data, having minimal entropy change between adjacent pixels. The autocorrelation of temperatures between adjacent pixels is a feature of most close-up thermal images. A surprising result from transient heating data indicates that the resulting corrected thermal images have less measurement error and are closer to the situational truth than the original data.

  15. Heat transfer enhancement and entropy generation analysis of Al2O3-water nanofluid in an alternating oval cross-section tube using two-phase mixture model under turbulent flow

    NASA Astrophysics Data System (ADS)

    Najafi Khaboshan, Hasan; Nazif, Hamid Reza

    2018-04-01

    Heat transfer and turbulent flow of Al2O3-water nanofluid within alternating oval cross-section tube are numerically simulated using Eulerian-Eulerian two-phase mixture model. The primary goal of the present study is to investigate the effects of nanoparticles volume fraction, nanoparticles diameter and different inlet velocities on heat transfer, pressure drop and entropy generation characteristics of the alternating oval cross-section tube. For numerical simulation validation, the numerical results were compared with experimental data. Also, constant wall temperature boundary condition was considered on the tube wall. In addition, the comparison of thermal-hydraulic performance and the entropy generation characteristics between alternating oval cross-section tube and circular tube under same fluids were done. The results show that the heat transfer coefficient and pressure drop of alternating oval cross-section tube is more than base tube under same fluids. Also, these two parameters are increased when adding Al2O3 nanoparticle into water fluid, at any inlet velocity for both tubes. Furthermore, compared to the base fluid, the value of the heat transfer enhancement of nanofluid is higher than the increase of friction factor of nanofluid at the same given inlet boundary conditions. The results of entropy generation analysis illustrate that the total entropy generation increase with increasing the nanoparticles volume fraction and decreasing the nanoparticles diameter of nanofluid. The generation of thermal entropy is the main part of irreversibility, and Bejan number with an increase of the nanoparticles diameter slightly increases. Finally, at any given inlet velocity the frictional irreversibility is grown with an increase the nanoparticles volume fraction.

  16. Discrimination of coherent features in turbulent boundary layers by the entropy method

    NASA Technical Reports Server (NTRS)

    Corke, T. C.; Guezennec, Y. G.

    1984-01-01

    Entropy in information theory is defined as the expected or mean value of the measure of the amount of self-information contained in the ith point of a distribution series x sub i, based on its probability of occurrence p(x sub i). If p(x sub i) is the probability of the ith state of the system in probability space, then the entropy, E(X) = - sigma p(x sub i) logp (x sub i), is a measure of the disorder in the system. Based on this concept, a method was devised which sought to minimize the entropy in a time series in order to construct the signature of the most coherent motions. The constrained minimization was performed using a Lagrange multiplier approach which resulted in the solution of a simultaneous set of non-linear coupled equations to obtain the coherent time series. The application of the method to space-time data taken by a rake of sensors in the near-wall region of a turbulent boundary layer was presented. The results yielded coherent velocity motions made up of locally decelerated or accelerated fluid having a streamwise scale of approximately 100 nu/u(tau), which is in qualitative agreement with the results from other less objective discrimination methods.

  17. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    PubMed

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  18. Entropy Generation Analysis in Convective Ferromagnetic Nano Blood Flow Through a Composite Stenosed Arteries with Permeable Wall

    NASA Astrophysics Data System (ADS)

    Sher Akbar, Noreen; Wahid Butt, Adil

    2017-05-01

    The study of heat transfer is of significant importance in many biological and biomedical industry problems. This investigation comprises of the study of entropy generation analysis of the blood flow in the arteries with permeable walls. The convection through the flow is studied with compliments to the entropy generation. Governing problem is formulized and solved for low Reynold’s number and long wavelength approximations. Exact analytical solutions have been obtained and are analyzed graphically. It is seen that temperature for pure water is lower as compared to the copper water. It gains magnitude with an increase in the slip parameter.

  19. Thermodynamic geometry for a non-extensive ideal gas

    NASA Astrophysics Data System (ADS)

    López, J. L.; Obregón, O.; Torres-Arenas, J.

    2018-05-01

    A generalized entropy arising in the context of superstatistics is applied to an ideal gas. The curvature scalar associated to the thermodynamic space generated by this modified entropy is calculated using two formalisms of the geometric approach to thermodynamics. By means of the curvature/interaction hypothesis of the geometric approach to thermodynamic geometry it is found that as a consequence of considering a generalized statistics, an effective interaction arises but the interaction is not enough to generate a phase transition. This generalized entropy seems to be relevant in confinement or in systems with not so many degrees of freedom, so it could be interesting to use such entropies to characterize the thermodynamics of small systems.

  20. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  1. Entropy production of active particles and for particles in active baths

    NASA Astrophysics Data System (ADS)

    Pietzonka, Patrick; Seifert, Udo

    2018-01-01

    Entropy production of an active particle in an external potential is identified through a thermodynamically consistent minimal lattice model that includes the chemical reaction providing the propulsion and ordinary translational noise. In the continuum limit, a unique expression follows, comprising a direct contribution from the active process and an indirect contribution from ordinary diffusive motion. From the corresponding Langevin equation, this physical entropy production cannot be inferred through the conventional, yet here ambiguous, comparison of forward and time-reversed trajectories. Generalizations to several interacting active particles and passive particles in a bath of active ones are presented explicitly, further ones are briefly indicated.

  2. Trends in entropy production during ecosystem development in the Amazon Basin.

    PubMed

    Holdaway, Robert J; Sparrow, Ashley D; Coomes, David A

    2010-05-12

    Understanding successional trends in energy and matter exchange across the ecosystem-atmosphere boundary layer is an essential focus in ecological research; however, a general theory describing the observed pattern remains elusive. This paper examines whether the principle of maximum entropy production could provide the solution. A general framework is developed for calculating entropy production using data from terrestrial eddy covariance and micrometeorological studies. We apply this framework to data from eight tropical forest and pasture flux sites in the Amazon Basin and show that forest sites had consistently higher entropy production rates than pasture sites (0.461 versus 0.422 W m(-2) K(-1), respectively). It is suggested that during development, changes in canopy structure minimize surface albedo, and development of deeper root systems optimizes access to soil water and thus potential transpiration, resulting in lower surface temperatures and increased entropy production. We discuss our results in the context of a theoretical model of entropy production versus ecosystem developmental stage. We conclude that, although further work is required, entropy production could potentially provide a much-needed theoretical basis for understanding the effects of deforestation and land-use change on the land-surface energy balance.

  3. Measurement of entropy generation within bypass transitional flow

    NASA Astrophysics Data System (ADS)

    Skifton, Richard; Budwig, Ralph; McEligot, Donald; Crepeau, John

    2012-11-01

    A flat plate made from quartz was submersed in the Idaho National Laboratory's Matched Index of Refraction (MIR) flow facility. PIV was utilized to capture spatial vectors maps at near wall locations with five to ten points within the viscous sublayer. Entropy generation was calculated directly from measured velocity fluctuation derivatives. Two flows were studied: a zero pressure gradient and an adverse pressure gradient (β = -0.039). The free stream turbulence intensity to drive bypass transition ranged between 3% (near trailing edge) and 8% (near leading edge). The pointwise entropy generation rate will be utilized as a design parameter to systematically reduce losses. As a second observation, the pointwise entropy can be shown to predict the onset of transitional flow. This research was partially supported by the DOE EPSCOR program, grant DE-SC0004751 and by the Idaho National Laboratory. Center for Advanced Energy Studies.

  4. Computation and analysis for a constrained entropy optimization problem in finance

    NASA Astrophysics Data System (ADS)

    He, Changhong; Coleman, Thomas F.; Li, Yuying

    2008-12-01

    In [T. Coleman, C. He, Y. Li, Calibrating volatility function bounds for an uncertain volatility model, Journal of Computational Finance (2006) (submitted for publication)], an entropy minimization formulation has been proposed to calibrate an uncertain volatility option pricing model (UVM) from market bid and ask prices. To avoid potential infeasibility due to numerical error, a quadratic penalty function approach is applied. In this paper, we show that the solution to the quadratic penalty problem can be obtained by minimizing an objective function which can be evaluated via solving a Hamilton-Jacobian-Bellman (HJB) equation. We prove that the implicit finite difference solution of this HJB equation converges to its viscosity solution. In addition, we provide computational examples illustrating accuracy of calibration.

  5. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  6. Using heteroclinic orbits to quantify topological entropy in fluid flows

    NASA Astrophysics Data System (ADS)

    Sattari, Sulimon; Chen, Qianting; Mitchell, Kevin A.

    2016-03-01

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or "ghost," rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding of ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.

  7. Minimum relative entropy distributions with a large mean are Gaussian

    NASA Astrophysics Data System (ADS)

    Smerlak, Matteo

    2016-12-01

    Entropy optimization principles are versatile tools with wide-ranging applications from statistical physics to engineering to ecology. Here we consider the following constrained problem: Given a prior probability distribution q , find the posterior distribution p minimizing the relative entropy (also known as the Kullback-Leibler divergence) with respect to q under the constraint that mean (p ) is fixed and large. We show that solutions to this problem are approximately Gaussian. We discuss two applications of this result. In the context of dissipative dynamics, the equilibrium distribution of a Brownian particle confined in a strong external field is independent of the shape of the confining potential. We also derive an H -type theorem for evolutionary dynamics: The entropy of the (standardized) distribution of fitness of a population evolving under natural selection is eventually increasing in time.

  8. A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization

    NASA Astrophysics Data System (ADS)

    Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano

    In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.

  9. Gauge field entanglement in Kitaev's honeycomb model

    NASA Astrophysics Data System (ADS)

    Dóra, Balázs; Moessner, Roderich

    2018-01-01

    A spin fractionalizes into matter and gauge fermions in Kitaev's spin liquid on the honeycomb lattice. This follows from a Jordan-Wigner mapping to fermions, allowing for the construction of a minimal entropy ground-state wave function on the cylinder. We use this to calculate the entanglement entropy by choosing several distinct partitionings. First, by partitioning an infinite cylinder into two, the -ln2 topological entanglement entropy is reconfirmed. Second, the reduced density matrix of the gauge sector on the full cylinder is obtained after tracing out the matter degrees of freedom. This allows for evaluating the gauge entanglement Hamiltonian, which contains infinitely long-range correlations along the symmetry axis of the cylinder. The matter-gauge entanglement entropy is (Ny-1 )ln2 , with Ny the circumference of the cylinder. Third, the rules for calculating the gauge sector entanglement of any partition are determined. Rather small correctly chosen gauge partitions can still account for the topological entanglement entropy in spite of long-range correlations in the gauge entanglement Hamiltonian.

  10. Random ambience using high fidelity images

    NASA Astrophysics Data System (ADS)

    Abu, Nur Azman; Sahib, Shahrin

    2011-06-01

    Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.

  11. Entropy generation method to quantify thermal comfort.

    PubMed

    Boregowda, S C; Tiwari, S N; Chaturvedi, S K

    2001-12-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  12. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  13. Noise and complexity in human postural control: interpreting the different estimations of entropy.

    PubMed

    Rhea, Christopher K; Silver, Tobin A; Hong, S Lee; Ryu, Joong Hyun; Studenka, Breanna E; Hughes, Charmayne M L; Haddad, Jeffrey M

    2011-03-17

    Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses.

  14. Generating intrinsically disordered protein conformational ensembles from a Markov chain

    NASA Astrophysics Data System (ADS)

    Cukier, Robert I.

    2018-03-01

    Intrinsically disordered proteins (IDPs) sample a diverse conformational space. They are important to signaling and regulatory pathways in cells. An entropy penalty must be payed when an IDP becomes ordered upon interaction with another protein or a ligand. Thus, the degree of conformational disorder of an IDP is of interest. We create a dichotomic Markov model that can explore entropic features of an IDP. The Markov condition introduces local (neighbor residues in a protein sequence) rotamer dependences that arise from van der Waals and other chemical constraints. A protein sequence of length N is characterized by its (information) entropy and mutual information, MIMC, the latter providing a measure of the dependence among the random variables describing the rotamer probabilities of the residues that comprise the sequence. For a Markov chain, the MIMC is proportional to the pair mutual information MI which depends on the singlet and pair probabilities of neighbor residue rotamer sampling. All 2N sequence states are generated, along with their probabilities, and contrasted with the probabilities under the assumption of independent residues. An efficient method to generate realizations of the chain is also provided. The chain entropy, MIMC, and state probabilities provide the ingredients to distinguish different scenarios using the terminologies: MoRF (molecular recognition feature), not-MoRF, and not-IDP. A MoRF corresponds to large entropy and large MIMC (strong dependence among the residues' rotamer sampling), a not-MoRF corresponds to large entropy but small MIMC, and not-IDP corresponds to low entropy irrespective of the MIMC. We show that MorFs are most appropriate as descriptors of IDPs. They provide a reasonable number of high-population states that reflect the dependences between neighbor residues, thus classifying them as IDPs, yet without very large entropy that might lead to a too high entropy penalty.

  15. The viscosity to entropy ratio: From string theory motivated bounds to warm dense matter

    DOE PAGES

    Faussurier, G.; Libby, S. B.; Silvestrelli, P. L.

    2014-07-04

    Here, we study the ratio of viscosity to entropy density in Yukawa one-component plasmas as a function of coupling parameter at fixed screening, and in realistic warm dense matter models as a function of temperature at fixed density. In these two situations, the ratio is minimized for values of the coupling parameters that depend on screening, and for temperatures that in turn depend on density and material. In this context, we also examine Rosenfeld arguments relating transport coefficients to excess reduced entropy for Yukawa one-component plasmas. For these cases we show that this ratio is always above the lower-bound conjecturemore » derived from string theory ideas.« less

  16. Entanglement entropy of AdS5 × S5 with massless flavors at nonzero temperature

    NASA Astrophysics Data System (ADS)

    Hu, Sen; Wu, Guozhen

    2018-03-01

    We consider backreacted AdS5 × S5 coupled with Nf massless flavors introduced by D7-branes at nonzero temperature. The backreacted geometry is in the Veneziano limit. The temperature of this system is related to the event horizon at rh. Dividing one of the spatial directions into a line segment with length l, we will calculate the holographic entanglement entropy (HEE) between the two subspaces. We study the behavior near the event horizon, and finally find that there exists confinement/deconfinement phase transition phenomenon near the horizon since the difference between the entanglement entropy of the connected minimal surface and the disconnected one changes sign.

  17. Least action and entropy considerations of self-organization in Benard cells

    NASA Astrophysics Data System (ADS)

    Georgiev, Georgi; Iannacchione, Germano

    We study self-organization in complex systems using first principles in physics. Our approach involves the principle of least action and the second law of thermodynamics. In far from equilibrium systems, energy gradients cause internal ordering to facilitate the dissipation of energy in the environment. This internal ordering decreases their internal entropy in order to obey the principle of least action, minimizing the product of time and energy for transport through the system. We are considering the connection between action and entropy decrease inside Benard cells in order to derive some general features of self-organization. We are developing mathematical treatment of this coupling and comparing it to results from experiments and simulations.

  18. Numerical study of magnetic field on mixed convection and entropy generation of nanofluid in a trapezoidal enclosure

    NASA Astrophysics Data System (ADS)

    Aghaei, Alireza; Khorasanizadeh, Hossein; Sheikhzadeh, Ghanbarali; Abbaszadeh, Mahmoud

    2016-04-01

    The flow under influence of magnetic field is experienced in cooling electronic devices and voltage transformers, nuclear reactors, biochemistry and in physical phenomenon like geology. In this study, the effects of magnetic field on the flow field, heat transfer and entropy generation of Cu-water nanofluid mixed convection in a trapezoidal enclosure have been investigated. The top lid is cold and moving toward right or left, the bottom wall is hot and the side walls are insulated and their angle from the horizon are 15°, 30°, 45° and 60°. Simulations have been carried out for constant Grashof number of 104, Reynolds numbers of 30, 100, 300 and 1000, Hartmann numbers of 25, 50, 75 and 100 and nanoparticles volume fractions of zero up to 0.04. The finite volume method and SIMPLER algorithm have been utilized to solve the governing equations numerically. The results showed that with imposing the magnetic field and enhancing it, the nanofluid convection and the strength of flow decrease and the flow tends toward natural convection and finally toward pure conduction. For this reason, for all of the considered Reynolds numbers and volume fractions, by increasing the Hartmann number the average Nusselt number decreases. Furthermore, for any case with constant Reynolds and Hartmann numbers by increasing the volume fraction of nanoparticles the maximum stream function decreases. For all of the studied cases, entropy generation due to friction is negligible and the total entropy generation is mainly due to irreversibility associated with heat transfer and variation of the total entropy generation with Hartmann number is similar to that of the average Nusselt number. With change in lid movement direction at Reynolds number of 30 the average Nusselt number and total entropy generation are changed, but at Reynolds number of 1000 it has a negligible effect.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattari, Sulimon, E-mail: ssattari2@ucmerced.edu; Chen, Qianting, E-mail: qchen2@ucmerced.edu; Mitchell, Kevin A., E-mail: kmitchell@ucmerced.edu

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or “ghost,” rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding ofmore » ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.« less

  20. Dynamic approximate entropy electroanatomic maps detect rotors in a simulated atrial fibrillation model.

    PubMed

    Ugarte, Juan P; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping.

  1. Music viewed by its entropy content: A novel window for comparative analysis.

    PubMed

    Febres, Gerardo; Jaffe, Klaus

    2017-01-01

    Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.

  2. The Dominant Folding Route Minimizes Backbone Distortion in SH3

    PubMed Central

    Lammert, Heiko; Noel, Jeffrey K.; Onuchic, José N.

    2012-01-01

    Energetic frustration in protein folding is minimized by evolution to create a smooth and robust energy landscape. As a result the geometry of the native structure provides key constraints that shape protein folding mechanisms. Chain connectivity in particular has been identified as an essential component for realistic behavior of protein folding models. We study the quantitative balance of energetic and geometrical influences on the folding of SH3 in a structure-based model with minimal energetic frustration. A decomposition of the two-dimensional free energy landscape for the folding reaction into relevant energy and entropy contributions reveals that the entropy of the chain is not responsible for the folding mechanism. Instead the preferred folding route through the transition state arises from a cooperative energetic effect. Off-pathway structures are penalized by excess distortion in local backbone configurations and contact pair distances. This energy cost is a new ingredient in the malleable balance of interactions that controls the choice of routes during protein folding. PMID:23166485

  3. Classical many-particle systems with unique disordered ground states

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Stillinger, F. H.; Torquato, S.

    2017-10-01

    Classical ground states (global energy-minimizing configurations) of many-particle systems are typically unique crystalline structures, implying zero enumeration entropy of distinct patterns (aside from trivial symmetry operations). By contrast, the few previously known disordered classical ground states of many-particle systems are all high-entropy (highly degenerate) states. Here we show computationally that our recently proposed "perfect-glass" many-particle model [Sci. Rep. 6, 36963 (2016), 10.1038/srep36963] possesses disordered classical ground states with a zero entropy: a highly counterintuitive situation . For all of the system sizes, parameters, and space dimensions that we have numerically investigated, the disordered ground states are unique such that they can always be superposed onto each other or their mirror image. At low energies, the density of states obtained from simulations matches those calculated from the harmonic approximation near a single ground state, further confirming ground-state uniqueness. Our discovery provides singular examples in which entropy and disorder are at odds with one another. The zero-entropy ground states provide a unique perspective on the celebrated Kauzmann-entropy crisis in which the extrapolated entropy of a supercooled liquid drops below that of the crystal. We expect that our disordered unique patterns to be of value in fields beyond glass physics, including applications in cryptography as pseudorandom functions with tunable computational complexity.

  4. Experimental heat capacities, excess entropies, and magnetic properties of bulk and nano Fe3O4-Co3O4 and Fe3O4-Mn3O4 spinel solid solutions

    NASA Astrophysics Data System (ADS)

    Schliesser, Jacob M.; Huang, Baiyu; Sahu, Sulata K.; Asplund, Megan; Navrotsky, Alexandra; Woodfield, Brian F.

    2018-03-01

    We have measured the heat capacities of several well-characterized bulk and nanophase Fe3O4-Co3O4 and Fe3O4-Mn3O4 spinel solid solution samples from which magnetic properties of transitions and third-law entropies have been determined. The magnetic transitions show several features common to effects of particle and magnetic domain sizes. From the standard molar entropies, excess entropies of mixing have been generated for these solid solutions and compared with configurational entropies determined previously by assuming appropriate cation and valence distributions. The vibrational and magnetic excess entropies for bulk materials are comparable in magnitude to the respective configurational entropies indicating that excess entropies of mixing must be included when analyzing entropies of mixing. The excess entropies for nanophase materials are even larger than the configurational entropies. Changes in valence, cation distribution, bonding and microstructure between the mixing ions are the likely sources of the positive excess entropies of mixing.

  5. Mathematical model for thermal and entropy analysis of thermal solar collectors by using Maxwell nanofluids with slip conditions, thermal radiation and variable thermal conductivity

    NASA Astrophysics Data System (ADS)

    Aziz, Asim; Jamshed, Wasim; Aziz, Taha

    2018-04-01

    In the present research a simplified mathematical model for the solar thermal collectors is considered in the form of non-uniform unsteady stretching surface. The non-Newtonian Maxwell nanofluid model is utilized for the working fluid along with slip and convective boundary conditions and comprehensive analysis of entropy generation in the system is also observed. The effect of thermal radiation and variable thermal conductivity are also included in the present model. The mathematical formulation is carried out through a boundary layer approach and the numerical computations are carried out for Cu-water and TiO2-water nanofluids. Results are presented for the velocity, temperature and entropy generation profiles, skin friction coefficient and Nusselt number. The discussion is concluded on the effect of various governing parameters on the motion, temperature variation, entropy generation, velocity gradient and the rate of heat transfer at the boundary.

  6. Tsirelson's bound from a generalized data processing inequality

    NASA Astrophysics Data System (ADS)

    Dahlsten, Oscar C. O.; Lercher, Daniel; Renner, Renato

    2012-06-01

    The strength of quantum correlations is bounded from above by Tsirelson's bound. We establish a connection between this bound and the fact that correlations between two systems cannot increase under local operations, a property known as the data processing inequality (DPI). More specifically, we consider arbitrary convex probabilistic theories. These can be equipped with an entropy measure that naturally generalizes the von Neumann entropy, as shown recently in Short and Wehner (2010 New J. Phys. 12 033023) and Barnum et al (2010 New J. Phys. 12 033024). We prove that if the DPI holds with respect to this generalized entropy measure then the underlying theory necessarily respects Tsirelson's bound. We, moreover, generalize this statement to any entropy measure satisfying certain minimal requirements. A consequence of our result is that not all the entropic relations used for deriving Tsirelson's bound via information causality in Pawlowski et al (2009 Nature 461 1101-4) are necessary.

  7. Quantum thermodynamics and quantum entanglement entropies in an expanding universe

    NASA Astrophysics Data System (ADS)

    Farahmand, Mehrnoosh; Mohammadzadeh, Hosein; Mehri-Dehnavi, Hossein

    2017-05-01

    We investigate an asymptotically spatially flat Robertson-Walker space-time from two different perspectives. First, using von Neumann entropy, we evaluate the entanglement generation due to the encoded information in space-time. Then, we work out the entropy of particle creation based on the quantum thermodynamics of the scalar field on the underlying space-time. We show that the general behavior of both entropies are the same. Therefore, the entanglement can be applied to the customary quantum thermodynamics of the universe. Also, using these entropies, we can recover some information about the parameters of space-time.

  8. Information Measures for Multisensor Systems

    DTIC Science & Technology

    2013-12-11

    permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor

  9. Campbell's Rule for Estimating Entropy Changes

    ERIC Educational Resources Information Center

    Jensen, William B.

    2004-01-01

    Campbell's rule for estimating entropy changes is discussed in relation to an earlier article by Norman Craig, where it was proposed that the approximate value of the entropy of reaction was related to net moles of gas consumed or generated. It was seen that the average for Campbell's data set was lower than that for Craig's data set and…

  10. Entropy production and optimization of geothermal power plants

    NASA Astrophysics Data System (ADS)

    Michaelides, Efstathios E.

    2012-09-01

    Geothermal power plants are currently producing reliable and low-cost, base load electricity. Three basic types of geothermal power plants are currently in operation: single-flashing, dual-flashing, and binary power plants. Typically, the single-flashing and dual-flashing geothermal power plants utilize geothermal water (brine) at temperatures in the range of 550-430 K. Binary units utilize geothermal resources at lower temperatures, typically 450-380 K. The entropy production in the various components of the three types of geothermal power plants determines the efficiency of the plants. It is axiomatic that a lower entropy production would improve significantly the energy utilization factor of the corresponding power plant. For this reason, the entropy production in the major components of the three types of geothermal power plants has been calculated. It was observed that binary power plants generate the lowest amount of entropy and, thus, convert the highest rate of geothermal energy into mechanical energy. The single-flashing units generate the highest amount of entropy, primarily because they re-inject fluid at relatively high temperature. The calculations for entropy production provide information on the equipment where the highest irreversibilities occur, and may be used to optimize the design of geothermal processes in future geothermal power plants and thermal cycles used for the harnessing of geothermal energy.

  11. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    NASA Astrophysics Data System (ADS)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  12. Dynamic Approximate Entropy Electroanatomic Maps Detect Rotors in a Simulated Atrial Fibrillation Model

    PubMed Central

    Ugarte, Juan P.; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping. PMID:25489858

  13. Entropy generation in a second grade magnetohydrodynamic nanofluid flow over a convectively heated stretching sheet with nonlinear thermal radiation and viscous dissipation

    NASA Astrophysics Data System (ADS)

    Sithole, Hloniphile; Mondal, Hiranmoy; Sibanda, Precious

    2018-06-01

    This study addresses entropy generation in magnetohydrodynamic flow of a second grade nanofluid over a convectively heated stretching sheet with nonlinear thermal radiation and viscous dissipation. The second grade fluid is assumed to be electrically conducting and is permeated by an applied non-uniform magnetic field. We further consider the impact on the fluid properties and the Nusselt number of homogeneous-heterogeneous reactions and a convective boundary condition. The mathematical equations are solved using the spectral local linearization method. Computations for skin-friction coefficient and local Nusselt number are carried out and displayed in a table. It is observed that the effects of the thermophoresis parameter is to increase the temperature distributions throughout the boundary layer. The entropy generation is enhanced by larger magnetic parameters and increasing Reynolds number. The aim of this manuscript is to pay more attention of entropy generation analysis with heat and fluid flow on second grade nanofluids to improve the system performance. Also the fluid velocity and temperature in the boundary layer region rise significantly for increasing the values of the second grade nanofluid parameter.

  14. Operational safety assessment of turbo generators with wavelet Rényi entropy from sensor-dependent vibration signals.

    PubMed

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-04-16

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals' wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance.

  15. Quantum Liouville theory and BTZ black hole entropy

    NASA Astrophysics Data System (ADS)

    Chen, Yujun

    In this thesis I give an explicit conformal field theory description of (2+1)-dimensional BTZ black hole entropy. In the boundary Liouville field theory I investigate the reducible Verma modules in the elliptic sector, which correspond to certain irreducible representations of the quantum algebra Uq(sl2) ⊙ Uq̂(sl2). I show that there are states that decouple from these reducible Verma modules in a similar fashion to the decoupling of null states in minimal models. Because of the nonstandard form of the Ward identity for the two-point correlation functions in quantum Liouville field theory, these decoupling states have positive-definite norms. The unitary representations built on these decoupling states give the Bekenstein-Hawking entropy of the BTZ black hole.

  16. Entropy production of a Brownian ellipsoid in the overdamped limit.

    PubMed

    Marino, Raffaele; Eichhorn, Ralf; Aurell, Erik

    2016-01-01

    We analyze the translational and rotational motion of an ellipsoidal Brownian particle from the viewpoint of stochastic thermodynamics. The particle's Brownian motion is driven by external forces and torques and takes place in an heterogeneous thermal environment where friction coefficients and (local) temperature depend on space and time. Our analysis of the particle's stochastic thermodynamics is based on the entropy production associated with single particle trajectories. It is motivated by the recent discovery that the overdamped limit of vanishing inertia effects (as compared to viscous fricion) produces a so-called "anomalous" contribution to the entropy production, which has no counterpart in the overdamped approximation, when inertia effects are simply discarded. Here we show that rotational Brownian motion in the overdamped limit generates an additional contribution to the "anomalous" entropy. We calculate its specific form by performing a systematic singular perturbation analysis for the generating function of the entropy production. As a side result, we also obtain the (well-known) equations of motion in the overdamped limit. We furthermore investigate the effects of particle shape and give explicit expressions of the "anomalous entropy" for prolate and oblate spheroids and for near-spherical Brownian particles.

  17. Noise and Complexity in Human Postural Control: Interpreting the Different Estimations of Entropy

    PubMed Central

    Rhea, Christopher K.; Silver, Tobin A.; Hong, S. Lee; Ryu, Joong Hyun; Studenka, Breanna E.; Hughes, Charmayne M. L.; Haddad, Jeffrey M.

    2011-01-01

    Background Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. Methods and Findings The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. Conclusions The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses. PMID:21437281

  18. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S.

    1989-05-15

    Several theorems on the mechanics of gathering information are proved, and the possibility of violating the second law of thermodynamics by obtaining information is discussed in light of these theorems. Maxwell's demon can lower the entropy of his surroundings by an amount equal to the difference between the maximum entropy of his recording device and its initial entropy, without generating a compensating entropy increase. A demon with human-scale recording devices can reduce the entropy of a gas by a negligible amount only, but the proof of the demon's impracticability leaves open the possibility that systems highly correlated with their environmentmore » can reduce the environment's entropy by a substantial amount without increasing entropy elsewhere. In the event that a boundary condition for the universe requires it to be in a state of low entropy when small, the correlations induced between different particle modes during the expansion phase allow the modes to behave like Maxwell's demons during the contracting phase, reducing the entropy of the universe to a low value.« less

  19. A two-phase copula entropy-based multiobjective optimization approach to hydrometeorological gauge network design

    NASA Astrophysics Data System (ADS)

    Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin

    2017-12-01

    Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.

  20. Ontologies of life: From thermodynamics to teleonomics. Comment on "Answering Schrödinger's question: A free-energy formulation" by Maxwell James Désormeau Ramstead et al.

    NASA Astrophysics Data System (ADS)

    Kirmayer, Laurence J.

    2018-03-01

    In a far-reaching essay, Ramstead and colleagues [1] offer an answer to Schrodinger's question "What is life?" [2] framed in terms of a thermodynamic/information-theoretic free energy principle. In short, "all biological systems instantiate a hierarchical generative model of the world that implicitly minimizes its internal entropy by minimizing free energy" [1]. This model generates dynamic stability-that is, a recurrent set of states that constitute a dynamic attractor. This aspect of their answer has much in common with earlier thermodynamic approaches, like that of Prigogine [3], and with the metabolic self-organization central to Maturana and Varela's notion of autopoiesis [4]. It contrasts with explanations of life that emphasize the mechanics of self-replication [5] or autocatalysis [6,7]. In this approach, there is something gained and something lost. Gained is an explanation and corresponding formalism of great generality. Lost (or at least obscured) is a way to understand the "teleonomics" [8], goal-directedness, purposiveness, or agency of living systems-arguably, precisely what makes us ascribe the quality of "being alive" to an organism. Free energy minimization may be a necessary condition for life, but it is not sufficient to characterize its goals, which vary widely and, at least at the level of individual organisms or populations, clearly can run counter to this principle for long stretches of time.

  1. Entropy-as-a-Service: Unlocking the Full Potential of Cryptography.

    PubMed

    Vassilev, Apostol; Staples, Robert

    2016-09-01

    Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised.

  2. Entropy generation, particle creation, and quantum field theory in a cosmological spacetime: When do number and entropy increase\\?

    NASA Astrophysics Data System (ADS)

    Kandrup, Henry E.

    1988-06-01

    This paper reexamines the statistical quantum field theory of a free, minimally coupled, real scalar field Φ in a statically bounded, classical Friedmann cosmology, where the time-dependent scale factor Ω(t) tends to constant values Ω1 and Ω2 for tt2. The principal objective is to investigate the intuition that ``entropy'' S correlates with average particle number , so that increases in induced by parametric amplification manifest a one-to-one connection with increases in S. The definition of particle number Nk becomes unambiguous for t>t2 and t- is guaranteed generically to be positive only for special initial data which, in a number representation, are characterized by ``random phases'' in the sense that any relative phase for the projection of ρ(t1) into two different number eigenstates is ``random'' or ``unobservable physically,'' and averaged over in a density matrix. More importantly for the notion of entropy, random-phase initial data also guarantee an increase in the spread of P(\\{k,Nk\\}), so that, e.g., the sum of the variances Δ2N+/-k(t2) exceeds the initial Δ2N+/-k(t1). It is this increasing spread in P, rather than the growth in average numbers per se, which suggests that, for initial data manifesting random phases, SN(t2)>SN(t1), a result established rigorously in the limits of strong and weak particle creation.

  3. Entropy Generation/Availability Energy Loss Analysis Inside MIT Gas Spring and "Two Space" Test Rigs

    NASA Technical Reports Server (NTRS)

    Ebiana, Asuquo B.; Savadekar, Rupesh T.; Patel, Kaushal V.

    2006-01-01

    The results of the entropy generation and availability energy loss analysis under conditions of oscillating pressure and oscillating helium gas flow in two Massachusetts Institute of Technology (MIT) test rigs piston-cylinder and piston-cylinder-heat exchanger are presented. Two solution domains, the gas spring (single-space) in the piston-cylinder test rig and the gas spring + heat exchanger (two-space) in the piston-cylinder-heat exchanger test rig are of interest. Sage and CFD-ACE+ commercial numerical codes are used to obtain 1-D and 2-D computer models, respectively, of each of the two solution domains and to simulate the oscillating gas flow and heat transfer effects in these domains. Second law analysis is used to characterize the entropy generation and availability energy losses inside the two solution domains. Internal and external entropy generation and availability energy loss results predicted by Sage and CFD-ACE+ are compared. Thermodynamic loss analysis of simple systems such as the MIT test rigs are often useful to understand some important features of complex pattern forming processes in more complex systems like the Stirling engine. This study is aimed at improving numerical codes for the prediction of thermodynamic losses via the development of a loss post-processor. The incorporation of loss post-processors in Stirling engine numerical codes will facilitate Stirling engine performance optimization. Loss analysis using entropy-generation rates due to heat and fluid flow is a relatively new technique for assessing component performance. It offers a deep insight into the flow phenomena, allows a more exact calculation of losses than is possible with traditional means involving the application of loss correlations and provides an effective tool for improving component and overall system performance.

  4. A computational study of entropy generation in magnetohydrodynamic flow and heat transfer over an unsteady stretching permeable sheet

    NASA Astrophysics Data System (ADS)

    Saeed Butt, Adnan; Ali, Asif

    2014-01-01

    The present article aims to investigate the entropy effects in magnetohydrodynamic flow and heat transfer over an unsteady permeable stretching surface. The time-dependent partial differential equations are converted into non-linear ordinary differential equations by suitable similarity transformations. The solutions of these equations are computed analytically by the Homotopy Analysis Method (HAM) then solved numerically by the MATLAB built-in routine. Comparison of the obtained results is made with the existing literature under limiting cases to validate our study. The effects of unsteadiness parameter, magnetic field parameter, suction/injection parameter, Prandtl number, group parameter and Reynolds number on flow and heat transfer characteristics are checked and analysed with the aid of graphs and tables. Moreover, the effects of these parameters on entropy generation number and Bejan number are also shown graphically. It is examined that the unsteadiness and presence of magnetic field augments the entropy production.

  5. Double strand breaks may be a missing link between entropy and aging.

    PubMed

    Lenart, Peter; Bienertová-Vašků, Julie

    2016-07-01

    It has been previously suggested that an increase in entropy production leads to aging. However, the mechanisms linking increased entropy production in living mass to aging are currently unclear. Even though entropy cannot be easily associated with any specific molecular damage, the increase of entropy in structural mass may be connected with heat stress, which is known to generate double strand breaks. Double strand breaks, which are in turn known to play an important role in process of aging, are thus connected to both aging and an increase of entropy. In view of these associations, we propose a new model where the increase of entropy leads to the formation of double strand breaks, resulting in an aging phenotype. This not only offers a new perspective on aging research and facilitates experimental validation, but could also serve as a useful explanatory tool. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Noisy EEG signals classification based on entropy metrics. Performance assessment using first and second generation statistics.

    PubMed

    Cuesta-Frau, David; Miró-Martínez, Pau; Jordán Núñez, Jorge; Oltra-Crespo, Sandra; Molina Picó, Antonio

    2017-08-01

    This paper evaluates the performance of first generation entropy metrics, featured by the well known and widely used Approximate Entropy (ApEn) and Sample Entropy (SampEn) metrics, and what can be considered an evolution from these, Fuzzy Entropy (FuzzyEn), in the Electroencephalogram (EEG) signal classification context. The study uses the commonest artifacts found in real EEGs, such as white noise, and muscular, cardiac, and ocular artifacts. Using two different sets of publicly available EEG records, and a realistic range of amplitudes for interfering artifacts, this work optimises and assesses the robustness of these metrics against artifacts in class segmentation terms probability. The results show that the qualitative behaviour of the two datasets is similar, with SampEn and FuzzyEn performing the best, and the noise and muscular artifacts are the most confounding factors. On the contrary, there is a wide variability as regards initialization parameters. The poor performance achieved by ApEn suggests that this metric should not be used in these contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Determination of LEDs degradation with entropy generation rate

    NASA Astrophysics Data System (ADS)

    Cuadras, Angel; Yao, Jiaqiang; Quilez, Marcos

    2017-10-01

    We propose a method to assess the degradation and aging of light emitting diodes (LEDs) based on irreversible entropy generation rate. We degraded several LEDs and monitored their entropy generation rate ( S ˙ ) in accelerated tests. We compared the thermoelectrical results with the optical light emission evolution during degradation. We find a good relationship between aging and S ˙ (t), because S ˙ is both related to device parameters and optical performance. We propose a threshold of S ˙ (t) as a reliable damage indicator of LED end-of-life that can avoid the need to perform optical measurements to assess optical aging. The method lays beyond the typical statistical laws for lifetime prediction provided by manufacturers. We tested different LED colors and electrical stresses to validate the electrical LED model and we analyzed the degradation mechanisms of the devices.

  8. Relative entropy and optimization-driven coarse-graining methods in VOTCA

    DOE PAGES

    Mashayak, S. Y.; Jochum, Mara N.; Koschke, Konstantin; ...

    2015-07-20

    We discuss recent advances of the VOTCA package for systematic coarse-graining. Two methods have been implemented, namely the downhill simplex optimization and the relative entropy minimization. We illustrate the new methods by coarse-graining SPC/E bulk water and more complex water-methanol mixture systems. The CG potentials obtained from both methods are then evaluated by comparing the pair distributions from the coarse-grained to the reference atomistic simulations.We have also added a parallel analysis framework to improve the computational efficiency of the coarse-graining process.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mintert, Florian; Zyczkowski, Karol; Uniwersytet Jagiellonski, Instytut Fizyki im. M. Smoluchowskiego, ul. Reymonta 4, 30-059 Cracow

    We propose to quantify the entanglement of pure states of NxN bipartite quantum systems by defining its Husimi distribution with respect to SU(N)xSU(N) coherent states. The Wehrl entropy is minimal if and only if the analyzed pure state is separable. The excess of the Wehrl entropy is shown to be equal to the subentropy of the mixed state obtained by partial trace of the bipartite pure state. This quantity, as well as the generalized (Renyi) subentropies, are proved to be Schur concave, so they are entanglement monotones and may be used as alternative measures of entanglement.

  10. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines.

    PubMed

    Haseli, Y

    2016-05-01

    The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  11. Pattern formation in nonextensive thermodynamics: selection criterion based on the Renyi entropy production.

    PubMed

    Cybulski, Olgierd; Matysiak, Daniel; Babin, Volodymyr; Holyst, Robert

    2005-05-01

    We analyze a system of two different types of Brownian particles confined in a cubic box with periodic boundary conditions. Particles of different types annihilate when they come into close contact. The annihilation rate is matched by the birth rate, thus the total number of each kind of particles is conserved. When in a stationary state, the system is divided by an interface into two subregions, each occupied by one type of particles. All possible stationary states correspond to the Laplacian eigenfunctions. We show that the system evolves towards those stationary distributions of particles which minimize the Renyi entropy production. In all cases, the Renyi entropy production decreases monotonically during the evolution despite the fact that the topology and geometry of the interface exhibit abrupt and violent changes.

  12. MHD nanofluid free convection and entropy generation in porous enclosures with different conductivity ratios

    NASA Astrophysics Data System (ADS)

    Ghasemi, Kasra; Siavashi, Majid

    2017-11-01

    MHD natural convection of Cu-water nanofluid in a square porous enclosure is investigated using a parallel LBM code, considering temperature dependence of viscosity and viscous dissipation. Effects of nanofluid concentration (φ = 0 - 0.12), Rayleigh (Ra =103 -106), Hartmann (Ha = 0-20) and porous-fluid thermal conductivity ratio (K∗ = 1-70) on heat transfer and entropy generation are investigated. It is shown that K∗ is a very important parameter, and porous media with low K∗ numbers can confine convection effects, but by increasing K∗ both conduction and convection effects can substantially improve. Also, magnetic field always has negative impact on Nu, however this impact can be controlled by φ and K∗. A magnetic instability has also observed in Ra = 104, and Nu exhibits a sinusoidal variation with Ha. It is proved that, depending on K∗, Ra and Ha values, use of nanofluid with porous media to enhance heat transfer can be either beneficial or detrimental. Also, for given K∗, Ra and Ha numbers an optimal φ exists to improve heat transfer. Finally, entropy generation study performed and results state that in low and high Ra values the thermal and frictional entropy generation are respectively dominant, while for moderate Ra they have the same order of magnitude.

  13. Entropy-as-a-Service: Unlocking the Full Potential of Cryptography

    PubMed Central

    Vassilev, Apostol; Staples, Robert

    2016-01-01

    Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised. PMID:28003687

  14. Elevated BIS and Entropy values after sugammadex or neostigmine: an electroencephalographic or electromyographic phenomenon?

    PubMed

    Aho, A J; Kamata, K; Yli-Hankala, A; Lyytikäinen, L-P; Kulkas, A; Jäntti, V

    2012-04-01

    Sugammadex is designed to antagonize neuromuscular blockade (NMB) induced by rocuronium or vecuronium. In clinical practice, we have noticed a rise in the numerical values of bispectral index (BIS) and Entropy, two electroencephalogram (EEG) - based depth of anesthesia monitors, during the reversal of the NMB with sugammadex. The aim of this prospective, randomized, double-blind study was to test this impression and to compare the effects of sugammadex and neostigmine on the BIS and Entropy values during the reversal of the NMB. Thirty patients undergoing gynecological operations were studied. Patients were anesthetized with target-controlled infusions of propofol and remifentanil, and rocuronium was used to induce NMB. After operation, during light propofol-remifentanil anesthesia, NMB was antagonized with sugammadex or neostigmine. During the following 5 min, the numerical values of BIS, BIS electromyographic (BIS EMG) and Entropy were recorded on a laptop computer, as well as the biosignal recorded by the Entropy strip. The Entropy biosignal was studied off-line both in time and frequency domain to see if NMB reversal causes changes in EEG. In some patients, administration of sugammadex or neostigmine caused a significant rise in the numerical values of BIS, BIS EMG and Entropy. This phenomenon was most likely caused by increased electromyographic (EMG) activity. The administration of sugammadex or neostigmine appeared to have only minimal effect on EEG. The EMG contamination of EEG causes BIS and Entropy values to rise during reversal of rocuronium-induced NMB in light propofol-remifentanil anesthesia. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.

  15. Local and global approaches to the problem of Poincaré recurrences. Applications in nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.

    2015-07-01

    We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.

  16. Bit Threads and Holographic Entanglement

    NASA Astrophysics Data System (ADS)

    Freedman, Michael; Headrick, Matthew

    2017-05-01

    The Ryu-Takayanagi (RT) formula relates the entanglement entropy of a region in a holographic theory to the area of a corresponding bulk minimal surface. Using the max flow-min cut principle, a theorem from network theory, we rewrite the RT formula in a way that does not make reference to the minimal surface. Instead, we invoke the notion of a "flow", defined as a divergenceless norm-bounded vector field, or equivalently a set of Planck-thickness "bit threads". The entanglement entropy of a boundary region is given by the maximum flux out of it of any flow, or equivalently the maximum number of bit threads that can emanate from it. The threads thus represent entanglement between points on the boundary, and naturally implement the holographic principle. As we explain, this new picture clarifies several conceptual puzzles surrounding the RT formula. We give flow-based proofs of strong subadditivity and related properties; unlike the ones based on minimal surfaces, these proofs correspond in a transparent manner to the properties' information-theoretic meanings. We also briefly discuss certain technical advantages that the flows offer over minimal surfaces. In a mathematical appendix, we review the max flow-min cut theorem on networks and on Riemannian manifolds, and prove in the network case that the set of max flows varies Lipshitz continuously in the network parameters.

  17. Operational Safety Assessment of Turbo Generators with Wavelet Rényi Entropy from Sensor-Dependent Vibration Signals

    PubMed Central

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-01-01

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals’ wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance. PMID:25894934

  18. Entropy generation across Earth's collisionless bow shock.

    PubMed

    Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A

    2012-02-10

    Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.

  19. Investigation of Heat and Mass Transfer and Irreversibility Phenomena Within a Three-Dimensional Tilted Enclosure for Different Shapes

    NASA Astrophysics Data System (ADS)

    Oueslati, F.; Ben-Beya, B.

    2018-01-01

    Three-dimensional thermosolutal natural convection and entropy generation within an inclined enclosure is investigated in the current study. A numerical method based on the finite volume method and a full multigrid technique is implemented to solve the governing equations. Effects of various parameters, namely, the aspect ratio, buoyancy ratio, and tilt angle on the flow patterns and entropy generation are predicted and discussed.

  20. Analysis of natural convection in nanofluid-filled H-shaped cavity by entropy generation and heatline visualization using lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Rahimi, Alireza; Sepehr, Mohammad; Lariche, Milad Janghorban; Mesbah, Mohammad; Kasaeipoor, Abbas; Malekshah, Emad Hasani

    2018-03-01

    The lattice Boltzmann simulation of natural convection in H-shaped cavity filled with nanofluid is performed. The entropy generation analysis and heatline visualization are employed to analyze the considered problem comprehensively. The produced nanofluid is SiO2-TiO2/Water-EG (60:40) hybrid nanofluid, and the thermal conductivity and dynamic viscosity of used nanofluid are measured experimentally. To use the experimental data of thermal conductivity and dynamic viscosity, two sets of correlations based on temperature for six different solid volume fractions of 0.5, 1, 1.5, 2, 2.5 and 3 vol% are derived. The influences of different governing parameters such different aspect ratio, solid volume fractions of nanofluid and Rayleigh numbers on the fluid flow, temperature filed, average/local Nusselt number, total/local entropy generation and heatlines are presented.

  1. Application of exergetic sustainability index to a nano-scale irreversible Brayton cycle operating with ideal Bose and Fermi gasses

    NASA Astrophysics Data System (ADS)

    Açıkkalp, Emin; Caner, Necmettin

    2015-09-01

    In this study, a nano-scale irreversible Brayton cycle operating with quantum gasses including Bose and Fermi gasses is researched. Developments in the nano-technology cause searching the nano-scale machines including thermal systems to be unavoidable. Thermodynamic analysis of a nano-scale irreversible Brayton cycle operating with Bose and Fermi gasses was performed (especially using exergetic sustainability index). In addition, thermodynamic analysis involving classical evaluation parameters such as work output, exergy output, entropy generation, energy and exergy efficiencies were conducted. Results are submitted numerically and finally some useful recommendations were conducted. Some important results are: entropy generation and exergetic sustainability index are affected mostly for Bose gas and power output and exergy output are affected mostly for the Fermi gas by x. At the high temperature conditions, work output and entropy generation have high values comparing with other degeneracy conditions.

  2. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  3. Develop and Test a Solvent Accessible Surface Area-Based Model in Conformational Entropy Calculations

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2012-01-01

    It is of great interest in modern drug design to accurately calculate the free energies of protein-ligand or nucleic acid-ligand binding. MM-PBSA (Molecular Mechanics-Poisson Boltzmann Surface Area) and MM-GBSA (Molecular Mechanics-Generalized Born Surface Area) have gained popularity in this field. For both methods, the conformational entropy, which is usually calculated through normal mode analysis (NMA), is needed to calculate the absolute binding free energies. Unfortunately, NMA is computationally demanding and becomes a bottleneck of the MM-PB/GBSA-NMA methods. In this work, we have developed a fast approach to estimate the conformational entropy based upon solvent accessible surface area calculations. In our approach, the conformational entropy of a molecule, S, can be obtained by summing up the contributions of all atoms, no matter they are buried or exposed. Each atom has two types of surface areas, solvent accessible surface area (SAS) and buried SAS (BSAS). The two types of surface areas are weighted to estimate the contribution of an atom to S. Atoms having the same atom type share the same weight and a general parameter k is applied to balance the contributions of the two types of surface areas. This entropy model was parameterized using a large set of small molecules for which their conformational entropies were calculated at the B3LYP/6-31G* level taking the solvent effect into account. The weighted solvent accessible surface area (WSAS) model was extensively evaluated in three tests. For the convenience, TS, the product of temperature T and conformational entropy S, were calculated in those tests. T was always set to 298.15 K through the text. First of all, good correlations were achieved between WSAS TS and NMA TS for 44 protein or nucleic acid systems sampled with molecular dynamics simulations (10 snapshots were collected for post-entropy calculations): the mean correlation coefficient squares (R2) was 0.56. As to the 20 complexes, the TS changes upon binding, TΔS, were also calculated and the mean R2 was 0.67 between NMA and WSAS. In the second test, TS were calculated for 12 proteins decoy sets (each set has 31 conformations) generated by the Rosetta software package. Again, good correlations were achieved for all decoy sets: the mean, maximum, minimum of R2 were 0.73, 0.89 and 0.55, respectively. Finally, binding free energies were calculated for 6 protein systems (the numbers of inhibitors range from 4 to 18) using four scoring functions. Compared to the measured binding free energies, the mean R2 of the six protein systems were 0.51, 0.47, 0.40 and 0.43 for MM-GBSA-WSAS, MM-GBSA-NMA, MM-PBSA-WSAS and MM-PBSA-NMA, respectively. The mean RMS errors of prediction were 1.19, 1.24, 1.41, 1.29 kcal/mol for the four scoring functions, correspondingly. Therefore, the two scoring functions employing WSAS achieved a comparable prediction performance to that of the scoring functions using NMA. It should be emphasized that no minimization was performed prior to the WSAS calculation in the last test. Although WSAS is not as rigorous as physical models such as quasi-harmonic analysis and thermodynamic integration (TI), it is computationally very efficient as only surface area calculation is involved and no structural minimization is required. Moreover, WSAS has achieved a comparable performance to normal mode analysis. We expect that this model could find its applications in the fields like high throughput screening (HTS), molecular docking and rational protein design. In those fields, efficiency is crucial since there are a large number of compounds, docking poses or protein models to be evaluated. A list of acronyms and abbreviations used in this work is provided for quick reference. PMID:22497310

  4. EEG entropy measures indicate decrease of cortical information processing in Disorders of Consciousness.

    PubMed

    Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel

    2016-02-01

    Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Entropy, matter, and cosmology.

    PubMed

    Prigogine, I; Géhéniau, J

    1986-09-01

    The role of irreversible processes corresponding to creation of matter in general relativity is investigated. The use of Landau-Lifshitz pseudotensors together with conformal (Minkowski) coordinates suggests that this creation took place in the early universe at the stage of the variation of the conformal factor. The entropy production in this creation process is calculated. It is shown that these dissipative processes lead to the possibility of cosmological models that start from empty conditions and gradually build up matter and entropy. Gravitational entropy takes a simple meaning as associated to the entropy that is necessary to produce matter. This leads to an extension of the third law of thermodynamics, as now the zero point of entropy becomes the space-time structure out of which matter is generated. The theory can be put into a convenient form using a supplementary "C" field in Einstein's field equations. The role of the C field is to express the coupling between gravitation and matter leading to irreversible entropy production.

  6. Wavelet Packet Entropy for Heart Murmurs Classification

    PubMed Central

    Safara, Fatemeh; Doraisamy, Shyamala; Azman, Azreen; Jantan, Azrul; Ranga, Sri

    2012-01-01

    Heart murmurs are the first signs of cardiac valve disorders. Several studies have been conducted in recent years to automatically differentiate normal heart sounds, from heart sounds with murmurs using various types of audio features. Entropy was successfully used as a feature to distinguish different heart sounds. In this paper, new entropy was introduced to analyze heart sounds and the feasibility of using this entropy in classification of five types of heart sounds and murmurs was shown. The entropy was previously introduced to analyze mammograms. Four common murmurs were considered including aortic regurgitation, mitral regurgitation, aortic stenosis, and mitral stenosis. Wavelet packet transform was employed for heart sound analysis, and the entropy was calculated for deriving feature vectors. Five types of classification were performed to evaluate the discriminatory power of the generated features. The best results were achieved by BayesNet with 96.94% accuracy. The promising results substantiate the effectiveness of the proposed wavelet packet entropy for heart sounds classification. PMID:23227043

  7. Experimental and analytical investigation of direct and indirect noise generated from non-isentropic boundaries

    NASA Astrophysics Data System (ADS)

    de Domenico, Francesca; Rolland, Erwan; Hochgreb, Simone

    2017-11-01

    Pressure fluctuations in combustors arise either directly from the heat release rate perturbations of the flame (direct noise), or indirectly from the acceleration of entropy, vorticity or compositional perturbations through nozzles or turbine guide vanes (indirect noise). In this work, the second mechanism is experimentally investigated in a simplified rig. Synthetic entropy spots are generated via Joule effect or helium injection and then accelerated via orifice plates of different area contraction and thickness. The objective of the study is to parametrically analyse the entropy-to-sound conversion in non isentropic contractions (e.g. with pressure losses), represented by the orifice plates. Acoustic measurements are performed to reconstruct the acoustic and entropic transfer functions of the orifices and compare experimental data with analytical predictions, to investigate the effect of orifice thickness and area ratio on the transfer functions. PIV measurements are performed to study the stretching and dispersion of the entropy waves due to mean flow effects. Secondly, PIV images taken in the jet exiting downstream of the orifices are used to investigate the coupling of the acoustic and entropy fields with the hydrodynamic field. EPRSC, Qualcomm.

  8. Increased temperature and entropy production in cancer: the role of anti-inflammatory drugs.

    PubMed

    Pitt, Michael A

    2015-02-01

    Some cancers have been shown to have a higher temperature than surrounding normal tissue. This higher temperature is due to heat generated internally in the cancer. The higher temperature of cancer (compared to surrounding tissue) enables a thermodynamic analysis to be carried out. Here I show that there is increased entropy production in cancer compared with surrounding tissue. This is termed excess entropy production. The excess entropy production is expressed in terms of heat flow from the cancer to surrounding tissue and enzymic reactions in the cancer and surrounding tissue. The excess entropy production in cancer drives it away from the stationary state that is characterised by minimum entropy production. Treatments that reduce inflammation (and therefore temperature) should drive a cancer towards the stationary state. Anti-inflammatory agents, such as aspirin, other non-steroidal anti-inflammatory drugs, corticosteroids and also thyroxine analogues have been shown (using various criteria) to reduce the progress of cancer.

  9. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  10. An Equation for Moist Entropy in a Precipitating and Icy Atmosphere

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Simpson, Joanne; Zeng, Xiping

    2003-01-01

    Moist entropy is nearly conserved in adiabatic motion. It is redistributed rather than created by moist convection. Thus moist entropy and its equation, as a healthy direction, can be used to construct analytical and numerical models for the interaction between tropical convective clouds and large-scale circulations. Hence, an accurate equation of moist entropy is needed for the analysis and modeling of atmospheric convective clouds. On the basis of the consistency between the energy and the entropy equations, a complete equation of moist entropy is derived from the energy equation. The equation expresses explicitly the internal and external sources of moist entropy, including those in relation to the microphysics of clouds and precipitation. In addition, an accurate formula for the surface flux of moist entropy from the underlying surface into the air above is derived. Because moist entropy deals "easily" with the transition among three water phases, it will be used as a prognostic variable in the next generation of cloud-resolving models (e. g. a global cloud-resolving model) for low computational noise. Its equation that is derived in this paper is accurate and complete, providing a theoretical basis for using moist entropy as a prognostic variable in the long-term modeling of clouds and large-scale circulations.

  11. Statistical physics inspired energy-efficient coded-modulation for optical communications.

    PubMed

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2012-04-15

    Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America

  12. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy.

    PubMed

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  13. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  14. On the Application of Information Theory to Sustainability

    EPA Science Inventory

    According to the 2nd Law of Thermodynamics, entropy must be an increasing function of time for the whole universe, system plus surroundings. This gives rise to conjectures regarding the lost of work with entropy generation in a general processes. It can be shown that under cond...

  15. Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.

    PubMed

    Mammone, Nadia; Morabito, Francesco Carlo

    2008-09-01

    Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.

  16. Inverting ion images without Abel inversion: maximum entropy reconstruction of velocity maps.

    PubMed

    Dick, Bernhard

    2014-01-14

    A new method for the reconstruction of velocity maps from ion images is presented, which is based on the maximum entropy concept. In contrast to other methods used for Abel inversion the new method never applies an inversion or smoothing to the data. Instead, it iteratively finds the map which is the most likely cause for the observed data, using the correct likelihood criterion for data sampled from a Poissonian distribution. The entropy criterion minimizes the information content in this map, which hence contains no information for which there is no evidence in the data. Two implementations are proposed, and their performance is demonstrated with simulated and experimental data: Maximum Entropy Velocity Image Reconstruction (MEVIR) obtains a two-dimensional slice through the velocity distribution and can be compared directly to Abel inversion. Maximum Entropy Velocity Legendre Reconstruction (MEVELER) finds one-dimensional distribution functions Q(l)(v) in an expansion of the velocity distribution in Legendre polynomials P((cos θ) for the angular dependence. Both MEVIR and MEVELER can be used for the analysis of ion images with intensities as low as 0.01 counts per pixel, with MEVELER performing significantly better than MEVIR for images with low intensity. Both methods perform better than pBASEX, in particular for images with less than one average count per pixel.

  17. On the Application of Information Theory to Regime Changes and Sustainability

    EPA Science Inventory

    According to the 2nd Law of Thermodynamics, entropy must be an increasing function of time for the whole universe, system plus surroundings. This gives rise to conjectures regarding the lost of work with entropy generation in a general processes. It can be shown that under cond...

  18. Mixture models with entropy regularization for community detection in networks

    NASA Astrophysics Data System (ADS)

    Chang, Zhenhai; Yin, Xianjun; Jia, Caiyan; Wang, Xiaoyang

    2018-04-01

    Community detection is a key exploratory tool in network analysis and has received much attention in recent years. NMM (Newman's mixture model) is one of the best models for exploring a range of network structures including community structure, bipartite and core-periphery structures, etc. However, NMM needs to know the number of communities in advance. Therefore, in this study, we have proposed an entropy regularized mixture model (called EMM), which is capable of inferring the number of communities and identifying network structure contained in a network, simultaneously. In the model, by minimizing the entropy of mixing coefficients of NMM using EM (expectation-maximization) solution, the small clusters contained little information can be discarded step by step. The empirical study on both synthetic networks and real networks has shown that the proposed model EMM is superior to the state-of-the-art methods.

  19. Rényi entropy of the totally asymmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Wood, Anthony J.; Blythe, Richard A.; Evans, Martin R.

    2017-11-01

    The Rényi entropy is a generalisation of the Shannon entropy that is sensitive to the fine details of a probability distribution. We present results for the Rényi entropy of the totally asymmetric exclusion process (TASEP). We calculate explicitly an entropy whereby the squares of configuration probabilities are summed, using the matrix product formalism to map the problem to one involving a six direction lattice walk in the upper quarter plane. We derive the generating function across the whole phase diagram, using an obstinate kernel method. This gives the leading behaviour of the Rényi entropy and corrections in all phases of the TASEP. The leading behaviour is given by the result for a Bernoulli measure and we conjecture that this holds for all Rényi entropies. Within the maximal current phase the correction to the leading behaviour is logarithmic in the system size. Finally, we remark upon a special property of equilibrium systems whereby discontinuities in the Rényi entropy arise away from phase transitions, which we refer to as secondary transitions. We find no such secondary transition for this nonequilibrium system, supporting the notion that these are specific to equilibrium cases.

  20. Thermodynamic constraints on a varying cosmological-constant-like term from the holographic equipartition law with a power-law corrected entropy

    NASA Astrophysics Data System (ADS)

    Komatsu, Nobuyoshi

    2017-11-01

    A power-law corrected entropy based on a quantum entanglement is considered to be a viable black-hole entropy. In this study, as an alternative to Bekenstein-Hawking entropy, a power-law corrected entropy is applied to Padmanabhan's holographic equipartition law to thermodynamically examine an extra driving term in the cosmological equations for a flat Friedmann-Robertson-Walker universe at late times. Deviations from the Bekenstein-Hawking entropy generate an extra driving term (proportional to the α th power of the Hubble parameter, where α is a dimensionless constant for the power-law correction) in the acceleration equation, which can be derived from the holographic equipartition law. Interestingly, the value of the extra driving term in the present model is constrained by the second law of thermodynamics. From the thermodynamic constraint, the order of the driving term is found to be consistent with the order of the cosmological constant measured by observations. In addition, the driving term tends to be constantlike when α is small, i.e., when the deviation from the Bekenstein-Hawking entropy is small.

  1. Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning

    NASA Technical Reports Server (NTRS)

    Fayyad, U.; Irani, K.

    1993-01-01

    Since most real-world applications of classification learning involve continuous-valued attributes, properly addressing the discretization process is an important problem. This paper addresses the use of the entropy minimization heuristic for discretizing the range of a continuous-valued attribute into multiple intervals.

  2. Origin of microbial life: Nano- and molecular events, thermodynamics/entropy, quantum mechanisms and genetic instructions.

    PubMed

    Trevors, J T

    2011-03-01

    Currently, there are no agreed upon mechanisms and supporting evidence for the origin of the first microbial cells on the Earth. However, some hypotheses have been proposed with minimal supporting evidence and experimentation/observations. The approach taken in this article is that life originated at the nano- and molecular levels of biological organization, using quantum mechanic principles that became manifested as classical microbial cell(s), allowing the origin of microbial life on the Earth with a core or minimal, organic, genetic code containing the correct instructions for cell(s) for growth and division, in a micron dimension environment, with a local entropy range conducive to life (present about 4 billion years ago), and obeying the laws of thermodynamics. An integrated approach that explores all encompassing factors necessary for the origin of life, may bring forth plausible hypotheses (and mechanisms) with much needed supporting experimentation and observations for an origin of life theory. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Phase retrieval from intensity-only data by relative entropy minimization.

    PubMed

    Deming, Ross W

    2007-11-01

    A recursive algorithm, which appears to be new, is presented for estimating the amplitude and phase of a wave field from intensity-only measurements on two or more scan planes at different axial positions. The problem is framed as a nonlinear optimization, in which the angular spectrum of the complex field model is adjusted in order to minimize the relative entropy, or Kullback-Leibler divergence, between the measured and reconstructed intensities. The most common approach to this so-called phase retrieval problem is a variation of the well-known Gerchberg-Saxton algorithm devised by Misell (J. Phys. D6, L6, 1973), which is efficient and extremely simple to implement. The new algorithm has a computational structure that is very similar to Misell's approach, despite the fundamental difference in the optimization criteria used for each. Based upon results from noisy simulated data, the new algorithm appears to be more robust than Misell's approach and to produce better results from low signal-to-noise ratio data. The convergence of the new algorithm is examined.

  4. Driver Fatigue Classification With Independent Component by Entropy Rate Bound Minimization Analysis in an EEG-Based System.

    PubMed

    Chai, Rifai; Naik, Ganesh R; Nguyen, Tuan Nghia; Ling, Sai Ho; Tran, Yvonne; Craig, Ashley; Nguyen, Hung T

    2017-05-01

    This paper presents a two-class electroencephal-ography-based classification for classifying of driver fatigue (fatigue state versus alert state) from 43 healthy participants. The system uses independent component by entropy rate bound minimization analysis (ERBM-ICA) for the source separation, autoregressive (AR) modeling for the features extraction, and Bayesian neural network for the classification algorithm. The classification results demonstrate a sensitivity of 89.7%, a specificity of 86.8%, and an accuracy of 88.2%. The combination of ERBM-ICA (source separator), AR (feature extractor), and Bayesian neural network (classifier) provides the best outcome with a p-value < 0.05 with the highest value of area under the receiver operating curve (AUC-ROC = 0.93) against other methods such as power spectral density as feature extractor (AUC-ROC = 0.81). The results of this study suggest the method could be utilized effectively for a countermeasure device for driver fatigue identification and other adverse event applications.

  5. Finding the quantum thermoelectric with maximal efficiency and minimal entropy production at given power output

    NASA Astrophysics Data System (ADS)

    Whitney, Robert S.

    2015-03-01

    We investigate the nonlinear scattering theory for quantum systems with strong Seebeck and Peltier effects, and consider their use as heat engines and refrigerators with finite power outputs. This paper gives detailed derivations of the results summarized in a previous paper [R. S. Whitney, Phys. Rev. Lett. 112, 130601 (2014), 10.1103/PhysRevLett.112.130601]. It shows how to use the scattering theory to find (i) the quantum thermoelectric with maximum possible power output, and (ii) the quantum thermoelectric with maximum efficiency at given power output. The latter corresponds to a minimal entropy production at that power output. These quantities are of quantum origin since they depend on system size over electronic wavelength, and so have no analog in classical thermodynamics. The maximal efficiency coincides with Carnot efficiency at zero power output, but decreases with increasing power output. This gives a fundamental lower bound on entropy production, which means that reversibility (in the thermodynamic sense) is impossible for finite power output. The suppression of efficiency by (nonlinear) phonon and photon effects is addressed in detail; when these effects are strong, maximum efficiency coincides with maximum power. Finally, we show in particular limits (typically without magnetic fields) that relaxation within the quantum system does not allow the system to exceed the bounds derived for relaxation-free systems, however, a general proof of this remains elusive.

  6. A Locally Optimal Algorithm for Estimating a Generating Partition from an Observed Time Series and Its Application to Anomaly Detection.

    PubMed

    Ghalyan, Najah F; Miller, David J; Ray, Asok

    2018-06-12

    Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.

  7. Calculation of Cyclodextrin Binding Affinities: Energy, Entropy, and Implications for Drug Design

    PubMed Central

    Chen, Wei; Chang, Chia-En; Gilson, Michael K.

    2004-01-01

    The second generation Mining Minima method yields binding affinities accurate to within 0.8 kcal/mol for the associations of α-, β-, and γ-cyclodextrin with benzene, resorcinol, flurbiprofen, naproxen, and nabumetone. These calculations require hours to a day on a commodity computer. The calculations also indicate that the changes in configurational entropy upon binding oppose association by as much as 24 kcal/mol and result primarily from a narrowing of energy wells in the bound versus the free state, rather than from a drop in the number of distinct low-energy conformations on binding. Also, the configurational entropy is found to vary substantially among the bound conformations of a given cyclodextrin-guest complex. This result suggests that the configurational entropy must be accounted for to reliably rank docked conformations in both host-guest and ligand-protein complexes. In close analogy with the common experimental observation of entropy-enthalpy compensation, the computed entropy changes show a near-linear relationship with the changes in mean potential plus solvation energy. PMID:15339804

  8. Numerical study focusing on the entropy analysis of MHD squeezing flow of a nanofluid model using Cattaneo–Christov theory

    NASA Astrophysics Data System (ADS)

    Akmal, N.; Sagheer, M.; Hussain, S.

    2018-05-01

    The present study gives an account of the heat transfer characteristics of the squeezing flow of a nanofluid between two flat plates with upper plate moving vertically and the lower in the horizontal direction. Tiwari and Das nanofluid model has been utilized to give a comparative analysis of the heat transfer in the Cu-water and Al2O3-water nanofluids with entropy generation. The modeling is carried out with the consideration of Lorentz forces to observe the effect of magnetic field on the flow. The Joule heating effect is included to discuss the heat dissipation in the fluid and its effect on the entropy of the system. The nondimensional ordinary differential equations are solved using the Keller box method to assess the numerical results which are presented by the graphs and tables. An interesting observation is that the entropy is generated more near the lower plate as compared with that at the upper plate. Also, the heat transfer rate is found to be higher for the Cu nanoparticles in comparison with the Al2O3 nanoparticles.

  9. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie

    2008-06-01

    Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases.

  10. Pareto versus lognormal: A maximum entropy test

    NASA Astrophysics Data System (ADS)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  11. A thermodynamic approach to the 'mitosis/apoptosis' ratio in cancer

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto; Ponzetto, Antonio; Deisboeck, Thomas S.

    2015-10-01

    Cancer can be considered as an open, complex, (bio-thermo)dynamic and self-organizing system. Consequently, an entropy generation approach has been employed to analyze its mitosis/apoptosis ratio. Specifically, a novel thermodynamic anticancer strategy is suggested, based on the variation of entropy generation caused by the application of external fields, for example electro-magnetic fields, for therapeutic purposes. Eventually, this innovative approach could support conventional therapies, particularly for inoperable tumors or advanced stages of cancer, when larger tumor burden is diagnosed, and therapeutic options are often limited.

  12. Entropy of level-cut random Gaussian structures at different volume fractions

    NASA Astrophysics Data System (ADS)

    Marčelja, Stjepan

    2017-10-01

    Cutting random Gaussian fields at a given level can create a variety of morphologically different two- or several-phase structures that have often been used to describe physical systems. The entropy of such structures depends on the covariance function of the generating Gaussian random field, which in turn depends on its spectral density. But the entropy of level-cut structures also depends on the volume fractions of different phases, which is determined by the selection of the cutting level. This dependence has been neglected in earlier work. We evaluate the entropy of several lattice models to show that, even in the cases of strongly coupled systems, the dependence of the entropy of level-cut structures on molar fractions of the constituents scales with the simple ideal noninteracting system formula. In the last section, we discuss the application of the results to binary or ternary fluids and microemulsions.

  13. Landauer-Büttiker Approach to Strongly Coupled Quantum Thermodynamics: Inside-Outside Duality of Entropy Evolution

    NASA Astrophysics Data System (ADS)

    Bruch, Anton; Lewenkopf, Caio; von Oppen, Felix

    2018-03-01

    We develop a Landauer-Büttiker theory of entropy evolution in time-dependent, strongly coupled electron systems. The formalism naturally avoids the problem of the system-bath distinction by defining the entropy current in the attached leads. This current can then be used to infer changes of the entropy of the system which we refer to as the inside-outside duality. We carry out this program in an adiabatic expansion up to first order beyond the quasistatic limit. When combined with particle and energy currents, as well as the work required to change an external potential, our formalism provides a full thermodynamic description, applicable to arbitrary noninteracting electron systems in contact with reservoirs. This provides a clear understanding of the relation between heat and entropy currents generated by time-dependent potentials and their connection to the occurring dissipation.

  14. Effect of extreme data loss on heart rate signals quantified by entropy analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao

    2015-02-01

    The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.

  15. Sonic-boom minimization.

    NASA Technical Reports Server (NTRS)

    Seebass, R.; George, A. R.

    1972-01-01

    There have been many attempts to reduce or eliminate the sonic boom. Such attempts fall into two categories: (1) aerodynamic minimization and (2) exotic configurations. In the first category changes in the entropy and the Bernoulli constant are neglected and equivalent body shapes required to minimize the overpressure, the shock pressure rise and the impulse are deduced. These results include the beneficial effects of atmospheric stratification. In the second category, the effective length of the aircraft is increased or its base area decreased by modifying the Bernoulli constant a significant fraction of the flow past the aircraft. A figure of merit is introduced which makes it possible to judge the effectiveness of the latter schemes.

  16. Entanglement of a quantum field with a dispersive medium.

    PubMed

    Klich, Israel

    2012-08-10

    In this Letter we study the entanglement of a quantum radiation field interacting with a dielectric medium. In particular, we describe the quantum mixed state of a field interacting with a dielectric through plasma and Drude models and show that these generate very different entanglement behavior, as manifested in the entanglement entropy of the field. We also present a formula for a "Casimir" entanglement entropy, i.e., the distance dependence of the field entropy. Finally, we study a toy model of the interaction between two plates. In this model, the field entanglement entropy is divergent; however, as in the Casimir effect, its distance-dependent part is finite, and the field matter entanglement is reduced when the objects are far.

  17. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less

  18. Theory and implementation of a very high throughput true random number generator in field programmable gate array.

    PubMed

    Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao

    2016-04-01

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  19. Building a Foundation for Bioenergetics

    ERIC Educational Resources Information Center

    Hamori, Eugene

    2002-01-01

    To give students a lasting comprehension of bioenergetics, first such basics as heat, work, equilibrium, entropy, free energy, closed "versus" open systems, steady state, and reversibility should be explained to them in a meticulous manner, albeit with a minimal use of mathematical formulae. The unique feature of thermodynamics, that it does not…

  20. Parabolic replicator dynamics and the principle of minimum Tsallis information gain

    PubMed Central

    2013-01-01

    Background Non-linear, parabolic (sub-exponential) and hyperbolic (super-exponential) models of prebiological evolution of molecular replicators have been proposed and extensively studied. The parabolic models appear to be the most realistic approximations of real-life replicator systems due primarily to product inhibition. Unlike the more traditional exponential models, the distribution of individual frequencies in an evolving parabolic population is not described by the Maximum Entropy (MaxEnt) Principle in its traditional form, whereby the distribution with the maximum Shannon entropy is chosen among all the distributions that are possible under the given constraints. We sought to identify a more general form of the MaxEnt principle that would be applicable to parabolic growth. Results We consider a model of a population that reproduces according to the parabolic growth law and show that the frequencies of individuals in the population minimize the Tsallis relative entropy (non-additive information gain) at each time moment. Next, we consider a model of a parabolically growing population that maintains a constant total size and provide an “implicit” solution for this system. We show that in this case, the frequencies of the individuals in the population also minimize the Tsallis information gain at each moment of the ‘internal time” of the population. Conclusions The results of this analysis show that the general MaxEnt principle is the underlying law for the evolution of a broad class of replicator systems including not only exponential but also parabolic and hyperbolic systems. The choice of the appropriate entropy (information) function depends on the growth dynamics of a particular class of systems. The Tsallis entropy is non-additive for independent subsystems, i.e. the information on the subsystems is insufficient to describe the system as a whole. In the context of prebiotic evolution, this “non-reductionist” nature of parabolic replicator systems might reflect the importance of group selection and competition between ensembles of cooperating replicators. Reviewers This article was reviewed by Viswanadham Sridhara (nominated by Claus Wilke), Puushottam Dixit (nominated by Sergei Maslov), and Nick Grishin. For the complete reviews, see the Reviewers’ Reports section. PMID:23937956

  1. Music viewed by its entropy content: A novel window for comparative analysis

    PubMed Central

    Febres, Gerardo; Jaffe, Klaus

    2017-01-01

    Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the ‘2nd Order Entropy’. Applying these methods to a variety of musical pieces showed how the space of ‘symbolic specific diversity-entropy’ and that of ‘2nd order entropy’ captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning. PMID:29040288

  2. Squeezed states and graviton-entropy production in the early universe

    NASA Technical Reports Server (NTRS)

    Giovannini, Massimo

    1994-01-01

    Squeezed states are a very useful framework for the quantum treatment of tensor perturbations (i.e. gravitons production) in the early universe. In particular, the non equilibrium entropy growth in a cosmological process of pair production is completely determined by the associated squeezing parameter and is insensitive to the number of particles in the initial state. The total produced entropy may represent a significant fraction of the entropy stored today in the cosmic blackbody radiation, provided pair production originates from a change in the background metric at a curvature scale of the Planck order. Within the formalism of squeezed thermal states it is also possible to discuss the stimulated emission of gravitons from an initial thermal bath, under the action of the cosmic gravitational background field. We find that at low energy the graviton production is enhanced, if compared with spontaneous creation from the vacuum; as a consequence, the inflation scale must be lowered, in order not to exceed the observed CMB quadrupole anisotropy. This effect is important, in particular, for models based on a symmetry-breaking transition which require, as initial condition, a state of thermal equilibrium at temperatures higher than the inflation scale and in which inflation has a minimal duration.

  3. Entanglement entropy between real and virtual particles in ϕ4 quantum field theory

    NASA Astrophysics Data System (ADS)

    Ardenghi, Juan Sebastián

    2015-04-01

    The aim of this work is to compute the entanglement entropy of real and virtual particles by rewriting the generating functional of ϕ4 theory as a mean value between states and observables defined through the correlation functions. Then the von Neumann definition of entropy can be applied to these quantum states and in particular, for the partial traces taken over the internal or external degrees of freedom. This procedure can be done for each order in the perturbation expansion showing that the entanglement entropy for real and virtual particles behaves as ln (m0). In particular, entanglement entropy is computed at first order for the correlation function of two external points showing that mutual information is identical to the external entropy and that conditional entropies are negative for all the domain of m0. In turn, from the definition of the quantum states, it is possible to obtain general relations between total traces between different quantum states of a ϕr theory. Finally, discussion about the possibility of taking partial traces over external degrees of freedom is considered, which implies the introduction of some observables that measure space-time points where an interaction occurs.

  4. Studies of Entanglement Entropy, and Relativistic Fluids for Thermal Field Theories

    NASA Astrophysics Data System (ADS)

    Spillane, Michael

    In this dissertation we consider physical consequences of adding a finite temperature to quantum field theories. At small length scales entanglement is a critically important feature. It is therefore unsurprising that entanglement entropy and Renyi entropy are useful tools in studying quantum phase transition, and quantum information. In this thesis we consider the corrections to entanglement and Renyi entropies due to addition of a finite temperature. More specifically, we investigate the entanglement entropy of a massive scalar field in 1+1 dimensions at nonzero temperature. In the small mass ( m) and temperature (T) limit, we put upper and lower bounds on the two largest eigenvalues of the covariance matrix used to compute the entanglement entropy. We argue that the entanglement entropy has e-m/T scaling in the limit T << m.. Additionally, we calculate thermal corrections to Renyi entropies for free massless fermions on R x S d-1. By expanding the density matrix in a Boltzmann sum, the problem of finding the Renyi entropies can be mapped to the problem of calculating a two point function on an n-sheeted cover of the sphere. We map the problem on the sphere to a conical region in Euclidean space. By using the method of images, we calculate the two point function and recover the Renyi entropies. At large length scales hydrodynamics is a useful way to study quantum field theories. We review recent interest in the Riemann problem as a method for generating a non-equilibrium steady state. The initial conditions consist of a planar interface between two halves of a system held at different temperatures in a hydrodynamic regime. The resulting fluid flow contains a fixed temperature region with a nonzero flux. We briefly discuss the effects of a conserved charge. Next we discuss deforming the relativistic equations with a nonlinear term and how that deformation affects the temperature and velocity in the region connecting the asymptotic fluids. Finally, we study properties of a non-equilibrium steady state generated when two heat baths are initially in contact with one another. The dynamics of the system in question are governed by holographic duality to a blackhole. We discuss the "phase diagram" associated with the steady state of the dual, dynamical black hole and its relation to the fluid/gravity correspondence.

  5. Real topological entropy versus metric entropy for birational measure-preserving transformations

    NASA Astrophysics Data System (ADS)

    Abarenkova, N.; Anglès d'Auriac, J.-Ch.; Boukraa, S.; Maillard, J.-M.

    2000-10-01

    We consider a family of birational measure-preserving transformations of two complex variables, depending on one parameter for which simple rational expressions for the dynamical zeta function have been conjectured, together with an equality between the topological entropy and the logarithm of the Arnold complexity (divided by the number of iterations). Similar results have been obtained for the adaptation of these two concepts to dynamical systems of real variables, yielding to introduce a “real topological entropy” and a “real Arnold complexity”. We try to compare, here, the Kolmogorov-Sinai metric entropy and this real Arnold complexity, or real topological entropy, on this particular example of a one-parameter dependent birational transformation of two variables. More precisely, we analyze, using an infinite precision calculation, the Lyapunov characteristic exponents for various values of the parameter of the birational transformation, in order to compare these results with the ones for the real Arnold complexity. We find a quite surprising result: for this very birational example, and, in fact, for a large set of birational measure-preserving mappings generated by involutions, the Lyapunov characteristic exponents seem to be equal to zero or, at least, extremely small, for all the orbits we have considered, and for all values of the parameter. Birational measure-preserving transformations, generated by involutions, could thus allow to better understand the difference between the topological description and the probabilistic description of discrete dynamical systems. Many birational measure-preserving transformations, generated by involutions, seem to provide examples of discrete dynamical systems which can be topologically chaotic while they are metrically almost quasi-periodic. Heuristically, this can be understood as a consequence of the fact that their orbits seem to form some kind of “transcendental foliation” of the two-dimensional space of variables.

  6. Effects of heat sink and source and entropy generation on MHD mixed convection of a Cu-water nanofluid in a lid-driven square porous enclosure with partial slip

    NASA Astrophysics Data System (ADS)

    Chamkha, A. J.; Rashad, A. M.; Mansour, M. A.; Armaghani, T.; Ghalambaz, M.

    2017-05-01

    In this work, the effects of the presence of a heat sink and a heat source and their lengths and locations and the entropy generation on MHD mixed convection flow and heat transfer in a porous enclosure filled with a Cu-water nanofluid in the presence of partial slip effect are investigated numerically. Both the lid driven vertical walls of the cavity are thermally insulated and are moving with constant and equal speeds in their own plane and the effect of partial slip is imposed on these walls. A segment of the bottom wall is considered as a heat source meanwhile a heat sink is placed on the upper wall of cavity. There are heated and cold parts placed on the bottom and upper walls, respectively, while the remaining parts are thermally insulated. Entropy generation and local heat transfer according to different values of the governing parameters are presented in detail. It is found that the addition of nanoparticles decreases the convective heat transfer inside the porous cavity at all ranges of the heat sink and source lengths. The results for the effects of the magnetic field show that the average Nusselt number decreases considerably upon the enhancement of the Hartmann number. Also, adding nanoparticles to a pure fluid leads to increasing the entropy generation for all values of D for λl=-λr = 1 .

  7. Chaotic Dynamics of Linguistic-Like Processes at the Syntactical and Semantic Levels: in the Pursuit of a Multifractal Attractor

    NASA Astrophysics Data System (ADS)

    Nicolis, John S.; Katsikas, Anastassis A.

    Collective parameters such as the Zipf's law-like statistics, the Transinformation, the Block Entropy and the Markovian character are compared for natural, genetic, musical and artificially generated long texts from generating partitions (alphabets) on homogeneous as well as on multifractal chaotic maps. It appears that minimal requirements for a language at the syntactical level such as memory, selectivity of few keywords and broken symmetry in one dimension (polarity) are more or less met by dynamically iterating simple maps or flows e.g. very simple chaotic hardware. The same selectivity is observed at the semantic level where the aim refers to partitioning a set of enviromental impinging stimuli onto coexisting attractors-categories. Under the regime of pattern recognition and classification, few key features of a pattern or few categories claim the lion's share of the information stored in this pattern and practically, only these key features are persistently scanned by the cognitive processor. A multifractal attractor model can in principle explain this high selectivity, both at the syntactical and the semantic levels.

  8. BIPAD: A web server for modeling bipartite sequence elements

    PubMed Central

    Bi, Chengpeng; Rogan, Peter K

    2006-01-01

    Background Many dimeric protein complexes bind cooperatively to families of bipartite nucleic acid sequence elements, which consist of pairs of conserved half-site sequences separated by intervening distances that vary among individual sites. Results We introduce the Bipad Server [1], a web interface to predict sequence elements embedded within unaligned sequences. Either a bipartite model, consisting of a pair of one-block position weight matrices (PWM's) with a gap distribution, or a single PWM matrix for contiguous single block motifs may be produced. The Bipad program performs multiple local alignment by entropy minimization and cyclic refinement using a stochastic greedy search strategy. The best models are refined by maximizing incremental information contents among a set of potential models with varying half site and gap lengths. Conclusion The web service generates information positional weight matrices, identifies binding site motifs, graphically represents the set of discovered elements as a sequence logo, and depicts the gap distribution as a histogram. Server performance was evaluated by generating a collection of bipartite models for distinct DNA binding proteins. PMID:16503993

  9. Temperature and composition dependence of short-range order and entropy, and statistics of bond length: the semiconductor alloy (GaN)(1-x)(ZnO)(x).

    PubMed

    Liu, Jian; Pedroza, Luana S; Misch, Carissa; Fernández-Serra, Maria V; Allen, Philip B

    2014-07-09

    We present total energy and force calculations for the (GaN)1-x(ZnO)x alloy. Site-occupancy configurations are generated from Monte Carlo (MC) simulations, on the basis of a cluster expansion model proposed in a previous study. Local atomic coordinate relaxations of surprisingly large magnitude are found via density-functional calculations using a 432-atom periodic supercell, for three representative configurations at x = 0.5. These are used to generate bond-length distributions. The configurationally averaged composition- and temperature-dependent short-range order (SRO) parameters of the alloys are discussed. The entropy is approximated in terms of pair distribution statistics and thus related to SRO parameters. This approximate entropy is compared with accurate numerical values from MC simulations. An empirical model for the dependence of the bond length on the local chemical environments is proposed.

  10. Near horizon symmetry and entropy formula for Kerr-Newman (A)dS black holes

    NASA Astrophysics Data System (ADS)

    Setare, Mohammad Reza; Adami, Hamed

    2018-04-01

    In this paper we provide the first non-trivial evidence for universality of the entropy formula 4 πJ 0 + J 0 - beyond pure Einstein gravity in 4-dimensions. We consider the Einstein-Maxwell theory in the presence of cosmological constant, then write near horizon metric of the Kerr-Newman (A)dS black hole in the Gaussian null coordinate system. We consider near horizon fall-off conditions for metric and U(1) gauge field. We find asymptotic combined symmetry generator, consists of diffeomorphism and U(1) gauge transformation, so that it preserves fall-off conditions. Consequently, we find supertranslation, supperrotation and multiple-charge modes and then we show that the entropy formula is held for the Kerr-Newman (A)dS black hole. Supperrotation modes suffer from a problem. By introducing new combined symmetry generator, we cure that problem.

  11. Magnetorheological rotational flow with viscous dissipation

    NASA Astrophysics Data System (ADS)

    Ashrafi, Nariman

    2017-11-01

    Effects of a magnetic field and fluid nonlinearity are investigated for the rotational flow of the Carreau-type fluid while viscous dissipation is taken into account. The governing motion and energy balance equations are coupled, adding complexity to the already highly correlated set of differential equations. The numerical solution is obtained for the narrow-gap limit and steady-state base flow. Magnetic field effect on local entropy generation due to steady two-dimensional laminar forced convection flow was investigated. This study was focused on the entropy generation characteristics and its dependency on various dimensionless parameters. The effects of the Hartmann number, the Brinkman number, and the Deborah number on the stability of the flow were investigated. The introduction of the magnetic field induces a resistive force acting in the opposite direction of the flow, thus causing its deceleration. Moreover, the study shows that the presence of magnetic field tends to slow down the fluid motion. It, however, increases the fluid temperature. Moreover, the total entropy generation number decreases as the Hartmann number and fluid elasticity increase and increases with increasing Brinkman number.

  12. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  13. Entropy information of heart rate variability and its power spectrum during day and night

    NASA Astrophysics Data System (ADS)

    Jin, Li; Jun, Wang

    2013-07-01

    Physiologic systems generate complex fluctuations in their output signals that reflect the underlying dynamics. We employed the base-scale entropy method and the power spectral analysis to study the 24 hours heart rate variability (HRV) signals. The results show that such profound circadian-, age- and pathologic-dependent changes are accompanied by changes in base-scale entropy and power spectral distribution. Moreover, the base-scale entropy changes reflect the corresponding changes in the autonomic nerve outflow. With the suppression of the vagal tone and dominance of the sympathetic tone in congestive heart failure (CHF) subjects, there is more variability in the date fluctuation mode. So the higher base-scale entropy belongs to CHF subjects. With the decrease of the sympathetic tone and the respiratory frequency (RSA) becoming more pronounced with slower breathing during sleeping, the base-scale entropy drops in CHF subjects. The HRV series of the two healthy groups have the same diurnal/nocturnal trend as the CHF series. The fluctuation dynamics trend of data in the three groups can be described as “HF effect”.

  14. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  15. Hair-brane ideas on the horizon

    DOE PAGES

    Martinec, Emil J.; Niehoff, Ben E.

    2015-11-27

    We continue an examination of the microstate geometries program begun in arXiv:1409.6017, focussing on the role of branes that wrap the cycles which degenerate when a throat in the geometry deepens and a horizon forms. An associated quiver quantum mechanical model of minimally wrapped branes exhibits a non-negligible fraction of the gravitational entropy, which scales correctly as a function of the charges. The results suggest a picture of AdS3/CFT2 duality wherein the long string that accounts for BTZ black hole entropy in the CFT description, can also be seen to inhabit the horizon of BPS black holes on the gravitymore » side.« less

  16. On holographic entanglement entropy with second order excitations

    NASA Astrophysics Data System (ADS)

    He, Song; Sun, Jia-Rui; Zhang, Hai-Qing

    2018-03-01

    We study the low-energy corrections to the holographic entanglement entropy (HEE) in the boundary CFT by perturbing the bulk geometry up to second order excitations. Focusing on the case that the boundary subsystem is a strip, we show that the area of the bulk minimal surface can be expanded in terms of the conserved charges, such as mass, angular momentum and electric charge of the AdS black brane. We also calculate the variation of the energy in the subsystem and verify the validity of the first law-like relation of thermodynamics at second order. Moreover, the HEE is naturally bounded at second order perturbations if the cosmic censorship conjecture for the dual black hole still holds.

  17. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    PubMed

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  18. Tables and charts of equilibrium thermodynamic properties of ammonia for temperatures from 500 to 50,000 K.

    NASA Technical Reports Server (NTRS)

    Simmonds, A. L.; Miller, C. G., III; Nealy, J. E.

    1976-01-01

    Equilibrium thermodynamic properties for pure ammonia were generated for a range of temperature from 500 to 50,000 K and pressure from 0.01 to 40 MN/sq m and are presented in tabulated and graphical form. Properties include pressure, temperature, density, enthalpy, speed of sound, entropy, molecular-weight ratio, specific heat at constant pressure, specific heat at constant volume, isentropic exponent, and species mole fractions. These properties were calculated by the method which is based on minimization of the Gibbs free energy. The data presented herein are for an 18-species ammonia model. Heats of formation and spectroscopic constants used as input data are presented. Comparison of several thermodynamic properties calculated with the present program and a second computer code is performed for a range of pressure and for temperatures up to 30,000 K.

  19. The role of Weyl symmetry in hydrodynamics

    NASA Astrophysics Data System (ADS)

    Diles, Saulo

    2018-04-01

    This article is dedicated to the analysis of Weyl symmetry in the context of relativistic hydrodynamics. Here is discussed how this symmetry is properly implemented using the prescription of minimal coupling: ∂ → ∂ + ωA. It is shown that this prescription has no problem to deal with curvature since it gives the correct expressions for the commutator of covariant derivatives. In hydrodynamics, Weyl gauge connection emerges from the degrees of freedom of the fluid: it is a combination of the expansion and entropy gradient. The remaining degrees of freedom, shear, vorticity and the metric tensor, are see in this context as charged fields under the Weyl gauge connection. The gauge nature of the connection provides natural dynamics to it via equations of motion analogous to the Maxwell equations for electromagnetism. As a consequence, a charge for the Weyl connection is defined and the notion of local charge is analyzed generating the conservation law for the Weyl charge.

  20. Euler analysis comparison with LDV data for an advanced counter-rotation propfan at cruise

    NASA Technical Reports Server (NTRS)

    Miller, Christopher J.; Podboy, Gary G.

    1990-01-01

    A fine mesh Euler solution of the F4/A4 unducted fan (UDF) model flowfield is compared with laser Doppler velocimeter (LDV) data taken in the NASA Lewis 8- by 6-Foot Supersonic Wind Tunnel. The comparison is made primarily at one axial plane downstream of the front rotor where the LDV particle lag errors are reduced. The agreement between measured and predicted velocities in this axial plane is good. The results show that a dense mesh is needed in the centerbody stagnation region to minimize entropy generation that weakens the aft row passage shock. The predicted radial location of the tip vortex downstream of the front rotor agrees well with the experimental results but the strength is overpredicted. With 40 points per chord line, the integrated performance quantities are nearly converged, but more points are needed to resolve passage shocks and flow field details.

  1. Enthalpy-entropy compensation: the role of solvation.

    PubMed

    Dragan, Anatoliy I; Read, Christopher M; Crane-Robinson, Colyn

    2017-05-01

    Structural modifications to interacting systems frequently lead to changes in both the enthalpy (heat) and entropy of the process that compensate each other, so that the Gibbs free energy is little changed: a major barrier to the development of lead compounds in drug discovery. The conventional explanation for such enthalpy-entropy compensation (EEC) is that tighter contacts lead to a more negative enthalpy but increased molecular constraints, i.e., a compensating conformational entropy reduction. Changes in solvation can also contribute to EEC but this contribution is infrequently discussed. We review long-established and recent cases of EEC and conclude that the large fluctuations in enthalpy and entropy observed are too great to be a result of only conformational changes and must result, to a considerable degree, from variations in the amounts of water immobilized or released on forming complexes. Two systems exhibiting EEC show a correlation between calorimetric entropies and local mobilities, interpreted to mean conformational control of the binding entropy/free energy. However, a substantial contribution from solvation gives the same effect, as a consequence of a structural link between the amount of bound water and the protein flexibility. Only by assuming substantial changes in solvation-an intrinsically compensatory process-can a more complete understanding of EEC be obtained. Faced with such large, and compensating, changes in the enthalpies and entropies of binding, the best approach to engineering elevated affinities must be through the addition of ionic links, as they generate increased entropy without affecting the enthalpy.

  2. Heavy fields and gravity

    NASA Astrophysics Data System (ADS)

    Goon, Garrett

    2017-01-01

    We study the effects of heavy fields on 4D spacetimes with flat, de Sitter and anti-de Sitter asymptotics. At low energies, matter generates specific, calculable higher derivative corrections to the GR action which perturbatively alter the Schwarzschild-( A) dS family of solutions. The effects of massive scalars, Dirac spinors and gauge fields are each considered. The six-derivative operators they produce, such as ˜ R 3 terms, generate the leading corrections. The induced changes to horizon radii, Hawking temperatures and entropies are found. Modifications to the energy of large AdS black holes are derived by imposing the first law. An explicit demonstration of the replica trick is provided, as it is used to derive black hole and cosmological horizon entropies. Considering entropy bounds, it's found that scalars and fermions increase the entropy one can store inside a region bounded by a sphere of fixed size, but vectors lead to a decrease, oddly. We also demonstrate, however, that many of the corrections fall below the resolving power of the effective field theory and are therefore untrustworthy. Defining properties of black holes, such as the horizon area and Hawking temperature, prove to be remarkably robust against higher derivative gravitational corrections.

  3. Entropy Analysis in Mixed Convection MHD flow of Nanofluid over a Non-linear Stretching Sheet

    NASA Astrophysics Data System (ADS)

    Matin, Meisam Habibi; Nobari, Mohammad Reza Heirani; Jahangiri, Pouyan

    This article deals with a numerical study of entropy analysis in mixed convection MHD flow of nanofluid over a non-linear stretching sheet taking into account the effects of viscous dissipation and variable magnetic field. The nanofluid is made of such nano particles as SiO2 with pure water as a base fluid. To analyze the problem, at first the boundary layer equations are transformed into non-linear ordinary equations using a similarity transformation. The resultant equations are then solved numerically using the Keller-Box scheme based on the implicit finite-difference method. The effects of different non-dimensional governing parameters such as magnetic parameter, nanoparticles volume fraction, Nusselt, Richardson, Eckert, Hartman, Brinkman, Reynolds and entropy generation numbers are investigated in details. The results indicate that increasing the nano particles to the base fluids causes the reduction in shear forces and a decrease in stretching sheet heat transfer coefficient. Also, decreasing the magnetic parameter and increasing the Eckert number result in improves heat transfer rate. Furthermore, the surface acts as a strong source of irreversibility due to the higher entropy generation number near the surface.

  4. Causality, transfer entropy, and allosteric communication landscapes in proteins with harmonic interactions.

    PubMed

    Hacisuleyman, Aysima; Erman, Burak

    2017-06-01

    A fast and approximate method of generating allosteric communication landscapes in proteins is presented by using Schreiber's entropy transfer concept in combination with the Gaussian Network Model of proteins. Predictions of the model and the allosteric communication landscapes generated show that information transfer in proteins does not necessarily take place along a single path, but an ensemble of pathways is possible. The model emphasizes that knowledge of entropy only is not sufficient for determining allosteric communication and additional information based on time delayed correlations should be introduced, which leads to the presence of causality in proteins. The model provides a simple tool for mapping entropy sink-source relations into pairs of residues. By this approach, residues that should be manipulated to control protein activity may be determined. This should be of great importance for allosteric drug design and for understanding the effects of mutations on function. The model is applied to determine allosteric communication in three proteins, Ubiquitin, Pyruvate Kinase, and the PDZ domain. Predictions are in agreement with molecular dynamics simulations and experimental evidence. Proteins 2017; 85:1056-1064. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  5. Minimization of a free-energy-like potential for non-equilibrium flow systems at steady state

    PubMed Central

    Niven, Robert K.

    2010-01-01

    This study examines a new formulation of non-equilibrium thermodynamics, which gives a conditional derivation of the ‘maximum entropy production’ (MEP) principle for flow and/or chemical reaction systems at steady state. The analysis uses a dimensionless potential function ϕst for non-equilibrium systems, analogous to the free energy concept of equilibrium thermodynamics. Spontaneous reductions in ϕst arise from increases in the ‘flux entropy’ of the system—a measure of the variability of the fluxes—or in the local entropy production; conditionally, depending on the behaviour of the flux entropy, the formulation reduces to the MEP principle. The inferred steady state is also shown to exhibit high variability in its instantaneous fluxes and rates, consistent with the observed behaviour of turbulent fluid flow, heat convection and biological systems; one consequence is the coexistence of energy producers and consumers in ecological systems. The different paths for attaining steady state are also classified. PMID:20368250

  6. Technology in the high entropy world.

    PubMed

    Tambo, N

    2006-01-01

    Modern growing society is mainly driven by oils and may be designated "petroleum civilisation". However, the basic energy used to drive the global ecosystem is solar radiation. The amount of fossil energy consumption is minimal in the whole global energy balance. Economic growth is mainly controlled by the fossil (commercial) energy consumption rate in urban areas. Water and sanitation systems are bridging economical activities and global ecosystems. Therefore, vast amounts of high entropy solar energy should always be taken into account in the water industry. Only in urban/industrial areas where most of the GDP is earned, are commercial energy driven systems inevitably introduced with maximum effort for energy saving. A water district concept to ensure appropriate quality use with the least deterioration of the environment is proposed. In other areas, decentralised water and sanitation systems driven on soft energy paths would be recommended. A process and system designed on a high entropy energy system would be the foundation for a future urban metabolic system revolution for when oil-based energy become scarce.

  7. A Variational Level Set Approach Based on Local Entropy for Image Segmentation and Bias Field Correction.

    PubMed

    Tang, Jian; Jiang, Xiaoliang

    2017-01-01

    Image segmentation has always been a considerable challenge in image analysis and understanding due to the intensity inhomogeneity, which is also commonly known as bias field. In this paper, we present a novel region-based approach based on local entropy for segmenting images and estimating the bias field simultaneously. Firstly, a local Gaussian distribution fitting (LGDF) energy function is defined as a weighted energy integral, where the weight is local entropy derived from a grey level distribution of local image. The means of this objective function have a multiplicative factor that estimates the bias field in the transformed domain. Then, the bias field prior is fully used. Therefore, our model can estimate the bias field more accurately. Finally, minimization of this energy function with a level set regularization term, image segmentation, and bias field estimation can be achieved. Experiments on images of various modalities demonstrated the superior performance of the proposed method when compared with other state-of-the-art approaches.

  8. The relative entropy is fundamental to adaptive resolution simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreis, Karsten; Graduate School Materials Science in Mainz, Staudingerweg 9, 55128 Mainz; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de

    Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy withmore » respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.« less

  9. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  10. Cross-entropy embedding of high-dimensional data using the neural gas model.

    PubMed

    Estévez, Pablo A; Figueroa, Cristián J; Saito, Kazumi

    2005-01-01

    A cross-entropy approach to mapping high-dimensional data into a low-dimensional space embedding is presented. The method allows to project simultaneously the input data and the codebook vectors, obtained with the Neural Gas (NG) quantizer algorithm, into a low-dimensional output space. The aim of this approach is to preserve the relationship defined by the NG neighborhood function for each pair of input and codebook vectors. A cost function based on the cross-entropy between input and output probabilities is minimized by using a Newton-Raphson method. The new approach is compared with Sammon's non-linear mapping (NLM) and the hierarchical approach of combining a vector quantizer such as the self-organizing feature map (SOM) or NG with the NLM recall algorithm. In comparison with these techniques, our method delivers a clear visualization of both data points and codebooks, and it achieves a better mapping quality in terms of the topology preservation measure q(m).

  11. The relative entropy is fundamental to adaptive resolution simulations

    NASA Astrophysics Data System (ADS)

    Kreis, Karsten; Potestio, Raffaello

    2016-07-01

    Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy with respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.

  12. Optimal behavior of viscoelastic flow at resonant frequencies.

    PubMed

    Lambert, A A; Ibáñez, G; Cuevas, S; del Río, J A

    2004-11-01

    The global entropy generation rate in the zero-mean oscillatory flow of a Maxwell fluid in a pipe is analyzed with the aim of determining its behavior at resonant flow conditions. This quantity is calculated explicitly using the analytic expression for the velocity field and assuming isothermal conditions. The global entropy generation rate shows well-defined peaks at the resonant frequencies where the flow displays maximum velocities. It was found that resonant frequencies can be considered optimal in the sense that they maximize the power transmitted to the pulsating flow at the expense of maximum dissipation.

  13. Information entropy to measure the spatial and temporal complexity of solute transport in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Li, Weiyao; Huang, Guanhua; Xiong, Yunwu

    2016-04-01

    The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and solute transport complexity weakened, and the corresponding information entropy also decreased. Longitudinal macro dispersivity declined slightly at early time then rose. Solute spatial and temporal distribution had significant impacts on the information entropy. Information entropy could reflect the change of solute distribution. Information entropy appears a tool to characterize the spatial and temporal complexity of solute migration and provides a reference for future research.

  14. Hidden conformal symmetry of rotating black holes in minimal five-dimensional gauged supergravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Setare, M. R.; Kamali, V.

    2010-10-15

    In the present paper we show that for a low frequency limit the wave equation of a massless scalar field in the background of nonextremal charged rotating black holes in five-dimensional minimal gauged and ungauged supergravity can be written as the Casimir of an SL(2,R) symmetry. Our result shows that the entropy of the black hole is reproduced by the Cardy formula. Also the absorption cross section is consistent with the finite temperature absorption cross section for a two-dimensional conformal field theory.

  15. Hidden disorder in the α '→δ transformation of Pu-1.9 at.% Ga

    DOE PAGES

    Jeffries, J. R.; Manley, M. E.; Wall, M. A.; ...

    2012-06-06

    Enthalpy and entropy are thermodynamic quantities critical to determining how and at what temperature a phase transition occurs. At a phase transition, the enthalpy and temperature-weighted entropy differences between two phases are equal (ΔH=TΔS), but there are materials where this balance has not been experimentally or theoretically realized, leading to the idea of hidden order and disorder. In a Pu-1.9 at. % Ga alloy, the δ phase is retained as a metastable state at room temperature, but at low temperatures, the δ phase yields to a mixed-phase microstructure of δ- and α'-Pu. The previously measured sources of entropy associated withmore » the α'→δ transformation fail to sum to the entropy predicted theoretically. We report an experimental measurement of the entropy of the α'→δ transformation that corroborates the theoretical prediction, and implies that only about 65% of the entropy stabilizing the δ phase is accounted for, leaving a missing entropy of about 0.5 k B/atom. Some previously proposed mechanisms for generating entropy are discussed, but none seem capable of providing the necessary disorder to stabilize the δ phase. This hidden disorder represents multiple accessible states per atom within the δ phase of Pu that may not be included in our current understanding of the properties and phase stability of δ-Pu.« less

  16. Thermodynamics of a third-generation poly(phenylene-pyridyl) dendron decorated with dodecyl groups in the range of T → 0 to 480 K

    NASA Astrophysics Data System (ADS)

    Smirnova, N. N.; Markin, A. V.; Tsvetkova, L. Ya.; Kuchkina, N. V.; Yuzik-Klimova, E. Yu.; Shifrina, Z. B.

    2016-05-01

    The heat capacity of a glassy third-generation poly(phenylene-pyridyl) dendron decorated with dodecyl groups is studied for the first time via high-precision adiabatic vacuum and differential scanning calorimetry in the temperature range of 6 to 520 K. The standard thermodynamic functions (molar heat capacity C p ° , enthalpy H°( T), entropy S°( T), and Gibbs energy G°( T)- H°(0)) in the range of T → 0 to 480 K, and the entropy of formation at 298.15 K, are calculated on the basis of the obtained data. The thermodynamic properties of the dendron and the corresponding third-generation poly(phenylene-pyridyl) dendrimer studied earlier are compared.

  17. Entanglement entropy between virtual and real excitations in quantum electrodynamics

    NASA Astrophysics Data System (ADS)

    Ardenghi, Juan Sebastián

    2018-05-01

    The aim of this work is to introduce the entanglement entropy of real and virtual excitations of fermion and photon fields. By rewriting the generating functional of quantum electrodynamics theory as an inner product between quantum operators, it is possible to obtain quantum density operators representing the propagation of real and virtual particles. These operators are partial traces, where the degrees of freedom traced out are unobserved excitations. Then the von Neumann definition of entropy can be applied to these quantum operators and in particular, for the partial traces taken over by the internal or external degrees of freedom. A universal behavior is obtained for the entanglement entropy for different quantum fields at zeroth order in the coupling constant. In order to obtain numerical results at different orders in the perturbation expansion, the Bloch-Nordsieck model is considered, where it is shown that for some particular values of the electric charge, the von Neumann entropy increases or decreases with respect to the noninteracting case.

  18. New paradigm for task switching strategies while performing multiple tasks: entropy and symbolic dynamics analysis of voluntary patterns.

    PubMed

    Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten

    2012-10-01

    It has become well established in laboratory experiments that switching tasks, perhaps due to interruptions at work, incur costs in response time to complete the next task. Conditions are also known that exaggerate or lessen the switching costs. Although switching costs can contribute to fatigue, task switching can also be an adaptive response to fatigue. The present study introduces a new research paradigm for studying the emergence of voluntary task switching regimes, self-organizing processes therein, and the possibly conflicting roles of switching costs and minimum entropy. Fifty-four undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The sequences of task choices were analyzed using orbital decomposition to extract pattern types and lengths, which were then classified and compared with regard to Shannon entropy, topological entropy, number of task switches involved, and overall performance. Results indicated that similar but different patterns were generated under the two instructional conditions, and better performance was associated with lower topological entropy. Both entropy metrics were associated with the amount of voluntary task switching. Future research should explore conditions affecting the trade-off between switching costs and entropy, levels of automaticity between task elements, and the role of voluntary switching regimes on fatigue.

  19. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  20. Investigating weaknesses in Android certificate security

    NASA Astrophysics Data System (ADS)

    Krych, Daniel E.; Lange-Maney, Stephen; McDaniel, Patrick; Glodek, William

    2015-05-01

    Android's application market relies on secure certificate generation to establish trust between applications and their users; yet, cryptography is often not a priority for application developers and many fail to take the necessary security precautions. Indeed, there is cause for concern: several recent high-profile studies have observed a pervasive lack of entropy on Web-systems leading to the factorization of private keys.1 Sufficient entropy, or randomness, is essential to generate secure key pairs and combat predictable key generation. In this paper, we analyze the security of Android certificates. We investigate the entropy present in 550,000 Android application certificates using the Quasilinear GCD finding algorithm.1 Our results show that while the lack of entropy does not appear to be as ubiquitous in the mobile markets as on Web-systems, there is substantial reuse of certificates only one third of the certificates in our dataset were unique. In other words, we find that organizations frequently reuse certificates for different applications. While such a practice is acceptable under Google's specifications for a single developer, we find that in some cases the same certificates are used for a myriad of developers, potentially compromising Android's intended trust relationships. Further, we observed duplicate certificates being used by both malicious and non-malicious applications. The top 3 repeated certificates present in our dataset accounted for a total of 11,438 separate APKs. Of these applications, 451, or roughly 4%, were identified as malicious by antivirus services.

  1. Detection of direct and indirect noise generated by synthetic hot spots in a duct

    NASA Astrophysics Data System (ADS)

    De Domenico, Francesca; Rolland, Erwan O.; Hochgreb, Simone

    2017-04-01

    Sound waves in a combustor are generated from fluctuations in the heat release rate (direct noise) or the acceleration of entropy, vorticity or compositional perturbations through nozzles or turbine guide vanes (indirect or entropy noise). These sound waves are transmitted downstream as well as reflected upstream of the acceleration point, contributing to the overall noise emissions, or triggering combustion instabilities. Previous experiments attempted to isolate indirect noise by generating thermoacoustic hot spots electrically and measuring the transmitted acoustic waves, yet there are no measurements on the backward propagating entropy and acoustic waves. This work presents the first measurements which clearly separate the direct and indirect noise contributions to pressure fluctuations upstream of the acceleration point. Synthetic entropy spots are produced by unsteady electrical heating of a grid of thin wires located in a tube. Compression waves (direct noise) are generated from this heating process. The hot spots are then advected with the mean flow and finally accelerated through an orifice plate located at the end of the tube, producing a strong acoustic signature which propagates upstream (indirect noise). The convective time is selected to be longer than the heating pulse length, in order to obtain a clear time separation between direct and indirect noise in the overall pressure trace. The contribution of indirect noise to the overall noise is shown to be non-negligible either in subsonic or sonic throat conditions. However, the absolute amplitude of direct noise is larger than the corresponding fraction of indirect noise, explaining the difficulty in clearly identifying the two contributions when they are merged. Further, the work shows the importance of using appropriate pressure transducer instrumentation and correcting for the respective transfer functions in order to account for low frequency effects in the determination of pressure fluctuations.

  2. Extension of Murray's law using a non-Newtonian model of blood flow.

    PubMed

    Revellin, Rémi; Rousset, François; Baud, David; Bonjour, Jocelyn

    2009-05-15

    So far, none of the existing methods on Murray's law deal with the non-Newtonian behavior of blood flow although the non-Newtonian approach for blood flow modelling looks more accurate. MODELING: In the present paper, Murray's law which is applicable to an arterial bifurcation, is generalized to a non-Newtonian blood flow model (power-law model). When the vessel size reaches the capillary limitation, blood can be modeled using a non-Newtonian constitutive equation. It is assumed two different constraints in addition to the pumping power: the volume constraint or the surface constraint (related to the internal surface of the vessel). For a seek of generality, the relationships are given for an arbitrary number of daughter vessels. It is shown that for a cost function including the volume constraint, classical Murray's law remains valid (i.e. SigmaR(c) = cste with c = 3 is verified and is independent of n, the dimensionless index in the viscosity equation; R being the radius of the vessel). On the contrary, for a cost function including the surface constraint, different values of c may be calculated depending on the value of n. We find that c varies for blood from 2.42 to 3 depending on the constraint and the fluid properties. For the Newtonian model, the surface constraint leads to c = 2.5. The cost function (based on the surface constraint) can be related to entropy generation, by dividing it by the temperature. It is demonstrated that the entropy generated in all the daughter vessels is greater than the entropy generated in the parent vessel. Furthermore, it is shown that the difference of entropy generation between the parent and daughter vessels is smaller for a non-Newtonian fluid than for a Newtonian fluid.

  3. Self-growing neural network architecture using crisp and fuzzy entropy

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.

    1992-01-01

    The paper briefly describes the self-growing neural network algorithm, CID2, which makes decision trees equivalent to hidden layers of a neural network. The algorithm generates a feedforward architecture using crisp and fuzzy entropy measures. The results of a real-life recognition problem of distinguishing defects in a glass ribbon and of a benchmark problem of differentiating two spirals are shown and discussed.

  4. Self-growing neural network architecture using crisp and fuzzy entropy

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.

    1992-01-01

    The paper briefly describes the self-growing neural network algorithm, CID3, which makes decision trees equivalent to hidden layers of a neural network. The algorithm generates a feedforward architecture using crisp and fuzzy entropy measures. The results for a real-life recognition problem of distinguishing defects in a glass ribbon, and for a benchmark problen of telling two spirals apart are shown and discussed.

  5. Spectral Entropy Can Predict Changes of Working Memory Performance Reduced by Short-Time Training in the Delayed-Match-to-Sample Task

    PubMed Central

    Tian, Yin; Zhang, Huiling; Xu, Wei; Zhang, Haiyong; Yang, Li; Zheng, Shuxing; Shi, Yupan

    2017-01-01

    Spectral entropy, which was generated by applying the Shannon entropy concept to the power distribution of the Fourier-transformed electroencephalograph (EEG), was utilized to measure the uniformity of power spectral density underlying EEG when subjects performed the working memory tasks twice, i.e., before and after training. According to Signed Residual Time (SRT) scores based on response speed and accuracy trade-off, 20 subjects were divided into two groups, namely high-performance and low-performance groups, to undertake working memory (WM) tasks. We found that spectral entropy derived from the retention period of WM on channel FC4 exhibited a high correlation with SRT scores. To this end, spectral entropy was used in support vector machine classifier with linear kernel to differentiate these two groups. Receiver operating characteristics analysis and leave-one out cross-validation (LOOCV) demonstrated that the averaged classification accuracy (CA) was 90.0 and 92.5% for intra-session and inter-session, respectively, indicating that spectral entropy could be used to distinguish these two different WM performance groups successfully. Furthermore, the support vector regression prediction model with radial basis function kernel and the root-mean-square error of prediction revealed that spectral entropy could be utilized to predict SRT scores on individual WM performance. After testing the changes in SRT scores and spectral entropy for each subject by short-time training, we found that 16 in 20 subjects’ SRT scores were clearly promoted after training and 15 in 20 subjects’ SRT scores showed consistent changes with spectral entropy before and after training. The findings revealed that spectral entropy could be a promising indicator to predict individual’s WM changes by training and further provide a novel application about WM for brain–computer interfaces. PMID:28912701

  6. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  7. Measurement-induced randomness and state-merging

    NASA Astrophysics Data System (ADS)

    Chakrabarty, Indranil; Deshpande, Abhishek; Chatterjee, Sourav

    In this work we introduce the randomness which is truly quantum mechanical in nature arising as an act of measurement. For a composite classical system, we have the joint entropy to quantify the randomness present in the total system and that happens to be equal to the sum of the entropy of one subsystem and the conditional entropy of the other subsystem, given we know the first system. The same analogy carries over to the quantum setting by replacing the Shannon entropy by the von Neumann entropy. However, if we replace the conditional von Neumann entropy by the average conditional entropy due to measurement, we find that it is different from the joint entropy of the system. We call this difference Measurement Induced Randomness (MIR) and argue that this is unique of quantum mechanical systems and there is no classical counterpart to this. In other words, the joint von Neumann entropy gives only the total randomness that arises because of the heterogeneity of the mixture and we show that it is not the total randomness that can be generated in the composite system. We generalize this quantity for N-qubit systems and show that it reduces to quantum discord for two-qubit systems. Further, we show that it is exactly equal to the change in the cost quantum state merging that arises because of the measurement. We argue that for quantum information processing tasks like state merging, the change in the cost as a result of discarding prior information can also be viewed as a rise of randomness due to measurement.

  8. Information theory-based decision support system for integrated design of multivariable hydrometric networks

    NASA Astrophysics Data System (ADS)

    Keum, Jongho; Coulibaly, Paulin

    2017-07-01

    Adequate and accurate hydrologic information from optimal hydrometric networks is an essential part of effective water resources management. Although the key hydrologic processes in the water cycle are interconnected, hydrometric networks (e.g., streamflow, precipitation, groundwater level) have been routinely designed individually. A decision support framework is proposed for integrated design of multivariable hydrometric networks. The proposed method is applied to design optimal precipitation and streamflow networks simultaneously. The epsilon-dominance hierarchical Bayesian optimization algorithm was combined with Shannon entropy of information theory to design and evaluate hydrometric networks. Specifically, the joint entropy from the combined networks was maximized to provide the most information, and the total correlation was minimized to reduce redundant information. To further optimize the efficiency between the networks, they were designed by maximizing the conditional entropy of the streamflow network given the information of the precipitation network. Compared to the traditional individual variable design approach, the integrated multivariable design method was able to determine more efficient optimal networks by avoiding the redundant stations. Additionally, four quantization cases were compared to evaluate their effects on the entropy calculations and the determination of the optimal networks. The evaluation results indicate that the quantization methods should be selected after careful consideration for each design problem since the station rankings and the optimal networks can change accordingly.

  9. Bitstream decoding processor for fast entropy decoding of variable length coding-based multiformat videos

    NASA Astrophysics Data System (ADS)

    Jo, Hyunho; Sim, Donggyu

    2014-06-01

    We present a bitstream decoding processor for entropy decoding of variable length coding-based multiformat videos. Since most of the computational complexity of entropy decoders comes from bitstream accesses and table look-up process, the developed bitstream processing unit (BsPU) has several designated instructions to access bitstreams and to minimize branch operations in the table look-up process. In addition, the instruction for bitstream access has the capability to remove emulation prevention bytes (EPBs) of H.264/AVC without initial delay, repeated memory accesses, and additional buffer. Experimental results show that the proposed method for EPB removal achieves a speed-up of 1.23 times compared to the conventional EPB removal method. In addition, the BsPU achieves speed-ups of 5.6 and 3.5 times in entropy decoding of H.264/AVC and MPEG-4 Visual bitstreams, respectively, compared to an existing processor without designated instructions and a new table mapping algorithm. The BsPU is implemented on a Xilinx Virtex5 LX330 field-programmable gate array. The MPEG-4 Visual (ASP, Level 5) and H.264/AVC (Main Profile, Level 4) are processed using the developed BsPU with a core clock speed of under 250 MHz in real time.

  10. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  11. Sample entropy applied to the analysis of synthetic time series and tachograms

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.

    2017-01-01

    Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.

  12. Generation of skeletal mechanism by means of projected entropy participation indices

    NASA Astrophysics Data System (ADS)

    Paolucci, Samuel; Valorani, Mauro; Ciottoli, Pietro Paolo; Galassi, Riccardo Malpica

    2017-11-01

    When the dynamics of reactive systems develop very-slow and very-fast time scales separated by a range of active time scales, with gaps in the fast/active and slow/active time scales, then it is possible to achieve multi-scale adaptive model reduction along-with the integration of the ODEs using the G-Scheme. The scheme assumes that the dynamics is decomposed into active, slow, fast, and invariant subspaces. We derive expressions that establish a direct link between time scales and entropy production by using estimates provided by the G-Scheme. To calculate the contribution to entropy production, we resort to a standard model of a constant pressure, adiabatic, batch reactor, where the mixture temperature of the reactants is initially set above the auto-ignition temperature. Numerical experiments show that the contribution to entropy production of the fast subspace is of the same magnitude as the error threshold chosen for the identification of the decomposition of the tangent space, and the contribution of the slow subspace is generally much smaller than that of the active subspace. The information on entropy production associated with reactions within each subspace is used to define an entropy participation index that is subsequently utilized for model reduction.

  13. Noise, chaos, and (ɛ, τ)-entropy per unit time

    NASA Astrophysics Data System (ADS)

    Gaspard, Pierre; Wang, Xiao-Jing

    1993-12-01

    The degree of dynamical randomness of different time processes is characterized in terms of the (ε, τ)-entropy per unit time. The (ε, τ)-entropy is the amount of information generated per unit time, at different scales τ of time and ε of the observables. This quantity generalizes the Kolmogorov-Sinai entropy per unit time from deterministic chaotic processes, to stochastic processes such as fluctuations in mesoscopic physico-chemical phenomena or strong turbulence in macroscopic spacetime dynamics. The random processes that are characterized include chaotic systems, Bernoulli and Markov chains, Poisson and birth-and-death processes, Ornstein-Uhlenbeck and Yaglom noises, fractional Brownian motions, different regimes of hydrodynamical turbulence, and the Lorentz-Boltzmann process of nonequilibrium statistical mechanics. We also extend the (ε, τ)-entropy to spacetime processes like cellular automata, Conway's game of life, lattice gas automata, coupled maps, spacetime chaos in partial differential equations, as well as the ideal, the Lorentz, and the hard sphere gases. Through these examples it is demonstrated that the (ε, τ)-entropy provides a unified quantitative measure of dynamical randomness to both chaos and noises, and a method to detect transitions between dynamical states of different degrees of randomness as a parameter of the system is varied.

  14. Entropy reduction via simplified image contourization

    NASA Technical Reports Server (NTRS)

    Turner, Martin J.

    1993-01-01

    The process of contourization is presented which converts a raster image into a set of plateaux or contours. These contours can be grouped into a hierarchical structure, defining total spatial inclusion, called a contour tree. A contour coder has been developed which fully describes these contours in a compact and efficient manner and is the basis for an image compression method. Simplification of the contour tree has been undertaken by merging contour tree nodes thus lowering the contour tree's entropy. This can be exploited by the contour coder to increase the image compression ratio. By applying general and simple rules derived from physiological experiments on the human vision system, lossy image compression can be achieved which minimizes noticeable artifacts in the simplified image.

  15. Entropy generation and momentum transfer in the superconductor-normal and normal-superconductor phase transformations and the consistency of the conventional theory of superconductivity

    NASA Astrophysics Data System (ADS)

    Hirsch, J. E.

    2018-05-01

    Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.

  16. Geometric optimization of an active magnetic regenerative refrigerator via second-law analysis

    NASA Astrophysics Data System (ADS)

    Li, Peng; Gong, Maoqiong; Wu, Jianfeng

    2008-11-01

    Previous analyses [Z. Yan and J. Chen, J. Appl. Phys. 72, 1 (1992); J. Chen and Z. Yan, ibid., 84, 1791 (1998); Lin et al., Physica B 344, 147 (2004); Yang et al., ibid., 364, 33 (2005); Xia et al., ibid., 381, 246 (2006).] of irreversibilities in magnetic refrigerators overlooked several important losses that could be dominant in a real active magnetic regenerative refrigerator (AMRR). No quantitative expressions have been provided yet to estimate the corresponding entropy generations in real AMRRs. The important geometric parameters of AMRRs, such as the aspect ratio of the active magnetic regenerator and the refrigerant diameter, are still arbitrarily chosen. Expressions for calculating different types of entropy generations in the AMRR were derived and used to optimize the aspect ratio and the refrigerant diameter. An optimal coefficient of performance (15.54) was achieved at an aspect ratio of 6.39 and a refrigerant diameter of 1.1mm for our current system. Further study showed that the dissipative sources (e.g., the fluid friction and the unbalanced magnetic forces) in AMRRs, which were overlooked by previous investigations, could significantly contribute to entropy generations.

  17. Entropy generation analysis for film boiling: A simple model of quenching

    NASA Astrophysics Data System (ADS)

    Lotfi, Ali; Lakzian, Esmail

    2016-04-01

    In this paper, quenching in high-temperature materials processing is modeled as a superheated isothermal flat plate. In these phenomena, a liquid flows over the highly superheated surfaces for cooling. So the surface and the liquid are separated by the vapor layer that is formed because of the liquid which is in contact with the superheated surface. This is named forced film boiling. As an objective, the distribution of the entropy generation in the laminar forced film boiling is obtained by similarity solution for the first time in the quenching processes. The PDE governing differential equations of the laminar film boiling including continuity, momentum, and energy are reduced to ODE ones, and a dimensionless equation for entropy generation inside the liquid boundary and vapor layer is obtained. Then the ODEs are solved by applying the 4th-order Runge-Kutta method with a shooting procedure. Moreover, the Bejan number is used as a design criterion parameter for a qualitative study about the rate of cooling and the effects of plate speed are studied in the quenching processes. It is observed that for high speed of the plate the rate of cooling (heat transfer) is more.

  18. A measurement of disorder in binary sequences

    NASA Astrophysics Data System (ADS)

    Gong, Longyan; Wang, Haihong; Cheng, Weiwen; Zhao, Shengmei

    2015-03-01

    We propose a complex quantity, AL, to characterize the degree of disorder of L-length binary symbolic sequences. As examples, we respectively apply it to typical random and deterministic sequences. One kind of random sequences is generated from a periodic binary sequence and the other is generated from the logistic map. The deterministic sequences are the Fibonacci and Thue-Morse sequences. In these analyzed sequences, we find that the modulus of AL, denoted by |AL | , is a (statistically) equivalent quantity to the Boltzmann entropy, the metric entropy, the conditional block entropy and/or other quantities, so it is a useful quantitative measure of disorder. It can be as a fruitful index to discern which sequence is more disordered. Moreover, there is one and only one value of |AL | for the overall disorder characteristics. It needs extremely low computational costs. It can be easily experimentally realized. From all these mentioned, we believe that the proposed measure of disorder is a valuable complement to existing ones in symbolic sequences.

  19. Numerical study of entropy generation and melting heat transfer on MHD generalised non-Newtonian fluid (GNF): Application to optimal energy

    NASA Astrophysics Data System (ADS)

    Iqbal, Z.; Mehmood, Zaffar; Ahmad, Bilal

    2018-05-01

    This paper concerns an application to optimal energy by incorporating thermal equilibrium on MHD-generalised non-Newtonian fluid model with melting heat effect. Highly nonlinear system of partial differential equations is simplified to a nonlinear system using boundary layer approach and similarity transformations. Numerical solutions of velocity and temperature profile are obtained by using shooting method. The contribution of entropy generation is appraised on thermal and fluid velocities. Physical features of relevant parameters have been discussed by plotting graphs and tables. Some noteworthy findings are: Prandtl number, power law index and Weissenberg number contribute in lowering mass boundary layer thickness and entropy effect and enlarging thermal boundary layer thickness. However, an increasing mass boundary layer effect is only due to melting heat parameter. Moreover, thermal boundary layers have same trend for all parameters, i.e., temperature enhances with increase in values of significant parameters. Similarly, Hartman and Weissenberg numbers enhance Bejan number.

  20. A Mixed QM/MM Scoring Function to Predict Protein-Ligand Binding Affinity

    PubMed Central

    Hayik, Seth A.; Dunbrack, Roland; Merz, Kenneth M.

    2010-01-01

    Computational methods for predicting protein-ligand binding free energy continue to be popular as a potential cost-cutting method in the drug discovery process. However, accurate predictions are often difficult to make as estimates must be made for certain electronic and entropic terms in conventional force field based scoring functions. Mixed quantum mechanics/molecular mechanics (QM/MM) methods allow electronic effects for a small region of the protein to be calculated, treating the remaining atoms as a fixed charge background for the active site. Such a semi-empirical QM/MM scoring function has been implemented in AMBER using DivCon and tested on a set of 23 metalloprotein-ligand complexes, where QM/MM methods provide a particular advantage in the modeling of the metal ion. The binding affinity of this set of proteins can be calculated with an R2 of 0.64 and a standard deviation of 1.88 kcal/mol without fitting and 0.71 and a standard deviation of 1.69 kcal/mol with fitted weighting of the individual scoring terms. In this study we explore using various methods to calculate terms in the binding free energy equation, including entropy estimates and minimization standards. From these studies we found that using the rotational bond estimate to ligand entropy results in a reasonable R2 of 0.63 without fitting. We also found that using the ESCF energy of the proteins without minimization resulted in an R2 of 0.57, when using the rotatable bond entropy estimate. PMID:21221417

  1. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  2. Chilly dark sectors and asymmetric reheating

    NASA Astrophysics Data System (ADS)

    Adshead, Peter; Cui, Yanou; Shelton, Jessie

    2016-06-01

    In a broad class of theories, the relic abundance of dark matter is determined by interactions internal to a thermalized dark sector, with no direct involvement of the Standard Model (SM). We point out that these theories raise an immediate cosmological question: how was the dark sector initially populated in the early universe? Motivated in part by the difficulty of accommodating large amounts of entropy carried in dark radiation with cosmic microwave background measurements of the effective number of relativistic species at recombination, N eff , we aim to establish which admissible cosmological histories can populate a thermal dark sector that never reaches thermal equilibrium with the SM. The minimal cosmological origin for such a dark sector is asymmetric reheating, when the same mechanism that populates the SM in the early universe also populates the dark sector at a lower temperature. Here we demonstrate that the resulting inevitable inflaton-mediated scattering between the dark sector and the SM can wash out a would-be temperature asymmetry, and establish the regions of parameter space where temperature asymmetries can be generated in minimal reheating scenarios. Thus obtaining a temperature asymmetry of a given size either restricts possible inflaton masses and couplings or necessitates a non-minimal cosmology for one or both sectors. As a side benefit, we develop techniques for evaluating collision terms in the relativistic Boltzmann equation when the full dependence on Bose-Einstein or Fermi-Dirac phase space distributions must be retained, and present several new results on relativistic thermal averages in an appendix.

  3. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    PubMed

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  4. Entropy for Mechanically Vibrating Systems

    NASA Astrophysics Data System (ADS)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  5. A method for the fast estimation of a battery entropy-variation high-resolution curve - Application on a commercial LiFePO4/graphite cell

    NASA Astrophysics Data System (ADS)

    Damay, Nicolas; Forgez, Christophe; Bichat, Marie-Pierre; Friedrich, Guy

    2016-11-01

    The entropy-variation of a battery is responsible for heat generation or consumption during operation and its prior measurement is mandatory for developing a thermal model. It is generally done through the potentiometric method which is considered as a reference. However, it requires several days or weeks to get a look-up table with a 5 or 10% SoC (State of Charge) resolution. In this study, a calorimetric method based on the inversion of a thermal model is proposed for the fast estimation of a nearly continuous curve of entropy-variation. This is achieved by separating the heats produced while charging and discharging the battery. The entropy-variation is then deduced from the extracted entropic heat. The proposed method is validated by comparing the results obtained with several current rates to measurements made with the potentiometric method.

  6. Analyzing the financial crisis using the entropy density function

    NASA Astrophysics Data System (ADS)

    Oh, Gabjin; Kim, Ho-yong; Ahn, Seok-Won; Kwak, Wooseop

    2015-02-01

    The risk that is created by nonlinear interactions among subjects in economic systems is assumed to increase during an abnormal state of a financial market. Nevertheless, investigating the systemic risk in financial markets following the global financial crisis is not sufficient. In this paper, we analyze the entropy density function in the return time series for several financial markets, such as the S&P500, KOSPI, and DAX indices, from October 2002 to December 2011 and analyze the variability in the entropy value over time. We find that the entropy density function of the S&P500 index during the subprime crisis exhibits a significant decrease compared to that in other periods, whereas the other markets, such as those in Germany and Korea, exhibit no significant decrease during the market crisis. These findings demonstrate that the S&P500 index generated a regular pattern in the return time series during the financial crisis.

  7. Entropy production of doubly stochastic quantum channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müller-Hermes, Alexander, E-mail: muellerh@posteo.net; Department of Mathematical Sciences, University of Copenhagen, 2100 Copenhagen; Stilck França, Daniel, E-mail: dsfranca@mytum.de

    2016-02-15

    We study the entropy increase of quantum systems evolving under primitive, doubly stochastic Markovian noise and thus converging to the maximally mixed state. This entropy increase can be quantified by a logarithmic-Sobolev constant of the Liouvillian generating the noise. We prove a universal lower bound on this constant that stays invariant under taking tensor-powers. Our methods involve a new comparison method to relate logarithmic-Sobolev constants of different Liouvillians and a technique to compute logarithmic-Sobolev inequalities of Liouvillians with eigenvectors forming a projective representation of a finite abelian group. Our bounds improve upon similar results established before and as an applicationmore » we prove an upper bound on continuous-time quantum capacities. In the last part of this work we study entropy production estimates of discrete-time doubly stochastic quantum channels by extending the framework of discrete-time logarithmic-Sobolev inequalities to the quantum case.« less

  8. Unbiased All-Optical Random-Number Generator

    NASA Astrophysics Data System (ADS)

    Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja

    2017-10-01

    The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.

  9. Entanglement entropy in causal set theory

    NASA Astrophysics Data System (ADS)

    Sorkin, Rafael D.; Yazdi, Yasaman K.

    2018-04-01

    Entanglement entropy is now widely accepted as having deep connections with quantum gravity. It is therefore desirable to understand it in the context of causal sets, especially since they provide in a natural manner the UV cutoff needed to render entanglement entropy finite. Formulating a notion of entanglement entropy in a causal set is not straightforward because the type of canonical hypersurface-data on which its definition typically relies is not available. Instead, we appeal to the more global expression given in Sorkin (2012 (arXiv:1205.2953)) which, for a Gaussian scalar field, expresses the entropy of a spacetime region in terms of the field’s correlation function within that region (its ‘Wightman function’ W(x, x') ). Carrying this formula over to the causal set, one obtains an entropy which is both finite and of a Lorentz invariant nature. We evaluate this global entropy-expression numerically for certain regions (primarily order-intervals or ‘causal diamonds’) within causal sets of 1  +  1 dimensions. For the causal-set counterpart of the entanglement entropy, we obtain, in the first instance, a result that follows a (spacetime) volume law instead of the expected (spatial) area law. We find, however, that one obtains an area law if one truncates the commutator function (‘Pauli–Jordan operator’) and the Wightman function by projecting out the eigenmodes of the Pauli–Jordan operator whose eigenvalues are too close to zero according to a geometrical criterion which we describe more fully below. In connection with these results and the questions they raise, we also study the ‘entropy of coarse-graining’ generated by thinning out the causal set, and we compare it with what one obtains by similarly thinning out a chain of harmonic oscillators, finding the same, ‘universal’ behaviour in both cases.

  10. Evaluation of the entropy consistent euler flux on 1D and 2D test problems

    NASA Astrophysics Data System (ADS)

    Roslan, Nur Khairunnisa Hanisah; Ismail, Farzad

    2012-06-01

    Perhaps most CFD simulations may yield good predictions of pressure and velocity when compared to experimental data. Unfortunately, these results will most likely not adhere to the second law of thermodynamics hence comprising the authenticity of predicted data. Currently, the test of a good CFD code is to check how much entropy is generated in a smooth flow and hope that the numerical entropy produced is of the correct sign when a shock is encountered. Herein, a shock capturing code written in C++ based on a recent entropy consistent Euler flux is developed to simulate 1D and 2D flows. Unlike other finite volume schemes in commercial CFD code, this entropy consistent flux (EC) function precisely satisfies the discrete second law of thermodynamics. This EC flux has an entropy-conserved part, preserving entropy for smooth flows and a numerical diffusion part that will accurately produce the proper amount of entropy, consistent with the second law. Several numerical simulations of the entropy consistent flux have been tested on two dimensional test cases. The first case is a Mach 3 flow over a forward facing step. The second case is a flow over a NACA 0012 airfoil while the third case is a hypersonic flow passing over a 2D cylinder. Local flow quantities such as velocity and pressure are analyzed and then compared with mainly the Roe flux. The results herein show that the EC flux does not capture the unphysical rarefaction shock unlike the Roe-flux and does not easily succumb to the carbuncle phenomenon. In addition, the EC flux maintains good performance in cases where the Roe flux is known to be superior.

  11. Fault detection and diagnosis for gas turbines based on a kernelized information entropy model.

    PubMed

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.

  12. Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model

    PubMed Central

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms. PMID:25258726

  13. Secure uniform random-number extraction via incoherent strategies

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Zhu, Huangjun

    2018-01-01

    To guarantee the security of uniform random numbers generated by a quantum random-number generator, we study secure extraction of uniform random numbers when the environment of a given quantum state is controlled by the third party, the eavesdropper. Here we restrict our operations to incoherent strategies that are composed of the measurement on the computational basis and incoherent operations (or incoherence-preserving operations). We show that the maximum secure extraction rate is equal to the relative entropy of coherence. By contrast, the coherence of formation gives the extraction rate when a certain constraint is imposed on the eavesdropper's operations. The condition under which the two extraction rates coincide is then determined. Furthermore, we find that the exponential decreasing rate of the leaked information is characterized by Rényi relative entropies of coherence. These results clarify the power of incoherent strategies in random-number generation, and can be applied to guarantee the quality of random numbers generated by a quantum random-number generator.

  14. An entropy correction method for unsteady full potential flows with strong shocks

    NASA Technical Reports Server (NTRS)

    Whitlow, W., Jr.; Hafez, M. M.; Osher, S. J.

    1986-01-01

    An entropy correction method for the unsteady full potential equation is presented. The unsteady potential equation is modified to account for entropy jumps across shock waves. The conservative form of the modified equation is solved in generalized coordinates using an implicit, approximate factorization method. A flux-biasing differencing method, which generates the proper amounts of artificial viscosity in supersonic regions, is used to discretize the flow equations in space. Comparisons between the present method and solutions of the Euler equations and between the present method and experimental data are presented. The comparisons show that the present method more accurately models solutions of the Euler equations and experiment than does the isentropic potential formulation.

  15. Maximum-entropy description of animal movement.

    PubMed

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  16. Gravitational baryogenesis in running vacuum models

    NASA Astrophysics Data System (ADS)

    Oikonomou, V. K.; Pan, Supriya; Nunes, Rafael C.

    2017-08-01

    We study the gravitational baryogenesis mechanism for generating baryon asymmetry in the context of running vacuum models. Regardless of whether these models can produce a viable cosmological evolution, we demonstrate that they produce a nonzero baryon-to-entropy ratio even if the universe is filled with conformal matter. This is a sound difference between the running vacuum gravitational baryogenesis and the Einstein-Hilbert one, since in the latter case, the predicted baryon-to-entropy ratio is zero. We consider two well known and most used running vacuum models and show that the resulting baryon-to-entropy ratio is compatible with the observational data. Moreover, we also show that the mechanism of gravitational baryogenesis may constrain the running vacuum models.

  17. Integrated design of multivariable hydrometric networks using entropy theory with a multiobjective optimization approach

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Hwang, T.; Vose, J. M.; Martin, K. L.; Band, L. E.

    2016-12-01

    Obtaining quality hydrologic observations is the first step towards a successful water resources management. While remote sensing techniques have enabled to convert satellite images of the Earth's surface to hydrologic data, the importance of ground-based observations has never been diminished because in-situ data are often highly accurate and can be used to validate remote measurements. The existence of efficient hydrometric networks is becoming more important to obtain as much as information with minimum redundancy. The World Meteorological Organization (WMO) has recommended a guideline for the minimum hydrometric network density based on physiography; however, this guideline is not for the optimum network design but for avoiding serious deficiency from a network. Moreover, all hydrologic variables are interconnected within the hydrologic cycle, while monitoring networks have been designed individually. This study proposes an integrated network design method using entropy theory with a multiobjective optimization approach. In specific, a precipitation and a streamflow networks in a semi-urban watershed in Ontario, Canada were designed simultaneously by maximizing joint entropy, minimizing total correlation, and maximizing conditional entropy of streamflow network given precipitation network. After comparing with the typical individual network designs, the proposed design method would be able to determine more efficient optimal networks by avoiding the redundant stations, in which hydrologic information is transferable. Additionally, four quantization cases were applied in entropy calculations to assess their implications on the station rankings and the optimal networks. The results showed that the selection of quantization method should be considered carefully because the rankings and optimal networks are subject to change accordingly.

  18. Integrated design of multivariable hydrometric networks using entropy theory with a multiobjective optimization approach

    NASA Astrophysics Data System (ADS)

    Keum, J.; Coulibaly, P. D.

    2017-12-01

    Obtaining quality hydrologic observations is the first step towards a successful water resources management. While remote sensing techniques have enabled to convert satellite images of the Earth's surface to hydrologic data, the importance of ground-based observations has never been diminished because in-situ data are often highly accurate and can be used to validate remote measurements. The existence of efficient hydrometric networks is becoming more important to obtain as much as information with minimum redundancy. The World Meteorological Organization (WMO) has recommended a guideline for the minimum hydrometric network density based on physiography; however, this guideline is not for the optimum network design but for avoiding serious deficiency from a network. Moreover, all hydrologic variables are interconnected within the hydrologic cycle, while monitoring networks have been designed individually. This study proposes an integrated network design method using entropy theory with a multiobjective optimization approach. In specific, a precipitation and a streamflow networks in a semi-urban watershed in Ontario, Canada were designed simultaneously by maximizing joint entropy, minimizing total correlation, and maximizing conditional entropy of streamflow network given precipitation network. After comparing with the typical individual network designs, the proposed design method would be able to determine more efficient optimal networks by avoiding the redundant stations, in which hydrologic information is transferable. Additionally, four quantization cases were applied in entropy calculations to assess their implications on the station rankings and the optimal networks. The results showed that the selection of quantization method should be considered carefully because the rankings and optimal networks are subject to change accordingly.

  19. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    DTIC Science & Technology

    2014-07-01

    to the query graph, or subgraph permutations with the same mismatch cost (often the case for homogeneous and/or symmetrical data/query). To avoid...decisions are generated in a bottom-up manner using the metric of entropy at the cluster level (Figure 9c). Using the definition of belief messages...for a cluster and a set of data nodes in this cluster , we compute the entropy for forward and backward messages as (,) = −∑ (

  20. Acoustic firearm discharge detection and classification in an enclosed environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luzi, Lorenzo; Gonzalez, Eric; Bruillard, Paul

    2016-05-01

    Two different signal processing algorithms are described for detection and classification of acoustic signals generated by firearm discharges in small enclosed spaces. The first is based on the logarithm of the signal energy. The second is a joint entropy. The current study indicates that a system using both signal energy and joint entropy would be able to both detect weapon discharges and classify weapon type, in small spaces, with high statistical certainty.

  1. Intrinsic measures of field entropy in cosmological particle creation

    NASA Astrophysics Data System (ADS)

    Hu, B. L.; Pavon, D.

    1986-11-01

    Using the properties of quantum parametric oscillators, two quantities are identified which increase monotonically in time in the process of parametric amplification. The use of these quantities as possible measures of entropy generation in vacuum cosmological particle creation is suggested. These quantities which are of complementary nature are both related to the number of particles spontaneously created. Permanent address: Departamento de Termologia, Facultad de Ciencias, Universidad Autonoma de Barcelona, Ballaterra, Barcelona, Spain.

  2. The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval

    DTIC Science & Technology

    2006-07-01

    reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We

  3. IFSM fractal image compression with entropy and sparsity constraints: A sequential quadratic programming approach

    NASA Astrophysics Data System (ADS)

    Kunze, Herb; La Torre, Davide; Lin, Jianyi

    2017-01-01

    We consider the inverse problem associated with IFSM: Given a target function f , find an IFSM, such that its fixed point f ¯ is sufficiently close to f in the Lp distance. Forte and Vrscay [1] showed how to reduce this problem to a quadratic optimization model. In this paper, we extend the collage-based method developed by Kunze, La Torre and Vrscay ([2][3][4]), by proposing the minimization of the 1-norm instead of the 0-norm. In fact, optimization problems involving the 0-norm are combinatorial in nature, and hence in general NP-hard. To overcome these difficulties, we introduce the 1-norm and propose a Sequential Quadratic Programming algorithm to solve the corresponding inverse problem. As in Kunze, La Torre and Vrscay [3] in our formulation, the minimization of collage error is treated as a multi-criteria problem that includes three different and conflicting criteria i.e., collage error, entropy and sparsity. This multi-criteria program is solved by means of a scalarization technique which reduces the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented.

  4. Complexity Measures in Magnetoencephalography: Measuring "Disorder" in Schizophrenia

    PubMed Central

    Brookes, Matthew J.; Hall, Emma L.; Robson, Siân E.; Price, Darren; Palaniyappan, Lena; Liddle, Elizabeth B.; Liddle, Peter F.; Robinson, Stephen E.; Morris, Peter G.

    2015-01-01

    This paper details a methodology which, when applied to magnetoencephalography (MEG) data, is capable of measuring the spatio-temporal dynamics of ‘disorder’ in the human brain. Our method, which is based upon signal entropy, shows that spatially separate brain regions (or networks) generate temporally independent entropy time-courses. These time-courses are modulated by cognitive tasks, with an increase in local neural processing characterised by localised and transient increases in entropy in the neural signal. We explore the relationship between entropy and the more established time-frequency decomposition methods, which elucidate the temporal evolution of neural oscillations. We observe a direct but complex relationship between entropy and oscillatory amplitude, which suggests that these metrics are complementary. Finally, we provide a demonstration of the clinical utility of our method, using it to shed light on aberrant neurophysiological processing in schizophrenia. We demonstrate significantly increased task induced entropy change in patients (compared to controls) in multiple brain regions, including a cingulo-insula network, bilateral insula cortices and a right fronto-parietal network. These findings demonstrate potential clinical utility for our method and support a recent hypothesis that schizophrenia can be characterised by abnormalities in the salience network (a well characterised distributed network comprising bilateral insula and cingulate cortices). PMID:25886553

  5. Galilei group with multiple central extension, vorticity, and entropy generation: Exotic fluid in 3 +1 dimensions

    NASA Astrophysics Data System (ADS)

    Das, Praloy; Ghosh, Subir

    2017-12-01

    A noncommutative extension of an ideal (Hamiltonian) fluid model in 3 +1 dimensions is proposed. The model enjoys several interesting features: it allows a multiparameter central extension in Galilean boost algebra (which is significant being contrary to the existing belief that a similar feature can appear only in 2 +1 -dimensions); noncommutativity generates vorticity in a canonically irrotational fluid; it induces a nonbarotropic pressure leading to a nonisentropic system. (Barotropic fluids are entropy preserving as the pressure depends only on the matter density.) Our fluid model is termed "exotic" since it has a close resemblance with the extensively studied planar (2 +1 dimensions) exotic models and exotic (noncommutative) field theories.

  6. Accuracy in the estimation of quantitative minimal area from the diversity/area curve.

    PubMed

    Vives, Sergi; Salicrú, Miquel

    2005-05-01

    The problem of representativity is fundamental in ecological studies. A qualitative minimal area that gives a good representation of species pool [C.M. Bouderesque, Methodes d'etude qualitative et quantitative du benthos (en particulier du phytobenthos), Tethys 3(1) (1971) 79] can be discerned from a quantitative minimal area which reflects the structural complexity of community [F.X. Niell, Sobre la biologia de Ascophyllum nosodum (L.) Le Jolis en Galicia, Invest. Pesq. 43 (1979) 501]. This suggests that the populational diversity can be considered as the value of the horizontal asymptote corresponding to the curve sample diversity/biomass [F.X. Niell, Les applications de l'index de Shannon a l'etude de la vegetation interdidale, Soc. Phycol. Fr. Bull. 19 (1974) 238]. In this study we develop a expression to determine minimal areas and use it to obtain certain information about the community structure based on diversity/area curve graphs. This expression is based on the functional relationship between the expected value of the diversity and the sample size used to estimate it. In order to establish the quality of the estimation process, we obtained the confidence intervals as a particularization of the functional (h-phi)-entropies proposed in [M. Salicru, M.L. Menendez, D. Morales, L. Pardo, Asymptotic distribution of (h,phi)-entropies, Commun. Stat. (Theory Methods) 22 (7) (1993) 2015]. As an example used to demonstrate the possibilities of this method, and only for illustrative purposes, data about a study on the rocky intertidal seawed populations in the Ria of Vigo (N.W. Spain) are analyzed [F.X. Niell, Estudios sobre la estructura, dinamica y produccion del Fitobentos intermareal (Facies rocosa) de la Ria de Vigo. Ph.D. Mem. University of Barcelona, Barcelona, 1979].

  7. Constructing entanglement wedges for Lifshitz spacetimes with Lifshitz gravity

    NASA Astrophysics Data System (ADS)

    Cheyne, Jonathan; Mattingly, David

    2018-03-01

    Holographic relationships between entanglement entropy on the boundary of a spacetime and the area of minimal surfaces in the bulk provide an important entry in the bulk/boundary dictionary. While constructing the necessary causal and entanglement wedges is well understood in asymptotically AdS spacetimes, less is known about the equivalent constructions in spacetimes with different asymptotics. In particular, recent attempts to construct entanglement and causal wedges for asymptotically Lifshitz solutions in relativistic gravitational theories have proven problematic. We note a simple observation, that a Lifshitz bulk theory, specifically a covariant formulation of Hořava-Lifshitz gravity coupled to matter, has causal propagation defined by Lifshitz modes. We use these modes to construct causal and entanglement wedges and compute the geometric entanglement entropy, which in such a construction matches the field theory prescription.

  8. Logarithmic corrections to black hole entropy: the non-BPS branch

    NASA Astrophysics Data System (ADS)

    Castro, Alejandra; Godet, Victor; Larsen, Finn; Zeng, Yangwenxiao

    2018-05-01

    We compute the leading logarithmic correction to black hole entropy on the non-BPS branch of 4D N≥2 supergravity theories. This branch corresponds to finite temperature black holes whose extremal limit does not preserve supersymmetry, such as the D0 - D6 system in string theory. Starting from a black hole in minimal Kaluza-Klein theory, we discuss in detail its embedding into N=8 , 6, 4, 2 supergravity, its spectrum of quadratic fluctuations in all these environments, and the resulting quantum corrections. We find that the c-anomaly vanishes only when N≥6 , in contrast to the BPS branch where c vanishes for all N≥2 . We briefly discuss potential repercussions this feature could have in a microscopic description of these black holes.

  9. A plea for "variational neuroethology". Comment on "Answering Schrödinger's question: A free-energy formulation" by M.J. Desormeau Ramstead et al.

    NASA Astrophysics Data System (ADS)

    Daunizeau, Jean

    2018-03-01

    What is life? According to Erwin Schrödinger [13], the living cell departs from other physical systems in that it - apparently - resists the second law of thermodynamics by restricting the dynamical repertoire (minimizing the entropy) of its physiological states. This is a physical rephrasing of Claude Bernard's biological notion of homeostasis, namely: the capacity of living systems to self-organize in order to maintain the stability of their internal milieu despite uninterrupted exchanges with an ever-altering external environment [2]. The important point here is that physical systems can neither identify nor prevent a state of high entropy. The Free Energy Principle or FEP was originally proposed as a mathematical description of how the brain actually solves this issue [4]. In line with the Bayesian brain hypothesis, the FEP views the brain as a hierarchical statistical learning machine, endowed with the imperative of minimizing Free Energy, i.e. prediction error. Action prescription under the FEP, however, does not follow standard Bayesian decision theory. Rather, action is assumed to further minimize Free Energy, which makes the active brain a self-fulfilling prophecy machine [6]. This is adaptive, under the assumption that evolution has equipped the brain with innate priors centered on homeostatic set points. In turn, avoiding (surprising) violations of such prior predictions implements homeostatic regulation [10], which becomes increasingly anticipatory as learning unfolds over the course of ontological development [5].

  10. Does horizon entropy satisfy a quantum null energy conjecture?

    NASA Astrophysics Data System (ADS)

    Fu, Zicao; Marolf, Donald

    2016-12-01

    A modern version of the idea that the area of event horizons gives 4G times an entropy is the Hubeny-Rangamani causal holographic information (CHI) proposal for holographic field theories. Given a region R of a holographic QFTs, CHI computes A/4G on a certain cut of an event horizon in the gravitational dual. The result is naturally interpreted as a coarse-grained entropy for the QFT. CHI is known to be finitely greater than the fine-grained Hubeny-Rangamani-Takayanagi (HRT) entropy when \\partial R lies on a Killing horizon of the QFT spacetime, and in this context satisfies other non-trivial properties expected of an entropy. Here we present evidence that it also satisfies the quantum null energy condition (QNEC), which bounds the second derivative of the entropy of a quantum field theory on one side of a non-expanding null surface by the flux of stress-energy across the surface. In particular, we show CHI to satisfy the QNEC in 1  +  1 holographic CFTs when evaluated in states dual to conical defects in AdS3. This surprising result further supports the idea that CHI defines a useful notion of coarse-grained holographic entropy, and suggests unprecedented bounds on the rate at which bulk horizon generators emerge from a caustic. To supplement our motivation, we include an appendix deriving a corresponding coarse-grained generalized second law for 1  +  1 holographic CFTs perturbatively coupled to dilaton gravity.

  11. Identification of breathing cracks in a beam structure with entropy

    NASA Astrophysics Data System (ADS)

    Wimarshana, Buddhi; Wu, Nan; Wu, Christine

    2016-04-01

    A cantilever beam with a breathing crack is studied to detect and evaluate the crack using entropy measures. Closed cracks in engineering structures lead to proportional complexities to their vibration responses due to weak bi-linearity imposed by the crack breathing phenomenon. Entropy is a measure of system complexity and has the potential in quantifying the complexity. The weak bi-linearity in vibration signals can be amplified using wavelet transformation to increase the sensitivity of the measurements. A mathematical model of harmonically excited unit length steel cantilever beam with a breathing crack located near the fixed end is established, and an iterative numerical method is applied to generate accurate time domain dynamic responses. The bi-linearity in time domain signals due to the crack breathing are amplified by wavelet transformation first, and then the complexities due to bi-linearity is quantified using sample entropy to detect the possible crack and estimate the crack depth. It is observed that the method is capable of identifying crack depths even at very early stages of 3% with the increase in the entropy values more than 10% compared with the healthy beam. The current study extends the entropy based damage detection of rotary machines to structural analysis and takes a step further in high-sensitivity structural health monitoring by combining wavelet transformation with entropy calculations. The proposed technique can also be applied to other types of structures, such as plates and shells.

  12. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  13. Thermodynamics and evolution.

    PubMed

    Demetrius, L

    2000-09-07

    The science of thermodynamics is concerned with understanding the properties of inanimate matter in so far as they are determined by changes in temperature. The Second Law asserts that in irreversible processes there is a uni-directional increase in thermodynamic entropy, a measure of the degree of uncertainty in the thermal energy state of a randomly chosen particle in the aggregate. The science of evolution is concerned with understanding the properties of populations of living matter in so far as they are regulated by changes in generation time. Directionality theory, a mathematical model of the evolutionary process, establishes that in populations subject to bounded growth constraints, there is a uni-directional increase in evolutionary entropy, a measure of the degree of uncertainty in the age of the immediate ancestor of a randomly chosen newborn. This article reviews the mathematical basis of directionality theory and analyses the relation between directionality theory and statistical thermodynamics. We exploit an analytic relation between temperature, and generation time, to show that the directionality principle for evolutionary entropy is a non-equilibrium extension of the principle of a uni-directional increase of thermodynamic entropy. The analytic relation between these directionality principles is consistent with the hypothesis of the equivalence of fundamental laws as one moves up the hierarchy, from a molecular ensemble where the thermodynamic laws apply, to a population of replicating entities (molecules, cells, higher organisms), where evolutionary principles prevail. Copyright 2000 Academic Press.

  14. He-Ne laser-induced changes in germination, thermodynamic parameters, internal energy, enzyme activities and physiological attributes of wheat during germination and early growth

    NASA Astrophysics Data System (ADS)

    Jamil, Yasir; Perveen, Rashida; Ashraf, Muhammad; Ali, Qasim; Iqbal, Munawar; Ahmad, Muhammad Raza

    2013-04-01

    Using low power continuous wave He-Ne laser irradiation of seeds, the germination characteristics, thermodynamic changes and enzyme activities as well as changes in morphological attributes were explored for wheat (Triticum aestivum L. cv. S-24) cultivar. The changes in thermodynamic properties such as change in enthalpy (ΔH), entropy generation [(ΔSe)], entropy flux [(ΔSc)], entropy generation ratio [(ΔS)e/Δt], and entropy flux ratio [(ΔS)c/Δt] showed significant (P < 0.05) changes at an energy level of 500 mJ. The germination energy (GE), germination percentage (G%), germination index (GI) as well as α-amylase and protease activities was also found to be higher at 500 mJ, while the mean emergence time (MET) and time for 50% germination (E50) decreased for 300 mJ irradiance. The internal energy of the seeds increased significantly at all laser energy levels, but was highest for 500 mJ 72 h after sowing. The enzyme activities increased up to 24 h after sowing and then declined. The activities of α-amylase and protease were found to be positively correlated with the plant physiological attributes. These results indicate that low power continuous wave He-Ne laser (632 nm) treatment has considerable biological effects on seed metabolism during germination as well as on later vegetative growth.

  15. MHD mixed convection and entropy generation of water-alumina nanofluid flow in a double lid driven cavity with discrete heating

    NASA Astrophysics Data System (ADS)

    Hussain, S.; Mehmood, K.; Sagheer, M.

    2016-12-01

    In the present study, entropy generation due to mixed convection in a partially heated square double lid driven cavity filled with Al2O3 -water nanofluid under the influence of inclined magnetic field is numerically investigated. At the lower wall of the cavity two heat sources are fixed, with the condition that the remaining part of the bottom wall is kept insulated. Top wall and vertically moving walls are maintained at constant cold temperature. Buoyant force is responsible for the flow along with the two moving vertical walls. Governing equations are discretized in space using LBB-stable finite element pair Q2 / P1disc which lead to 3rd and 2nd order accuracy in the L2-norm for the velocity/temperature and pressure, respectively and the fully implicit Crank-Nicolson scheme of 2nd order accuracy is utilized for the temporal discretization. The discretized systems of nonlinear equations are treated by using the Newton method and the associated linear subproblems are solved by means of Guassian elimination method. Numerical results are presented and analyzed by means of streamlines, isotherms, tables and some useful plots. Impacts of emerging parameters on the flow, in specific ranges such as Reynolds number (1 ≤ Re ≤ 100) , Richardson number (1 ≤ Ri ≤ 50) , Hartman number (0 ≤ Ha ≤ 100) , solid volume fraction (0 ≤ ϕ ≤ 0.2) as well as the angles of inclined magnetic field (0 ° ≤ γ ≤ 90 °) are investigated and the findings are exactly of the same order as that of the previously performed analysis. Calculation of average Nusselt number, entropy generation due to heat transfer, fluid friction and magnetic field, total entropy generation, Bejan number and kinetic energy are the main focus of our study.

  16. An uncertainty principle for unimodular quantum groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crann, Jason; Université Lille 1 - Sciences et Technologies, UFR de Mathématiques, Laboratoire de Mathématiques Paul Painlevé - UMR CNRS 8524, 59655 Villeneuve d'Ascq Cédex; Kalantar, Mehrdad, E-mail: jason-crann@carleton.ca, E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect tomore » the Haar weight reduces to the canonical entropy of the random walk generated by the state.« less

  17. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  18. Convex foundations for generalized MaxEnt models

    NASA Astrophysics Data System (ADS)

    Frongillo, Rafael; Reid, Mark D.

    2014-12-01

    We present an approach to maximum entropy models that highlights the convex geometry and duality of generalized exponential families (GEFs) and their connection to Bregman divergences. Using our framework, we are able to resolve a puzzling aspect of the bijection of Banerjee and coauthors between classical exponential families and what they call regular Bregman divergences. Their regularity condition rules out all but Bregman divergences generated from log-convex generators. We recover their bijection and show that a much broader class of divergences correspond to GEFs via two key observations: 1) Like classical exponential families, GEFs have a "cumulant" C whose subdifferential contains the mean: Eo˜pθ[φ(o)]∈∂C(θ) ; 2) Generalized relative entropy is a C-Bregman divergence between parameters: DF(pθ,pθ')= D C(θ,θ') , where DF becomes the KL divergence for F = -H. We also show that every incomplete market with cost function C can be expressed as a complete market, where the prices are constrained to be a GEF with cumulant C. This provides an entirely new interpretation of prediction markets, relating their design back to the principle of maximum entropy.

  19. From entropy-maximization to equality-maximization: Gauss, Laplace, Pareto, and Subbotin

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2014-12-01

    The entropy-maximization paradigm of statistical physics is well known to generate the omnipresent Gauss law. In this paper we establish an analogous socioeconomic model which maximizes social equality, rather than physical disorder, in the context of the distributions of income and wealth in human societies. We show that-on a logarithmic scale-the Laplace law is the socioeconomic equality-maximizing counterpart of the physical entropy-maximizing Gauss law, and that this law manifests an optimized balance between two opposing forces: (i) the rich and powerful, striving to amass ever more wealth, and thus to increase social inequality; and (ii) the masses, struggling to form more egalitarian societies, and thus to increase social equality. Our results lead from log-Gauss statistics to log-Laplace statistics, yield Paretian power-law tails of income and wealth distributions, and show how the emergence of a middle-class depends on the underlying levels of socioeconomic inequality and variability. Also, in the context of asset-prices with Laplace-distributed returns, our results imply that financial markets generate an optimized balance between risk and predictability.

  20. Unresolved Problems by Shock Capturing: Taming the Overheating Problem

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2012-01-01

    The overheating problem, first observed by von Neumann [1] and later studied extensively by Noh [2] using both Eulerian and Lagrangian formulations, remains to be one of the unsolved problems by shock capturing. It is historically well known to occur when a flow is under compression, such as when a shock wave hits and reflects from a wall or when two streams collides with each other. The overheating phenomenon is also found numerically in a smooth flow undergoing rarefaction created by two streams receding from each other. This is in contrary to one s intuition expecting a decrease in internal energy. The excessive amount in the temperature increase does not reduce by refining the mesh size or increasing the order of accuracy. This study finds that the overheating in the receding flow correlates with the entropy generation. By requiring entropy preservation, the overheating is eliminated and the solution is grid convergent. The shock-capturing scheme, as being practiced today, gives rise to the entropy generation, which in turn causes the overheating. This assertion stands up to the convergence test.

  1. Shallow water equations: viscous solutions and inviscid limit

    NASA Astrophysics Data System (ADS)

    Chen, Gui-Qiang; Perepelitsa, Mikhail

    2012-12-01

    We establish the inviscid limit of the viscous shallow water equations to the Saint-Venant system. For the viscous equations, the viscosity terms are more degenerate when the shallow water is close to the bottom, in comparison with the classical Navier-Stokes equations for barotropic gases; thus, the analysis in our earlier work for the classical Navier-Stokes equations does not apply directly, which require new estimates to deal with the additional degeneracy. We first introduce a notion of entropy solutions to the viscous shallow water equations and develop an approach to establish the global existence of such solutions and their uniform energy-type estimates with respect to the viscosity coefficient. These uniform estimates yield the existence of measure-valued solutions to the Saint-Venant system generated by the viscous solutions. Based on the uniform energy-type estimates and the features of the Saint-Venant system, we further establish that the entropy dissipation measures of the viscous solutions for weak entropy-entropy flux pairs, generated by compactly supported C 2 test-functions, are confined in a compact set in H -1, which yields that the measure-valued solutions are confined by the Tartar-Murat commutator relation. Then, the reduction theorem established in Chen and Perepelitsa [5] for the measure-valued solutions with unbounded support leads to the convergence of the viscous solutions to a finite-energy entropy solution of the Saint-Venant system with finite-energy initial data, which is relative with respect to the different end-states of the bottom topography of the shallow water at infinity. The analysis also applies to the inviscid limit problem for the Saint-Venant system in the presence of friction.

  2. Experimental feasibility of investigating acoustic waves in Couette flow with entropy and pressure gradients

    NASA Technical Reports Server (NTRS)

    Parrott, Tony L.; Zorumski, William E.; Rawls, John W., Jr.

    1990-01-01

    The feasibility is discussed for an experimental program for studying the behavior of acoustic wave propagation in the presence of strong gradients of pressure, temperature, and flow. Theory suggests that gradients effects can be experimentally observed as resonant frequency shifts and mode shape changes in a waveguide. A convenient experimental geometry for such experiments is the annular region between two co-rotating cylinders. Radial temperature gradients in a spinning annulus can be generated by differentially heating the two cylinders via electromagnetic induction. Radial pressure gradients can be controlled by varying the cylinder spin rates. Present technology appears adequate to construct an apparatus to allow independent control of temperature and pressure gradients. A complicating feature of a more advanced experiment, involving flow gradients, is the requirement for independently controlled cylinder spin rates. Also, the boundary condition at annulus terminations must be such that flow gradients are minimally disturbed. The design and construction of an advanced apparatus to include flow gradients will require additional technology development.

  3. Entropy, non-linearity and hierarchy in ecosystems

    NASA Astrophysics Data System (ADS)

    Addiscott, T.

    2009-04-01

    Soil-plant systems are open systems thermodynamically because they exchange both energy and matter with their surroundings. Thus they are properly described by the second and third of the three stages of thermodynamics defined by Prigogine and Stengers (1984). The second stage describes a system in which the flow is linearly related to the force. Such a system tends towards a steady state in which entropy production is minimized, but it depends on the capacity of the system for self-organization. In a third stage system, flow is non-linearly related to force, and the system can move far from equilibrium. This system maximizes entropy production but in so doing facilitates self-organization. The second stage system was suggested earlier to provide a useful analogue of the behaviour of natural and agricultural ecosystems subjected to perturbations, but it needs the capacity for self-organization. Considering an ecosystem as a hierarchy suggests this capacity is provided by the soil population, which releases from dead plant matter nutrients such as nitrate, phosphate and captions needed for growth of new plants and the renewal of the whole ecosystem. This release of small molecules from macromolecules increases entropy, and the soil population maximizes entropy production by releasing nutrients and carbon dioxide as vigorously as conditions allow. In so doing it behaves as a third stage thermodynamic system. Other authors (Schneider and Kay, 1994, 1995) consider that it is in the plants in an ecosystem that maximize entropy, mainly through transpiration, but studies on transpiration efficiency suggest that this is questionable. Prigogine, I. & Stengers, I. 1984. Order out of chaos. Bantam Books, Toronto. Schneider, E.D. & Kay, J.J. 1994. Life as a manifestation of the Second Law of Thermodynamics. Mathematical & Computer Modelling, 19, 25-48. Schneider, E.D. & Kay, J.J. 1995. Order from disorder: The Thermodynamics of Complexity in Biology. In: What is Life: the Next Fifty Years (eds. M.P. Murphy & L.A.J. O'Neill), pp. 161-172, Cambridge University Press, Cambridge.

  4. Exact symmetries in the velocity fluctuations of a hot Brownian swimmer

    NASA Astrophysics Data System (ADS)

    Falasco, Gianmaria; Pfaller, Richard; Bregulla, Andreas P.; Cichos, Frank; Kroy, Klaus

    2016-09-01

    Symmetries constrain dynamics. We test this fundamental physical principle, experimentally and by molecular dynamics simulations, for a hot Janus swimmer operating far from thermal equilibrium. Our results establish scalar and vectorial steady-state fluctuation theorems and a thermodynamic uncertainty relation that link the fluctuating particle current to its entropy production at an effective temperature. A Markovian minimal model elucidates the underlying nonequilibrium physics.

  5. Randomized shortest-path problems: two related models.

    PubMed

    Saerens, Marco; Achbany, Youssef; Fouss, François; Yen, Luh

    2009-08-01

    This letter addresses the problem of designing the transition probabilities of a finite Markov chain (the policy) in order to minimize the expected cost for reaching a destination node from a source node while maintaining a fixed level of entropy spread throughout the network (the exploration). It is motivated by the following scenario. Suppose you have to route agents through a network in some optimal way, for instance, by minimizing the total travel cost-nothing particular up to now-you could use a standard shortest-path algorithm. Suppose, however, that you want to avoid pure deterministic routing policies in order, for instance, to allow some continual exploration of the network, avoid congestion, or avoid complete predictability of your routing strategy. In other words, you want to introduce some randomness or unpredictability in the routing policy (i.e., the routing policy is randomized). This problem, which will be called the randomized shortest-path problem (RSP), is investigated in this work. The global level of randomness of the routing policy is quantified by the expected Shannon entropy spread throughout the network and is provided a priori by the designer. Then, necessary conditions to compute the optimal randomized policy-minimizing the expected routing cost-are derived. Iterating these necessary conditions, reminiscent of Bellman's value iteration equations, allows computing an optimal policy, that is, a set of transition probabilities in each node. Interestingly and surprisingly enough, this first model, while formulated in a totally different framework, is equivalent to Akamatsu's model ( 1996 ), appearing in transportation science, for a special choice of the entropy constraint. We therefore revisit Akamatsu's model by recasting it into a sum-over-paths statistical physics formalism allowing easy derivation of all the quantities of interest in an elegant, unified way. For instance, it is shown that the unique optimal policy can be obtained by solving a simple linear system of equations. This second model is therefore more convincing because of its computational efficiency and soundness. Finally, simulation results obtained on simple, illustrative examples show that the models behave as expected.

  6. Metabolic networks evolve towards states of maximum entropy production.

    PubMed

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  8. Entangled de Sitter from stringy axionic Bell pair I: an analysis using Bunch-Davies vacuum

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayantan; Panda, Sudhakar

    2018-01-01

    In this work, we study the quantum entanglement and compute entanglement entropy in de Sitter space for a bipartite quantum field theory driven by an axion originating from Type IIB string compactification on a Calabi-Yau three fold (CY^3) and in the presence of an NS5 brane. For this computation, we consider a spherical surface S^2, which divides the spatial slice of de Sitter (dS_4) into exterior and interior sub-regions. We also consider the initial choice of vacuum to be Bunch-Davies state. First we derive the solution of the wave function of the axion in a hyperbolic open chart by constructing a suitable basis for Bunch-Davies vacuum state using Bogoliubov transformation. We then derive the expression for density matrix by tracing over the exterior region. This allows us to compute the entanglement entropy and Rényi entropy in 3+1 dimension. Furthermore, we quantify the UV-finite contribution of the entanglement entropy which contain the physics of long range quantum correlations of our expanding universe. Finally, our analysis complements the necessary condition for generating non-vanishing entanglement entropy in primordial cosmology due to the axion.

  9. Seebeck Effects in N-Type and P-Type Polymers Driven Simultaneously by Surface Polarization and Entropy Differences Based on Conductor/Polymer/Conductor Thin-Film Devices

    DOE PAGES

    Hu, Dehua; Liu, Qing; Tisdale, Jeremy; ...

    2015-04-15

    This paper reports Seebeck effects driven by both surface polarization difference and entropy difference by using intramolecular charge-transfer states in n-type and p-type conjugated polymers, namely IIDT and IIDDT, based on vertical conductor/polymer/conductor thin-film devices. Large Seebeck coefficients of -898 V/K and 1300 V/K from are observed from n-type IIDT p-type IIDDT, respectively, when the charge-transfer states are generated by a white light illumination of 100 mW/cm2. Simultaneously, electrical conductivities are increased from almost insulating states in dark condition to conducting states under photoexcitation in both n-type IIDT and p-type IIDDT devices. We find that the intramolecular charge-transfer states canmore » largely enhance Seebeck effects in the n-type IIDT and p-type IIDDT devices driven by both surface polarization difference and entropy difference. Furthermore, the Seebeck effects can be shifted between polarization and entropy regimes when electrical conductivities are changed. This reveals a new concept to develop Seebeck effects by controlling polarization and entropy regimes based on charge-transfer states in vertical conductor/polymer/conductor thin-film devices.« less

  10. Ladar imaging detection of salient map based on PWVD and Rényi entropy

    NASA Astrophysics Data System (ADS)

    Xu, Yuannan; Zhao, Yuan; Deng, Rong; Dong, Yanbing

    2013-10-01

    Spatial-frequency information of a given image can be extracted by associating the grey-level spatial data with one of the well-known spatial/spatial-frequency distributions. The Wigner-Ville distribution (WVD) has a good characteristic that the images can be represented in spatial/spatial-frequency domains. For intensity and range images of ladar, through the pseudo Wigner-Ville distribution (PWVD) using one or two dimension window, the statistical property of Rényi entropy is studied. We also analyzed the change of Rényi entropy's statistical property in the ladar intensity and range images when the man-made objects appear. From this foundation, a novel method for generating saliency map based on PWVD and Rényi entropy is proposed. After that, target detection is completed when the saliency map is segmented using a simple and convenient threshold method. For the ladar intensity and range images, experimental results show the proposed method can effectively detect the military vehicles from complex earth background with low false alarm.

  11. EEG based topography analysis in string recognition task

    NASA Astrophysics Data System (ADS)

    Ma, Xiaofei; Huang, Xiaolin; Shen, Yuxiaotong; Qin, Zike; Ge, Yun; Chen, Ying; Ning, Xinbao

    2017-03-01

    Vision perception and recognition is a complex process, during which different parts of brain are involved depending on the specific modality of the vision target, e.g. face, character, or word. In this study, brain activities in string recognition task compared with idle control state are analyzed through topographies based on multiple measurements, i.e. sample entropy, symbolic sample entropy and normalized rhythm power, extracted from simultaneously collected scalp EEG. Our analyses show that, for most subjects, both symbolic sample entropy and normalized gamma power in string recognition task are significantly higher than those in idle state, especially at locations of P4, O2, T6 and C4. It implies that these regions are highly involved in string recognition task. Since symbolic sample entropy measures complexity, from the perspective of new information generation, and normalized rhythm power reveals the power distributions in frequency domain, complementary information about the underlying dynamics can be provided through the two types of indices.

  12. A synergistic approach to protein crystallization: Combination of a fixed-arm carrier with surface entropy reduction

    PubMed Central

    Moon, Andrea F; Mueller, Geoffrey A; Zhong, Xuejun; Pedersen, Lars C

    2010-01-01

    Protein crystallographers are often confronted with recalcitrant proteins not readily crystallizable, or which crystallize in problematic forms. A variety of techniques have been used to surmount such obstacles: crystallization using carrier proteins or antibody complexes, chemical modification, surface entropy reduction, proteolytic digestion, and additive screening. Here we present a synergistic approach for successful crystallization of proteins that do not form diffraction quality crystals using conventional methods. This approach combines favorable aspects of carrier-driven crystallization with surface entropy reduction. We have generated a series of maltose binding protein (MBP) fusion constructs containing different surface mutations designed to reduce surface entropy and encourage crystal lattice formation. The MBP advantageously increases protein expression and solubility, and provides a streamlined purification protocol. Using this technique, we have successfully solved the structures of three unrelated proteins that were previously unattainable. This crystallization technique represents a valuable rescue strategy for protein structure solution when conventional methods fail. PMID:20196072

  13. Rényi Entropies from Random Quenches in Atomic Hubbard and Spin Models.

    PubMed

    Elben, A; Vermersch, B; Dalmonte, M; Cirac, J I; Zoller, P

    2018-02-02

    We present a scheme for measuring Rényi entropies in generic atomic Hubbard and spin models using single copies of a quantum state and for partitions in arbitrary spatial dimensions. Our approach is based on the generation of random unitaries from random quenches, implemented using engineered time-dependent disorder potentials, and standard projective measurements, as realized by quantum gas microscopes. By analyzing the properties of the generated unitaries and the role of statistical errors, with respect to the size of the partition, we show that the protocol can be realized in existing quantum simulators and used to measure, for instance, area law scaling of entanglement in two-dimensional spin models or the entanglement growth in many-body localized systems.

  14. Rényi Entropies from Random Quenches in Atomic Hubbard and Spin Models

    NASA Astrophysics Data System (ADS)

    Elben, A.; Vermersch, B.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a scheme for measuring Rényi entropies in generic atomic Hubbard and spin models using single copies of a quantum state and for partitions in arbitrary spatial dimensions. Our approach is based on the generation of random unitaries from random quenches, implemented using engineered time-dependent disorder potentials, and standard projective measurements, as realized by quantum gas microscopes. By analyzing the properties of the generated unitaries and the role of statistical errors, with respect to the size of the partition, we show that the protocol can be realized in existing quantum simulators and used to measure, for instance, area law scaling of entanglement in two-dimensional spin models or the entanglement growth in many-body localized systems.

  15. The Root Cause of the Overheating Problem

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2017-01-01

    Previously we identified the receding flow, where two fluid streams recede from each other, as an open numerical problem, because all well-known numerical fluxes give an anomalous temperature rise, thus called the overheating problem. This phenomenon, although presented in several textbooks, and many previous publications, has scarcely been satisfactorily addressed and the root cause of the overheating problem not well understood. We found that this temperature rise was solely connected to entropy rise and proposed to use the method of characteristics to eradicate the problem. However, the root cause of the entropy production was still unclear. In the present study, we identify the cause of this problem: the entropy rise is rooted in the pressure flux in a finite volume formulation and is implanted at the first time step. It is found theoretically inevitable for all existing numerical flux schemes used in the finite volume setting, as confirmed by numerical tests. This difficulty cannot be eliminated by manipulating time step, grid size, spatial accuracy, etc, although the rate of overheating depends on the flux scheme used. Finally, we incorporate the entropy transport equation, in place of the energy equation, to ensure preservation of entropy, thus correcting this temperature anomaly. Its applicability is demonstrated for some relevant 1D and 2D problems. Thus, the present study validates that the entropy generated ab initio is the genesis of the overheating problem.

  16. Digital focusing of OCT images based on scalar diffraction theory and information entropy.

    PubMed

    Liu, Guozhong; Zhi, Zhongwei; Wang, Ruikang K

    2012-11-01

    This paper describes a digital method that is capable of automatically focusing optical coherence tomography (OCT) en face images without prior knowledge of the point spread function of the imaging system. The method utilizes a scalar diffraction model to simulate wave propagation from out-of-focus scatter to the focal plane, from which the propagation distance between the out-of-focus plane and the focal plane is determined automatically via an image-definition-evaluation criterion based on information entropy theory. By use of the proposed approach, we demonstrate that the lateral resolution close to that at the focal plane can be recovered from the imaging planes outside the depth of field region with minimal loss of resolution. Fresh onion tissues and mouse fat tissues are used in the experiments to show the performance of the proposed method.

  17. Maximum nonlocality and minimum uncertainty using magic states

    NASA Astrophysics Data System (ADS)

    Howard, Mark

    2015-04-01

    We prove that magic states from the Clifford hierarchy give optimal solutions for tasks involving nonlocality and entropic uncertainty with respect to Pauli measurements. For both the nonlocality and uncertainty tasks, stabilizer states are the worst possible pure states, so our solutions have an operational interpretation as being highly nonstabilizer. The optimal strategy for a qudit version of the Clauser-Horne-Shimony-Holt game in prime dimensions is achieved by measuring maximally entangled states that are isomorphic to single-qudit magic states. These magic states have an appealingly simple form, and our proof shows that they are "balanced" with respect to all but one of the mutually unbiased stabilizer bases. Of all equatorial qudit states, magic states minimize the average entropic uncertainties for collision entropy and also, for small prime dimensions, min-entropy, a fact that may have implications for cryptography.

  18. On extremal surfaces and de Sitter entropy

    NASA Astrophysics Data System (ADS)

    Narayan, K.

    2018-04-01

    We study extremal surfaces in the static patch coordinatization of de Sitter space, focusing on the future and past universes. We find connected timelike codim-2 surfaces on a boundary Euclidean time slice stretching from the future boundary I+ to the past boundary I-. In a limit, these surfaces pass through the bifurcation region and have minimal area with a divergent piece alone, whose coefficient is de Sitter entropy in 4-dimensions. These are reminiscent of rotated versions of certain surfaces in the AdS black hole. We close with some speculations on a possible dS / CFT interpretation of 4-dim de Sitter space as dual to two copies of ghost-CFTs in an entangled state. For a simple toy model of two copies of ghost-spin chains, we argue that similar entangled states always have positive norm and positive entanglement.

  19. Endocannabinoids: Multi-scaled, Global Homeostatic Regulators of Cells and Society

    NASA Astrophysics Data System (ADS)

    Melamede, Robert

    Living systems are far from equilibrium open systems that exhibit many scales of emergent behavior. They may be abstractly viewed as a complex weave of dissipative structures that maintain organization by passing electrons from reduced hydrocarbons to oxygen. Free radicals are unavoidable byproducts of biological electron flow. Due to their highly reactive chemical properties, free radicals modify all classes of biological molecules (carbohydrates, lipids, nucleic acids, and proteins). As a result, free radicals are destructive. The generally disruptive nature of free radicals makes them the "friction of life." As such, they are believed to be the etiological agents behind age related illnesses such as cardiovascular, immunological, and neurological diseases, cancer, and ageing itself. Free radicals also play a critical constructive role in living systems. From a thermodynamic perspective, life can only exist if a living system takes in sufficient negative entropy from its environment to overcome the obligatory increase in entropy that would result if the system could not appropriately exchange mass, energy and information with its environment. Free radicals are generated in response to perturbations in the relationship between a living system and its environment. However, evolution has selected for biological response systems to free radicals so that the cellular biochemistry can adapt to environmental perturbations by modifying cellular gene expression and biochemistry. Endocannabinoids are marijuana-like compounds that have their origins hundreds of millions of years in the evolutionary past. They serve as fundamental modulators of energy homeostasis in all vertebrates. Their widespread biological activities may often be attributed to their ability to minimize the negative consequences of free radicals.

  20. Braid Entropy of Two-Dimensional Turbulence

    NASA Astrophysics Data System (ADS)

    Francois, Nicolas; Xia, Hua; Punzmann, Horst; Faber, Benjamin; Shats, Michael

    2015-12-01

    The evolving shape of material fluid lines in a flow underlies the quantitative prediction of the dissipation and material transport in many industrial and natural processes. However, collecting quantitative data on this dynamics remains an experimental challenge in particular in turbulent flows. Indeed the deformation of a fluid line, induced by its successive stretching and folding, can be difficult to determine because such description ultimately relies on often inaccessible multi-particle information. Here we report laboratory measurements in two-dimensional turbulence that offer an alternative topological viewpoint on this issue. This approach characterizes the dynamics of a braid of Lagrangian trajectories through a global measure of their entanglement. The topological length of material fluid lines can be derived from these braids. This length is found to grow exponentially with time, giving access to the braid topological entropy . The entropy increases as the square root of the turbulent kinetic energy and is directly related to the single-particle dispersion coefficient. At long times, the probability distribution of is positively skewed and shows strong exponential tails. Our results suggest that may serve as a measure of the irreversibility of turbulence based on minimal principles and sparse Lagrangian data.

  1. Quantifying the entropic cost of cellular growth control

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele; Capuani, Fabrizio; De Martino, Andrea

    2017-07-01

    Viewing the ways a living cell can organize its metabolism as the phase space of a physical system, regulation can be seen as the ability to reduce the entropy of that space by selecting specific cellular configurations that are, in some sense, optimal. Here we quantify the amount of regulation required to control a cell's growth rate by a maximum-entropy approach to the space of underlying metabolic phenotypes, where a configuration corresponds to a metabolic flux pattern as described by genome-scale models. We link the mean growth rate achieved by a population of cells to the minimal amount of metabolic regulation needed to achieve it through a phase diagram that highlights how growth suppression can be as costly (in regulatory terms) as growth enhancement. Moreover, we provide an interpretation of the inverse temperature β controlling maximum-entropy distributions based on the underlying growth dynamics. Specifically, we show that the asymptotic value of β for a cell population can be expected to depend on (i) the carrying capacity of the environment, (ii) the initial size of the colony, and (iii) the probability distribution from which the inoculum was sampled. Results obtained for E. coli and human cells are found to be remarkably consistent with empirical evidence.

  2. 16QAM Blind Equalization via Maximum Entropy Density Approximation Technique and Nonlinear Lagrange Multipliers

    PubMed Central

    Mauda, R.; Pinchas, M.

    2014-01-01

    Recently a new blind equalization method was proposed for the 16QAM constellation input inspired by the maximum entropy density approximation technique with improved equalization performance compared to the maximum entropy approach, Godard's algorithm, and others. In addition, an approximated expression for the minimum mean square error (MSE) was obtained. The idea was to find those Lagrange multipliers that bring the approximated MSE to minimum. Since the derivation of the obtained MSE with respect to the Lagrange multipliers leads to a nonlinear equation for the Lagrange multipliers, the part in the MSE expression that caused the nonlinearity in the equation for the Lagrange multipliers was ignored. Thus, the obtained Lagrange multipliers were not those Lagrange multipliers that bring the approximated MSE to minimum. In this paper, we derive a new set of Lagrange multipliers based on the nonlinear expression for the Lagrange multipliers obtained from minimizing the approximated MSE with respect to the Lagrange multipliers. Simulation results indicate that for the high signal to noise ratio (SNR) case, a faster convergence rate is obtained for a channel causing a high initial intersymbol interference (ISI) while the same equalization performance is obtained for an easy channel (initial ISI low). PMID:24723813

  3. Multi-Material Closure Model for High-Order Finite Element Lagrangian Hydrodynamics

    DOE PAGES

    Dobrev, V. A.; Kolev, T. V.; Rieben, R. N.; ...

    2016-04-27

    We present a new closure model for single fluid, multi-material Lagrangian hydrodynamics and its application to high-order finite element discretizations of these equations [1]. The model is general with respect to the number of materials, dimension and space and time discretizations. Knowledge about exact material interfaces is not required. Material indicator functions are evolved by a closure computation at each quadrature point of mixed cells, which can be viewed as a high-order variational generalization of the method of Tipton [2]. This computation is defined by the notion of partial non-instantaneous pressure equilibration, while the full pressure equilibration is achieved bymore » both the closure model and the hydrodynamic motion. Exchange of internal energy between materials is derived through entropy considerations, that is, every material produces positive entropy, and the total entropy production is maximized in compression and minimized in expansion. Results are presented for standard one-dimensional two-material problems, followed by two-dimensional and three-dimensional multi-material high-velocity impact arbitrary Lagrangian–Eulerian calculations. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.« less

  4. Braid Entropy of Two-Dimensional Turbulence

    PubMed Central

    Francois, Nicolas; Xia, Hua; Punzmann, Horst; Faber, Benjamin; Shats, Michael

    2015-01-01

    The evolving shape of material fluid lines in a flow underlies the quantitative prediction of the dissipation and material transport in many industrial and natural processes. However, collecting quantitative data on this dynamics remains an experimental challenge in particular in turbulent flows. Indeed the deformation of a fluid line, induced by its successive stretching and folding, can be difficult to determine because such description ultimately relies on often inaccessible multi-particle information. Here we report laboratory measurements in two-dimensional turbulence that offer an alternative topological viewpoint on this issue. This approach characterizes the dynamics of a braid of Lagrangian trajectories through a global measure of their entanglement. The topological length of material fluid lines can be derived from these braids. This length is found to grow exponentially with time, giving access to the braid topological entropy . The entropy increases as the square root of the turbulent kinetic energy and is directly related to the single-particle dispersion coefficient. At long times, the probability distribution of is positively skewed and shows strong exponential tails. Our results suggest that may serve as a measure of the irreversibility of turbulence based on minimal principles and sparse Lagrangian data. PMID:26689261

  5. Multi-Material Closure Model for High-Order Finite Element Lagrangian Hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobrev, V. A.; Kolev, T. V.; Rieben, R. N.

    We present a new closure model for single fluid, multi-material Lagrangian hydrodynamics and its application to high-order finite element discretizations of these equations [1]. The model is general with respect to the number of materials, dimension and space and time discretizations. Knowledge about exact material interfaces is not required. Material indicator functions are evolved by a closure computation at each quadrature point of mixed cells, which can be viewed as a high-order variational generalization of the method of Tipton [2]. This computation is defined by the notion of partial non-instantaneous pressure equilibration, while the full pressure equilibration is achieved bymore » both the closure model and the hydrodynamic motion. Exchange of internal energy between materials is derived through entropy considerations, that is, every material produces positive entropy, and the total entropy production is maximized in compression and minimized in expansion. Results are presented for standard one-dimensional two-material problems, followed by two-dimensional and three-dimensional multi-material high-velocity impact arbitrary Lagrangian–Eulerian calculations. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.« less

  6. Stimulus-dependent Maximum Entropy Models of Neural Population Codes

    PubMed Central

    Segev, Ronen; Schneidman, Elad

    2013-01-01

    Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population. PMID:23516339

  7. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    NASA Technical Reports Server (NTRS)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  8. Inference of gene regulatory networks from time series by Tsallis entropy

    PubMed Central

    2011-01-01

    Background The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 ≤ q ≤ 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/. PMID:21545720

  9. A finite-temperature Hartree-Fock code for shell-model Hamiltonians

    NASA Astrophysics Data System (ADS)

    Bertsch, G. F.; Mehlhaff, J. M.

    2016-10-01

    The codes HFgradZ.py and HFgradT.py find axially symmetric minima of a Hartree-Fock energy functional for a Hamiltonian supplied in a shell model basis. The functional to be minimized is the Hartree-Fock energy for zero-temperature properties or the Hartree-Fock grand potential for finite-temperature properties (thermal energy, entropy). The minimization may be subjected to additional constraints besides axial symmetry and nucleon numbers. A single-particle operator can be used to constrain the minimization by adding it to the single-particle Hamiltonian with a Lagrange multiplier. One can also constrain its expectation value in the zero-temperature code. Also the orbital filling can be constrained in the zero-temperature code, fixing the number of nucleons having given Kπ quantum numbers. This is particularly useful to resolve near-degeneracies among distinct minima.

  10. A Statistical Model for Multilingual Entity Detection and Tracking

    DTIC Science & Technology

    2004-01-01

    tomatic Content Extraction ( ACE ) evaluation achieved top-tier results in all three evaluation languages. 1 Introduction Detecting entities, whether named...of com- bining the detected mentions into groups of references to the same object. The work presented here is motivated by the ACE eval- uation...Entropy (MaxEnt henceforth) (Berger et al., 1996) and Robust Risk Minimization (RRM henceforth) 1For a description of the ACE program see http

  11. Finger Vein Segmentation from Infrared Images Based on a Modified Separable Mumford Shah Model and Local Entropy Thresholding

    PubMed Central

    Dermatas, Evangelos

    2015-01-01

    A novel method for finger vein pattern extraction from infrared images is presented. This method involves four steps: preprocessing which performs local normalization of the image intensity, image enhancement, image segmentation, and finally postprocessing for image cleaning. In the image enhancement step, an image which will be both smooth and similar to the original is sought. The enhanced image is obtained by minimizing the objective function of a modified separable Mumford Shah Model. Since, this minimization procedure is computationally intensive for large images, a local application of the Mumford Shah Model in small window neighborhoods is proposed. The finger veins are located in concave nonsmooth regions and, so, in order to distinct them from the other tissue parts, all the differences between the smooth neighborhoods, obtained by the local application of the model, and the corresponding windows of the original image are added. After that, veins in the enhanced image have been sufficiently emphasized. Thus, after image enhancement, an accurate segmentation can be obtained readily by a local entropy thresholding method. Finally, the resulted binary image may suffer from some misclassifications and, so, a postprocessing step is performed in order to extract a robust finger vein pattern. PMID:26120357

  12. Application of band-target entropy minimization to infrared emission spectroscopy and the reconstruction of pure component emissivities from thin films and liquid samples.

    PubMed

    Cheng, Shuying; Rajarathnam, D; Meiling, Tan; Garland, Marc

    2006-05-01

    Thermal emission spectral data sets were collected for a thin solid film (parafilm) and a thin liquid film (isopropanol) on the interval of 298-348 K. The measurements were performed using a conventional Fourier transform infrared (FT-IR) spectrometer with external optical bench and in-house-designed emission cell. Both DTGS and MCT detectors were used. The data sets were analyzed with band-target entropy minimization (BTEM), which is a pure component spectral reconstruction program. Pure component emissivities of the parafilm, isopropanol, and thermal background were all recovered without any a priori information. Furthermore, the emissivities were obtained with increased signal-to-noise ratios, and the signals due to absorbance of thermal radiation by gas-phase moisture and CO2 were significantly reduced. As expected, the MCT results displayed better signal-to-noise ratios than the DTGS results, but the latter results were still rather impressive given the low temperatures used in this study. Comparison is made with spectral reconstruction using the orthogonal projection approach-alternating least squares (OPA-ALS) technique. This contribution introduces the primary equation for emission spectral reconstruction using BTEM and discusses some of the unusual characteristics of thermal emission and their impact on the analysis.

  13. Use of Raman microscopy and band-target entropy minimization analysis to identify dyes in a commercial stamp. Implications for authentication and counterfeit detection.

    PubMed

    Widjaja, Effendi; Garland, Marc

    2008-02-01

    Raman microscopy was used in mapping mode to collect more than 1000 spectra in a 100 microm x 100 microm area from a commercial stamp. Band-target entropy minimization (BTEM) was then employed to unmix the mixture spectra in order to extract the pure component spectra of the samples. Three pure component spectral patterns with good signal-to-noise ratios were recovered, and their spatial distributions were determined. The three pure component spectral patterns were then identified as copper phthalocyanine blue, calcite-like material, and yellow organic dye material by comparison to known spectral libraries. The present investigation, consisting of (1) advanced curve resolution (blind-source separation) followed by (2) spectral data base matching, readily suggests extensions to authenticity and counterfeit studies of other types of commercial objects. The presence or absence of specific observable components form the basis for assessment. The present spectral analysis (BTEM) is applicable to highly overlapping spectral information. Since a priori information such as the number of components present and spectral libraries are not needed in BTEM, and since minor signals arising from trace components can be reconstructed, this analysis offers a robust approach to a wide variety of material problems involving authenticity and counterfeit issues.

  14. Coherent entropy induced and acoustic noise separation in compact nozzles

    NASA Astrophysics Data System (ADS)

    Tao, Wenjie; Schuller, Thierry; Huet, Maxime; Richecoeur, Franck

    2017-04-01

    A method to separate entropy induced noise from an acoustic pressure wave in an harmonically perturbed flow through a nozzle is presented. It is tested on an original experimental setup generating simultaneously acoustic and temperature fluctuations in an air flow that is accelerated by a convergent nozzle. The setup mimics the direct and indirect noise contributions to the acoustic pressure field in a confined combustion chamber by producing synchronized acoustic and temperature fluctuations, without dealing with the complexity of the combustion process. It allows generating temperature fluctuations with amplitude up to 10 K in the frequency range from 10 to 100 Hz. The noise separation technique uses experiments with and without temperature fluctuations to determine the relative level of acoustic and entropy fluctuations in the system and to identify the nozzle response to these forcing waves. It requires multi-point measurements of acoustic pressure and temperature. The separation method is first validated with direct numerical simulations of the nonlinear Euler equations. These simulations are used to investigate the conditions for which the separation technique is valid and yield similar trends as the experiments for the investigated flow operating conditions. The separation method then gives successfully the acoustic reflection coefficient but does not recover the same entropy reflection coefficient as predicted by the compact nozzle theory due to the sensitivity of the method to signal noises in the explored experimental conditions. This methodology provides a framework for experimental investigation of direct and indirect combustion noises originating from synchronized perturbations.

  15. Shock wave induced vaporization of porous solids

    NASA Astrophysics Data System (ADS)

    Shen, Andy H.; Ahrens, Thomas J.; O'Keefe, John D.

    2003-05-01

    Strong shock waves generated by hypervelocity impact can induce vaporization in solid materials. To pursue knowledge of the chemical species in the shock-induced vapors, one needs to design experiments that will drive the system to such thermodynamic states that sufficient vapor can be generated for investigation. It is common to use porous media to reach high entropy, vaporized states in impact experiments. We extended calculations by Ahrens [J. Appl. Phys. 43, 2443 (1972)] and Ahrens and O'Keefe [The Moon 4, 214 (1972)] to higher distentions (up to five) and improved their method with a different impedance match calculation scheme and augmented their model with recent thermodynamic and Hugoniot data of metals, minerals, and polymers. Although we reconfirmed the competing effects reported in the previous studies: (1) increase of entropy production and (2) decrease of impedance match, when impacting materials with increasing distentions, our calculations did not exhibit optimal entropy-generating distention. For different materials, very different impact velocities are needed to initiate vaporization. For aluminum at distention (m)<2.2, a minimum impact velocity of 2.7 km/s is required using tungsten projectile. For ionic solids such as NaCl at distention <2.2, 2.5 km/s is needed. For carbonate and sulfate minerals, the minimum impact velocities are much lower, ranging from less than 1 to 1.5 km/s.

  16. Study of water based nanofluid flows in annular tubes using numerical simulation and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Siadaty, Moein; Kazazi, Mohsen

    2018-04-01

    Convective heat transfer, entropy generation and pressure drop of two water based nanofluids (Cu-water and Al2O3-water) in horizontal annular tubes are scrutinized by means of computational fluids dynamics, response surface methodology and sensitivity analysis. First, central composite design is used to perform a series of experiments with diameter ratio, length to diameter ratio, Reynolds number and solid volume fraction. Then, CFD is used to calculate the Nusselt Number, Euler number and entropy generation. After that, RSM is applied to fit second order polynomials on responses. Finally, sensitivity analysis is conducted to manage the above mentioned parameters inside tube. Totally, 62 different cases are examined. CFD results show that Cu-water and Al2O3-water have the highest and lowest heat transfer rate, respectively. In addition, analysis of variances indicates that increase in solid volume fraction increases dimensionless pressure drop for Al2O3-water. Moreover, it has a significant negative and insignificant effects on Cu-water Nusselt and Euler numbers, respectively. Analysis of Bejan number indicates that frictional and thermal entropy generations are the dominant irreversibility in Al2O3-water and Cu-water flows, respectively. Sensitivity analysis indicates dimensionless pressure drop sensitivity to tube length for Cu-water is independent of its diameter ratio at different Reynolds numbers.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donnelly, William; Freidel, Laurent

    We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less

  18. Structure-Activity Relationship and Molecular Mechanics Reveal the Importance of Ring Entropy in the Biosynthesis and Activity of a Natural Product.

    PubMed

    Tran, Hai L; Lexa, Katrina W; Julien, Olivier; Young, Travis S; Walsh, Christopher T; Jacobson, Matthew P; Wells, James A

    2017-02-22

    Macrocycles are appealing drug candidates due to their high affinity, specificity, and favorable pharmacological properties. In this study, we explored the effects of chemical modifications to a natural product macrocycle upon its activity, 3D geometry, and conformational entropy. We chose thiocillin as a model system, a thiopeptide in the ribosomally encoded family of natural products that exhibits potent antimicrobial effects against Gram-positive bacteria. Since thiocillin is derived from a genetically encoded peptide scaffold, site-directed mutagenesis allows for rapid generation of analogues. To understand thiocillin's structure-activity relationship, we generated a site-saturation mutagenesis library covering each position along thiocillin's macrocyclic ring. We report the identification of eight unique compounds more potent than wild-type thiocillin, the best having an 8-fold improvement in potency. Computational modeling of thiocillin's macrocyclic structure revealed a striking requirement for a low-entropy macrocycle for activity. The populated ensembles of the active mutants showed a rigid structure with few adoptable conformations while inactive mutants showed a more flexible macrocycle which is unfavorable for binding. This finding highlights the importance of macrocyclization in combination with rigidifying post-translational modifications to achieve high-potency binding.

  19. Three perspectives on complexity: entropy, compression, subsymmetry

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi

    2017-12-01

    There is no single universally accepted definition of `Complexity'. There are several perspectives on complexity and what constitutes complex behaviour or complex systems, as opposed to regular, predictable behaviour and simple systems. In this paper, we explore the following perspectives on complexity: effort-to-describe (Shannon entropy H, Lempel-Ziv complexity LZ), effort-to-compress (ETC complexity) and degree-of-order (Subsymmetry or SubSym). While Shannon entropy and LZ are very popular and widely used, ETC is relatively a new complexity measure. In this paper, we also propose a novel normalized complexity measure SubSym based on the existing idea of counting the number of subsymmetries or palindromes within a sequence. We compare the performance of these complexity measures on the following tasks: (A) characterizing complexity of short binary sequences of lengths 4 to 16, (B) distinguishing periodic and chaotic time series from 1D logistic map and 2D Hénon map, (C) analyzing the complexity of stochastic time series generated from 2-state Markov chains, and (D) distinguishing between tonic and irregular spiking patterns generated from the `Adaptive exponential integrate-and-fire' neuron model. Our study reveals that each perspective has its own advantages and uniqueness while also having an overlap with each other.

  20. Cattaneo-Christov based study of {TiO}_2 -CuO/EG Casson hybrid nanofluid flow over a stretching surface with entropy generation

    NASA Astrophysics Data System (ADS)

    Jamshed, Wasim; Aziz, Asim

    2018-06-01

    In the present research, a simplified mathematical model is presented to study the heat transfer and entropy generation analysis of thermal system containing hybrid nanofluid. Nanofluid occupies the space over an infinite horizontal surface and the flow is induced by the non-linear stretching of surface. A uniform transverse magnetic field, Cattaneo-Christov heat flux model and thermal radiation effects are also included in the present study. The similarity technique is employed to reduce the governing non-linear partial differential equations to a set of ordinary differential equation. Keller Box numerical scheme is then used to approximate the solutions for the thermal analysis. Results are presented for conventional copper oxide-ethylene glycol (CuO-EG) and hybrid titanium-copper oxide/ethylene glycol ({TiO}_2 -CuO/EG) nanofluids. The spherical, hexahedron, tetrahedron, cylindrical, and lamina-shaped nanoparticles are considered in the present analysis. The significant findings of the study is the enhanced heat transfer capability of hybrid nanofluids over the conventional nanofluids, greatest heat transfer rate for the smallest value of the shape factor parameter and the increase in Reynolds number and Brinkman number increases the overall entropy of the system.

  1. Black holes in vector-tensor theories and their thermodynamics

    NASA Astrophysics Data System (ADS)

    Fan, Zhong-Ying

    2018-01-01

    In this paper, we study Einstein gravity either minimally or non-minimally coupled to a vector field which breaks the gauge symmetry explicitly in general dimensions. We first consider a minimal theory which is simply the Einstein-Proca theory extended with a quartic self-interaction term for the vector field. We obtain its general static maximally symmetric black hole solution and study the thermodynamics using Wald formalism. The aspects of the solution are much like a Reissner-Nordstrøm black hole in spite of that a global charge cannot be defined for the vector. For non-minimal theories, we obtain a lot of exact black hole solutions, depending on the parameters of the theories. In particular, many of the solutions are general static and have maximal symmetry. However, there are some subtleties and ambiguities in the derivation of the first laws because the existence of an algebraic degree of freedom of the vector in general invalids the Wald entropy formula. The thermodynamics of these solutions deserves further studies.

  2. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  3. Essential equivalence of the general equation for the nonequilibrium reversible-irreversible coupling (GENERIC) and steepest-entropy-ascent models of dissipation for nonequilibrium thermodynamics.

    PubMed

    Montefusco, Alberto; Consonni, Francesco; Beretta, Gian Paolo

    2015-04-01

    By reformulating the steepest-entropy-ascent (SEA) dynamical model for nonequilibrium thermodynamics in the mathematical language of differential geometry, we compare it with the primitive formulation of the general equation for the nonequilibrium reversible-irreversible coupling (GENERIC) model and discuss the main technical differences of the two approaches. In both dynamical models the description of dissipation is of the "entropy-gradient" type. SEA focuses only on the dissipative, i.e., entropy generating, component of the time evolution, chooses a sub-Riemannian metric tensor as dissipative structure, and uses the local entropy density field as potential. GENERIC emphasizes the coupling between the dissipative and nondissipative components of the time evolution, chooses two compatible degenerate structures (Poisson and degenerate co-Riemannian), and uses the global energy and entropy functionals as potentials. As an illustration, we rewrite the known GENERIC formulation of the Boltzmann equation in terms of the square root of the distribution function adopted by the SEA formulation. We then provide a formal proof that in more general frameworks, whenever all degeneracies in the GENERIC framework are related to conservation laws, the SEA and GENERIC models of the dissipative component of the dynamics are essentially interchangeable, provided of course they assume the same kinematics. As part of the discussion, we note that equipping the dissipative structure of GENERIC with the Leibniz identity makes it automatically SEA on metric leaves.

  4. Natural approach to quantum dissipation

    NASA Astrophysics Data System (ADS)

    Taj, David; Öttinger, Hans Christian

    2015-12-01

    The dissipative dynamics of a quantum system weakly coupled to one or several reservoirs is usually described in terms of a Lindblad generator. The popularity of this approach is certainly due to the linear character of the latter. However, while such linearity finds justification from an underlying Hamiltonian evolution in some scaling limit, it does not rely on solid physical motivations at small but finite values of the coupling constants, where the generator is typically used for applications. The Markovian quantum master equations we propose are instead supported by very natural thermodynamic arguments. They themselves arise from Markovian master equations for the system and the environment which preserve factorized states and mean energy and generate entropy at a non-negative rate. The dissipative structure is driven by an entropic map, called modular, which introduces nonlinearity. The generated modular dynamical semigroup (MDS) guarantees for the positivity of the time evolved state the correct steady state properties, the positivity of the entropy production, and a positive Onsager matrix with symmetry relations arising from Green-Kubo formulas. We show that the celebrated Davies Lindblad generator, obtained through the Born and the secular approximations, generates a MDS. In doing so we also provide a nonlinear MDS which is supported by a weak coupling argument and is free from the limitations of the Davies generator.

  5. Temperature lapse rates at restricted thermodynamic equilibrium. Part II: Saturated air and further discussions

    NASA Astrophysics Data System (ADS)

    Björnbom, Pehr

    2016-03-01

    In the first part of this work equilibrium temperature profiles in fluid columns with ideal gas or ideal liquid were obtained by numerically minimizing the column energy at constant entropy, equivalent to maximizing column entropy at constant energy. A minimum in internal plus potential energy for an isothermal temperature profile was obtained in line with Gibbs' classical equilibrium criterion. However, a minimum in internal energy alone for adiabatic temperature profiles was also obtained. This led to a hypothesis that the adiabatic lapse rate corresponds to a restricted equilibrium state, a type of state in fact discussed already by Gibbs. In this paper similar numerical results for a fluid column with saturated air suggest that also the saturated adiabatic lapse rate corresponds to a restricted equilibrium state. The proposed hypothesis is further discussed and amended based on the previous and the present numerical results and a theoretical analysis based on Gibbs' equilibrium theory.

  6. Entanglement Entropy across the Superfluid-Insulator Transition: A Signature of Bosonic Criticality.

    PubMed

    Frérot, Irénée; Roscilde, Tommaso

    2016-05-13

    We study the entanglement entropy and entanglement spectrum of the paradigmatic Bose-Hubbard model, describing strongly correlated bosons on a lattice. The use of a controlled approximation-the slave-boson approach-allows us to study entanglement in all regimes of the model (and, most importantly, across its superfluid-Mott-insulator transition) at a minimal cost. We find that the area-law scaling of entanglement-verified in all the phases-exhibits a sharp singularity at the transition. The singularity is greatly enhanced when the transition is crossed at fixed, integer filling, due to a richer entanglement spectrum containing an additional gapless mode, which descends from the amplitude (Higgs) mode of the global excitation spectrum-while this mode remains gapped at the generic (commensurate-incommensurate) transition with variable filling. Hence, the entanglement properties contain a unique signature of the two different forms of bosonic criticality exhibited by the Bose-Hubbard model.

  7. Digital focusing of OCT images based on scalar diffraction theory and information entropy

    PubMed Central

    Liu, Guozhong; Zhi, Zhongwei; Wang, Ruikang K.

    2012-01-01

    This paper describes a digital method that is capable of automatically focusing optical coherence tomography (OCT) en face images without prior knowledge of the point spread function of the imaging system. The method utilizes a scalar diffraction model to simulate wave propagation from out-of-focus scatter to the focal plane, from which the propagation distance between the out-of-focus plane and the focal plane is determined automatically via an image-definition-evaluation criterion based on information entropy theory. By use of the proposed approach, we demonstrate that the lateral resolution close to that at the focal plane can be recovered from the imaging planes outside the depth of field region with minimal loss of resolution. Fresh onion tissues and mouse fat tissues are used in the experiments to show the performance of the proposed method. PMID:23162717

  8. A generalized complexity measure based on Rényi entropy

    NASA Astrophysics Data System (ADS)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  9. Test images for the maximum entropy image restoration method

    NASA Technical Reports Server (NTRS)

    Mackey, James E.

    1990-01-01

    One of the major activities of any experimentalist is data analysis and reduction. In solar physics, remote observations are made of the sun in a variety of wavelengths and circumstances. In no case is the data collected free from the influence of the design and operation of the data gathering instrument as well as the ever present problem of noise. The presence of significant noise invalidates the simple inversion procedure regardless of the range of known correlation functions. The Maximum Entropy Method (MEM) attempts to perform this inversion by making minimal assumptions about the data. To provide a means of testing the MEM and characterizing its sensitivity to noise, choice of point spread function, type of data, etc., one would like to have test images of known characteristics that can represent the type of data being analyzed. A means of reconstructing these images is presented.

  10. Bacterial protease uses distinct thermodynamic signatures for substrate recognition.

    PubMed

    Bezerra, Gustavo Arruda; Ohara-Nemoto, Yuko; Cornaciu, Irina; Fedosyuk, Sofiya; Hoffmann, Guillaume; Round, Adam; Márquez, José A; Nemoto, Takayuki K; Djinović-Carugo, Kristina

    2017-06-06

    Porphyromonas gingivalis and Porphyromonas endodontalis are important bacteria related to periodontitis, the most common chronic inflammatory disease in humans worldwide. Its comorbidity with systemic diseases, such as type 2 diabetes, oral cancers and cardiovascular diseases, continues to generate considerable interest. Surprisingly, these two microorganisms do not ferment carbohydrates; rather they use proteinaceous substrates as carbon and energy sources. However, the underlying biochemical mechanisms of their energy metabolism remain unknown. Here, we show that dipeptidyl peptidase 11 (DPP11), a central metabolic enzyme in these bacteria, undergoes a conformational change upon peptide binding to distinguish substrates from end products. It binds substrates through an entropy-driven process and end products in an enthalpy-driven fashion. We show that increase in protein conformational entropy is the main-driving force for substrate binding via the unfolding of specific regions of the enzyme ("entropy reservoirs"). The relationship between our structural and thermodynamics data yields a distinct model for protein-protein interactions where protein conformational entropy modulates the binding free-energy. Further, our findings provide a framework for the structure-based design of specific DPP11 inhibitors.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Dehua; Liu, Qing; Tisdale, Jeremy

    This paper reports Seebeck effects driven by both surface polarization difference and entropy difference by using intramolecular charge-transfer states in n-type and p-type conjugated polymers, namely IIDT and IIDDT, based on vertical conductor/polymer/conductor thin-film devices. Large Seebeck coefficients of -898 V/K and 1300 V/K from are observed from n-type IIDT p-type IIDDT, respectively, when the charge-transfer states are generated by a white light illumination of 100 mW/cm2. Simultaneously, electrical conductivities are increased from almost insulating states in dark condition to conducting states under photoexcitation in both n-type IIDT and p-type IIDDT devices. We find that the intramolecular charge-transfer states canmore » largely enhance Seebeck effects in the n-type IIDT and p-type IIDDT devices driven by both surface polarization difference and entropy difference. Furthermore, the Seebeck effects can be shifted between polarization and entropy regimes when electrical conductivities are changed. This reveals a new concept to develop Seebeck effects by controlling polarization and entropy regimes based on charge-transfer states in vertical conductor/polymer/conductor thin-film devices.« less

  12. Anosov C-systems and random number generators

    NASA Astrophysics Data System (ADS)

    Savvidy, G. K.

    2016-08-01

    We further develop our previous proposal to use hyperbolic Anosov C-systems to generate pseudorandom numbers and to use them for efficient Monte Carlo calculations in high energy particle physics. All trajectories of hyperbolic dynamical systems are exponentially unstable, and C-systems therefore have mixing of all orders, a countable Lebesgue spectrum, and a positive Kolmogorov entropy. These exceptional ergodic properties follow from the C-condition introduced by Anosov. This condition defines a rich class of dynamical systems forming an open set in the space of all dynamical systems. An important property of C-systems is that they have a countable set of everywhere dense periodic trajectories and their density increases exponentially with entropy. Of special interest are the C-systems defined on higher-dimensional tori. Such C-systems are excellent candidates for generating pseudorandom numbers that can be used in Monte Carlo calculations. An efficient algorithm was recently constructed that allows generating long C-system trajectories very rapidly. These trajectories have good statistical properties and can be used for calculations in quantum chromodynamics and in high energy particle physics.

  13. Temperature Dependence of Uranium and Vanadium Adsorption on Amidoxime-Based Adsorbents in Natural Seawater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuo, Li-Jung; Gill, Gary A.; Tsouris, Costas

    The apparent enthalpy and entropy of the complexation of uranium (VI) and vanadium (V) with amidoxime ligands grafted onto polyethylene fiber was determined using time series measurements of adsorption capacities in natural seawater at three different temperatures. The complexation of uranium was highly endothermic, while the complexation of vanadium showed minimal temperature sensitivity. Amidoxime-based polymeric adsorbents exhibit significantly increased uranium adsorption capacities and selectivity in warmer waters.

  14. Statistical aspects of the Klein-Gordon oscillator in the frame work of GUP

    NASA Astrophysics Data System (ADS)

    Khosropour, B.

    2018-01-01

    Investigation in perturbative string theory and quantum gravity suggest that there is a measurable minimal length in nature. In this work, according to generalized uncertainty principle, we study the statistical characteristics of Klein-Gordon Oscillator (KLO). The modified energy spectrum of the KLO are obtained. The generalized thermodynamical quantities of the KLO such as partition function, mean energy and entropy are calculated by using the modified energy spectrum.

  15. Generative complexity of Gray-Scott model

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2018-03-01

    In the Gray-Scott reaction-diffusion system one reactant is constantly fed in the system, another reactant is reproduced by consuming the supplied reactant and also converted to an inert product. The rate of feeding one reactant in the system and the rate of removing another reactant from the system determine configurations of concentration profiles: stripes, spots, waves. We calculate the generative complexity-a morphological complexity of concentration profiles grown from a point-wise perturbation of the medium-of the Gray-Scott system for a range of the feeding and removal rates. The morphological complexity is evaluated using Shannon entropy, Simpson diversity, approximation of Lempel-Ziv complexity, and expressivity (Shannon entropy divided by space-filling). We analyse behaviour of the systems with highest values of the generative morphological complexity and show that the Gray-Scott systems expressing highest levels of the complexity are composed of the wave-fragments (similar to wave-fragments in sub-excitable media) and travelling localisations (similar to quasi-dissipative solitons and gliders in Conway's Game of Life).

  16. Entanglement of purification: from spin chains to holography

    NASA Astrophysics Data System (ADS)

    Nguyen, Phuc; Devakul, Trithep; Halbasch, Matthew G.; Zaletel, Michael P.; Swingle, Brian

    2018-01-01

    Purification is a powerful technique in quantum physics whereby a mixed quantum state is extended to a pure state on a larger system. This process is not unique, and in systems composed of many degrees of freedom, one natural purification is the one with minimal entanglement. Here we study the entropy of the minimally entangled purification, called the entanglement of purification, in three model systems: an Ising spin chain, conformal field theories holographically dual to Einstein gravity, and random stabilizer tensor networks. We conjecture values for the entanglement of purification in all these models, and we support our conjectures with a variety of numerical and analytical results. We find that such minimally entangled purifications have a number of applications, from enhancing entanglement-based tensor network methods for describing mixed states to elucidating novel aspects of the emergence of geometry from entanglement in the AdS/CFT correspondence.

  17. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  18. Minimally Informative Prior Distributions for PSA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that themore » posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained noninformative prior, the robust prior places a heavy-tailed Cauchy prior on the canonical parameter of the aleatory model.« less

  19. Energetic basis on interactions between ferredoxin and ferredoxin NADP{sup +} reductase at varying physiological conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinoshita, Misaki; Kim, Ju Yaen; Kume, Satoshi

    In spite of a number of studies to characterize ferredoxin (Fd):ferredoxin NADP{sup +} reductase (FNR) interactions at limited conditions, detailed energetic investigation on how these proteins interact under near physiological conditions and its linkage to FNR activity are still lacking. We herein performed systematic Fd:FNR binding thermodynamics using isothermal titration calorimetry (ITC) at distinct pH (6.0 and 8.0), NaCl concentrations (0–200 mM), and temperatures (19–28 °C) for mimicking physiological conditions in chloroplasts. Energetically unfavorable endothermic enthalpy changes were accompanied by Fd:FNR complexation at all conditions. This energetic cost was compensated by favorable entropy changes, balanced by conformational and hydrational entropy. Increases inmore » the NaCl concentration and pH weakened interprotein affinity due to the less contribution of favorable entropy change regardless of energetic gains from enthalpy changes, suggesting that entropy drove complexation and modulated affinity. Effects of temperature on binding thermodynamics were much smaller than those of pH and NaCl. NaCl concentration and pH-dependent enthalpy and heat capacity changes provided clues for distinct binding modes. Moreover, decreases in the enthalpy level in the Hammond's postulate-based energy landscape implicated kinetic advantages for FNR activity. All these energetic interplays were comprehensively demonstrated by the driving force plot with the enthalpy-entropy compensation which may serve as an energetic buffer against outer stresses. We propose that high affinity at pH 6.0 may be beneficial for protection from proteolysis of Fd and FNR in rest states, and moderate affinity at pH 8.0 and proper NaCl concentrations with smaller endothermic enthalpy changes may contribute to increase FNR activity. - Highlights: • Energetics of Fd:FNR binding were examined by considering physiological conditions. • NaCl and pH affect energetically Fd:FNR binding with minimal effects of temperature. • Enthalpy and heat capacity may modulate binding kinetics and modes for FNR activity. • Entropy drives complexation by overcoming unfavorable enthalpy and tunes affinity. • Driving force plot reveals condition-dependent energetic interplays for complexation.« less

  20. Altered Enthalpy-Entropy Compensation in Picomolar Transition State Analogues of Human Purine Nucleoside Phosphorylase†

    PubMed Central

    Edwards, Achelle A.; Mason, Jennifer M.; Clinch, Keith; Tyler, Peter C.; Evans, Gary B.; Schramm, Vern L.

    2009-01-01

    Human purine nucleoside phosphorylase (PNP) belongs to the trimeric class of PNPs and is essential for catabolism of deoxyguanosine. Genetic deficiency of PNP in humans causes a specific T-cell immune deficiency and transition state analogue inhibitors of PNP are in development for treatment of T-cell cancers and autoimmune disorders. Four generations of Immucillins have been developed, each of which contains inhibitors binding with picomolar affinity to human PNP. Full inhibition of PNP occurs upon binding to the first of three subunits and binding to subsequent sites occurs with negative cooperativity. In contrast, substrate analogue and product bind without cooperativity. Titrations of human PNP using isothermal calorimetery indicate that binding of a structurally rigid first-generation Immucillin (K d = 56 pM) is driven by large negative enthalpy values (ΔH = −21.2 kcal/mol) with a substantial entropic (-TΔS) penalty. The tightest-binding inhibitors (K d = 5 to 9 pM) have increased conformational flexibility. Despite their conformational freedom in solution, flexible inhibitors bind with high affinity because of reduced entropic penalties. Entropic penalties are proposed to arise from conformational freezing of the PNP·inhibitor complex with the entropy term dominated by protein dynamics. The conformationally flexible Immucillins reduce the system entropic penalty. Disrupting the ribosyl 5’-hydroxyl interaction of transition state analogues with PNP causes favorable entropy of binding. Tight binding of the seventeen Immucillins is characterized by large enthalpic contributions, emphasizing their similarity to the transition state. By introducing flexibility into the inhibitor structure, the enthalpy-entropy compensation pattern is altered to permit tighter binding. PMID:19425594

  1. Dissipation, generalized free energy, and a self-consistent nonequilibrium thermodynamics of chemically driven open subsystems.

    PubMed

    Ge, Hao; Qian, Hong

    2013-06-01

    Nonequilibrium thermodynamics of a system situated in a sustained environment with influx and efflux is usually treated as a subsystem in a larger, closed "universe." A question remains with regard to what the minimally required description for the surrounding of such an open driven system is so that its nonequilibrium thermodynamics can be established solely based on the internal stochastic kinetics. We provide a solution to this problem using insights from studies of molecular motors in a chemical nonequilibrium steady state (NESS) with sustained external drive through a regenerating system or in a quasisteady state (QSS) with an excess amount of adenosine triphosphate (ATP), adenosine diphosphate (ADP), and inorganic phosphate (Pi). We introduce the key notion of minimal work that is needed, W(min), for the external regenerating system to sustain a NESS (e.g., maintaining constant concentrations of ATP, ADP and Pi for a molecular motor). Using a Markov (master-equation) description of a motor protein, we illustrate that the NESS and QSS have identical kinetics as well as the second law in terms of the same positive entropy production rate. The heat dissipation of a NESS without mechanical output is exactly the W(min). This provides a justification for introducing an ideal external regenerating system and yields a free-energy balance equation between the net free-energy input F(in) and total dissipation F(dis) in an NESS: F(in) consists of chemical input minus mechanical output; F(dis) consists of dissipative heat, i.e. the amount of useful energy becoming heat, which also equals the NESS entropy production. Furthermore, we show that for nonstationary systems, the F(dis) and F(in) correspond to the entropy production rate and housekeeping heat in stochastic thermodynamics and identify a relative entropy H as a generalized free energy. We reach a new formulation of Markovian nonequilibrium thermodynamics based on only the internal kinetic equation without further reference to the intrinsic degree of freedom within each Markov state. It includes an extended free-energy balance and a second law which are valid for driven stochastic dynamics with an ideal external regenerating system. Our result suggests new ingredients for a generalized thermodynamics of self-organization in driven systems.

  2. Numerical analysis of single and multiple jets

    NASA Astrophysics Data System (ADS)

    Boussoufi, Mustapha; Sabeur-Bendehina, Amina; Ouadha, Ahmed; Morsli, Souad; El Ganaoui, Mohammed

    2017-05-01

    The present study aims to use the concept of entropy generation in order to study numerically the flow and the interaction of multiple jets. Several configurations of a single jet surrounded by equidistant 3, 5, 7 and 9 circumferential jets have been studied. The turbulent incompressible Navier-Stokes equations have been solved numerically using the commercial computational fluid dynamics code Fluent. The standard k-ɛ model has been selected to assess the eddy viscosity. The domain has been reduced to a quarter of the geometry due to symmetry. Results for axial and radial velocities have been compared with experimental measurements from the literature. Furthermore, additional results involving entropy generation rate have been presented and discussed. Contribution to the topical issue "Materials for Energy harvesting, conversion and storage II (ICOME 2016)", edited by Jean-Michel Nunzi, Rachid Bennacer and Mohammed El Ganaoui

  3. Alloy design for intrinsically ductile refractory high-entropy alloys

    NASA Astrophysics Data System (ADS)

    Sheikh, Saad; Shafeie, Samrand; Hu, Qiang; Ahlström, Johan; Persson, Christer; Veselý, Jaroslav; Zýka, Jiří; Klement, Uta; Guo, Sheng

    2016-10-01

    Refractory high-entropy alloys (RHEAs), comprising group IV (Ti, Zr, Hf), V (V, Nb, Ta), and VI (Cr, Mo, W) refractory elements, can be potentially new generation high-temperature materials. However, most existing RHEAs lack room-temperature ductility, similar to conventional refractory metals and alloys. Here, we propose an alloy design strategy to intrinsically ductilize RHEAs based on the electron theory and more specifically to decrease the number of valence electrons through controlled alloying. A new ductile RHEA, Hf0.5Nb0.5Ta0.5Ti1.5Zr, was developed as a proof of concept, with a fracture stress of close to 1 GPa and an elongation of near 20%. The findings here will shed light on the development of ductile RHEAs for ultrahigh-temperature applications in aerospace and power-generation industries.

  4. Universal bounds on current fluctuations.

    PubMed

    Pietzonka, Patrick; Barato, Andre C; Seifert, Udo

    2016-05-01

    For current fluctuations in nonequilibrium steady states of Markovian processes, we derive four different universal bounds valid beyond the Gaussian regime. Different variants of these bounds apply to either the entropy change or any individual current, e.g., the rate of substrate consumption in a chemical reaction or the electron current in an electronic device. The bounds vary with respect to their degree of universality and tightness. A universal parabolic bound on the generating function of an arbitrary current depends solely on the average entropy production. A second, stronger bound requires knowledge both of the thermodynamic forces that drive the system and of the topology of the network of states. These two bounds are conjectures based on extensive numerics. An exponential bound that depends only on the average entropy production and the average number of transitions per time is rigorously proved. This bound has no obvious relation to the parabolic bound but it is typically tighter further away from equilibrium. An asymptotic bound that depends on the specific transition rates and becomes tight for large fluctuations is also derived. This bound allows for the prediction of the asymptotic growth of the generating function. Even though our results are restricted to networks with a finite number of states, we show that the parabolic bound is also valid for three paradigmatic examples of driven diffusive systems for which the generating function can be calculated using the additivity principle. Our bounds provide a general class of constraints for nonequilibrium systems.

  5. Translation Invariant Extensions of Finite Volume Measures

    NASA Astrophysics Data System (ADS)

    Goldstein, S.; Kuna, T.; Lebowitz, J. L.; Speer, E. R.

    2017-02-01

    We investigate the following questions: Given a measure μ _Λ on configurations on a subset Λ of a lattice L, where a configuration is an element of Ω ^Λ for some fixed set Ω , does there exist a measure μ on configurations on all of L, invariant under some specified symmetry group of L, such that μ _Λ is its marginal on configurations on Λ ? When the answer is yes, what are the properties, e.g., the entropies, of such measures? Our primary focus is the case in which L=Z^d and the symmetries are the translations. For the case in which Λ is an interval in Z we give a simple necessary and sufficient condition, local translation invariance ( LTI), for extendibility. For LTI measures we construct extensions having maximal entropy, which we show are Gibbs measures; this construction extends to the case in which L is the Bethe lattice. On Z we also consider extensions supported on periodic configurations, which are analyzed using de Bruijn graphs and which include the extensions with minimal entropy. When Λ subset Z is not an interval, or when Λ subset Z^d with d>1, the LTI condition is necessary but not sufficient for extendibility. For Z^d with d>1, extendibility is in some sense undecidable.

  6. Aeroacoustic and aerodynamic applications of the theory of nonequilibrium thermodynamics

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Smith, Charles A.; Karamcheti, Krishnamurty

    1991-01-01

    Recent developments in the field of nonequilibrium thermodynamics associated with viscous flows are examined and related to developments to the understanding of specific phenomena in aerodynamics and aeroacoustics. A key element of the nonequilibrium theory is the principle of minimum entropy production rate for steady dissipative processes near equilibrium, and variational calculus is used to apply this principle to several examples of viscous flow. A review of nonequilibrium thermodynamics and its role in fluid motion are presented. Several formulations are presented of the local entropy production rate and the local energy dissipation rate, two quantities that are of central importance to the theory. These expressions and the principle of minimum entropy production rate for steady viscous flows are used to identify parallel-wall channel flow and irrotational flow as having minimally dissipative velocity distributions. Features of irrotational, steady, viscous flow near an airfoil, such as the effect of trailing-edge radius on circulation, are also found to be compatible with the minimum principle. Finally, the minimum principle is used to interpret the stability of infinitesimal and finite amplitude disturbances in an initially laminar, parallel shear flow, with results that are consistent with experiment and linearized hydrodynamic stability theory. These results suggest that a thermodynamic approach may be useful in unifying the understanding of many diverse phenomena in aerodynamics and aeroacoustics.

  7. Marginally trapped surfaces and AdS/CFT

    NASA Astrophysics Data System (ADS)

    Grado-White, Brianna; Marolf, Donald

    2018-02-01

    It has been proposed that the areas of marginally trapped or anti-trapped surfaces (also known as leaves of holographic screens) may encode some notion of entropy. To connect this to AdS/CFT, we study the case of marginally trapped surfaces anchored to an AdS boundary. We establish that such boundary-anchored leaves lie between the causal and extremal surfaces defined by the anchor and that they have area bounded below by that of the minimal extremal surface. This suggests that the area of any leaf represents a coarse-grained von Neumann entropy for the associated region of the dual CFT. We further demonstrate that the leading area-divergence of a boundary-anchored marginally trapped surface agrees with that for the associated extremal surface, though subleading divergences generally differ. Finally, we generalize an argument of Bousso and Engelhardt to show that holographic screens with all leaves anchored to the same boundary set have leaf-areas that increase monotonically along the screen, and we describe a construction through which this monotonicity can take the more standard form of requiring entropy to increase with boundary time. This construction is related to what one might call future causal holographic information, which in such cases also provides an upper bound on the area of the associated leaves.

  8. Thermodynamic perspectives on genetic instructions, the laws of biology and diseased states.

    PubMed

    Trevors, Jack T; Saier, Milton H

    2011-01-01

    This article examines in a broad perspective entropy and some examples of its relationship to evolution, genetic instructions and how we view diseases. Living organisms are programmed by functional genetic instructions (FGI), through cellular communication pathways, to grow and reproduce by maintaining a variety of hemistable, ordered structures (low entropy). Living organisms are far from equilibrium with their surrounding environmental systems, which tends towards increasing disorder (increasing entropy). Organisms free themselves from high entropy (high disorder) to maintain their cellular structures for a period of time sufficient to allow reproduction and the resultant offspring to reach reproductive ages. This time interval varies for different species. Bacteria, for example need no sexual parents; dividing cells are nearly identical to the previous generation of cells, and can begin a new cell cycle without delay under appropriate conditions. By contrast, human infants require years of care before they can reproduce. Living organisms maintain order in spite of their changing surrounding environment that decreases order according to the second law of thermodynamics. These events actually work together since living organisms create ordered biological structures by increasing local entropy. From a disease perspective, viruses and other disease agents interrupt the normal functioning of cells. The pressure for survival may result in mechanisms that allow organisms to resist attacks by viruses, other pathogens, destructive chemicals and physical agents such as radiation. However, when the attack is successful, the organism can be damaged until the cell, tissue, organ or entire organism is no longer functional and entropy increases. Copyright © 2010 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  9. On relativistic generalization of Perelman's W-entropy and thermodynamic description of gravitational fields and cosmology

    NASA Astrophysics Data System (ADS)

    Ruchin, Vyacheslav; Vacaru, Olivia; Vacaru, Sergiu I.

    2017-03-01

    Using double 2+2 and 3+1 nonholonomic fibrations on Lorentz manifolds, we extend the concept of W-entropy for gravitational fields in general relativity (GR). Such F- and W-functionals were introduced in the Ricci flow theory of three dimensional (3-d) Riemannian metrics by Perelman (the entropy formula for the Ricci flow and its geometric applications. arXiv:math.DG/0211159). Non-relativistic 3-d Ricci flows are characterized by associated statistical thermodynamical values determined by W-entropy. Generalizations for geometric flows of 4-d pseudo-Riemannian metrics are considered for models with local thermodynamical equilibrium and separation of dissipative and non-dissipative processes in relativistic hydrodynamics. The approach is elaborated in the framework of classical field theories (relativistic continuum and hydrodynamic models) without an underlying kinetic description, which will be elaborated in other work. The 3+1 splitting allows us to provide a general relativistic definition of gravitational entropy in the Lyapunov-Perelman sense. It increases monotonically as structure forms in the Universe. We can formulate a thermodynamic description of exact solutions in GR depending, in general, on all spacetime coordinates. A corresponding 2+2 splitting with nonholonomic deformation of linear connection and frame structures is necessary for generating in very general form various classes of exact solutions of the Einstein and general relativistic geometric flow equations. Finally, we speculate on physical macrostates and microstate interpretations of the W-entropy in GR, geometric flow theories and possible connections to string theory (a second unsolved problem also contained in Perelman's work) in Polyakov's approach.

  10. Contrast statistics for foveated visual systems: fixation selection by minimizing contrast entropy

    NASA Astrophysics Data System (ADS)

    Raj, Raghu; Geisler, Wilson S.; Frazor, Robert A.; Bovik, Alan C.

    2005-10-01

    The human visual system combines a wide field of view with a high-resolution fovea and uses eye, head, and body movements to direct the fovea to potentially relevant locations in the visual scene. This strategy is sensible for a visual system with limited neural resources. However, for this strategy to be effective, the visual system needs sophisticated central mechanisms that efficiently exploit the varying spatial resolution of the retina. To gain insight into some of the design requirements of these central mechanisms, we have analyzed the effects of variable spatial resolution on local contrast in 300 calibrated natural images. Specifically, for each retinal eccentricity (which produces a certain effective level of blur), and for each value of local contrast observed at that eccentricity, we measured the probability distribution of the local contrast in the unblurred image. These conditional probability distributions can be regarded as posterior probability distributions for the ``true'' unblurred contrast, given an observed contrast at a given eccentricity. We find that these conditional probability distributions are adequately described by a few simple formulas. To explore how these statistics might be exploited by central perceptual mechanisms, we consider the task of selecting successive fixation points, where the goal on each fixation is to maximize total contrast information gained about the image (i.e., minimize total contrast uncertainty). We derive an entropy minimization algorithm and find that it performs optimally at reducing total contrast uncertainty and that it also works well at reducing the mean squared error between the original image and the image reconstructed from the multiple fixations. Our results show that measurements of local contrast alone could efficiently drive the scan paths of the eye when the goal is to gain as much information about the spatial structure of a scene as possible.

  11. Analytical approach to entropy generation and heat transfer in CNT-nanofluid dynamics through a ciliated porous medium

    NASA Astrophysics Data System (ADS)

    Akbar, Noreen Sher; Shoaib, M.; Tripathi, Dharmendra; Bhushan, Shashi; Bég, O. Anwar

    2018-04-01

    The transportation of biological and industrial nanofluids by natural propulsion like cilia movement and self-generated contraction-relaxation of flexible walls has significant applications in numerous emerging technologies. Inspired by multi-disciplinary progress and innovation in this direction, a thermo-fluid mechanical model is proposed to study the entropy generation and convective heat transfer of nanofluids fabricated by the dispersion of single-wall carbon nanotubes (SWCNT) nanoparticles in water as the base fluid. The regime studied comprises heat transfer and steady, viscous, incompressible flow, induced by metachronal wave propulsion due to beating cilia, through a cylindrical tube containing a sparse (i.e., high permeability) homogenous porous medium. The flow is of the creeping type and is restricted under the low Reynolds number and long wavelength approximations. Slip effects at the wall are incorporated and the generalized Darcy drag-force model is utilized to mimic porous media effects. Cilia boundary conditions for velocity components are employed to determine analytical solutions to the resulting non-dimensionalized boundary value problem. The influence of pertinent physical parameters on temperature, axial velocity, pressure rise and pressure gradient, entropy generation function, Bejan number and stream-line distributions are computed numerically. A comparative study between SWCNT-nanofluids and pure water is also computed. The computations demonstrate that axial flow is accelerated with increasing slip parameter and Darcy number and is greater for SWCNT-nanofluids than for pure water. Furthermore the size of the bolus for SWCNT-nanofluids is larger than that of the pure water. The study is applicable in designing and fabricating nanoscale and microfluidics devices, artificial cilia and biomimetic micro-pumps.

  12. Analytical approach to entropy generation and heat transfer in CNT-nanofluid dynamics through a ciliated porous medium

    NASA Astrophysics Data System (ADS)

    Akbar, Noreen Sher; Shoaib, M.; Tripathi, Dharmendra; Bhushan, Shashi; Bég, O. Anwar

    2018-03-01

    The transportation of biological and industrial nanofluids by natural propulsion like cilia movement and self-generated contraction-relaxation of flexible walls has significant applications in numerous emerging technologies. Inspired by multi-disciplinary progress and innovation in this direction, a thermo-fluid mechanical model is proposed to study the entropy generation and convective heat transfer of nanofluids fabricated by the dispersion of single-wall carbon nanotubes (SWCNT) nanoparticles in water as the base fluid. The regime studied comprises heat transfer and steady, viscous, incompressible flow, induced by metachronal wave propulsion due to beating cilia, through a cylindrical tube containing a sparse (i.e., high permeability) homogenous porous medium. The flow is of the creeping type and is restricted under the low Reynolds number and long wavelength approximations. Slip effects at the wall are incorporated and the generalized Darcy drag-force model is utilized to mimic porous media effects. Cilia boundary conditions for velocity components are employed to determine analytical solutions to the resulting non-dimensionalized boundary value problem. The influence of pertinent physical parameters on temperature, axial velocity, pressure rise and pressure gradient, entropy generation function, Bejan number and stream-line distributions are computed numerically. A comparative study between SWCNT-nanofluids and pure water is also computed. The computations demonstrate that axial flow is accelerated with increasing slip parameter and Darcy number and is greater for SWCNT-nanofluids than for pure water. Furthermore the size of the bolus for SWCNT-nanofluids is larger than that of the pure water. The study is applicable in designing and fabricating nanoscale and microfluidics devices, artificial cilia and biomimetic micro-pumps.

  13. Path-integral Monte Carlo method for Rényi entanglement entropies.

    PubMed

    Herdman, C M; Inglis, Stephen; Roy, P-N; Melko, R G; Del Maestro, A

    2014-07-01

    We introduce a quantum Monte Carlo algorithm to measure the Rényi entanglement entropies in systems of interacting bosons in the continuum. This approach is based on a path-integral ground state method that can be applied to interacting itinerant bosons in any spatial dimension with direct relevance to experimental systems of quantum fluids. We demonstrate how it may be used to compute spatial mode entanglement, particle partitioned entanglement, and the entanglement of particles, providing insights into quantum correlations generated by fluctuations, indistinguishability, and interactions. We present proof-of-principle calculations and benchmark against an exactly soluble model of interacting bosons in one spatial dimension. As this algorithm retains the fundamental polynomial scaling of quantum Monte Carlo when applied to sign-problem-free models, future applications should allow for the study of entanglement entropy in large-scale many-body systems of interacting bosons.

  14. Novel quantum phase transition from bounded to extensive entanglement

    PubMed Central

    Zhang, Zhao; Ahmadain, Amr

    2017-01-01

    The nature of entanglement in many-body systems is a focus of intense research with the observation that entanglement holds interesting information about quantum correlations in large systems and their relation to phase transitions. In particular, it is well known that although generic, many-body states have large, extensive entropy, ground states of reasonable local Hamiltonians carry much smaller entropy, often associated with the boundary length through the so-called area law. Here we introduce a continuous family of frustration-free Hamiltonians with exactly solvable ground states and uncover a remarkable quantum phase transition whereby the entanglement scaling changes from area law into extensively large entropy. This transition shows that entanglement in many-body systems may be enhanced under special circumstances with a potential for generating “useful” entanglement for the purpose of quantum computing and that the full implications of locality and its restrictions on possible ground states may hold further surprises. PMID:28461464

  15. Novel quantum phase transition from bounded to extensive entanglement.

    PubMed

    Zhang, Zhao; Ahmadain, Amr; Klich, Israel

    2017-05-16

    The nature of entanglement in many-body systems is a focus of intense research with the observation that entanglement holds interesting information about quantum correlations in large systems and their relation to phase transitions. In particular, it is well known that although generic, many-body states have large, extensive entropy, ground states of reasonable local Hamiltonians carry much smaller entropy, often associated with the boundary length through the so-called area law. Here we introduce a continuous family of frustration-free Hamiltonians with exactly solvable ground states and uncover a remarkable quantum phase transition whereby the entanglement scaling changes from area law into extensively large entropy. This transition shows that entanglement in many-body systems may be enhanced under special circumstances with a potential for generating "useful" entanglement for the purpose of quantum computing and that the full implications of locality and its restrictions on possible ground states may hold further surprises.

  16. Nonlinear dynamic analysis of voices before and after surgical excision of vocal polyps

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; McGilligan, Clancy; Zhou, Liang; Vig, Mark; Jiang, Jack J.

    2004-05-01

    Phase space reconstruction, correlation dimension, and second-order entropy, methods from nonlinear dynamics, are used to analyze sustained vowels generated by patients before and after surgical excision of vocal polyps. Two conventional acoustic perturbation parameters, jitter and shimmer, are also employed to analyze voices before and after surgery. Presurgical and postsurgical analyses of jitter, shimmer, correlation dimension, and second-order entropy are statistically compared. Correlation dimension and second-order entropy show a statistically significant decrease after surgery, indicating reduced complexity and higher predictability of postsurgical voice dynamics. There is not a significant postsurgical difference in shimmer, although jitter shows a significant postsurgical decrease. The results suggest that jitter and shimmer should be applied to analyze disordered voices with caution; however, nonlinear dynamic methods may be useful for analyzing abnormal vocal function and quantitatively evaluating the effects of surgical excision of vocal polyps.

  17. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  18. ATR applications of minimax entropy models of texture and shape

    NASA Astrophysics Data System (ADS)

    Zhu, Song-Chun; Yuille, Alan L.; Lanterman, Aaron D.

    2001-10-01

    Concepts from information theory have recently found favor in both the mainstream computer vision community and the military automatic target recognition community. In the computer vision literature, the principles of minimax entropy learning theory have been used to generate rich probabilitistic models of texture and shape. In addition, the method of types and large deviation theory has permitted the difficulty of various texture and shape recognition tasks to be characterized by 'order parameters' that determine how fundamentally vexing a task is, independent of the particular algorithm used. These information-theoretic techniques have been demonstrated using traditional visual imagery in applications such as simulating cheetah skin textures and such as finding roads in aerial imagery. We discuss their application to problems in the specific application domain of automatic target recognition using infrared imagery. We also review recent theoretical and algorithmic developments which permit learning minimax entropy texture models for infrared textures in reasonable timeframes.

  19. Mixing entropy in Dean flows

    NASA Astrophysics Data System (ADS)

    Fodor, Petru; Vyhnalek, Brian; Kaufman, Miron

    2013-03-01

    We investigate mixing in Dean flows by solving numerically the Navier-Stokes equation for a circular channel. Tracers of two chemical species are carried by the fluid. The centrifugal forces, experienced as the fluid travels along a curved trajectory, coupled with the fluid incompressibility induce cross-sectional rotating flows (Dean vortices). These transversal flows promote the mixing of the chemical species. We generate images for different cross sections along the trajectory. The mixing efficiency is evaluated using the Shannon entropy. Previously we have found, P. S. Fodor and M. Kaufman, Modern Physics Letters B 25, 1111 (2011), this measure to be useful in understanding mixing in the staggered herringbone mixer. The mixing entropy is determined as function of the Reynolds number, the angle of the cross section and the observation scale (number of bins). Quantitative comparison of the mixing in the Dean micromixer and in the staggered herringbone mixer is attempted.

  20. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    PubMed

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  1. Using ordinal partition transition networks to analyze ECG data

    NASA Astrophysics Data System (ADS)

    Kulp, Christopher W.; Chobot, Jeremy M.; Freitas, Helena R.; Sprechini, Gene D.

    2016-07-01

    Electrocardiogram (ECG) data from patients with a variety of heart conditions are studied using ordinal pattern partition networks. The ordinal pattern partition networks are formed from the ECG time series by symbolizing the data into ordinal patterns. The ordinal patterns form the nodes of the network and edges are defined through the time ordering of the ordinal patterns in the symbolized time series. A network measure, called the mean degree, is computed from each time series-generated network. In addition, the entropy and number of non-occurring ordinal patterns (NFP) is computed for each series. The distribution of mean degrees, entropies, and NFPs for each heart condition studied is compared. A statistically significant difference between healthy patients and several groups of unhealthy patients with varying heart conditions is found for the distributions of the mean degrees, unlike for any of the distributions of the entropies or NFPs.

  2. Discontinuity minimization for omnidirectional video projections

    NASA Astrophysics Data System (ADS)

    Alshina, Elena; Zakharchenko, Vladyslav

    2017-09-01

    Advances in display technologies both for head mounted devices and television panels demand resolution increase beyond 4K for source signal in virtual reality video streaming applications. This poses a problem of content delivery trough a bandwidth limited distribution networks. Considering a fact that source signal covers entire surrounding space investigation reviled that compression efficiency may fluctuate 40% in average depending on origin selection at the conversion stage from 3D space to 2D projection. Based on these knowledge the origin selection algorithm for video compression applications has been proposed. Using discontinuity entropy minimization function projection origin rotation may be defined to provide optimal compression results. Outcome of this research may be applied across various video compression solutions for omnidirectional content.

  3. Dynamic Cross-Entropy.

    PubMed

    Aur, Dorian; Vila-Rodriguez, Fidel

    2017-01-01

    Complexity measures for time series have been used in many applications to quantify the regularity of one dimensional time series, however many dynamical systems are spatially distributed multidimensional systems. We introduced Dynamic Cross-Entropy (DCE) a novel multidimensional complexity measure that quantifies the degree of regularity of EEG signals in selected frequency bands. Time series generated by discrete logistic equations with varying control parameter r are used to test DCE measures. Sliding window DCE analyses are able to reveal specific period doubling bifurcations that lead to chaos. A similar behavior can be observed in seizures triggered by electroconvulsive therapy (ECT). Sample entropy data show the level of signal complexity in different phases of the ictal ECT. The transition to irregular activity is preceded by the occurrence of cyclic regular behavior. A significant increase of DCE values in successive order from high frequencies in gamma to low frequencies in delta band reveals several phase transitions into less ordered states, possible chaos in the human brain. To our knowledge there are no reliable techniques able to reveal the transition to chaos in case of multidimensional times series. In addition, DCE based on sample entropy appears to be robust to EEG artifacts compared to DCE based on Shannon entropy. The applied technique may offer new approaches to better understand nonlinear brain activity. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. On the use of band-target entropy minimization to simplify the interpretation of two-dimensional correlation spectroscopy.

    PubMed

    Widjaja, Effendi; Tan, Boon Hong; Garland, Marc

    2006-03-01

    Two-dimensional (2D) correlation spectroscopy has been extensively applied to analyze various vibrational spectroscopic data, especially infrared and Raman. However, when it is applied to real-world experimental data, which often contains various imperfections (such as noise interference, baseline fluctuations, and band-shifting) and highly overlapping bands, many artifacts and misleading features in synchronous and asynchronous maps will emerge, and this will lead to difficulties with interpretation. Therefore, an approach that counters many artifacts and therefore leads to simplified interpretation of 2D correlation analysis is certainly useful. In the present contribution, band-target entropy minimization (BTEM) is employed as a spectral pretreatment to handle many of the artifact problems before the application of 2D correlation analysis. BTEM is employed to elucidate the pure component spectra of mixtures and their corresponding concentration profiles. Two alternate forms of analysis result. In the first, the normally vxv problem is converted to an equivalent nvxnv problem, where n represents the number of species present. In the second, the pure component spectra are transformed into simple distributions, and an equivalent and less computationally intensive nv'xnv' problem results (v'

  5. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    PubMed

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  6. Device-independent characterizations of a shared quantum state independent of any Bell inequalities

    NASA Astrophysics Data System (ADS)

    Wei, Zhaohui; Sikora, Jamie

    2017-03-01

    In a Bell experiment two parties share a quantum state and perform local measurements on their subsystems separately, and the statistics of the measurement outcomes are recorded as a Bell correlation. For any Bell correlation, it turns out that a quantum state with minimal size that is able to produce this correlation can always be pure. In this work, we first exhibit two device-independent characterizations for the pure state that Alice and Bob share using only the correlation data. Specifically, we give two conditions that the Schmidt coefficients must satisfy, which can be tight, and have various applications in quantum tasks. First, one of the characterizations allows us to bound the entanglement between Alice and Bob using Renyi entropies and also to bound the underlying Hilbert space dimension. Second, when the Hilbert space dimension bound is tight, the shared pure quantum state has to be maximally entangled. Third, the second characterization gives a sufficient condition that a Bell correlation cannot be generated by particular quantum states. We also show that our results can be generalized to the case of shared mixed states.

  7. Dynamical noise filter and conditional entropy analysis in chaos synchronization.

    PubMed

    Wang, Jiao; Lai, C-H

    2006-06-01

    It is shown that, in a chaotic synchronization system whose driving signal is exposed to channel noise, the estimation of the drive system states can be greatly improved by applying the dynamical noise filtering to the response system states. If the noise is bounded in a certain range, the estimation errors, i.e., the difference between the filtered responding states and the driving states, can be made arbitrarily small. This property can be used in designing an alternative digital communication scheme. An analysis based on the conditional entropy justifies the application of dynamical noise filtering in generating quality synchronization.

  8. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    PubMed

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  9. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies

    PubMed Central

    Lorenz, Ralph D.

    2010-01-01

    The ‘two-box model’ of planetary climate is discussed. This model has been used to demonstrate consistency of the equator–pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b. PMID:20368253

  10. Shock melting and vaporization of metals.

    NASA Technical Reports Server (NTRS)

    Ahrens, T. J.

    1972-01-01

    The effect of initial porosity on shock induction of melting and vaporization is investigated for Ba, Sr, Li, Fe, Al, U, and Th. For the less compressible of these metals, it is found that for a given strong shock-generation system (explosive in contact, or flyer-plate impact) an optimum initial specific volume exists such that the total entropy production, and hence the amount of metal liquid or vapor, is a maximum. Initial volumes from 1.4 to 2.0 times crystal volumes, depending on the metal sample and shock-inducing system, will result in optimum post-shock entropies.

  11. A survey of the role of thermodynamic stability in viscous flow

    NASA Technical Reports Server (NTRS)

    Horne, W. C.; Smith, C. A.; Karamcheti, K.

    1991-01-01

    The stability of near-equilibrium states has been studied as a branch of the general field of nonequilibrium thermodynamics. By treating steady viscous flow as an open thermodynamic system, nonequilibrium principles such as the condition of minimum entropy-production rate for steady, near-equilibrium processes can be used to generate flow distributions from variational analyses. Examples considered in this paper are steady heat conduction, channel flow, and unconstrained three-dimensional flow. The entropy-production-rate condition has also been used for hydrodynamic stability criteria, and calculations of the stability of a laminar wall jet support this interpretation.

  12. Local subsystems in gauge theory and gravity

    DOE PAGES

    Donnelly, William; Freidel, Laurent

    2016-09-16

    We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less

  13. Quantifying control effort of biological and technical movements: an information-entropy-based approach.

    PubMed

    Haeufle, D F B; Günther, M; Wunner, G; Schmitt, S

    2014-01-01

    In biomechanics and biorobotics, muscles are often associated with reduced movement control effort and simplified control compared to technical actuators. This is based on evidence that the nonlinear muscle properties positively influence movement control. It is, however, open how to quantify the simplicity aspect of control effort and compare it between systems. Physical measures, such as energy consumption, stability, or jerk, have already been applied to compare biological and technical systems. Here a physical measure of control effort based on information entropy is presented. The idea is that control is simpler if a specific movement is generated with less processed sensor information, depending on the control scheme and the physical properties of the systems being compared. By calculating the Shannon information entropy of all sensor signals required for control, an information cost function can be formulated allowing the comparison of models of biological and technical control systems. Exemplarily applied to (bio-)mechanical models of hopping, the method reveals that the required information for generating hopping with a muscle driven by a simple reflex control scheme is only I=32 bits versus I=660 bits with a DC motor and a proportional differential controller. This approach to quantifying control effort captures the simplicity of a control scheme and can be used to compare completely different actuators and control approaches.

  14. The Interplay between Proto--Neutron Star Convection and Neutrino Transport in Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Mezzacappa, A.; Calder, A. C.; Bruenn, S. W.; Blondin, J. M.; Guidry, M. W.; Strayer, M. R.; Umar, A. S.

    1998-01-01

    We couple two-dimensional hydrodynamics to realistic one-dimensional multigroup flux-limited diffusion neutrino transport to investigate proto-neutron star convection in core-collapse supernovae, and more specifically, the interplay between its development and neutrino transport. Our initial conditions, time-dependent boundary conditions, and neutrino distributions for computing neutrino heating, cooling, and deleptonization rates are obtained from one-dimensional simulations that implement multigroup flux-limited diffusion and one-dimensional hydrodynamics. The development and evolution of proto-neutron star convection are investigated for both 15 and 25 M⊙ models, representative of the two classes of stars with compact and extended iron cores, respectively. For both models, in the absence of neutrino transport, the angle-averaged radial and angular convection velocities in the initial Ledoux unstable region below the shock after bounce achieve their peak values in ~20 ms, after which they decrease as the convection in this region dissipates. The dissipation occurs as the gradients are smoothed out by convection. This initial proto-neutron star convection episode seeds additional convectively unstable regions farther out beneath the shock. The additional proto-neutron star convection is driven by successive negative entropy gradients that develop as the shock, in propagating out after core bounce, is successively strengthened and weakened by the oscillating inner core. The convection beneath the shock distorts its sphericity, but on the average the shock radius is not boosted significantly relative to its radius in our corresponding one-dimensional models. In the presence of neutrino transport, proto-neutron star convection velocities are too small relative to bulk inflow velocities to result in any significant convective transport of entropy and leptons. This is evident in our two-dimensional entropy snapshots, which in this case appear spherically symmetric. The peak angle-averaged radial and angular convection velocities are orders of magnitude smaller than they are in the corresponding ``hydrodynamics-only'' models. A simple analytical model supports our numerical results, indicating that the inclusion of neutrino transport reduces the entropy-driven (lepton-driven) convection growth rates and asymptotic velocities by a factor ~3 (50) at the neutrinosphere and a factor ~250 (1000) at ρ = 1012 g cm-3, for both our 15 and 25 M⊙ models. Moreover, when transport is included, the initial postbounce entropy gradient is smoothed out by neutrino diffusion, whereas the initial lepton gradient is maintained by electron capture and neutrino escape near the neutrinosphere. Despite the maintenance of the lepton gradient, proto-neutron star convection does not develop over the 100 ms duration typical of all our simulations, except in the instance where ``low-test'' intial conditions are used, which are generated by core-collapse and bounce simulations that neglect neutrino-electron scattering and ion-ion screening corrections to neutrino-nucleus elastic scattering. Models favoring the development of proto-neutron star convection either by starting with more favorable, albeit artificial (low-test), initial conditions or by including transport corrections that were ignored in our ``fiducial'' models were considered. Our conclusions nonetheless remained the same. Evidence of proto-neutron star convection in our two-dimensional entropy snapshots was minimal, and, as in our fiducial models, the angle-averaged convective velocities when neutrino transport was included remained orders of magnitude smaller than their counterparts in the corresponding hydrodynamics-only models.

  15. Transitions in eigenvalue and wavefunction structure in (1+2) -body random matrix ensembles with spin.

    PubMed

    Vyas, Manan; Kota, V K B; Chavda, N D

    2010-03-01

    Finite interacting Fermi systems with a mean-field and a chaos generating two-body interaction are modeled by one plus two-body embedded Gaussian orthogonal ensemble of random matrices with spin degree of freedom [called EGOE(1+2)-s]. Numerical calculations are used to demonstrate that, as lambda , the strength of the interaction (measured in the units of the average spacing of the single-particle levels defining the mean-field), increases, generically there is Poisson to GOE transition in level fluctuations, Breit-Wigner to Gaussian transition in strength functions (also called local density of states) and also a duality region where information entropy will be the same in both the mean-field and interaction defined basis. Spin dependence of the transition points lambda_{c} , lambdaF, and lambdad , respectively, is described using the propagator for the spectral variances and the formula for the propagator is derived. We further establish that the duality region corresponds to a region of thermalization. For this purpose we compared the single-particle entropy defined by the occupancies of the single-particle orbitals with thermodynamic entropy and information entropy for various lambda values and they are very close to each other at lambda=lambdad.

  16. Gravitational entropy and the cosmological no-hair conjecture

    NASA Astrophysics Data System (ADS)

    Bolejko, Krzysztof

    2018-04-01

    The gravitational entropy and no-hair conjectures seem to predict contradictory future states of our Universe. The growth of the gravitational entropy is associated with the growth of inhomogeneity, while the no-hair conjecture argues that a universe dominated by dark energy should asymptotically approach a homogeneous and isotropic de Sitter state. The aim of this paper is to study these two conjectures. The investigation is based on the Simsilun simulation, which simulates the universe using the approximation of the Silent Universe. The Silent Universe is a solution to the Einstein equations that assumes irrotational, nonviscous, and insulated dust, with vanishing magnetic part of the Weyl curvature. The initial conditions for the Simsilun simulation are sourced from the Millennium simulation, which results with a realistically appearing but relativistic at origin simulation of a universe. The Simsilun simulation is evolved from the early universe (t =25 Myr ) until far future (t =1000 Gyr ). The results of this investigation show that both conjectures are correct. On global scales, a universe with a positive cosmological constant and nonpositive spatial curvature does indeed approach the de Sitter state. At the same time it keeps generating the gravitational entropy.

  17. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Wei; Wang, Jin, E-mail: jin.wang.1@stonybrook.edu; State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, 130022 Changchun, China and College of Physics, Jilin University, 130021 Changchun

    We have established a general non-equilibrium thermodynamic formalism consistently applicable to both spatially homogeneous and, more importantly, spatially inhomogeneous systems, governed by the Langevin and Fokker-Planck stochastic dynamics with multiple state transition mechanisms, using the potential-flux landscape framework as a bridge connecting stochastic dynamics with non-equilibrium thermodynamics. A set of non-equilibrium thermodynamic equations, quantifying the relations of the non-equilibrium entropy, entropy flow, entropy production, and other thermodynamic quantities, together with their specific expressions, is constructed from a set of dynamical decomposition equations associated with the potential-flux landscape framework. The flux velocity plays a pivotal role on both the dynamic andmore » thermodynamic levels. On the dynamic level, it represents a dynamic force breaking detailed balance, entailing the dynamical decomposition equations. On the thermodynamic level, it represents a thermodynamic force generating entropy production, manifested in the non-equilibrium thermodynamic equations. The Ornstein-Uhlenbeck process and more specific examples, the spatial stochastic neuronal model, in particular, are studied to test and illustrate the general theory. This theoretical framework is particularly suitable to study the non-equilibrium (thermo)dynamics of spatially inhomogeneous systems abundant in nature. This paper is the second of a series.« less

  19. Temporal and Spatial Evolution Characteristics of Disturbance Wave in a Hypersonic Boundary Layer due to Single-Frequency Entropy Disturbance

    PubMed Central

    Lv, Hongqing; Shi, Jianqiang

    2014-01-01

    By using a high-order accurate finite difference scheme, direct numerical simulation of hypersonic flow over an 8° half-wedge-angle blunt wedge under freestream single-frequency entropy disturbance is conducted; the generation and the temporal and spatial nonlinear evolution of boundary layer disturbance waves are investigated. Results show that, under the freestream single-frequency entropy disturbance, the entropy state of boundary layer is changed sharply and the disturbance waves within a certain frequency range are induced in the boundary layer. Furthermore, the amplitudes of disturbance waves in the period phase are larger than that in the response phase and ablation phase and the frequency range in the boundary layer in the period phase is narrower than that in these two phases. In addition, the mode competition, dominant mode transformation, and disturbance energy transfer exist among different modes both in temporal and in spatial evolution. The mode competition changes the characteristics of nonlinear evolution of the unstable waves in the boundary layer. The development of the most unstable mode along streamwise relies more on the motivation of disturbance waves in the upstream than that of other modes on this motivation. PMID:25143983

  20. Temporal and spatial evolution characteristics of disturbance wave in a hypersonic boundary layer due to single-frequency entropy disturbance.

    PubMed

    Wang, Zhenqing; Tang, Xiaojun; Lv, Hongqing; Shi, Jianqiang

    2014-01-01

    By using a high-order accurate finite difference scheme, direct numerical simulation of hypersonic flow over an 8° half-wedge-angle blunt wedge under freestream single-frequency entropy disturbance is conducted; the generation and the temporal and spatial nonlinear evolution of boundary layer disturbance waves are investigated. Results show that, under the freestream single-frequency entropy disturbance, the entropy state of boundary layer is changed sharply and the disturbance waves within a certain frequency range are induced in the boundary layer. Furthermore, the amplitudes of disturbance waves in the period phase are larger than that in the response phase and ablation phase and the frequency range in the boundary layer in the period phase is narrower than that in these two phases. In addition, the mode competition, dominant mode transformation, and disturbance energy transfer exist among different modes both in temporal and in spatial evolution. The mode competition changes the characteristics of nonlinear evolution of the unstable waves in the boundary layer. The development of the most unstable mode along streamwise relies more on the motivation of disturbance waves in the upstream than that of other modes on this motivation.

  1. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  2. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  3. Design of Novel Precipitate-Strengthened Al-Co-Cr-Fe-Nb-Ni High-Entropy Superalloys

    NASA Astrophysics Data System (ADS)

    Antonov, Stoichko; Detrois, Martin; Tin, Sammy

    2018-01-01

    A series of non-equiatomic Al-Co-Cr-Fe-Nb-Ni high-entropy alloys, with varying levels of Co, Nb and Fe, were investigated in an effort to obtain microstructures similar to conventional Ni-based superalloys. Elevated levels of Co were observed to significantly decrease the solvus temperature of the γ' precipitates. Both Nb and Co in excessive concentrations promoted the formation of Laves and NiAl phases that formed either during solidification and remained undissolved during homogenization or upon high-temperature aging. Lowering the content of Nb, Co, or Fe prevented the formation of the eutectic type Laves. In addition, lowering the Co content resulted in a higher number density and volume fraction of the γ' precipitates, while increasing the Fe content led to the destabilization of the γ' precipitates. Various aging treatments were performed which led to different size distributions of the strengthening phase. Results from the microstructural characterization and hardness property assessments of these high-entropy alloys were compared to a commercial, high-strength Ni-based superalloy RR1000. Potentially, precipitation-strengthened high-entropy alloys could find applications replacing Ni-based superalloys as structural materials in power generation applications.

  4. Information entropy of humpback whale songs.

    PubMed

    Suzuki, Ryuji; Buck, John R; Tyack, Peter L

    2006-03-01

    The structure of humpback whale (Megaptera novaeangliae) songs was examined using information theory techniques. The song is an ordered sequence of individual sound elements separated by gaps of silence. Song samples were converted into sequences of discrete symbols by both human and automated classifiers. This paper analyzes the song structure in these symbol sequences using information entropy estimators and autocorrelation estimators. Both parametric and nonparametric entropy estimators are applied to the symbol sequences representing the songs. The results provide quantitative evidence consistent with the hierarchical structure proposed for these songs by Payne and McVay [Science 173, 587-597 (1971)]. Specifically, this analysis demonstrates that: (1) There is a strong structural constraint, or syntax, in the generation of the songs, and (2) the structural constraints exhibit periodicities with periods of 6-8 and 180-400 units. This implies that no empirical Markov model is capable of representing the songs' structure. The results are robust to the choice of either human or automated song-to-symbol classifiers. In addition, the entropy estimates indicate that the maximum amount of information that could be communicated by the sequence of sounds made is less than 1 bit per second.

  5. Multivariate multiscale entropy of financial markets

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun

    2017-11-01

    In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.

  6. Nonlinear Complexity Analysis of Brain fMRI Signals in Schizophrenia

    PubMed Central

    Sokunbi, Moses O.; Gradin, Victoria B.; Waiter, Gordon D.; Cameron, George G.; Ahearn, Trevor S.; Murray, Alison D.; Steele, Douglas J.; Staff, Roger T.

    2014-01-01

    We investigated the differences in brain fMRI signal complexity in patients with schizophrenia while performing the Cyberball social exclusion task, using measures of Sample entropy and Hurst exponent (H). 13 patients meeting diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM IV) criteria for schizophrenia and 16 healthy controls underwent fMRI scanning at 1.5 T. The fMRI data of both groups of participants were pre-processed, the entropy characterized and the Hurst exponent extracted. Whole brain entropy and H maps of the groups were generated and analysed. The results after adjusting for age and sex differences together show that patients with schizophrenia exhibited higher complexity than healthy controls, at mean whole brain and regional levels. Also, both Sample entropy and Hurst exponent agree that patients with schizophrenia have more complex fMRI signals than healthy controls. These results suggest that schizophrenia is associated with more complex signal patterns when compared to healthy controls, supporting the increase in complexity hypothesis, where system complexity increases with age or disease, and also consistent with the notion that schizophrenia is characterised by a dysregulation of the nonlinear dynamics of underlying neuronal systems. PMID:24824731

  7. On Entropy Generation and the Effect of Heat and Mass Transfer Coupling in a Distillation Process

    NASA Astrophysics Data System (ADS)

    Burgos-Madrigal, Paulina; Mendoza, Diego F.; López de Haro, Mariano

    2018-01-01

    The entropy production rates as obtained from the exergy analysis, entropy balance and the nonequilibrium thermodynamics approach are compared for two distillation columns. The first case is a depropanizer column involving a mixture of ethane, propane, n-butane and n-pentane. The other is a weighed sample of Mexican crude oil distilled with a pilot scale fractionating column. The composition, temperature and flow profiles, for a given duty and operating conditions in each column, are obtained with the Aspen Plus V8.4 software by using the RateFrac model with a rate-based nonequilibrium column. For the depropanizer column the highest entropy production rate is found in the central trays where most of the mass transfer occurs, while in the second column the highest values correspond to the first three stages (where the vapor mixture is in contact with the cold liquid reflux), and to the last three stages (where the highest temperatures take place). The importance of the explicit inclusion of thermal diffusion in these processes is evaluated. In the depropanizer column, the effect of the coupling between heat and mass transfer is found to be negligible, while for the fractionating column it becomes appreciable.

  8. Chaos in Magnetic Flux Ropes

    NASA Astrophysics Data System (ADS)

    Gekelman, W. N.; DeHaas, T.; Van Compernolle, B.

    2013-12-01

    Magnetic Flux Ropes Immersed in a uniform magnetoplasma are observed to twist about themselves, writhe about each other and rotate about a central axis. They are kink unstable and smash into one another as they move. Full three dimensional magnetic field and flows are measured at thousands of time steps. Each collision results in magnetic field line generation and the generation of a quasi-seperatrix layer and induced electric fields. Three dimensional magnetic field lines are computed by conditionally averaging the data using correlation techniques. The permutation entropy1 ,which is related to the Lyapunov exponent, can be calculated from the the time series of the magnetic field data (this is also done with flows) and used to calculate the positions of the data on a Jensen Shannon complexity map2. The location of data on this map indicates if the magnetic fields are stochastic, or fall into regions of minimal or maximal complexity. The complexity is a function of space and time. The complexity map, and analysis will be explained in the course of the talk. Other types of chaotic dynamical models such as the Lorentz, Gissinger and Henon process also fall on the map and can give a clue to the nature of the flux rope turbulence. The ropes fall in the region of the C-H plane where chaotic systems lie. The entropy and complexity change in space and time which reflects the change and possibly type of chaos associated with the ropes. The maps give insight as to the type of chaos (deterministic chaos, fractional diffusion , Levi flights..) and underlying dynamical process. The power spectra of much of the magnetic and flow data is exponential and Lorentzian structures in the time domain are embedded in them. Other quantities such as the Hurst exponent are evaluated for both magnetic fields and plasma flow. Work Supported by a UC-LANL Lab fund and the Basic Plasma Science Facility which is funded by DOE and NSF. 1) C. Bandt, B. Pompe, Phys. Rev. Lett., 88,174102 (2007) 2) O. Russo et al., Phys. Rev. Lett., 99, 154102 (2007), J. Maggs, G.Morales, 55, 085015 (2013)

  9. Entropy Generation Analysis through Helical Coil Heat Exchanger in an Agitated Vessel

    NASA Astrophysics Data System (ADS)

    Ashok Reddy, K.

    2018-03-01

    Entropy Generation have been obtained while conducting the experiments for different sodium carboxymethyl cellulose concentrations 0.05%,0.1%,0.15% and 0.2% of Newtonian and non Newtonian fluids and the data made available by passing the test fluid at different flow rates through a helical coil in a mixing coil using paddle impeller. Heating of fluids depend on operational parameters, geometry of the mixing vessel and the type of impeller used. A new design of heating element was design and fabricated by providing kanthal wire inserted into a glove knitted with fiber glass yarn as glass fabric is flexible, heat resistant and can accommodate to adopt small difference in size of the vessel, perfectly. The knitted fabric is made to the shape of vessel used in the experiment and the heating elements are inserted so that it gets embedded and forms part of the glove knitted with yarn of fiber glass.

  10. Thermodynamic properties of a liquid crystal carbosilane dendrimer

    NASA Astrophysics Data System (ADS)

    Samosudova, Ya. S.; Markin, A. V.; Smirnova, N. N.; Ogurtsov, T. G.; Boiko, N. I.; Shibaev, V. P.

    2016-11-01

    The temperature dependence of the heat capacity of a first-generation liquid crystal carbosilane dendrimer with methoxyphenyl benzoate end groups is studied for the first time in the region of 6-370 K by means of precision adiabatic vacuum calorimetry. Physical transformations are observed in this interval of temperatures, and their standard thermodynamic characteristics are determined and discussed. Standard thermodynamic functions C p ° ( T), H°( T) - H°(0), S°( T) - S°(0), and G°( T) - H°(0) are calculated from the obtained experimental data for the region of T → 0 to 370 K. The standard entropy of formation of the dendrimer in the partially crystalline state at T = 298.15 K is calculated, and the standard entropy of the hypothetic reaction of its synthesis at this temperature is estimated. The thermodynamic properties of the studied dendrimer are compared to those of second- and fourth-generation liquid crystal carbosilane dendrimers with the same end groups studied earlier.

  11. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  12. Research and implementation of group animation based on normal cloud model

    NASA Astrophysics Data System (ADS)

    Li, Min; Wei, Bin; Peng, Bao

    2011-12-01

    Group Animation is a difficult technology problem which always has not been solved in computer Animation technology, All current methods have their limitations. This paper put forward a method: the Motion Coordinate and Motion Speed of true fish group was collected as sample data, reverse cloud generator was designed and run, expectation, entropy and super entropy are gotten. Which are quantitative value of qualitative concept. These parameters are used as basis, forward cloud generator was designed and run, Motion Coordinate and Motion Speed of two-dimensional fish group animation are produced, And two spirit state variable about fish group : the feeling of hunger, the feeling of fear are designed. Experiment is used to simulated the motion state of fish Group Animation which is affected by internal cause and external cause above, The experiment shows that the Group Animation which is designed by this method has strong Realistic.

  13. A desalination battery.

    PubMed

    Pasta, Mauro; Wessells, Colin D; Cui, Yi; La Mantia, Fabio

    2012-02-08

    Water desalination is an important approach to provide fresh water around the world, although its high energy consumption, and thus high cost, call for new, efficient technology. Here, we demonstrate the novel concept of a "desalination battery", which operates by performing cycles in reverse on our previously reported mixing entropy battery. Rather than generating electricity from salinity differences, as in mixing entropy batteries, desalination batteries use an electrical energy input to extract sodium and chloride ions from seawater and to generate fresh water. The desalination battery is comprised by a Na(2-x)Mn(5)O(10) nanorod positive electrode and Ag/AgCl negative electrode. Here, we demonstrate an energy consumption of 0.29 Wh l(-1) for the removal of 25% salt using this novel desalination battery, which is promising when compared to reverse osmosis (~ 0.2 Wh l(-1)), the most efficient technique presently available. © 2012 American Chemical Society

  14. Histogram-driven cupping correction (HDCC) in CT

    NASA Astrophysics Data System (ADS)

    Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.

    2010-04-01

    Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.

  15. From quantum coherence to quantum correlations

    NASA Astrophysics Data System (ADS)

    Sun, Yuan; Mao, Yuanyuan; Luo, Shunlong

    2017-06-01

    In quantum mechanics, quantum coherence of a state relative to a quantum measurement can be identified with the quantumness that has to be destroyed by the measurement. In particular, quantum coherence of a bipartite state relative to a local quantum measurement encodes quantum correlations in the state. If one takes minimization with respect to the local measurements, then one is led to quantifiers which capture quantum correlations from the perspective of coherence. In this vein, quantum discord, which quantifies the minimal correlations that have to be destroyed by quantum measurements, can be identified as the minimal coherence, with the coherence measured by the relative entropy of coherence. To advocate and formulate this idea in a general context, we first review coherence relative to Lüders measurements which extends the notion of coherence relative to von Neumann measurements (or equivalently, orthonomal bases), and highlight the observation that quantum discord arises as minimal coherence through two prototypical examples. Then, we introduce some novel measures of quantum correlations in terms of coherence, illustrate them through examples, investigate their fundamental properties and implications, and indicate their applications to quantum metrology.

  16. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  17. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  18. Effect of rotation preference on spontaneous alternation behavior on Y maze and introduction of a new analytical method, entropy of spontaneous alternation.

    PubMed

    Bak, Jia; Pyeon, Hae-In; Seok, Jin-I; Choi, Yun-Sik

    2017-03-01

    Y maze has been used to test spatial working memory in rodents. To this end, the percentage of spontaneous alternation has been employed. Alternation indicates sequential entries into all three arms; e.g., when an animal visits all three arms clockwise or counterclockwise sequentially, alternation is achieved. Interestingly, animals have a tendency to rotate or turn to a preferred side. Thus, when an animal has a high rotation preference, this may influence their alternation behavior. Here, we have generated a new analytical method, termed entropy of spontaneous alternation, to offset the effect of rotation preference on Y maze. To validate the entropy of spontaneous alternation, we employed a free rotation test using a cylinder and a spatial working memory test on Y maze. We identified that mice showed 65.1% rotation preference on average. Importantly, the percentage of spontaneous alternation in the high preference group (more than 70% rotation to a preferred side) was significantly higher than that in the no preference group (<55%). In addition, there was a clear correlation between rotation preference on cylinder and turning preference on Y maze. On the other hand, this potential leverage effect that arose from rotation preference disappeared when the animal behavior on Y maze was analyzed with the entropy of spontaneous alternation. Further, entropy of spontaneous alternation significantly determined the loss of spatial working memory by scopolamine administration. Combined, these data indicate that the entropy of spontaneous alternation provides higher credibility when spatial working memory is evaluated using Y maze. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Thermodynamic perspectives on genetic instructions, the laws of biology, diseased states and human population control

    PubMed Central

    Saier, M. H.

    2014-01-01

    This article examines in a broad perspective entropy and some examples of its relationship to evolution, genetic instructions and how we view diseases. Many knowledge gaps abound, hence our understanding is still fragmented and incomplete. Living organisms are programmed by functional genetic instructions (FGI), through cellular communication pathways, to grow and reproduce by maintaining a variety of hemistable, ordered structures (low entropy). Living organisms are far from equilibrium with their surrounding environmental systems, which tends towards increasing disorder (increasing entropy). Organisms must free themselves from high entropy (high disorder) to maintain their cellular structures for a period of time sufficient enough to allow reproduction and the resultant offspring to reach reproductive ages. This time interval varies for different species. Bacteria, for example need no sexual parents; dividing cells are nearly identical to the previous generation of cells, and can begin a new cell cycle without delay under appropriate conditions. By contrast, human infants require years of care before they can reproduce. Living organisms maintain order in spite of their changing surrounding environment, that decreases order according to the second law of thermodynamics. These events actually work together since living organisms create ordered biological structures by increasing local entropy. From a disease perspective, viruses and other disease agents interrupt the normal functioning of cells. The pressure for survival may result in mechanisms that allow organisms to resist attacks by viruses, other pathogens, destructive chemicals and physical agents such as radiation. However, when the attack is successful, the organism can be damaged until the cell, tissue, organ or entire organism is no longer functional and entropy increases. PMID:21262480

  20. Differentiating benign from malignant mediastinal lymph nodes visible at EBUS using grey-scale textural analysis.

    PubMed

    Edey, Anthony J; Pollentine, Adrian; Doody, Claire; Medford, Andrew R L

    2015-04-01

    Recent data suggest that grey-scale textural analysis on endobronchial ultrasound (EBUS) imaging can differentiate benign from malignant lymphadenopathy. The objective of studies was to evaluate grey-scale textural analysis and examine its clinical utility. Images from 135 consecutive clinically indicated EBUS procedures were evaluated retrospectively using MATLAB software (MathWorks, Natick, MA, USA). Manual node mapping was performed to obtain a region of interest and grey-scale textural features (range of pixel values and entropy) were analysed. The initial analysis involved 94 subjects and receiver operating characteristic (ROC) curves were generated. The ROC thresholds were then applied on a second cohort (41 subjects) to validate the earlier findings. A total of 371 images were evaluated. There was no difference in proportions of malignant disease (56% vs 53%, P = 0.66) in the prediction (group 1) and validation (group 2) sets. There was no difference in range of pixel values in group 1 but entropy was significantly higher in the malignant group (5.95 vs 5.77, P = 0.03). Higher entropy was seen in adenocarcinoma versus lymphoma (6.00 vs 5.50, P < 0.05). An ROC curve for entropy gave an area under the curve of 0.58 with 51% sensitivity and 71% specificity for entropy greater than 5.94 for malignancy. In group 2, the entropy threshold phenotyped only 47% of benign cases and 20% of malignant cases correctly. These findings suggest that use of EBUS grey-scale textural analysis for differentiation of malignant from benign lymphadenopathy may not be accurate. Further studies are required. © 2015 Asian Pacific Society of Respirology.

  1. Entropy, materials, and posterity

    USGS Publications Warehouse

    Cloud, P.

    1977-01-01

    Materials and energy are the interdependent feedstocks of economic systems, and thermodynamics is their moderator. It costs energy to transform the dispersed minerals of Earth's crust into ordered materials and structures. And it costs materials to collect and focus the energy to perform work - be it from solar, fossil fuel, nuclear, or other sources. The greater the dispersal of minerals sought, the more energy is required to collect them into ordered states. But available energy can be used once only. And the ordered materials of industrial economies become disordered with time. They may be partially reordered and recycled, but only at further costs in energy. Available energy everywhere degrades to bound states and order to disorder - for though entropy may be juggled it always increases. Yet industry is utterly dependent on low entropy states of matter and energy, while decreasing grades of ore require ever higher inputs of energy to convert them to metals, with ever increasing growth both of entropy and environmental hazard. Except as we may prize a thing for its intrinsic qualities - beauty, leisure, love, or gold - low-entropy is the only thing of real value. It is worth whatever the market will bear, and it becomes more valuable as entropy increases. It would be foolish of suppliers to sell it more cheaply or in larger amounts than their own enjoyment of life requires, whatever form it may take. For this reason, and because of physical constraints on the availability of all low-entropy states, the recent energy crises is only the first of a sequence of crises to be expected in energy and materials as long as current trends continue. The apportioning of low-entropy states in a modern industrial society is achieved more or less according to the theory of competitive markets. But the rational powers of this theory suffer as the world grows increasingly polarized into rich, over-industrialized nations with diminishing resource bases and poor, supplier nations with little industry. The theory also discounts posterity, the more so as population density and percapita rates of consumption continue to grow. A new social, economic, and ecologic norm that leads to population control, conservation, and an apportionment of low-entropy states across the generations is needed to assure to posterity the options that properly belong to it as an important but voiceless constituency of the collectivity we call mankind. ?? 1977 Ferdinand Enke Verlag Stuttgart.

  2. Security and composability of randomness expansion from Bell inequalities

    NASA Astrophysics Data System (ADS)

    Fehr, Serge; Gelles, Ran; Schaffner, Christian

    2013-01-01

    The nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness expansion. R. Colbeck and A. Kent [J. Phys. A1751-811310.1088/1751-8113/44/9/095305 44, 095305 (2011)] proposed the first method for generating randomness from untrusted devices, but without providing a rigorous analysis. This was addressed subsequently by S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], who aimed at deriving a lower bound on the min-entropy of the data extracted from an untrusted device based only on the observed nonlocal behavior of the device. Although that article succeeded in developing important tools for reaching the stated goal, the proof itself contained a bug, and the given formal claim on the guaranteed amount of min-entropy needs to be revisited. In this paper we build on the tools provided by Pironio and obtain a meaningful lower bound on the min-entropy of the data produced by an untrusted device based on the observed nonlocal behavior of the device. Our main result confirms the essence of the (improperly formulated) claims of Pironio and puts them on solid ground. We also address the question of composability and show that different untrusted devices can be composed in an alternating manner under the assumption that they are not entangled. This enables superpolynomial randomness expansion based on two untrusted yet unentangled devices.

  3. Approximate Entropies for Stochastic Time Series and EKG Time Series of Patients with Epilepsy and Pseudoseizures

    NASA Astrophysics Data System (ADS)

    Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron

    2009-10-01

    A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.

  4. Stability, Consistency and Performance of Distribution Entropy in Analysing Short Length Heart Rate Variability (HRV) Signal.

    PubMed

    Karmakar, Chandan; Udhayakumar, Radhagayathri K; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu

    2017-01-01

    Distribution entropy ( DistEn ) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters-the embedding dimension m , and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy ( ApEn ) and sample entropy ( SampEn ) measures. The performance of DistEn can also be affected by the data length N . In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter ( m or M ) or combination of two parameters ( N and M ). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn . The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series.

  5. Stability, Consistency and Performance of Distribution Entropy in Analysing Short Length Heart Rate Variability (HRV) Signal

    PubMed Central

    Karmakar, Chandan; Udhayakumar, Radhagayathri K.; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu

    2017-01-01

    Distribution entropy (DistEn) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters—the embedding dimension m, and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy (ApEn) and sample entropy (SampEn) measures. The performance of DistEn can also be affected by the data length N. In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter (m or M) or combination of two parameters (N and M). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn. The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series. PMID:28979215

  6. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement

    PubMed Central

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-01-01

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893

  7. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement.

    PubMed

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-02-07

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.

  8. Application of SNODAS and hydrologic models to enhance entropy-based snow monitoring network design

    NASA Astrophysics Data System (ADS)

    Keum, Jongho; Coulibaly, Paulin; Razavi, Tara; Tapsoba, Dominique; Gobena, Adam; Weber, Frank; Pietroniro, Alain

    2018-06-01

    Snow has a unique characteristic in the water cycle, that is, snow falls during the entire winter season, but the discharge from snowmelt is typically delayed until the melting period and occurs in a relatively short period. Therefore, reliable observations from an optimal snow monitoring network are necessary for an efficient management of snowmelt water for flood prevention and hydropower generation. The Dual Entropy and Multiobjective Optimization is applied to design snow monitoring networks in La Grande River Basin in Québec and Columbia River Basin in British Columbia. While the networks are optimized to have the maximum amount of information with minimum redundancy based on entropy concepts, this study extends the traditional entropy applications to the hydrometric network design by introducing several improvements. First, several data quantization cases and their effects on the snow network design problems were explored. Second, the applicability the Snow Data Assimilation System (SNODAS) products as synthetic datasets of potential stations was demonstrated in the design of the snow monitoring network of the Columbia River Basin. Third, beyond finding the Pareto-optimal networks from the entropy with multi-objective optimization, the networks obtained for La Grande River Basin were further evaluated by applying three hydrologic models. The calibrated hydrologic models simulated discharges using the updated snow water equivalent data from the Pareto-optimal networks. Then, the model performances for high flows were compared to determine the best optimal network for enhanced spring runoff forecasting.

  9. Using quantum erasure to exorcize Maxwell's demon: I. Concepts and context

    NASA Astrophysics Data System (ADS)

    Scully, Marlan O.; Rostovtsev, Yuri; Sariyanni, Zoe-Elizabeth; Suhail Zubairy, M.

    2005-10-01

    Szilard [L. Szilard, Zeitschrift für Physik, 53 (1929) 840] made a decisive step toward solving the Maxwell demon problem by introducing and analyzing the single atom heat engine. Bennett [Sci. Am. 255 (1987) 107] completed the solution by pointing out that there must be an entropy, ΔS=kln2, generated as the result of information erased on each cycle. Nevertheless, others have disagreed. For example, philosophers such as Popper “have found the literature surrounding Maxwell's demon deeply problematic.” We propose and analyze a new kind of single atom quantum heat engine which allows us to resolve the Maxwell demon paradox simply, and without invoking the notions of information or entropy. The energy source of the present quantum engine [Scully, Phys. Rev. Lett. 87 (2001) 22601] is a Stern-Gerlach apparatus acting as a demonesque heat sorter. An isothermal compressor acts as the entropy sink. In order to complete a thermodynamic cycle, an energy of ΔW=kTln2 must be expended. This energy is essentially a “reset” or “eraser” energy. Thus Bennett's entropy ΔS=ΔW/T emerges as a simple consequence of the quantum thermodynamics of our heat engine. It would seem that quantum mechanics contains the kernel of information entropy at its very core. That is the concept of information erasure as it appears in quantum mechanics [Scully and Drühl, Phys. Rev. A 25 (1982) 2208] and the present quantum heat engine have a deep common origin.

  10. ELECTRIC CURRENT FILAMENTATION AT A NON-POTENTIAL MAGNETIC NULL-POINT DUE TO PRESSURE PERTURBATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jelínek, P.; Karlický, M.; Murawski, K., E-mail: pjelinek@prf.jcu.cz

    2015-10-20

    An increase of electric current densities due to filamentation is an important process in any flare. We show that the pressure perturbation, followed by an entropy wave, triggers such a filamentation in the non-potential magnetic null-point. In the two-dimensional (2D), non-potential magnetic null-point, we generate the entropy wave by a negative or positive pressure pulse that is launched initially. Then, we study its evolution under the influence of the gravity field. We solve the full set of 2D time dependent, ideal magnetohydrodynamic equations numerically, making use of the FLASH code. The negative pulse leads to an entropy wave with amore » plasma density greater than in the ambient atmosphere and thus this wave falls down in the solar atmosphere, attracted by the gravity force. In the case of the positive pressure pulse, the plasma becomes evacuated and the entropy wave propagates upward. However, in both cases, owing to the Rayleigh–Taylor instability, the electric current in a non-potential magnetic null-point is rapidly filamented and at some locations the electric current density is strongly enhanced in comparison to its initial value. Using numerical simulations, we find that entropy waves initiated either by positive or negative pulses result in an increase of electric current densities close to the magnetic null-point and thus the energy accumulated here can be released as nanoflares or even flares.« less

  11. Entanglement in Nonunitary Quantum Critical Spin Chains

    NASA Astrophysics Data System (ADS)

    Couvreur, Romain; Jacobsen, Jesper Lykke; Saleur, Hubert

    2017-07-01

    Entanglement entropy has proven invaluable to our understanding of quantum criticality. It is natural to try to extend the concept to "nonunitary quantum mechanics," which has seen growing interest from areas as diverse as open quantum systems, noninteracting electronic disordered systems, or nonunitary conformal field theory (CFT). We propose and investigate such an extension here, by focusing on the case of one-dimensional quantum group symmetric or supergroup symmetric spin chains. We show that the consideration of left and right eigenstates combined with appropriate definitions of the trace leads to a natural definition of Rényi entropies in a large variety of models. We interpret this definition geometrically in terms of related loop models and calculate the corresponding scaling in the conformal case. This allows us to distinguish the role of the central charge and effective central charge in rational minimal models of CFT, and to define an effective central charge in other, less well-understood cases. The example of the s l (2 |1 ) alternating spin chain for percolation is discussed in detail.

  12. Lattice thermal conductivity of multi-component alloys

    DOE PAGES

    Caro, Magdalena; Béland, Laurent K.; Samolyuk, German D.; ...

    2015-06-12

    High entropy alloys (HEA) have unique properties including the potential to be radiation tolerant. These materials with extreme disorder could resist damage because disorder, stabilized by entropy, is the equilibrium thermodynamic state. Disorder also reduces electron and phonon conductivity keeping the damage energy longer at the deposition locations, eventually favoring defect recombination. In the short time-scales related to thermal spikes induced by collision cascades, phonons become the relevant energy carrier. In this paper, we perform a systematic study of phonon thermal conductivity in multiple component solid solutions represented by Lennard-Jones (LJ) potentials. We explore the conditions that minimize phonon meanmore » free path via extreme alloy complexity, by varying the composition and the elements (differing in mass, atomic radii, and cohesive energy). We show that alloy complexity can be tailored to modify the scattering mechanisms that control energy transport in the phonon subsystem. Finally, our analysis provides a qualitative guidance for the selection criteria used in the design of HEA alloys with low phonon thermal conductivity.« less

  13. Molecular dynamics simulations of polyelectrolyte brushes under poor solvent conditions: origins of bundle formation.

    PubMed

    He, Gui-Li; Merlitz, Holger; Sommer, Jens-Uwe

    2014-03-14

    Molecular dynamics simulations are applied to investigate salt-free planar polyelectrolyte brushes under poor solvent conditions. Starting above the Θ-point with a homogeneous brush and then gradually reducing the temperature, the polymers initially display a lateral structure formation, forming vertical bundles of chains. A further reduction of the temperature (or solvent quality) leads to a vertical collapse of the brush. By varying the size and selectivity of the counterions, we show that lateral structure formation persists and therefore demonstrate that the entropy of counterions being the dominant factor for the formation of the bundle phase. By applying an external compression force on the brush we calculate the minimal work done on the polymer phase only and prove that the entropy gain of counterions in the bundle state, as compared to the homogeneously collapsed state at the same temperature, is responsible for the lateral microphase segregation. As a consequence, the observed lateral structure formation has to be regarded universal for osmotic polymer brushes below the Θ-point.

  14. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data

    NASA Astrophysics Data System (ADS)

    White, Andrew D.; Knight, Chris; Hocky, Glen M.; Voth, Gregory A.

    2017-01-01

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  15. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data.

    PubMed

    White, Andrew D; Knight, Chris; Hocky, Glen M; Voth, Gregory A

    2017-01-28

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  16. Conservation laws shape dissipation

    NASA Astrophysics Data System (ADS)

    Rao, Riccardo; Esposito, Massimiliano

    2018-02-01

    Starting from the most general formulation of stochastic thermodynamics—i.e. a thermodynamically consistent nonautonomous stochastic dynamics describing systems in contact with several reservoirs—we define a procedure to identify the conservative and the minimal set of nonconservative contributions in the entropy production. The former is expressed as the difference between changes caused by time-dependent drivings and a generalized potential difference. The latter is a sum over the minimal set of flux-force contributions controlling the dissipative flows across the system. When the system is initially prepared at equilibrium (e.g. by turning off drivings and forces), a finite-time detailed fluctuation theorem holds for the different contributions. Our approach relies on identifying the complete set of conserved quantities and can be viewed as the extension of the theory of generalized Gibbs ensembles to nonequilibrium situations.

  17. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  18. Universal features of fluctuations in globular proteins.

    PubMed

    Erman, Burak

    2016-06-01

    Using data from 2000 non-homologous protein crystal structures, we show that the distribution of residue B factors of proteins collapses onto a single master curve. We show by maximum entropy arguments that this curve is a Gamma function whose order and dispersion are obtained from experimental data. The distribution for any given specific protein can be generated from the master curve by a linear transformation. Any perturbation of the B factor distribution of a protein, imposed at constant energy, causes a decrease in the entropy of the protein relative to that of the reference state. Proteins 2016; 84:721-725. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. An Integrated Theory of Everything (TOE)

    NASA Astrophysics Data System (ADS)

    Colella, Antonio

    2014-03-01

    An Integrated TOE unifies all known physical phenomena from the Planck cube to the Super Universe (multiverse). Each matter/force particle is represented by a Planck cube string. Any Super Universe object is a volume of contiguous Planck cubes. Super force Planck cube string singularities existed at the start of all universes. An Integrated TOE foundations are twenty independent existing theories and without sacrificing their integrities, are replaced by twenty interrelated amplified theories. Amplifications of Higgs force theory are key to an Integrated TOE and include: 64 supersymmetric Higgs particles; super force condensations to 17 matter particles/associated Higgs forces; spontaneous symmetry breaking is bidirectional; and the sum of 8 permanent Higgs force energies is dark energy. Stellar black hole theory was amplified to include a quark star (matter) with mass, volume, near zero temperature, and maximum entropy. A black hole (energy) has energy, minimal volume (singularity), near infinite temperature, and minimum entropy. Our precursor universe's super supermassive quark star (matter) evaporated to a super supermassive black hole (energy). This transferred total conserved energy/mass and transformed entropy from maximum to minimum. Integrated Theory of Everything Book Video: https://www.youtube.com/watch?v=4a1c9IvdoGY Research Article Video: http://www.youtube.com/watch?v=CD-QoLeVbSY Research Article: http://toncolella.files.wordpress.com/2012/07/m080112.pdf.

  20. Positive gravitational subsystem energies from CFT cone relative entropies

    NASA Astrophysics Data System (ADS)

    Neuenfeld, Dominik; Saraswat, Krishan; Van Raamsdonk, Mark

    2018-06-01

    The positivity of relative entropy for spatial subsystems in a holographic CFT implies the positivity of certain quantities in the dual gravitational theory. In this note, we consider CFT subsystems whose boundaries lie on the lightcone of a point p. We show that the positive gravitational quantity which corresponds to the relative entropy for such a subsystem A is a novel notion of energy associated with a gravitational subsystem bounded by the minimal area extremal surface à associated with A and by the AdS boundary region  corresponding to the part of the lightcone from p bounded by ∂ A. This generalizes the results of arXiv:1605.01075 for ball-shaped regions by making use of the recent results in arXiv:1703.10656 for the vacuum modular Hamiltonian of regions bounded on lightcones. As part of our analysis, we give an analytic expression for the extremal surface in pure AdS associated with any such region A. We note that its form immediately implies the Markov property of the CFT vacuum (saturation of strong subadditivity) for regions bounded on the same lightcone. This gives a holographic proof of the result proven for general CFTs in arXiv:1703.10656. A similar holographic proof shows the Markov property for regions bounded on a lightsheet for non-conformal holographic theories defined by relevant perturbations of a CFT.

  1. Vestibular Activation Differentially Modulates Human Early Visual Cortex and V5/MT Excitability and Response Entropy

    PubMed Central

    Guzman-Lopez, Jessica; Arshad, Qadeer; Schultz, Simon R; Walsh, Vincent; Yousif, Nada

    2013-01-01

    Head movement imposes the additional burdens on the visual system of maintaining visual acuity and determining the origin of retinal image motion (i.e., self-motion vs. object-motion). Although maintaining visual acuity during self-motion is effected by minimizing retinal slip via the brainstem vestibular-ocular reflex, higher order visuovestibular mechanisms also contribute. Disambiguating self-motion versus object-motion also invokes higher order mechanisms, and a cortical visuovestibular reciprocal antagonism is propounded. Hence, one prediction is of a vestibular modulation of visual cortical excitability and indirect measures have variously suggested none, focal or global effects of activation or suppression in human visual cortex. Using transcranial magnetic stimulation-induced phosphenes to probe cortical excitability, we observed decreased V5/MT excitability versus increased early visual cortex (EVC) excitability, during vestibular activation. In order to exclude nonspecific effects (e.g., arousal) on cortical excitability, response specificity was assessed using information theory, specifically response entropy. Vestibular activation significantly modulated phosphene response entropy for V5/MT but not EVC, implying a specific vestibular effect on V5/MT responses. This is the first demonstration that vestibular activation modulates human visual cortex excitability. Furthermore, using information theory, not previously used in phosphene response analysis, we could distinguish between a specific vestibular modulation of V5/MT excitability from a nonspecific effect at EVC. PMID:22291031

  2. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

  3. Developing cross entropy genetic algorithm for solving Two-Dimensional Loading Heterogeneous Fleet Vehicle Routing Problem (2L-HFVRP)

    NASA Astrophysics Data System (ADS)

    Paramestha, D. L.; Santosa, B.

    2018-04-01

    Two-dimensional Loading Heterogeneous Fleet Vehicle Routing Problem (2L-HFVRP) is a combination of Heterogeneous Fleet VRP and a packing problem well-known as Two-Dimensional Bin Packing Problem (BPP). 2L-HFVRP is a Heterogeneous Fleet VRP in which these costumer demands are formed by a set of two-dimensional rectangular weighted item. These demands must be served by a heterogeneous fleet of vehicles with a fix and variable cost from the depot. The objective function 2L-HFVRP is to minimize the total transportation cost. All formed routes must be consistent with the capacity and loading process of the vehicle. Sequential and unrestricted scenarios are considered in this paper. We propose a metaheuristic which is a combination of the Genetic Algorithm (GA) and the Cross Entropy (CE) named Cross Entropy Genetic Algorithm (CEGA) to solve the 2L-HFVRP. The mutation concept on GA is used to speed up the algorithm CE to find the optimal solution. The mutation mechanism was based on local improvement (2-opt, 1-1 Exchange, and 1-0 Exchange). The probability transition matrix mechanism on CE is used to avoid getting stuck in the local optimum. The effectiveness of CEGA was tested on benchmark instance based 2L-HFVRP. The result of experiments shows a competitive result compared with the other algorithm.

  4. Identifying functional thermodynamics in autonomous Maxwellian ratchets

    NASA Astrophysics Data System (ADS)

    Boyd, Alexander B.; Mandal, Dibyendu; Crutchfield, James P.

    2016-02-01

    We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly—for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds. As such, it broadly describes the minimal energetic cost of any computation by a thermodynamic system.

  5. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  6. Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung

    2015-04-01

    In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting

  7. Quantum Random Number Generation Using a Quanta Image Sensor

    PubMed Central

    Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.

    2016-01-01

    A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698

  8. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  9. [The motive force of evolution based on the principle of organismal adjustment evolution.].

    PubMed

    Cao, Jia-Shu

    2010-08-01

    From the analysis of the existing problems of the prevalent theories of evolution, this paper discussed the motive force of evolution based on the knowledge of the principle of organismal adjustment evolution to get a new understanding of the evolution mechanism. In the guide of Schrodinger's theory - "life feeds on negative entropy", the author proposed that "negative entropy flow" actually includes material flow, energy flow and information flow, and the "negative entropy flow" is the motive force for living and development. By modifying my own theory of principle of organismal adjustment evolution (not adaptation evolution), a new theory of "regulation system of organismal adjustment evolution involved in DNA, RNA and protein interacting with environment" is proposed. According to the view that phylogenetic development is the "integral" of individual development, the difference of negative entropy flow between organisms and environment is considered to be a motive force for evolution, which is a new understanding of the mechanism of evolution. Based on such understanding, evolution is regarded as "a changing process that one subsystem passes all or part of its genetic information to the next generation in a larger system, and during the adaptation process produces some new elements, stops some old ones, and thereby lasts in the larger system". Some other controversial questions related to evolution are also discussed.

  10. Analysis of cardiac signals using spatial filling index and time-frequency domain

    PubMed Central

    Faust, Oliver; Acharya U, Rajendra; Krishnan, SM; Min, Lim Choo

    2004-01-01

    Background Analysis of heart rate variation (HRV) has become a popular noninvasive tool for assessing the activities of the autonomic nervous system (ANS). HRV analysis is based on the concept that fast fluctuations may specifically reflect changes of sympathetic and vagal activity. It shows that the structure generating the signal is not simply linear, but also involves nonlinear contributions. These signals are essentially non-stationary; may contain indicators of current disease, or even warnings about impending diseases. The indicators may be present at all times or may occur at random in the time scale. However, to study and pinpoint abnormalities in voluminous data collected over several hours is strenuous and time consuming. Methods This paper presents the spatial filling index and time-frequency analysis of heart rate variability signal for disease identification. Renyi's entropy is evaluated for the signal in the Wigner-Ville and Continuous Wavelet Transformation (CWT) domain. Results This Renyi's entropy gives lower 'p' value for scalogram than Wigner-Ville distribution and also, the contours of scalogram visually show the features of the diseases. And in the time-frequency analysis, the Renyi's entropy gives better result for scalogram than the Wigner-Ville distribution. Conclusion Spatial filling index and Renyi's entropy has distinct regions for various diseases with an accuracy of more than 95%. PMID:15361254

  11. Analysis of the GRNs Inference by Using Tsallis Entropy and a Feature Selection Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Fabrício M.; de Oliveira, Evaldo A.; Cesar, Roberto M.

    An important problem in the bioinformatics field is to understand how genes are regulated and interact through gene networks. This knowledge can be helpful for many applications, such as disease treatment design and drugs creation purposes. For this reason, it is very important to uncover the functional relationship among genes and then to construct the gene regulatory network (GRN) from temporal expression data. However, this task usually involves data with a large number of variables and small number of observations. In this way, there is a strong motivation to use pattern recognition and dimensionality reduction approaches. In particular, feature selection is specially important in order to select the most important predictor genes that can explain some phenomena associated with the target genes. This work presents a first study about the sensibility of entropy methods regarding the entropy functional form, applied to the problem of topology recovery of GRNs. The generalized entropy proposed by Tsallis is used to study this sensibility. The inference process is based on a feature selection approach, which is applied to simulated temporal expression data generated by an artificial gene network (AGN) model. The inferred GRNs are validated in terms of global network measures. Some interesting conclusions can be drawn from the experimental results, as reported for the first time in the present paper.

  12. Refined composite multiscale weighted-permutation entropy of financial time series

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  13. Supertranslations and Superrotations at the Black Hole Horizon.

    PubMed

    Donnay, Laura; Giribet, Gaston; González, Hernán A; Pino, Miguel

    2016-03-04

    We show that the asymptotic symmetries close to nonextremal black hole horizons are generated by an extension of supertranslations. This group is generated by a semidirect sum of Virasoro and Abelian currents. The charges associated with the asymptotic Killing symmetries satisfy the same algebra. When considering the special case of a stationary black hole, the zero mode charges correspond to the angular momentum and the entropy at the horizon.

  14. Overall heat transfer coefficient and pressure drop in a typical tubular exchanger employing alumina nano-fluid as the tube side hot fluid

    NASA Astrophysics Data System (ADS)

    Kabeel, A. E.; Abdelgaied, Mohamed

    2016-08-01

    Nano-fluids are used to improve the heat transfer rates in heat exchangers, especially; the shell-and-tube heat exchanger that is considered one of the most important types of heat exchangers. In the present study, an experimental loop is constructed to study the thermal characteristics of the shell-and-tube heat exchanger; at different concentrations of Al2O3 nonmetallic particles (0.0, 2, 4, and 6 %). This material concentrations is by volume concentrations in pure water as a base fluid. The effects of nano-fluid concentrations on the performance of shell and tube heat exchanger have been conducted based on the overall heat transfer coefficient, the friction factor, the pressure drop in tube side, and the entropy generation rate. The experimental results show that; the highest heat transfer coefficient is obtained at a nano-fluid concentration of 4 % of the shell side. In shell side the maximum percentage increase in the overall heat transfer coefficient has reached 29.8 % for a nano-fluid concentration of 4 %, relative to the case of the base fluid (water) at the same tube side Reynolds number. However; in the tube side the maximum relative increase in pressure drop has recorded the values of 12, 28 and 48 % for a nano-material concentration of 2, 4 and 6 %, respectively, relative to the case without nano-fluid, at an approximate value of 56,000 for Reynolds number. The entropy generation reduces with increasing the nonmetallic particle volume fraction of the same flow rates. For increase the nonmetallic particle volume fraction from 0.0 to 6 % the rate of entropy generation decrease by 10 %.

  15. Entropy, pumped-storage and energy system finance

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Georgios

    2015-04-01

    Pumped-storage holds a key role for integrating renewable energy units with non-renewable fuel plants into large-scale energy systems of electricity output. An emerging issue is the development of financial engineering models with physical basis to systematically fund energy system efficiency improvements across its operation. A fundamental physically-based economic concept is the Scarcity Rent; which concerns the pricing of a natural resource's scarcity. Specifically, the scarcity rent comprises a fraction of a depleting resource's full price and accumulates to fund its more efficient future use. In an integrated energy system, scarcity rents derive from various resources and can be deposited to a pooled fund to finance the energy system's overall efficiency increase; allowing it to benefit from economies of scale. With pumped-storage incorporated to the system, water upgrades to a hub resource, in which the scarcity rents of all connected energy sources are denominated to. However, as available water for electricity generation or storage is also limited, a scarcity rent upon it is also imposed. It is suggested that scarcity rent generation is reducible to three (3) main factors, incorporating uncertainty: (1) water's natural renewability, (2) the energy system's intermittent components and (3) base-load prediction deviations from actual loads. For that purpose, the concept of entropy is used in order to measure the energy system's overall uncertainty; hence pumped-storage intensity requirements and generated water scarcity rents. Keywords: pumped-storage, integration, energy systems, financial engineering, physical basis, Scarcity Rent, pooled fund, economies of scale, hub resource, uncertainty, entropy Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)

  16. Studies on pressure-gain combustion engines

    NASA Astrophysics Data System (ADS)

    Matsutomi, Yu

    Various aspects of the pressure-gain combustion engine are investigated analytically and experimentally in the current study. A lumped parameter model is developed to characterize the operation of a valveless pulse detonation engine. The model identified the function of flame quenching process through gas dynamic process. By adjusting fuel manifold pressure and geometries, the duration of the air buffer can be effectively varied. The parametric study with the lumped parameter model has shown that engine frequency of up to approximately 15 Hz is attainable. However, requirements for upstream air pressure increases significantly with higher engine frequency. The higher pressure requirement indicates pressure loss in the system and lower overall engine performance. The loss of performance due to the pressure loss is a critical issue for the integrated pressure-gain combustors. Two types of transitional methods are examined using entropy-based models. An accumulator based transition has obvious loss due to sudden area expansion, but it can be minimized by utilizing the gas dynamics in the combustion tube. An ejector type transition has potential to achieve performance beyond the limit specified by a single flow path Humphrey cycle. The performance of an ejector was discussed in terms of apparent entropy and mixed flow entropy. Through an ideal ejector, the apparent part of entropy increases due to the reduction in flow unsteadiness, but entropy of the mixed flow remains constant. The method is applied to a CFD simulation with a simple manifold for qualitative evaluation. The operation of the wave rotor constant volume combustion rig is experimentally examined. The rig has shown versatility of operation for wide range of conditions. Large pressure rise in the rotor channel and in a section of the exhaust duct are observed even with relatively large leakage gaps on the rotor. The simplified analysis indicated that inconsistent combustion is likely due to insufficient fuel near the ignition source. However, it is difficult to conclude its fuel distribution with the current setup. Additional measurement near the rotor interfaces and better fuel control are required for the future test.

  17. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  18. Entropy and cosmology.

    NASA Astrophysics Data System (ADS)

    Zucker, M. H.

    This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own temperature and thus, by itself; reverse entropy. The vast encompassing gravitational forces that the universe has at its disposal, forces that dominate the phase of contraction, provide the compacting, compressive mechanism that regenerates heat in an expanded, cooled universe and decreases entropy. And this phenomenon takes place without diminishing or depleting the finite amount of mass/energy with which the universe began. The fact that the universe can reverse the entropic process leads to possibilities previously ignored when assessing which of the three models (open, closed, of flat) most probably represents the future of the universe. After analyzing the models, the conclusion reached here is that the open model is only an expanded version of the closed model and therefore is not open, and the closed model will never collapse to a big crunch and, therefore, is not closed. Which leaves a modified model, oscillating forever between limited phases of expansion and contraction (a universe in "dynamic equilibrium") as the only feasible choice.

  19. Research of MPPT for photovoltaic generation based on two-dimensional cloud model

    NASA Astrophysics Data System (ADS)

    Liu, Shuping; Fan, Wei

    2013-03-01

    The cloud model is a mathematical representation to fuzziness and randomness in linguistic concepts. It represents a qualitative concept with expected value Ex, entropy En and hyper entropy He, and integrates the fuzziness and randomness of a linguistic concept in a unified way. This model is a new method for transformation between qualitative and quantitative in the knowledge. This paper is introduced MPPT (maximum power point tracking, MPPT) controller based two- dimensional cloud model through analysis of auto-optimization MPPT control of photovoltaic power system and combining theory of cloud model. Simulation result shows that the cloud controller is simple and easy, directly perceived through the senses, and has strong robustness, better control performance.

  20. Unitary n -designs via random quenches in atomic Hubbard and spin models: Application to the measurement of Rényi entropies

    NASA Astrophysics Data System (ADS)

    Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.

  1. MHD natural convection and entropy generation in an open cavity having different horizontal porous blocks saturated with a ferrofluid

    NASA Astrophysics Data System (ADS)

    Gibanov, Nikita S.; Sheremet, Mikhail A.; Oztop, Hakan F.; Al-Salem, Khaled

    2018-04-01

    In this study, natural convection combined with entropy generation of Fe3O4-water nanofluid within a square open cavity filled with two different porous blocks under the influence of uniform horizontal magnetic field is numerically studied. Porous blocks of different thermal properties, permeability and porosity are located on the bottom wall. The bottom wall of the cavity is kept at hot temperature Th, while upper open boundary is at constant cold temperature Tc and other walls of the cavity are supposed to be adiabatic. Governing equations with corresponding boundary conditions formulated in dimensionless stream function and vorticity using Brinkman-extended Darcy model for porous blocks have been solved numerically using finite difference method. Numerical analysis has been carried out for wide ranges of Hartmann number, nanoparticles volume fraction and length of the porous blocks. It has been found that an addition of spherical ferric oxide nanoparticles can order the flow structures inside the cavity.

  2. System Mass Variation and Entropy Generation in 100k We Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Reid, Bryan M.

    2004-01-01

    State-of-the-art closed-Brayton-cycle (CBC) space power systems were modeled to study performance trends in a trade space characteristic of interplanetary orbiters. For working-fluid molar masses of 48.6, 39.9, and 11.9 kg/kmol, peak system pressures of 1.38 and 3.0 MPa and compressor pressure ratios ranging from 1.6 to 2.4, total system masses were estimated. System mass increased as peak operating pressure increased for all compressor pressure ratios and molar mass values examined. Minimum mass point comparison between 72 percent He at 1.38 MPa peak and 94 percent He at 3.0 MPa peak showed an increase in system mass of 14 percent. Converter flow loop entropy generation rates were calculated for 1.38 and 3.0 MPa peak pressure cases. Physical system behavior was approximated using a pedigreed NASA Glenn modeling code, Closed Cycle Engine Program (CCEP), which included realistic performance prediction for heat exchangers, radiators and turbomachinery.

  3. System Mass Variation and Entropy Generation in 100-kWe Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Reid, Bryan M.

    2004-01-01

    State-of-the-art closed-Brayton-cycle (CBC) space power systems were modeled to study performance trends in a trade space characteristic of interplanetary orbiters. For working-fluid molar masses of 48.6, 39.9, and 11.9 kg/kmol, peak system pressures of 1.38 and 3.0 MPa and compressor pressure ratios ranging from 1.6 to 2.4, total system masses were estimated. System mass increased as peak operating pressure increased for all compressor pressure ratios and molar mass values examined. Minimum mass point comparison between 72 percent He at 1.38 MPa peak and 94 percent He at 3.0 MPa peak showed an increase in system mass of 14 percent. Converter flow loop entropy generation rates were calculated for 1.38 and 3.0 MPa peak pressure cases. Physical system behavior was approximated using a pedigreed NASA Glenn modeling code, Closed Cycle Engine Program (CCEP), which included realistic performance prediction for heat exchangers, radiators and turbomachinery.

  4. First steps towards a constructal Microbial Fuel Cell.

    PubMed

    Lepage, Guillaume; Perrier, Gérard; Ramousse, Julien; Merlin, Gérard

    2014-06-01

    In order to reach real operating conditions with consequent organic charge flow, a multi-channel reactor for Microbial Fuel Cells is designed. The feed-through double chamber reactor is a two-dimensional system with four parallel channels and Reticulated Vitreous Carbon as electrodes. Based on thermodynamical calculations, the constructal-inspired distributor is optimized with the aim to reduce entropy generation along the distributing path. In the case of negligible singular pressure drops, the Hess-Murray law links the lengths and the hydraulic diameters of the successive reducing ducts leading to one given working channel. The determination of generated entropy in the channels of our constructal MFC is based on the global hydraulic resistance caused by both regular and singular pressure drops. Polarization, power and Electrochemical Impedance Spectroscopy show the robustness and the efficiency of the cell, and therefore the potential of the constructal approach. Routes towards improvements are suggested in terms of design evolutions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Enhanced asymptotic BMS3 algebra of the flat spacetime solutions of generalized minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Setare, M. R.; Adami, H.

    2018-01-01

    We apply the new fall of conditions presented in the paper [1] on asymptotically flat spacetime solutions of Chern-Simons-like theories of gravity. We show that the considered fall of conditions asymptotically solve equations of motion of generalized minimal massive gravity. We demonstrate that there exist two type of solutions, one of those is trivial and the others are non-trivial. By looking at non-trivial solutions, for asymptotically flat spacetimes in the generalized minimal massive gravity, in contrast to Einstein gravity, cosmological parameter can be non-zero. We obtain the conserved charges of the asymptotically flat spacetimes in generalized minimal massive gravity, and by introducing Fourier modes we show that the asymptotic symmetry algebra is a semidirect product of a BMS3 algebra and two U (1) current algebras. Also we verify that the BMS3 algebra can be obtained by a contraction of the AdS3 asymptotic symmetry algebra when the AdS3 radius tends to infinity in the flat-space limit. Finally we find energy, angular momentum and entropy for a particular case and deduce that these quantities satisfy the first law of flat space cosmologies.

  6. Classification of Phase Transitions by Microcanonical Inflection-Point Analysis

    NASA Astrophysics Data System (ADS)

    Qi, Kai; Bachmann, Michael

    2018-05-01

    By means of the principle of minimal sensitivity we generalize the microcanonical inflection-point analysis method by probing derivatives of the microcanonical entropy for signals of transitions in complex systems. A strategy of systematically identifying and locating independent and dependent phase transitions of any order is proposed. The power of the generalized method is demonstrated in applications to the ferromagnetic Ising model and a coarse-grained model for polymer adsorption onto a substrate. The results shed new light on the intrinsic phase structure of systems with cooperative behavior.

  7. Using Maximum Entropy to Find Patterns in Genomes

    NASA Astrophysics Data System (ADS)

    Liu, Sophia; Hockenberry, Adam; Lancichinetti, Andrea; Jewett, Michael; Amaral, Luis

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. To accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. This approach can also be easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes. National Institute of General Medical Science, Northwestern University Presidential Fellowship, National Science Foundation, David and Lucile Packard Foundation, Camille Dreyfus Teacher Scholar Award.

  8. Whole-Lesion Apparent Diffusion Coefficient-Based Entropy-Related Parameters for Characterizing Cervical Cancers: Initial Findings.

    PubMed

    Guan, Yue; Li, Weifeng; Jiang, Zhuoran; Chen, Ying; Liu, Song; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-12-01

    This study aimed to develop whole-lesion apparent diffusion coefficient (ADC)-based entropy-related parameters of cervical cancer to preliminarily assess intratumoral heterogeneity of this lesion in comparison to adjacent normal cervical tissues. A total of 51 women (mean age, 49 years) with cervical cancers confirmed by biopsy underwent 3-T pelvic diffusion-weighted magnetic resonance imaging with b values of 0 and 800 s/mm 2 prospectively. ADC-based entropy-related parameters including first-order entropy and second-order entropies were derived from the whole tumor volume as well as adjacent normal cervical tissues. Intraclass correlation coefficient, Wilcoxon test with Bonferroni correction, Kruskal-Wallis test, and receiver operating characteristic curve were used for statistical analysis. All the parameters showed excellent interobserver agreement (all intraclass correlation coefficients  > 0.900). Entropy, entropy(H) 0 , entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean were significantly higher, whereas entropy(H) range and entropy(H) std were significantly lower in cervical cancers compared to adjacent normal cervical tissues (all P <.0001). Kruskal-Wallis test showed that there were no significant differences among the values of various second-order entropies including entropy(H) 0, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean. All second-order entropies had larger area under the receiver operating characteristic curve than first-order entropy in differentiating cervical cancers from adjacent normal cervical tissues. Further, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean had the same largest area under the receiver operating characteristic curve of 0.867. Whole-lesion ADC-based entropy-related parameters of cervical cancers were developed successfully, which showed initial potential in characterizing intratumoral heterogeneity in comparison to adjacent normal cervical tissues. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  9. Chemical library subset selection algorithms: a unified derivation using spatial statistics.

    PubMed

    Hamprecht, Fred A; Thiel, Walter; van Gunsteren, Wilfred F

    2002-01-01

    If similar compounds have similar activity, rational subset selection becomes superior to random selection in screening for pharmacological lead discovery programs. Traditional approaches to this experimental design problem fall into two classes: (i) a linear or quadratic response function is assumed (ii) some space filling criterion is optimized. The assumptions underlying the first approach are clear but not always defendable; the second approach yields more intuitive designs but lacks a clear theoretical foundation. We model activity in a bioassay as realization of a stochastic process and use the best linear unbiased estimator to construct spatial sampling designs that optimize the integrated mean square prediction error, the maximum mean square prediction error, or the entropy. We argue that our approach constitutes a unifying framework encompassing most proposed techniques as limiting cases and sheds light on their underlying assumptions. In particular, vector quantization is obtained, in dimensions up to eight, in the limiting case of very smooth response surfaces for the integrated mean square error criterion. Closest packing is obtained for very rough surfaces under the integrated mean square error and entropy criteria. We suggest to use either the integrated mean square prediction error or the entropy as optimization criteria rather than approximations thereof and propose a scheme for direct iterative minimization of the integrated mean square prediction error. Finally, we discuss how the quality of chemical descriptors manifests itself and clarify the assumptions underlying the selection of diverse or representative subsets.

  10. An information-theoretical perspective on weighted ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Weijs, Steven V.; van de Giesen, Nick

    2013-08-01

    This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.

  11. Apparent diffusion coefficient histogram shape analysis for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2016-10-22

    To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.

  12. On quantum Rényi entropies: A new generalization and some properties

    NASA Astrophysics Data System (ADS)

    Müller-Lennert, Martin; Dupuis, Frédéric; Szehr, Oleg; Fehr, Serge; Tomamichel, Marco

    2013-12-01

    The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.

  13. Precipitation behavior of AlxCoCrFeNi high entropy alloys under ion irradiation

    NASA Astrophysics Data System (ADS)

    Yang, Tengfei; Xia, Songqin; Liu, Shi; Wang, Chenxu; Liu, Shaoshuai; Fang, Yuan; Zhang, Yong; Xue, Jianming; Yan, Sha; Wang, Yugang

    2016-08-01

    Materials performance is central to the satisfactory operation of current and future nuclear energy systems due to the severe irradiation environment in reactors. Searching for structural materials with excellent irradiation tolerance is crucial for developing the next generation nuclear reactors. Here, we report the irradiation responses of a novel multi-component alloy system, high entropy alloy (HEA) AlxCoCrFeNi (x = 0.1, 0.75 and 1.5), focusing on their precipitation behavior. It is found that the single phase system, Al0.1CoCrFeNi, exhibits a great phase stability against ion irradiation. No precipitate is observed even at the highest fluence. In contrast, numerous coherent precipitates are present in both multi-phase HEAs. Based on the irradiation-induced/enhanced precipitation theory, the excellent structural stability against precipitation of Al0.1CoCrFeNi is attributed to the high configurational entropy and low atomic diffusion, which reduces the thermodynamic driving force and kinetically restrains the formation of precipitate, respectively. For the multiphase HEAs, the phase separations and formation of ordered phases reduce the system configurational entropy, resulting in the similar precipitation behavior with corresponding binary or ternary conventional alloys. This study demonstrates the structural stability of single-phase HEAs under irradiation and provides important implications for searching for HEAs with higher irradiation tolerance.

  14. Noise reduction algorithm with the soft thresholding based on the Shannon entropy and bone-conduction speech cross- correlation bands.

    PubMed

    Na, Sung Dae; Wei, Qun; Seong, Ki Woong; Cho, Jin Ho; Kim, Myoung Nam

    2018-01-01

    The conventional methods of speech enhancement, noise reduction, and voice activity detection are based on the suppression of noise or non-speech components of the target air-conduction signals. However, air-conduced speech is hard to differentiate from babble or white noise signals. To overcome this problem, the proposed algorithm uses the bone-conduction speech signals and soft thresholding based on the Shannon entropy principle and cross-correlation of air- and bone-conduction signals. A new algorithm for speech detection and noise reduction is proposed, which makes use of the Shannon entropy principle and cross-correlation with the bone-conduction speech signals to threshold the wavelet packet coefficients of the noisy speech. The proposed method can be get efficient result by objective quality measure that are PESQ, RMSE, Correlation, SNR. Each threshold is generated by the entropy and cross-correlation approaches in the decomposed bands using the wavelet packet decomposition. As a result, the noise is reduced by the proposed method using the MATLAB simulation. To verify the method feasibility, we compared the air- and bone-conduction speech signals and their spectra by the proposed method. As a result, high performance of the proposed method is confirmed, which makes it quite instrumental to future applications in communication devices, noisy environment, construction, and military operations.

  15. Cross-entropy clustering framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Tongal, Hakan; Sivakumar, Bellie

    2017-09-01

    There is an increasing interest in catchment classification and regionalization in hydrology, as they are useful for identification of appropriate model complexity and transfer of information from gauged catchments to ungauged ones, among others. This study introduces a nonlinear cross-entropy clustering (CEC) method for classification of catchments. The method specifically considers embedding dimension (m), sample entropy (SampEn), and coefficient of variation (CV) to represent dimensionality, complexity, and variability of the time series, respectively. The method is applied to daily streamflow time series from 217 gauging stations across Australia. The results suggest that a combination of linear and nonlinear parameters (i.e. m, SampEn, and CV), representing different aspects of the underlying dynamics of streamflows, could be useful for determining distinct patterns of flow generation mechanisms within a nonlinear clustering framework. For the 217 streamflow time series, nine hydrologically homogeneous clusters that have distinct patterns of flow regime characteristics and specific dominant hydrological attributes with different climatic features are obtained. Comparison of the results with those obtained using the widely employed k-means clustering method (which results in five clusters, with the loss of some information about the features of the clusters) suggests the superiority of the cross-entropy clustering method. The outcomes from this study provide a useful guideline for employing the nonlinear dynamic approaches based on hydrologic signatures and for gaining an improved understanding of streamflow variability at a large scale.

  16. Upper entropy axioms and lower entropy axioms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi

    2015-04-15

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less

  17. Percolation

    NASA Astrophysics Data System (ADS)

    Dã¡Vila, Alã¡N.; Escudero, Christian; López, Jorge, , Dr.

    2004-10-01

    Several methods have been developed in order to study phase transitions in nuclear fragmentation. The one used in this research is Percolation. This method allows us to adjust resulting data to heavy ion collisions experiments. In systems, such as atomic nuclei or molecules, energy is put into the system. The system's particles move away from each other until their links are broken. Some particles will still be linked. The fragments' distribution is found to be a power law. We are witnessing then a critical phenomenon. In our model the particles are represented as occupied spaces in a cubical array. Each particle has a bound to each one of its 6 neighbors. Each bound can be active if the two particles are linked or inactive if they are not. When two or more particles are linked, a fragment is formed. The probability for a specific link to be broken cannot be calculated, so the probability for a bound to be active is going to be used as parameter when trying to adjust the data. For a given probability p several arrays are generated. The fragments are counted. The fragments' distribution is then adjusted to a power law. The probability that generates the better fit is going to be the critical probability that indicates a phase transition. The better fit is found by seeking the fragments' distribution that gives the minimal chi squared when compared to a power law. As additional evidence of criticality the entropy and normalized variance of the mass are also calculated for each probability.

  18. Moderate point: Balanced entropy and enthalpy contributions in soft matter

    NASA Astrophysics Data System (ADS)

    He, Baoji; Wang, Yanting

    2017-03-01

    Various soft materials share some common features, such as significant entropic effect, large fluctuations, sensitivity to thermodynamic conditions, and mesoscopic characteristic spatial and temporal scales. However, no quantitative definitions have yet been provided for soft matter, and the intrinsic mechanisms leading to their common features are unclear. In this work, from the viewpoint of statistical mechanics, we show that soft matter works in the vicinity of a specific thermodynamic state named moderate point, at which entropy and enthalpy contributions among substates along a certain order parameter are well balanced or have a minimal difference. Around the moderate point, the order parameter fluctuation, the associated response function, and the spatial correlation length maximize, which explains the large fluctuation, the sensitivity to thermodynamic conditions, and mesoscopic spatial and temporal scales of soft matter, respectively. Possible applications to switching chemical bonds or allosteric biomachines determining their best working temperatures are also briefly discussed. Project supported by the National Basic Research Program of China (Grant No. 2013CB932804) and the National Natural Science Foundation of China (Grant Nos. 11274319 and 11421063).

  19. Entropy Splitting for High Order Numerical Simulation of Vortex Sound at Low Mach Numbers

    NASA Technical Reports Server (NTRS)

    Mueller, B.; Yee, H. C.; Mansour, Nagi (Technical Monitor)

    2001-01-01

    A method of minimizing numerical errors, and improving nonlinear stability and accuracy associated with low Mach number computational aeroacoustics (CAA) is proposed. The method consists of two levels. From the governing equation level, we condition the Euler equations in two steps. The first step is to split the inviscid flux derivatives into a conservative and a non-conservative portion that satisfies a so called generalized energy estimate. This involves the symmetrization of the Euler equations via a transformation of variables that are functions of the physical entropy. Owing to the large disparity of acoustic and stagnation quantities in low Mach number aeroacoustics, the second step is to reformulate the split Euler equations in perturbation form with the new unknowns as the small changes of the conservative variables with respect to their large stagnation values. From the numerical scheme level, a stable sixth-order central interior scheme with a third-order boundary schemes that satisfies the discrete analogue of the integration-by-parts procedure used in the continuous energy estimate (summation-by-parts property) is employed.

  20. Functional organization of mitotic microtubules. Physical chemistry of the in vivo equilibrium system.

    PubMed Central

    Inoué, S; Fuseler, J; Salmon, E D; Ellis, G W

    1975-01-01

    Equilibrium between mitotic microtubules and tubulin is analyzed, using birefringence of mitotic spindle to measure microtubule concentration in vivo. A newly designed temperature-controlled slide and miniature, thermostated hydrostatic pressure chamber permit rapid alteration of temperature and of pressure. Stress birefringence of the windows is minimized, and a system for rapid recording of compensation is incorporated, so that birefringence can be measured to 0.1 nm retardation every few seconds. Both temperature and pressure data yield thermodynamic values (delta H similar to 35 kcal/mol, delta S similar to 120 entropy units [eu], delta V similar to 400 ml/mol of subunit polymerized) consistent with the explanation that polymerization of tubulin is entropy driven and mediated by hydrophobic interactions. Kinetic data suggest pseudo-zero-order polymerization and depolymerization following rapid temperature shifts, and a pseudo-first-order depolymerization during anaphase at constant temperature. The equilibrium properties of the in vivo mitotic microtubules are compared with properties of isolated brain tubules. Images FIGURE 1 FIGURE 2 FIGURE 5 FIGURE 12 FIGURE 13 FIGURE 14 FIGURE 19 PMID:1139037

Top