Entropy-Based Search Algorithm for Experimental Design
NASA Astrophysics Data System (ADS)
Malakar, N. K.; Knuth, K. H.
2011-03-01
The scientific method relies on the iterated processes of inference and inquiry. The inference phase consists of selecting the most probable models based on the available data; whereas the inquiry phase consists of using what is known about the models to select the most relevant experiment. Optimizing inquiry involves searching the parameterized space of experiments to select the experiment that promises, on average, to be maximally informative. In the case where it is important to learn about each of the model parameters, the relevance of an experiment is quantified by Shannon entropy of the distribution of experimental outcomes predicted by a probable set of models. If the set of potential experiments is described by many parameters, we must search this high-dimensional entropy space. Brute force search methods will be slow and computationally expensive. We present an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment for efficient experimental design. This algorithm is inspired by Skilling's nested sampling algorithm used in inference and borrows the concept of a rising threshold while a set of experiment samples are maintained. We demonstrate that this algorithm not only selects highly relevant experiments, but also is more efficient than brute force search. Such entropic search techniques promise to greatly benefit autonomous experimental design.
A Discrete Constraint for Entropy Conservation and Sound Waves in Cloud-Resolving Modeling
NASA Technical Reports Server (NTRS)
Zeng, Xi-Ping; Tao, Wei-Kuo; Simpson, Joanne
2003-01-01
Ideal cloud-resolving models contain little-accumulative errors. When their domain is so large that synoptic large-scale circulations are accommodated, they can be used for the simulation of the interaction between convective clouds and the large-scale circulations. This paper sets up a framework for the models, using moist entropy as a prognostic variable and employing conservative numerical schemes. The models possess no accumulative errors of thermodynamic variables when they comply with a discrete constraint on entropy conservation and sound waves. Alternatively speaking, the discrete constraint is related to the correct representation of the large-scale convergence and advection of moist entropy. Since air density is involved in entropy conservation and sound waves, the challenge is how to compute sound waves efficiently under the constraint. To address the challenge, a compensation method is introduced on the basis of a reference isothermal atmosphere whose governing equations are solved analytically. Stability analysis and numerical experiments show that the method allows the models to integrate efficiently with a large time step.
The Law of Entropy Increase - A Lab Experiment
NASA Astrophysics Data System (ADS)
Dittrich, William; Drosd, Robert; Minkin, Leonid; Shapovalov, Alexander S.
2016-09-01
The second law of thermodynamics has various formulations. There is the "Clausius formulation," which can be stated in a very intuitive way: "No process is possible whose sole result is the transfer of heat from a cooler to a hotter body." There is also the "Kelvin-Plank principle," which states that "no cyclic process exists whose sole result is the absorption of heat from a reservoir and the conversion of all this heat into work" [emphasis added] (since this would require perfect energy conversion efficiency). Both these statements can be presented to physics students in a conceptual manner, and students' "everyday" experiences will support either statement of the second law of thermodynamics. However, when the second law of thermodynamics is expressed using the concept of entropy (ΔS ≥ 0, for a closed system), most first-year physics students lack any direct experimental experience with this parameter. This paper describes a calculation of the increase in entropy that can be performed while completing three traditional thermodynamics experiments. These simple and quick calculations help students become familiar and comfortable with the concept of entropy. This paper is complementary to prior work where classroom activities were developed to provide insight into the statistical nature of entropy.
Fault Diagnosis for Micro-Gas Turbine Engine Sensors via Wavelet Entropy
Yu, Bing; Liu, Dongdong; Zhang, Tianhong
2011-01-01
Sensor fault diagnosis is necessary to ensure the normal operation of a gas turbine system. However, the existing methods require too many resources and this need can’t be satisfied in some occasions. Since the sensor readings are directly affected by sensor state, sensor fault diagnosis can be performed by extracting features of the measured signals. This paper proposes a novel fault diagnosis method for sensors based on wavelet entropy. Based on the wavelet theory, wavelet decomposition is utilized to decompose the signal in different scales. Then the instantaneous wavelet energy entropy (IWEE) and instantaneous wavelet singular entropy (IWSE) are defined based on the previous wavelet entropy theory. Subsequently, a fault diagnosis method for gas turbine sensors is proposed based on the results of a numerically simulated example. Then, experiments on this method are carried out on a real micro gas turbine engine. In the experiment, four types of faults with different magnitudes are presented. The experimental results show that the proposed method for sensor fault diagnosis is efficient. PMID:22163734
Fault diagnosis for micro-gas turbine engine sensors via wavelet entropy.
Yu, Bing; Liu, Dongdong; Zhang, Tianhong
2011-01-01
Sensor fault diagnosis is necessary to ensure the normal operation of a gas turbine system. However, the existing methods require too many resources and this need can't be satisfied in some occasions. Since the sensor readings are directly affected by sensor state, sensor fault diagnosis can be performed by extracting features of the measured signals. This paper proposes a novel fault diagnosis method for sensors based on wavelet entropy. Based on the wavelet theory, wavelet decomposition is utilized to decompose the signal in different scales. Then the instantaneous wavelet energy entropy (IWEE) and instantaneous wavelet singular entropy (IWSE) are defined based on the previous wavelet entropy theory. Subsequently, a fault diagnosis method for gas turbine sensors is proposed based on the results of a numerically simulated example. Then, experiments on this method are carried out on a real micro gas turbine engine. In the experiment, four types of faults with different magnitudes are presented. The experimental results show that the proposed method for sensor fault diagnosis is efficient.
Entropy in DNA Double-Strand Break, Detection and Signaling
NASA Astrophysics Data System (ADS)
Zhang, Yang; Schindler, Christina; Heermann, Dieter
2014-03-01
In biology, the term entropy is often understood as a measure of disorder - a restrictive interpretation that can even be misleading. Recently it has become clearer and clearer that entropy, contrary to conventional wisdom, can help to order and guide biological processes in living cells. DNA double-strand breaks (DSBs) are among the most dangerous lesions and efficient damage detection and repair is essential for organism viability. However, what remains unknown is the precise mechanism of targeting the site of damage within billions of intact nucleotides and a crowded nuclear environment, a process which is often referred to as recruitment or signaling. Here we show that the change in entropy associated with inflicting a DSB facilitates the recruitment of damage sensor proteins. By means of computational modeling we found that higher mobility and local chromatin structure accelerate protein association at DSB ends. We compared the effect of different chromatin architectures on protein dynamics and concentrations in the vicinity of DSBs, and related these results to experiments on repair in heterochromatin. Our results demonstrate how entropy contributes to a more efficient damage detection. We identify entropy as the physical basis for DNA double-strand break signaling.
Modeling Information Content Via Dirichlet-Multinomial Regression Analysis.
Ferrari, Alberto
2017-01-01
Shannon entropy is being increasingly used in biomedical research as an index of complexity and information content in sequences of symbols, e.g. languages, amino acid sequences, DNA methylation patterns and animal vocalizations. Yet, distributional properties of information entropy as a random variable have seldom been the object of study, leading to researchers mainly using linear models or simulation-based analytical approach to assess differences in information content, when entropy is measured repeatedly in different experimental conditions. Here a method to perform inference on entropy in such conditions is proposed. Building on results coming from studies in the field of Bayesian entropy estimation, a symmetric Dirichlet-multinomial regression model, able to deal efficiently with the issue of mean entropy estimation, is formulated. Through a simulation study the model is shown to outperform linear modeling in a vast range of scenarios and to have promising statistical properties. As a practical example, the method is applied to a data set coming from a real experiment on animal communication.
Haseli, Y
2016-05-01
The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.
Sensitivity of Tropical Cyclone Spinup Time to the Initial Entropy Deficit
NASA Astrophysics Data System (ADS)
Tang, B.; Corbosiero, K. L.; Rios-Berrios, R.; Alland, J.; Berman, J.
2014-12-01
The development timescale of a tropical cyclone from genesis to the start of rapid intensification in an axisymmetric model is hypothesized to be a function of the initial entropy deficit. We run a set of idealized simulations in which the initial entropy deficit between the boundary layer and free troposphere varies from 0 to 100 J kg-1 K-1. The development timescale is measured by changes in the integrated kinetic energy of the low-level vortex. This timescale is inversely related to the mean mass flux during the tropical cyclone gestation period. The mean mass flux, in turn, is a function of the statistics of convective updrafts and downdrafts. Contour frequency by altitude diagrams show that entrainment of dry air into updrafts is predominately responsible for differences in the mass flux between the experiments, while downdrafts play a secondary role. Analyses of the potential and kinetic energy budgets indicate less efficient conversion of available potential energy to kinetic energy in the experiments with higher entropy deficits. Entrainment leads to the loss of buoyancy and the destruction of available potential energy. In the presence of strong downdrafts, there can even be a reversal of the conversion term. Weaker and more radially confined radial inflow results in less convergence of angular momentum in the experiments with higher entropy deficits. The result is a slower vortex spinup and a reduction in steady-state vortex size, despite similar steady-state maximum intensities among the experiments.
Fast and Efficient Stochastic Optimization for Analytic Continuation
Bao, Feng; Zhang, Guannan; Webster, Clayton G; ...
2016-09-28
In this analytic continuation of imaginary-time quantum Monte Carlo data to extract real-frequency spectra remains a key problem in connecting theory with experiment. Here we present a fast and efficient stochastic optimization method (FESOM) as a more accessible variant of the stochastic optimization method introduced by Mishchenko et al. [Phys. Rev. B 62, 6317 (2000)], and we benchmark the resulting spectra with those obtained by the standard maximum entropy method for three representative test cases, including data taken from studies of the two-dimensional Hubbard model. Genearally, we find that our FESOM approach yields spectra similar to the maximum entropy results.more » In particular, while the maximum entropy method yields superior results when the quality of the data is strong, we find that FESOM is able to resolve fine structure with more detail when the quality of the data is poor. In addition, because of its stochastic nature, the method provides detailed information on the frequency-dependent uncertainty of the resulting spectra, while the maximum entropy method does so only for the spectral weight integrated over a finite frequency region. Therefore, we believe that this variant of the stochastic optimization approach provides a viable alternative to the routinely used maximum entropy method, especially for data of poor quality.« less
Mobli, Mehdi; Stern, Alan S.; Bermel, Wolfgang; King, Glenn F.; Hoch, Jeffrey C.
2010-01-01
One of the stiffest challenges in structural studies of proteins using NMR is the assignment of sidechain resonances. Typically, a panel of lengthy 3D experiments are acquired in order to establish connectivities and resolve ambiguities due to overlap. We demonstrate that these experiments can be replaced by a single 4D experiment that is time-efficient, yields excellent resolution, and captures unique carbon-proton connectivity information. The approach is made practical by the use of non-uniform sampling in the three indirect time dimensions and maximum entropy reconstruction of the corresponding 3D frequency spectrum. This 4D method will facilitate automated resonance assignment procedures and it should be particularly beneficial for increasing throughput in NMR-based structural genomics initiatives. PMID:20299257
Is Water at the Graphite Interface Vapor-like or Ice-like?
Qiu, Yuqing; Lupi, Laura; Molinero, Valeria
2018-04-05
Graphitic surfaces are the main component of soot, a major constituent of atmospheric aerosols. Experiments indicate that soots of different origins display a wide range of abilities to heterogeneously nucleate ice. The ability of pure graphite to nucleate ice in experiments, however, seems to be almost negligible. Nevertheless, molecular simulations with the monatomic water model mW with water-carbon interactions parameterized to reproduce the experimental contact angle of water on graphite predict that pure graphite nucleates ice. According to classical nucleation theory, the ability of a surface to nucleate ice is controlled by the binding free energy between ice immersed in liquid water and the surface. To establish whether the discrepancy in freezing efficiencies of graphite in mW simulations and experiments arises from the coarse resolution of the model or can be fixed by reparameterization, it is important to elucidate the contributions of the water-graphite, water-ice, and ice-water interfaces to the free energy, enthalpy, and entropy of binding for both water and the model. Here we use thermodynamic analysis and free energy calculations to determine these interfacial properties. We demonstrate that liquid water at the graphite interface is not ice-like or vapor-like: it has similar free energy, entropy, and enthalpy as water in the bulk. The thermodynamics of the water-graphite interface is well reproduced by the mW model. We find that the entropy of binding between graphite and ice is positive and dominated, in both experiments and simulations, by the favorable entropy of reducing the ice-water interface. Our analysis indicates that the discrepancy in freezing efficiencies of graphite in experiments and the simulations with mW arises from the inability of the model to simultaneously reproduce the contact angle of liquid water on graphite and the free energy of the ice-graphite interface. This transferability issue is intrinsic to the resolution of the model, and arises from its lack of rotational degrees of freedom.
RNA Thermodynamic Structural Entropy
Garcia-Martin, Juan Antonio; Clote, Peter
2015-01-01
Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner’99 and Turner’04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http://bioinformatics.bc.edu/clotelab/RNAentropy, including source code and ancillary programs. PMID:26555444
RNA Thermodynamic Structural Entropy.
Garcia-Martin, Juan Antonio; Clote, Peter
2015-01-01
Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http://bioinformatics.bc.edu/clotelab/RNAentropy, including source code and ancillary programs.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael
2014-01-01
Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489
2013-01-01
Here we present a novel, end-point method using the dead-end-elimination and A* algorithms to efficiently and accurately calculate the change in free energy, enthalpy, and configurational entropy of binding for ligand–receptor association reactions. We apply the new approach to the binding of a series of human immunodeficiency virus (HIV-1) protease inhibitors to examine the effect ensemble reranking has on relative accuracy as well as to evaluate the role of the absolute and relative ligand configurational entropy losses upon binding in affinity differences for structurally related inhibitors. Our results suggest that most thermodynamic parameters can be estimated using only a small fraction of the full configurational space, and we see significant improvement in relative accuracy when using an ensemble versus single-conformer approach to ligand ranking. We also find that using approximate metrics based on the single-conformation enthalpy differences between the global minimum energy configuration in the bound as well as unbound states also correlates well with experiment. Using a novel, additive entropy expansion based on conditional mutual information, we also analyze the source of ligand configurational entropy loss upon binding in terms of both uncoupled per degree of freedom losses as well as changes in coupling between inhibitor degrees of freedom. We estimate entropic free energy losses of approximately +24 kcal/mol, 12 kcal/mol of which stems from loss of translational and rotational entropy. Coupling effects contribute only a small fraction to the overall entropy change (1–2 kcal/mol) but suggest differences in how inhibitor dihedral angles couple to each other in the bound versus unbound states. The importance of accounting for flexibility in drug optimization and design is also discussed. PMID:24250277
Quantifying and minimizing entropy generation in AMTEC cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, T.J.; Huang, C.
1997-12-31
Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less
Low-dimensional approximation searching strategy for transfer entropy from non-uniform embedding
2018-01-01
Transfer entropy from non-uniform embedding is a popular tool for the inference of causal relationships among dynamical subsystems. In this study we present an approach that makes use of low-dimensional conditional mutual information quantities to decompose the original high-dimensional conditional mutual information in the searching procedure of non-uniform embedding for significant variables at different lags. We perform a series of simulation experiments to assess the sensitivity and specificity of our proposed method to demonstrate its advantage compared to previous algorithms. The results provide concrete evidence that low-dimensional approximations can help to improve the statistical accuracy of transfer entropy in multivariate causality analysis and yield a better performance over other methods. The proposed method is especially efficient as the data length grows. PMID:29547669
Duan, Lili; Liu, Xiao; Zhang, John Z H
2016-05-04
Efficient and reliable calculation of protein-ligand binding free energy is a grand challenge in computational biology and is of critical importance in drug design and many other molecular recognition problems. The main challenge lies in the calculation of entropic contribution to protein-ligand binding or interaction systems. In this report, we present a new interaction entropy method which is theoretically rigorous, computationally efficient, and numerically reliable for calculating entropic contribution to free energy in protein-ligand binding and other interaction processes. Drastically different from the widely employed but extremely expensive normal mode method for calculating entropy change in protein-ligand binding, the new method calculates the entropic component (interaction entropy or -TΔS) of the binding free energy directly from molecular dynamics simulation without any extra computational cost. Extensive study of over a dozen randomly selected protein-ligand binding systems demonstrated that this interaction entropy method is both computationally efficient and numerically reliable and is vastly superior to the standard normal mode approach. This interaction entropy paradigm introduces a novel and intuitive conceptual understanding of the entropic effect in protein-ligand binding and other general interaction systems as well as a practical method for highly efficient calculation of this effect.
Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong
2017-09-03
Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro-meso-scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy-enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy-enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates.
Giant onsite electronic entropy enhances the performance of ceria for water splitting.
Naghavi, S Shahab; Emery, Antoine A; Hansen, Heine A; Zhou, Fei; Ozolins, Vidvuds; Wolverton, Chris
2017-08-18
Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Ce 4+ /Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.Solid-state entropy of reduction increases the thermodynamic efficiency of ceria for two-step thermochemical water splitting. Here, the authors report a large and different source of entropy, the onsite electronic configurational entropy arising from coupling between orbital and spin angular momenta in f orbitals.
An efficient algorithm for automatic phase correction of NMR spectra based on entropy minimization
NASA Astrophysics Data System (ADS)
Chen, Li; Weng, Zhiqiang; Goh, LaiYoong; Garland, Marc
2002-09-01
A new algorithm for automatic phase correction of NMR spectra based on entropy minimization is proposed. The optimal zero-order and first-order phase corrections for a NMR spectrum are determined by minimizing entropy. The objective function is constructed using a Shannon-type information entropy measure. Entropy is defined as the normalized derivative of the NMR spectral data. The algorithm has been successfully applied to experimental 1H NMR spectra. The results of automatic phase correction are found to be comparable to, or perhaps better than, manual phase correction. The advantages of this automatic phase correction algorithm include its simple mathematical basis and the straightforward, reproducible, and efficient optimization procedure. The algorithm is implemented in the Matlab program ACME—Automated phase Correction based on Minimization of Entropy.
Informational basis of sensory adaptation: entropy and single-spike efficiency in rat barrel cortex.
Adibi, Mehdi; Clifford, Colin W G; Arabzadeh, Ehsan
2013-09-11
We showed recently that exposure to whisker vibrations enhances coding efficiency in rat barrel cortex despite increasing correlations in variability (Adibi et al., 2013). Here, to understand how adaptation achieves this improvement in sensory representation, we decomposed the stimulus information carried in neuronal population activity into its fundamental components in the framework of information theory. In the context of sensory coding, these components are the entropy of the responses across the entire stimulus set (response entropy) and the entropy of the responses conditional on the stimulus (conditional response entropy). We found that adaptation decreased response entropy and conditional response entropy at both the level of single neurons and the pooled activity of neuronal populations. However, the net effect of adaptation was to increase the mutual information because the drop in the conditional entropy outweighed the drop in the response entropy. The information transmitted by a single spike also increased under adaptation. As population size increased, the information content of individual spikes declined but the relative improvement attributable to adaptation was maintained.
Chaotic characteristics enhanced by impeller of perturbed six-bent-bladed turbine in stirred tank
NASA Astrophysics Data System (ADS)
Luan, Deyu; Zhang, Shengfeng; Lu, Jianping; Zhang, Xiaoguang
The fundamental way of improving the mixing efficiency is to induce the chaotic flow in a stirred vessel. The impeller form plays an important role for changing the structure of flow field and realizing chaotic mixing. Based on the velocity time series acquired by the experiment of particle image velocimetry (PIV), with the software Matlab, the macro-instability (MI), largest Lyapunov exponent (LLE), and Kolmogorov entropy in the water stirred tank is investigated respectively with the impeller of perturbed six-bent-bladed turbine (6PBT). The results show that the MI characteristics are obvious and two peak values of MI frequency are observed at the speed N = 60 rpm. With the increasing speed (more than 100 rpm), the peak characteristics of MI frequency disappear and a multi-scale wavelet structure of characterizing the chaotic flow field appears. Moreover, under the speed N = 60 rpm, the LLE is less than 0 and Kolmogorov entropy is 0, which means that the flow field is in the periodic moving state. As the speed is increased to more than 100 rpm, the LLE and Kolmogorov entropy are all more than 0, which indicates that the flow field goes into the chaotic mixing. When the speed reaches up to about 210 rpm, both of the LLE and Kolmogorov entropy achieve the optimum values, which will result in an excellent chaos with the highest mixing efficient. So it is feasible that the MI frequency, the LLE and the Kolmogorov entropy can be used to analyze the flow field characteristics in a stirred tank. The research results promote the understanding of the chaotic mixing mechanism and provide a theoretical reference for the development of new type impeller.
NASA Astrophysics Data System (ADS)
Siokis, Fotios M.
2018-06-01
We explore the evolution of the informational efficiency for specific instruments of the U.S. money, bond and stock exchange markets, prior and after the outbreak of the Great Recession. We utilize the permutation entropy and the complexity-entropy causality plane to rank the time series and measure the degree of informational efficiency. We find that after the credit crunch and the collapse of Lehman Brothers the efficiency level of specific money market instruments' yield falls considerably. This is an evidence of less uncertainty included in predicting the related yields throughout the financial disarray. Similar trend is depicted in the indices of the stock exchange markets but efficiency remains in much higher levels. On the other hand, bond market instruments maintained their efficiency levels even after the outbreak of the crisis, which could be interpreted into greater randomness and less predictability of their yields.
NASA Astrophysics Data System (ADS)
Traversaro, Francisco; O. Redelico, Francisco
2018-04-01
In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series. The literature describes some resampling methods of quantities used in nonlinear dynamics - as the largest Lyapunov exponent - but these seems to fail. In this contribution, we propose a parametric bootstrap methodology using a symbolic representation of the time series to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well-known stochastic processes: the 1/fα noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.
Entropy reduction via simplified image contourization
NASA Technical Reports Server (NTRS)
Turner, Martin J.
1993-01-01
The process of contourization is presented which converts a raster image into a set of plateaux or contours. These contours can be grouped into a hierarchical structure, defining total spatial inclusion, called a contour tree. A contour coder has been developed which fully describes these contours in a compact and efficient manner and is the basis for an image compression method. Simplification of the contour tree has been undertaken by merging contour tree nodes thus lowering the contour tree's entropy. This can be exploited by the contour coder to increase the image compression ratio. By applying general and simple rules derived from physiological experiments on the human vision system, lossy image compression can be achieved which minimizes noticeable artifacts in the simplified image.
DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K
2012-04-05
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.
Local curvature entropy-based 3D terrain representation using a comprehensive Quadtree
NASA Astrophysics Data System (ADS)
Chen, Qiyu; Liu, Gang; Ma, Xiaogang; Mariethoz, Gregoire; He, Zhenwen; Tian, Yiping; Weng, Zhengping
2018-05-01
Large scale 3D digital terrain modeling is a crucial part of many real-time applications in geoinformatics. In recent years, the improved speed and precision in spatial data collection make the original terrain data more complex and bigger, which poses challenges for data management, visualization and analysis. In this work, we presented an effective and comprehensive 3D terrain representation based on local curvature entropy and a dynamic Quadtree. The Level-of-detail (LOD) models of significant terrain features were employed to generate hierarchical terrain surfaces. In order to reduce the radical changes of grid density between adjacent LODs, local entropy of terrain curvature was regarded as a measure of subdividing terrain grid cells. Then, an efficient approach was presented to eliminate the cracks among the different LODs by directly updating the Quadtree due to an edge-based structure proposed in this work. Furthermore, we utilized a threshold of local entropy stored in each parent node of this Quadtree to flexibly control the depth of the Quadtree and dynamically schedule large-scale LOD terrain. Several experiments were implemented to test the performance of the proposed method. The results demonstrate that our method can be applied to construct LOD 3D terrain models with good performance in terms of computational cost and the maintenance of terrain features. Our method has already been deployed in a geographic information system (GIS) for practical uses, and it is able to support the real-time dynamic scheduling of large scale terrain models more easily and efficiently.
Entropy Generation Minimization in Dimethyl Ether Synthesis: A Case Study
NASA Astrophysics Data System (ADS)
Kingston, Diego; Razzitte, Adrián César
2018-04-01
Entropy generation minimization is a method that helps improve the efficiency of real processes and devices. In this article, we study the entropy production (due to chemical reactions, heat exchange and friction) in a conventional reactor that synthesizes dimethyl ether and minimize it by modifying different operating variables of the reactor, such as composition, temperature and pressure, while aiming at a fixed production of dimethyl ether. Our results indicate that it is possible to reduce the entropy production rate by nearly 70 % and that, by changing only the inlet composition, it is possible to cut it by nearly 40 %, though this comes at the expense of greater dissipation due to heat transfer. We also study the alternative of coupling the reactor with another, where dehydrogenation of methylcyclohexane takes place. In that case, entropy generation can be reduced by 54 %, when pressure, temperature and inlet molar flows are varied. These examples show that entropy generation analysis can be a valuable tool in engineering design and applications aiming at process intensification and efficient operation of plant equipment.
DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.
2012-01-01
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694
Crowd macro state detection using entropy model
NASA Astrophysics Data System (ADS)
Zhao, Ying; Yuan, Mengqi; Su, Guofeng; Chen, Tao
2015-08-01
In the crowd security research area a primary concern is to identify the macro state of crowd behaviors to prevent disasters and to supervise the crowd behaviors. The entropy is used to describe the macro state of a self-organization system in physics. The entropy change indicates the system macro state change. This paper provides a method to construct crowd behavior microstates and the corresponded probability distribution using the individuals' velocity information (magnitude and direction). Then an entropy model was built up to describe the crowd behavior macro state. Simulation experiments and video detection experiments were conducted. It was verified that in the disordered state, the crowd behavior entropy is close to the theoretical maximum entropy; while in ordered state, the entropy is much lower than half of the theoretical maximum entropy. The crowd behavior macro state sudden change leads to the entropy change. The proposed entropy model is more applicable than the order parameter model in crowd behavior detection. By recognizing the entropy mutation, it is possible to detect the crowd behavior macro state automatically by utilizing cameras. Results will provide data support on crowd emergency prevention and on emergency manual intervention.
Magnetic stirling cycles: A new application for magnetic materials
NASA Technical Reports Server (NTRS)
Brown, G. V.
1977-01-01
The elements of the cycle are summarized. The basic advantages include high entropy density in the magnetic material, completely reversible processes, convenient control of the entropy by the applied field, the feature that heat transfer is possible during all processes, and the ability of the ideal cycle to attain Carnot efficiency. The mean field theory is used to predict the entropy of a ferromagnet in an applied field and also the isothermal entropy change and isentropic temperature change caused by applying a field. The results for isentropic temperature change are compared with experimental data on Gd. Coarse mixtures of ferromagnetic materials with different Curie points are proposed to modify the path of the cycle in the T-S diagram in order to improve the efficiency or to increase the specific power.
Using entropy measures to characterize human locomotion.
Leverick, Graham; Szturm, Tony; Wu, Christine Q
2014-12-01
Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this paper, the value of using entropy measures to characterize human locomotion is demonstrated based on their construct validity, predictive validity in a simple model of human walking and convergent validity in an experimental study. Results show that four of the five considered entropy measures increase meaningfully with the increased probability of falling in a simple passive bipedal walker model. The same four entropy measures also experienced statistically significant increases in response to increasing age and gait impairment caused by cognitive interference in an experimental study. Of the considered entropy measures, the proposed quantized dynamical entropy (QDE) and quantization-based approximation of sample entropy (QASE) offered the best combination of sensitivity to changes in gait dynamics and computational efficiency. Based on these results, entropy appears to be a viable candidate for assessing the stability of human locomotion.
Propulsion Options for the HI SPOT Long Endurance Drone Airship
1979-09-15
Carnot Stirling Ranklne Entropy Entropy Entropy Temper- ature ’iconst P ’Ccost V Coat P 01 1 ___PConsit V Cnt V C:)nst P Diesel Otto Brayton Entropy...ignition) and Brayton (gas turbine) systems. All of these are within a few percentage points of efficiency, though the Brayton engine is generally less...fuel consumption. The ultimate lightweight engine is the gas turbine, or Brayton cycle engine. However, while good specific fuel consumption can be
Concepts in receptor optimization: targeting the RGD peptide.
Chen, Wei; Chang, Chia-en; Gilson, Michael K
2006-04-12
Synthetic receptors have a wide range of potential applications, but it has been difficult to design low molecular weight receptors that bind ligands with high, "proteinlike" affinities. This study uses novel computational methods to understand why it is hard to design a high-affinity receptor and to explore the limits of affinity, with the bioactive peptide RGD as a model ligand. The M2 modeling method is found to yield excellent agreement with experiment for a known RGD receptor and then is used to analyze a series of receptors generated in silico with a de novo design algorithm. Forces driving binding are found to be systematically opposed by proportionate repulsions due to desolvation and entropy. In particular, strong correlations are found between Coulombic attractions and the electrostatic desolvation penalty and between the mean energy change on binding and the cost in configurational entropy. These correlations help explain why it is hard to achieve high affinity. The change in surface area upon binding is found to correlate poorly with affinity within this series. Measures of receptor efficiency are formulated that summarize how effectively a receptor uses surface area, total energy, and Coulombic energy to achieve affinity. Analysis of the computed efficiencies suggests that a low molecular weight receptor can achieve proteinlike affinity. It is also found that macrocyclization of a receptor can, unexpectedly, increase the entropy cost of binding because the macrocyclic structure further restricts ligand motion.
Fuzzy entropy thresholding and multi-scale morphological approach for microscopic image enhancement
NASA Astrophysics Data System (ADS)
Zhou, Jiancan; Li, Yuexiang; Shen, Linlin
2017-07-01
Microscopic images provide lots of useful information for modern diagnosis and biological research. However, due to the unstable lighting condition during image capturing, two main problems, i.e., high-level noises and low image contrast, occurred in the generated cell images. In this paper, a simple but efficient enhancement framework is proposed to address the problems. The framework removes image noises using a hybrid method based on wavelet transform and fuzzy-entropy, and enhances the image contrast with an adaptive morphological approach. Experiments on real cell dataset were made to assess the performance of proposed framework. The experimental results demonstrate that our proposed enhancement framework increases the cell tracking accuracy to an average of 74.49%, which outperforms the benchmark algorithm, i.e., 46.18%.
A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization
NASA Astrophysics Data System (ADS)
Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano
In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.
A modified belief entropy in Dempster-Shafer framework.
Zhou, Deyun; Tang, Yongchuan; Jiang, Wen
2017-01-01
How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What's more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.
A modified belief entropy in Dempster-Shafer framework
Zhou, Deyun; Jiang, Wen
2017-01-01
How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What’s more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method. PMID:28481914
Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong
2017-01-01
Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro–meso–scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy–enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy–enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates. PMID:28869520
NASA Astrophysics Data System (ADS)
DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.
2013-08-01
We analyzed single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.
DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.
2013-01-01
We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data. PMID:24223465
Devore, Matthew S; Gull, Stephen F; Johnson, Carey K
2013-08-30
We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca 2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.
The gravity dual of Rényi entropy.
Dong, Xi
2016-08-12
A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Rényi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometric prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Rényi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Rényi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.
The gravity dual of Rényi entropy
Dong, Xi
2016-01-01
A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Rényi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometric prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Rényi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Rényi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity. PMID:27515122
Entropy-based financial asset pricing.
Ormos, Mihály; Zibriczky, Dávid
2014-01-01
We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.
Entropy-Based Financial Asset Pricing
Ormos, Mihály; Zibriczky, Dávid
2014-01-01
We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return – entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy. PMID:25545668
Configurational entropy measurements in extremely supercooled liquids that break the glass ceiling.
Berthier, Ludovic; Charbonneau, Patrick; Coslovich, Daniele; Ninarello, Andrea; Ozawa, Misaki; Yaida, Sho
2017-10-24
Liquids relax extremely slowly on approaching the glass state. One explanation is that an entropy crisis, because of the rarefaction of available states, makes it increasingly arduous to reach equilibrium in that regime. Validating this scenario is challenging, because experiments offer limited resolution, while numerical studies lag more than eight orders of magnitude behind experimentally relevant timescales. In this work, we not only close the colossal gap between experiments and simulations but manage to create in silico configurations that have no experimental analog yet. Deploying a range of computational tools, we obtain four estimates of their configurational entropy. These measurements consistently confirm that the steep entropy decrease observed in experiments is also found in simulations, even beyond the experimental glass transition. Our numerical results thus extend the observational window into the physics of glasses and reinforce the relevance of an entropy crisis for understanding their formation. Published under the PNAS license.
Breaking the glass ceiling: Configurational entropy measurements in extremely supercooled liquids
NASA Astrophysics Data System (ADS)
Berthier, Ludovic
Liquids relax extremely slowly on approaching the glass state. One explanation is that an entropy crisis, due to the rarefaction of available states, makes it increasingly arduous to reach equilibrium in that regime. Validating this scenario is challenging, because experiments offer limited resolution, while numerical studies lag more than eight orders of magnitude behind experimentally-relevant timescales. In this work we not only close the colossal gap between experiments and simulations but manage to create in-silico configurations that have no experimental analog yet. Deploying a range of computational tools, we obtain four independent estimates of their configurational entropy. These measurements consistently indicate that the steep entropy decrease observed in experiments is found in simulations even beyond the experimental glass transition. Our numerical results thus open a new observational window into the physics of glasses and reinforce the relevance of an entropy crisis for understanding their formation.
Configurational entropy measurements in extremely supercooled liquids that break the glass ceiling
Berthier, Ludovic; Charbonneau, Patrick; Coslovich, Daniele; Ninarello, Andrea; Ozawa, Misaki
2017-01-01
Liquids relax extremely slowly on approaching the glass state. One explanation is that an entropy crisis, because of the rarefaction of available states, makes it increasingly arduous to reach equilibrium in that regime. Validating this scenario is challenging, because experiments offer limited resolution, while numerical studies lag more than eight orders of magnitude behind experimentally relevant timescales. In this work, we not only close the colossal gap between experiments and simulations but manage to create in silico configurations that have no experimental analog yet. Deploying a range of computational tools, we obtain four estimates of their configurational entropy. These measurements consistently confirm that the steep entropy decrease observed in experiments is also found in simulations, even beyond the experimental glass transition. Our numerical results thus extend the observational window into the physics of glasses and reinforce the relevance of an entropy crisis for understanding their formation. PMID:29073056
Configurational entropy measurements in extremely supercooled liquids that break the glass ceiling
NASA Astrophysics Data System (ADS)
Berthier, Ludovic; Charbonneau, Patrick; Coslovich, Daniele; Ninarello, Andrea; Ozawa, Misaki; Yaida, Sho
2017-10-01
Liquids relax extremely slowly on approaching the glass state. One explanation is that an entropy crisis, because of the rarefaction of available states, makes it increasingly arduous to reach equilibrium in that regime. Validating this scenario is challenging, because experiments offer limited resolution, while numerical studies lag more than eight orders of magnitude behind experimentally relevant timescales. In this work, we not only close the colossal gap between experiments and simulations but manage to create in silico configurations that have no experimental analog yet. Deploying a range of computational tools, we obtain four estimates of their configurational entropy. These measurements consistently confirm that the steep entropy decrease observed in experiments is also found in simulations, even beyond the experimental glass transition. Our numerical results thus extend the observational window into the physics of glasses and reinforce the relevance of an entropy crisis for understanding their formation.
Entropic bounds on currents in Langevin systems
NASA Astrophysics Data System (ADS)
Dechant, Andreas; Sasa, Shin-ichi
2018-06-01
We derive a bound on generalized currents for Langevin systems in terms of the total entropy production in the system and its environment. For overdamped dynamics, any generalized current is bounded by the total rate of entropy production. We show that this entropic bound on the magnitude of generalized currents imposes power-efficiency tradeoff relations for ratchets in contact with a heat bath: Maximum efficiency—Carnot efficiency for a Smoluchowski-Feynman ratchet and unity for a flashing or rocking ratchet—can only be reached at vanishing power output. For underdamped dynamics, while there may be reversible currents that are not bounded by the entropy production rate, we show that the output power and heat absorption rate are irreversible currents and thus obey the same bound. As a consequence, a power-efficiency tradeoff relation holds not only for underdamped ratchets but also for periodically driven heat engines. For weak driving, the bound results in additional constraints on the Onsager matrix beyond those imposed by the second law. Finally, we discuss the connection between heat and entropy in a nonthermal situation where the friction and noise intensity are state dependent.
Entropy, recycling and macroeconomics of water resources
NASA Astrophysics Data System (ADS)
Karakatsanis, Georgios; Mamassis, Nikos; Koutsoyiannis, Demetris
2014-05-01
We propose a macroeconomic model for water quantity and quality supply multipliers derived by water recycling (Karakatsanis et al. 2013). Macroeconomic models that incorporate natural resource conservation have become increasingly important (European Commission et al. 2012). In addition, as an estimated 80% of globally used freshwater is not reused (United Nations 2012), under increasing population trends, water recycling becomes a solution of high priority. Recycling of water resources creates two major conservation effects: (1) conservation of water in reservoirs and aquifers and (2) conservation of ecosystem carrying capacity due to wastewater flux reduction. Statistical distribution properties of the recycling efficiencies -on both water quantity and quality- for each sector are of vital economic importance. Uncertainty and complexity of water reuse in sectors are statistically quantified by entropy. High entropy of recycling efficiency values signifies greater efficiency dispersion; which -in turn- may indicate the need for additional infrastructure for the statistical distribution's both shifting and concentration towards higher efficiencies that lead to higher supply multipliers. Keywords: Entropy, water recycling, water supply multipliers, conservation, recycling efficiencies, macroeconomics References 1. European Commission (EC), Food and Agriculture Organization (FAO), International Monetary Fund (IMF), Organization of Economic Cooperation and Development (OECD), United Nations (UN) and World Bank (2012), System of Environmental and Economic Accounting (SEEA) Central Framework (White cover publication), United Nations Statistics Division 2. Karakatsanis, G., N. Mamassis, D. Koutsoyiannis and A. Efstratiades (2013), Entropy and reliability of water use via a statistical approach of scarcity, 5th EGU Leonardo Conference - Hydrofractals 2013 - STAHY '13, Kos Island, Greece, European Geosciences Union, International Association of Hydrological Sciences, International Union of Geodesy and Geophysics 3. United Nations (UN) (2012), World Water Development Report 4, UNESCO Publishing
Converting Sunlight to Mechanical Energy: A Polymer Example of Entropy.
ERIC Educational Resources Information Center
Mathias, Lon J.
1987-01-01
This experiment/demonstration provides elementary through high school science students with hands-on experience with polymer entropy. Construction of a simple machine for converting light into mechanical energy is described. (RH)
An Information Theoretic Characterisation of Auditory Encoding
Overath, Tobias; Cusack, Rhodri; Kumar, Sukhbinder; von Kriegstein, Katharina; Warren, Jason D; Grube, Manon; Carlyon, Robert P; Griffiths, Timothy D
2007-01-01
The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. By systematically varying the entropy of pitch sequences, we sought brain areas where neural activity and energetic demands increase as a function of entropy. Such a relationship is predicted to occur in an efficient encoding mechanism that uses less computational resource when less information is present in the signal: we specifically tested the hypothesis that such a relationship is present in the planum temporale (PT). In two convergent functional MRI studies, we demonstrated this relationship in PT for encoding, while furthermore showing that a distributed fronto-parietal network for retrieval of acoustic information is independent of entropy. The results establish PT as an efficient neural engine that demands less computational resource to encode redundant signals than those with high information content. PMID:17958472
Enhanced magnetocaloric effect tuning efficiency in Ni-Mn-Sn alloy ribbons
NASA Astrophysics Data System (ADS)
Quintana-Nedelcos, A.; Sánchez Llamazares, J. L.; Daniel-Perez, G.
2017-11-01
The present work was undertaken to investigate the effect of microstructure on the magnetic entropy change of Ni50Mn37Sn13 ribbon alloys. Unchanged sample composition and cell parameter of austenite allowed us to study strictly the correlation between the average grain size and the total magnetic field induced entropy change (ΔST). We found that a size-dependent martensitic transformation tuning results in a wide temperature range tailoring (>40 K) of the magnetic entropy change with a reasonably small variation on the peak value of the total field induced entropy change. The peak values varied from 6.0 J kg-1 K-1 to 7.7 J kg-1 K-1 for applied fields up to 2 T. Different tuning efficiencies obtained by diverse MCE tailoring approaches are compared to highlight the advantages of the herein proposed mechanism.
Evaluation of Fuel Cell Operation and Degradation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Mark; Gemmen, Randall; Richards, George
The concepts of area specific resistance (ASR) and degradation are developed for different fuel cell operating modes. The concepts of exergetic efficiency and entropy production were applied to ASR and degradation. It is shown that exergetic efficiency is a time-dependent function useful describing the thermal efficiency of a fuel cell and the change in thermal efficiency of a degrading fuel cell. Entropy production was evaluated for the cases of constant voltage operation and constant current operation of the fuel cell for a fuel cell undergoing ohmic degradation. It was discovered that the Gaussian hypergeometric function describes the cumulative entropy andmore » electrical work produced by fuel cells operating at constant voltage. The Gaussian hypergeometric function is found in many applications in modern physics. This paper builds from and is an extension of several papers recently published by the authors in the Journal of The Electrochemical Society (ECS), ECS Transactions, Journal of Power Sources, and the Journal of Fuel Cell Science and Technology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine A.
Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Cemore » 4+/Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Xi
A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Re´nyi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometricmore » prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Re´nyi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Re´nyi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.« less
Efficient optimization of the quantum relative entropy
NASA Astrophysics Data System (ADS)
Fawzi, Hamza; Fawzi, Omar
2018-04-01
Many quantum information measures can be written as an optimization of the quantum relative entropy between sets of states. For example, the relative entropy of entanglement of a state is the minimum relative entropy to the set of separable states. The various capacities of quantum channels can also be written in this way. We propose a unified framework to numerically compute these quantities using off-the-shelf semidefinite programming solvers, exploiting the approximation method proposed in Fawzi, Saunderson and Parrilo (2017 arXiv: 1705.00812). As a notable application, this method allows us to provide numerical counterexamples for a proposed lower bound on the quantum conditional mutual information in terms of the relative entropy of recovery.
Design of high entropy alloys based on the experience from commercial superalloys
NASA Astrophysics Data System (ADS)
Wang, Z.; Huang, Y.; Wang, J.; Liu, C. T.
2015-01-01
High entropy alloys (HEAs) have been drawing increasing attention recently and gratifying results have been obtained. However, the existing metallurgic rules of HEAs could not provide specific information of selecting candidate alloys for structural applications. Our brief survey reveals that many commercial superalloys have medium and even to high configurational entropies. The experience of commercial superalloys provides a clue for helping us in the development of HEAs for structural applications.
The gravity dual of Rényi entropy
Dong, Xi
2016-08-12
A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Re´nyi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometricmore » prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Re´nyi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Re´nyi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.« less
Giant onsite electronic entropy enhances the performance of ceria for water splitting
Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine A.; ...
2017-08-18
Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Cemore » 4+/Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.« less
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista; Seagrave, Richard C.
1993-01-01
The objective of this paper is to present an estimate of the second law thermodynamic efficiency of the various units comprising an Environmental Control and Life Support System (ECLSS). The technique adopted here is based on an evaluation of the 'lost work' within each functional unit of the subsystem. Pertinent information for our analysis is obtained from a user interactive integrated model of an ECLSS. The model was developed using ASPEN. A potential benefit of this analysis is the identification of subsystems with high entropy generation as the most likely candidates for engineering improvements. This work has been motivated by the fact that the design objective for a long term mission should be the evaluation of existing ECLSS technologies not only the basis of the quantity of work needed for or obtained from each subsystem but also on the quality of work. In a previous study Brandhorst showed that the power consumption for partially closed and completely closed regenerable life support systems was estimated as 3.5 kw/individual and 10-12 kw/individual respectively. With the increasing cost and scarcity of energy resources, our attention is drawn to evaluate the existing ECLSS technologies on the basis of their energy efficiency. In general the first law efficiency of a system is usually greater than 50 percent. From literature, the second law efficiency is usually about 10 percent. The estimation of second law efficiency of the system indicates the percentage of energy degraded as irreversibilities within the process. This estimate offers more room for improvement in the design of equipment. From another perspective, our objective is to keep the total entropy production of a life support system as low as possible and still ensure a positive entropy gradient between the system and the surroundings. The reason for doing so is as the entropy production of the system increases, the entropy gradient between the system and the surroundings decreases, and the system will gradually approach equilibrium with the surroundings until it reaches the point where the entropy gradient is zero. At this point no work can be extracted from the system. This is called the 'dead state' of the system.
Katan, Pesia; Kahta, Shani; Sasson, Ayelet; Schiff, Rachel
2017-07-01
Graph complexity as measured by topological entropy has been previously shown to affect performance on artificial grammar learning tasks among typically developing children. The aim of this study was to examine the effect of graph complexity on implicit sequential learning among children with developmental dyslexia. Our goal was to determine whether children's performance depends on the complexity level of the grammar system learned. We conducted two artificial grammar learning experiments that compared performance of children with developmental dyslexia with that of age- and reading level-matched controls. Experiment 1 was a high topological entropy artificial grammar learning task that aimed to establish implicit learning phenomena in children with developmental dyslexia using previously published experimental conditions. Experiment 2 is a lower topological entropy variant of that task. Results indicated that given a high topological entropy grammar system, children with developmental dyslexia who were similar to the reading age-matched control group had substantial difficulty in performing the task as compared to typically developing children, who exhibited intact implicit learning of the grammar. On the other hand, when tested on a lower topological entropy grammar system, all groups performed above chance level, indicating that children with developmental dyslexia were able to identify rules from a given grammar system. The results reinforced the significance of graph complexity when experimenting with artificial grammar learning tasks, particularly with dyslexic participants.
EEG entropy measures in anesthesia
Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli
2015-01-01
Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277
Autonomous entropy-based intelligent experimental design
NASA Astrophysics Data System (ADS)
Malakar, Nabin Kumar
2011-07-01
The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same goal in an automated fashion.
Efficient algorithms and implementations of entropy-based moment closures for rarefied gases
NASA Astrophysics Data System (ADS)
Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel
2017-07-01
We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) [13], we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.
Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.
Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai
2017-02-20
Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.
On efficiency and interpretation of sawteeth pacing with on-axis ICRH modulation in JET
NASA Astrophysics Data System (ADS)
Murari, A.; Craciunescu, T.; Peluso, E.; Lerche, E.; Gelfusa, M.; Contributors, JET
2017-12-01
In metallic machines ICRH heating is playing an increasingly important role. One of its most recent applications on the Joint Europena Torus (JET) is sawtooth control by ICRH modulation, for avoiding triggering dangerous neo-classical tearing modes (NTMs) and counteracting impurity accumulation. Some of the main difficulties of these experiments are the assessment of the synchronization efficiency and the understanding of the main physical mechanisms at play. In this paper, three independent classes of statistical indicators are introduced to address these issues: Recurrence Plots, Convergent Cross Mapping and Transfer Entropy. The application to JET experiments with the ILW shows that the proposed indicators agree quite well among themselves and provide sound estimates of the efficiency of the synchronisation scheme investigated. They also support, with a shot to shot basis analysis and an estimate of the uncertainties, the interpretation that the fast ions play a fundamental role in the stabilization of the sawteeth, in both L and H mode. Proposals for experiments to be carried out in the future to consolidate the interpretation of the results are discussed.
Entropy and climate. I - ERBE observations of the entropy production of the earth
NASA Technical Reports Server (NTRS)
Stephens, G. L.; O'Brien, D. M.
1993-01-01
An approximate method for estimating the global distributions of the entropy fluxes flowing through the upper boundary of the climate system is introduced, and an estimate of the entropy exchange between the earth and space and the entropy production of the planet is provided. Entropy fluxes calculated from the Earth Radiation Budget Experiment measurements show how the long-wave entropy flux densities dominate the total entropy fluxes at all latitudes compared with the entropy flux densities associated with reflected sunlight, although the short-wave flux densities are important in the context of clear sky-cloudy sky net entropy flux differences. It is suggested that the entropy production of the planet is both constant for the 36 months of data considered and very near its maximum possible value. The mean value of this production is 0.68 x 10 exp 15 W/K, and the amplitude of the annual cycle is approximately 1 to 2 percent of this value.
Papadelis, Christos; Chen, Zhe; Kourtidou-Papadeli, Chrysoula; Bamidis, Panagiotis D; Chouvarda, Ioanna; Bekiaris, Evangelos; Maglaveras, Nikos
2007-09-01
The objective of this study is the development and evaluation of efficient neurophysiological signal statistics, which may assess the driver's alertness level and serve as potential indicators of sleepiness in the design of an on-board countermeasure system. Multichannel EEG, EOG, EMG, and ECG were recorded from sleep-deprived subjects exposed to real field driving conditions. A number of severe driving errors occurred during the experiments. The analysis was performed in two main dimensions: the macroscopic analysis that estimates the on-going temporal evolution of physiological measurements during the driving task, and the microscopic event analysis that focuses on the physiological measurements' alterations just before, during, and after the driving errors. Two independent neurophysiologists visually interpreted the measurements. The EEG data were analyzed by using both linear and non-linear analysis tools. We observed the occurrence of brief paroxysmal bursts of alpha activity and an increased synchrony among EEG channels before the driving errors. The alpha relative band ratio (RBR) significantly increased, and the Cross Approximate Entropy that quantifies the synchrony among channels also significantly decreased before the driving errors. Quantitative EEG analysis revealed significant variations of RBR by driving time in the frequency bands of delta, alpha, beta, and gamma. Most of the estimated EEG statistics, such as the Shannon Entropy, Kullback-Leibler Entropy, Coherence, and Cross-Approximate Entropy, were significantly affected by driving time. We also observed an alteration of eyes blinking duration by increased driving time and a significant increase of eye blinks' number and duration before driving errors. EEG and EOG are promising neurophysiological indicators of driver sleepiness and have the potential of monitoring sleepiness in occupational settings incorporated in a sleepiness countermeasure device. The occurrence of brief paroxysmal bursts of alpha activity before severe driving errors is described in detail for the first time. Clear evidence is presented that eye-blinking statistics are sensitive to the driver's sleepiness and should be considered in the design of an efficient and driver-friendly sleepiness detection countermeasure device.
Conditional Entropy-Constrained Residual VQ with Application to Image Coding
NASA Technical Reports Server (NTRS)
Kossentini, Faouzi; Chung, Wilson C.; Smith, Mark J. T.
1996-01-01
This paper introduces an extension of entropy-constrained residual vector quantization (VQ) where intervector dependencies are exploited. The method, which we call conditional entropy-constrained residual VQ, employs a high-order entropy conditioning strategy that captures local information in the neighboring vectors. When applied to coding images, the proposed method is shown to achieve better rate-distortion performance than that of entropy-constrained residual vector quantization with less computational complexity and lower memory requirements. Moreover, it can be designed to support progressive transmission in a natural way. It is also shown to outperform some of the best predictive and finite-state VQ techniques reported in the literature. This is due partly to the joint optimization between the residual vector quantizer and a high-order conditional entropy coder as well as the efficiency of the multistage residual VQ structure and the dynamic nature of the prediction.
Ehrenfest's Lottery--Time and Entropy Maximization
ERIC Educational Resources Information Center
Ashbaugh, Henry S.
2010-01-01
Successful teaching of the Second Law of Thermodynamics suffers from limited simple examples linking equilibrium to entropy maximization. I describe a thought experiment connecting entropy to a lottery that mixes marbles amongst a collection of urns. This mixing obeys diffusion-like dynamics. Equilibrium is achieved when the marble distribution is…
The Nasal Geometry of the Reindeer Gives Energy-Efficient Respiration
NASA Astrophysics Data System (ADS)
Magnanelli, Elisa; Wilhelmsen, Øivind; Acquarone, Mario; Folkow, Lars P.; Kjelstrup, Signe
2017-01-01
Reindeer in the arctic region live under very harsh conditions and may face temperatures below 233 K. Therefore, efficient conservation of body heat and water is important for their survival. Alongside their insulating fur, the reindeer nasal mechanism for heat and mass exchange during respiration plays a fundamental role. We present a dynamic model to describe the heat and mass transport that takes place inside the reindeer nose, where we account for the complicated geometrical structure of the subsystems that are part of the nose. The model correctly captures the trend in experimental data for the temperature, heat and water recovery in the reindeer nose during respiration. As a reference case, we model a nose with a simple cylindrical-like geometry, where the total volume and contact area are the same as those determined in the reindeer nose. A comparison of the reindeer nose with the reference case shows that the nose geometry has a large influence on the velocity, temperature and water content of the air inside the nose. For all investigated cases, we find that the total entropy production during a breathing cycle is lower for the reindeer nose than for the reference case. The same trend is observed for the total energy consumption. The reduction in the total entropy production caused by the complicated geometry is higher (up to -20 %) at more extreme ambient conditions, when energy efficiency is presumably more important for the maintenance of energy balance in the animal. In the literature, a hypothesis has been proposed, which states that the most energy-efficient design of a system is characterized by equipartition of the entropy production. In agreement with this hypothesis, we find that the local entropy production during a breathing cycle is significantly more uniform for the reindeer nose than for the reference case. This suggests that natural selection has favored designs that give uniform entropy production when energy efficiency is an issue. Animals living in the harsh arctic climate, such as the reindeer, can therefore serve as inspiration for a novel industrial design with increased efficiency.
Enhanced electrocaloric cooling in ferroelectric single crystals by electric field reversal
NASA Astrophysics Data System (ADS)
Ma, Yang-Bin; Novak, Nikola; Koruza, Jurij; Yang, Tongqing; Albe, Karsten; Xu, Bai-Xiang
2016-09-01
An improved thermodynamic cycle is validated in ferroelectric single crystals, where the cooling effect of an electrocaloric refrigerant is enhanced by applying a reversed electric field. In contrast to the conventional adiabatic heating or cooling by on-off cycles of the external electric field, applying a reversed field is significantly improving the cooling efficiency, since the variation in configurational entropy is increased. By comparing results from computer simulations using Monte Carlo algorithms and experiments using direct electrocaloric measurements, we show that the electrocaloric cooling efficiency can be enhanced by more than 20% in standard ferroelectrics and also relaxor ferroelectrics, like Pb (Mg1 /3 /Nb2 /3)0.71Ti0.29O3 .
Entropy Beacon: A Hairpin-Free DNA Amplification Strategy for Efficient Detection of Nucleic Acids
2015-01-01
Here, we propose an efficient strategy for enzyme- and hairpin-free nucleic acid detection called an entropy beacon (abbreviated as Ebeacon). Different from previously reported DNA hybridization/displacement-based strategies, Ebeacon is driven forward by increases in the entropy of the system, instead of free energy released from new base-pair formation. Ebeacon shows high sensitivity, with a detection limit of 5 pM target DNA in buffer and 50 pM in cellular homogenate. Ebeacon also benefits from the hairpin-free amplification strategy and zero-background, excellent thermostability from 20 °C to 50 °C, as well as good resistance to complex environments. In particular, based on the huge difference between the breathing rate of a single base pair and two adjacent base pairs, Ebeacon also shows high selectivity toward base mutations, such as substitution, insertion, and deletion and, therefore, is an efficient nucleic acid detection method, comparable to most reported enzyme-free strategies. PMID:26505212
NASA Astrophysics Data System (ADS)
Sadeghi, Pegah; Safavinejad, Ali
2017-11-01
Radiative entropy generation through a gray absorbing, emitting, and scattering planar medium at radiative equilibrium with diffuse-gray walls is investigated. The radiative transfer equation and radiative entropy generation equations are solved using discrete ordinates method. Components of the radiative entropy generation are considered for two different boundary conditions: two walls are at a prescribed temperature and mixed boundary conditions, which one wall is at a prescribed temperature and the other is at a prescribed heat flux. The effect of wall emissivities, optical thickness, single scattering albedo, and anisotropic-scattering factor on the entropy generation is attentively investigated. The results reveal that entropy generation in the system mainly arises from irreversible radiative transfer at wall with lower temperature. Total entropy generation rate for the system with prescribed temperature at walls remarkably increases as wall emissivity increases; conversely, for system with mixed boundary conditions, total entropy generation rate slightly decreases. Furthermore, as the optical thickness increases, total entropy generation rate remarkably decreases for the system with prescribed temperature at walls; nevertheless, for the system with mixed boundary conditions, total entropy generation rate increases. The variation of single scattering albedo does not considerably affect total entropy generation rate. This parametric analysis demonstrates that the optical thickness and wall emissivities have a significant effect on the entropy generation in the system at radiative equilibrium. Considering the parameters affecting radiative entropy generation significantly, provides an opportunity to optimally design or increase overall performance and efficiency by applying entropy minimization techniques for the systems at radiative equilibrium.
Extended statistical entropy analysis as a quantitative management tool for water resource systems
NASA Astrophysics Data System (ADS)
Sobantka, Alicja; Rechberger, Helmut
2010-05-01
The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency of WWTPs with respect to the state-of-the-art of technology, waste water treatment could become more resources preserving.
NASA Astrophysics Data System (ADS)
Hao Chiang, Shou; Valdez, Miguel; Chen, Chi-Farn
2016-06-01
Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM) were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface reflectance coupled with terrain variables produced better result, with the higher overall accuracy and kappa coefficient than first experiment. The results indicate that the Maximum Entropy method is an applicable, and to classify tree species using satellite imagery data coupled with terrain information can improve the classification of tree species in the study area.
Development of a novel high-entropy alloy with eminent efficiency of degrading azo dye solutions
Lv, Z. Y.; Liu, X. J.; Jia, B.; Wang, H.; Wu, Y.; Lu, Z. P.
2016-01-01
In addition to its scientific importance, the degradation of azo dyes is of practical significance from the perspective of environmental protection. Although encouraging progress has been made on developing degradation approaches and materials, it is still challenging to fully resolve this long-standing problem. Herein, we report that high entropy alloys, which have been emerging as a new class of metallic materials in the last decade, have excellent performance in degradation of azo dyes. In particular, the newly developed AlCoCrTiZn high-entropy alloy synthesized by mechanical alloying exhibits a prominent efficiency in degradation of the azo dye (Direct Blue 6: DB6), as high as that of the best metallic glass reported so far. The newly developed AlCoCrTiZn HEA powder has low activation energy barrier, i.e., 30 kJ/mol, for the degrading reaction and thus make the occurrence of reaction easier as compared with other materials such as the glassy Fe-based powders. The excellent capability of our high-entropy alloys in degrading azo dye is attributed to their unique atomic structure with severe lattice distortion, chemical composition effect, residual stress and high specific surface area. Our findings have important implications in developing novel high-entropy alloys for functional applications as catalyst materials. PMID:27677462
Computational Methods for Configurational Entropy Using Internal and Cartesian Coordinates.
Hikiri, Simon; Yoshidome, Takashi; Ikeguchi, Mitsunori
2016-12-13
The configurational entropy of solute molecules is a crucially important quantity to study various biophysical processes. Consequently, it is necessary to establish an efficient quantitative computational method to calculate configurational entropy as accurately as possible. In the present paper, we investigate the quantitative performance of the quasi-harmonic and related computational methods, including widely used methods implemented in popular molecular dynamics (MD) software packages, compared with the Clausius method, which is capable of accurately computing the change of the configurational entropy upon temperature change. Notably, we focused on the choice of the coordinate systems (i.e., internal or Cartesian coordinates). The Boltzmann-quasi-harmonic (BQH) method using internal coordinates outperformed all the six methods examined here. The introduction of improper torsions in the BQH method improves its performance, and anharmonicity of proper torsions in proteins is identified to be the origin of the superior performance of the BQH method. In contrast, widely used methods implemented in MD packages show rather poor performance. In addition, the enhanced sampling of replica-exchange MD simulations was found to be efficient for the convergent behavior of entropy calculations. Also in folding/unfolding transitions of a small protein, Chignolin, the BQH method was reasonably accurate. However, the independent term without the correlation term in the BQH method was most accurate for the folding entropy among the methods considered in this study, because the QH approximation of the correlation term in the BQH method was no longer valid for the divergent unfolded structures.
Entropy Generation Across Earth's Bow Shock
NASA Technical Reports Server (NTRS)
Parks, George K.; McCarthy, Michael; Fu, Suiyan; Lee E. s; Cao, Jinbin; Goldstein, Melvyn L.; Canu, Patrick; Dandouras, Iannis S.; Reme, Henri; Fazakerley, Andrew;
2011-01-01
Earth's bow shock is a transition layer that causes an irreversible change in the state of plasma that is stationary in time. Theories predict entropy increases across the bow shock but entropy has never been directly measured. Cluster and Double Star plasma experiments measure 3D plasma distributions upstream and downstream of the bow shock that allow calculation of Boltzmann's entropy function H and his famous H-theorem, dH/dt O. We present the first direct measurements of entropy density changes across Earth's bow shock. We will show that this entropy generation may be part of the processes that produce the non-thermal plasma distributions is consistent with a kinetic entropy flux model derived from the collisionless Boltzmann equation, giving strong support that solar wind's total entropy across the bow shock remains unchanged. As far as we know, our results are not explained by any existing shock models and should be of interests to theorists.
Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.
Mammone, Nadia; Morabito, Francesco Carlo
2008-09-01
Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.
NASA Astrophysics Data System (ADS)
Ai, Yan-Ting; Guan, Jiao-Yue; Fei, Cheng-Wei; Tian, Jing; Zhang, Feng-Ling
2017-05-01
To monitor rolling bearing operating status with casings in real time efficiently and accurately, a fusion method based on n-dimensional characteristic parameters distance (n-DCPD) was proposed for rolling bearing fault diagnosis with two types of signals including vibration signal and acoustic emission signals. The n-DCPD was investigated based on four information entropies (singular spectrum entropy in time domain, power spectrum entropy in frequency domain, wavelet space characteristic spectrum entropy and wavelet energy spectrum entropy in time-frequency domain) and the basic thought of fusion information entropy fault diagnosis method with n-DCPD was given. Through rotor simulation test rig, the vibration and acoustic emission signals of six rolling bearing faults (ball fault, inner race fault, outer race fault, inner-ball faults, inner-outer faults and normal) are collected under different operation conditions with the emphasis on the rotation speed from 800 rpm to 2000 rpm. In the light of the proposed fusion information entropy method with n-DCPD, the diagnosis of rolling bearing faults was completed. The fault diagnosis results show that the fusion entropy method holds high precision in the recognition of rolling bearing faults. The efforts of this study provide a novel and useful methodology for the fault diagnosis of an aeroengine rolling bearing.
Financial time series analysis based on effective phase transfer entropy
NASA Astrophysics Data System (ADS)
Yang, Pengbo; Shang, Pengjian; Lin, Aijing
2017-02-01
Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.
NASA Astrophysics Data System (ADS)
Whitney, Robert S.
2015-03-01
We investigate the nonlinear scattering theory for quantum systems with strong Seebeck and Peltier effects, and consider their use as heat engines and refrigerators with finite power outputs. This paper gives detailed derivations of the results summarized in a previous paper [R. S. Whitney, Phys. Rev. Lett. 112, 130601 (2014), 10.1103/PhysRevLett.112.130601]. It shows how to use the scattering theory to find (i) the quantum thermoelectric with maximum possible power output, and (ii) the quantum thermoelectric with maximum efficiency at given power output. The latter corresponds to a minimal entropy production at that power output. These quantities are of quantum origin since they depend on system size over electronic wavelength, and so have no analog in classical thermodynamics. The maximal efficiency coincides with Carnot efficiency at zero power output, but decreases with increasing power output. This gives a fundamental lower bound on entropy production, which means that reversibility (in the thermodynamic sense) is impossible for finite power output. The suppression of efficiency by (nonlinear) phonon and photon effects is addressed in detail; when these effects are strong, maximum efficiency coincides with maximum power. Finally, we show in particular limits (typically without magnetic fields) that relaxation within the quantum system does not allow the system to exceed the bounds derived for relaxation-free systems, however, a general proof of this remains elusive.
Bayesian cross-entropy methodology for optimal design of validation experiments
NASA Astrophysics Data System (ADS)
Jiang, X.; Mahadevan, S.
2006-07-01
An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.
Wang, Lei; Troyer, Matthias
2014-09-12
We present a new algorithm for calculating the Renyi entanglement entropy of interacting fermions using the continuous-time quantum Monte Carlo method. The algorithm only samples the interaction correction of the entanglement entropy, which by design ensures the efficient calculation of weakly interacting systems. Combined with Monte Carlo reweighting, the algorithm also performs well for systems with strong interactions. We demonstrate the potential of this method by studying the quantum entanglement signatures of the charge-density-wave transition of interacting fermions on a square lattice.
NASA Astrophysics Data System (ADS)
Mao, Chao; Chen, Shou
2017-01-01
According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.
Entropy-based link prediction in weighted networks
NASA Astrophysics Data System (ADS)
Xu, Zhongqi; Pu, Cunlai; Ramiz Sharafat, Rajput; Li, Lunbo; Yang, Jian
2017-01-01
Information entropy has been proved to be an effective tool to quantify the structural importance of complex networks. In the previous work (Xu et al, 2016 \\cite{xu2016}), we measure the contribution of a path in link prediction with information entropy. In this paper, we further quantify the contribution of a path with both path entropy and path weight, and propose a weighted prediction index based on the contributions of paths, namely Weighted Path Entropy (WPE), to improve the prediction accuracy in weighted networks. Empirical experiments on six weighted real-world networks show that WPE achieves higher prediction accuracy than three typical weighted indices.
Towse, Clare-Louise; Akke, Mikael; Daggett, Valerie
2017-04-27
Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing data set of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identifies deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, α-helices have lower entropy on average than do β-sheets, and both are lower than coil regions.
An entropy-assisted musculoskeletal shoulder model.
Xu, Xu; Lin, Jia-Hua; McGorry, Raymond W
2017-04-01
Optimization combined with a musculoskeletal shoulder model has been used to estimate mechanical loading of musculoskeletal elements around the shoulder. Traditionally, the objective function is to minimize the summation of the total activities of the muscles with forces, moments, and stability constraints. Such an objective function, however, tends to neglect the antagonist muscle co-contraction. In this study, an objective function including an entropy term is proposed to address muscle co-contractions. A musculoskeletal shoulder model is developed to apply the proposed objective function. To find the optimal weight for the entropy term, an experiment was conducted. In the experiment, participants generated various 3-D shoulder moments in six shoulder postures. The surface EMG of 8 shoulder muscles was measured and compared with the predicted muscle activities based on the proposed objective function using Bhattacharyya distance and concordance ratio under different weight of the entropy term. The results show that a small weight of the entropy term can improve the predictability of the model in terms of muscle activities. Such a result suggests that the concept of entropy could be helpful for further understanding the mechanism of muscle co-contractions as well as developing a shoulder biomechanical model with greater validity. Copyright © 2017 Elsevier Ltd. All rights reserved.
On the efficiency of sovereign bond markets
NASA Astrophysics Data System (ADS)
Zunino, Luciano; Fernández Bariviera, Aurelio; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.
2012-09-01
The existence of memory in financial time series has been extensively studied for several stock markets around the world by means of different approaches. However, fixed income markets, i.e. those where corporate and sovereign bonds are traded, have been much less studied. We believe that, given the relevance of these markets, not only from the investors', but also from the issuers' point of view (government and firms), it is necessary to fill this gap in the literature. In this paper, we study the sovereign market efficiency of thirty bond indices of both developed and emerging countries, using an innovative statistical tool in the financial literature: the complexity-entropy causality plane. This representation space allows us to establish an efficiency ranking of different markets and distinguish different bond market dynamics. We conclude that the classification derived from the complexity-entropy causality plane is consistent with the qualifications assigned by major rating companies to the sovereign instruments. Additionally, we find a correlation between permutation entropy, economic development and market size that could be of interest for policy makers and investors.
Sort entropy-based for the analysis of EEG during anesthesia
NASA Astrophysics Data System (ADS)
Ma, Liang; Huang, Wei-Zhi
2010-08-01
The monitoring of anesthetic depth is an absolutely necessary procedure in the process of surgical operation. To judge and control the depth of anesthesia has become a clinical issue which should be resolved urgently. EEG collected wiil be processed by sort entrop in this paper. Signal response of the surface of the cerebral cortex is determined for different stages of patients in the course of anesthesia. EEG is simulated and analyzed through the fast algorithm of sort entropy. The results show that discipline of phasic changes for EEG is very detected accurately,and it has better noise immunity in detecting the EEG anaesthetized than approximate entropy. In conclusion,the computing of Sort entropy algorithm requires shorter time. It has high efficiency and strong anti-interference.
Quench action and Rényi entropies in integrable systems
NASA Astrophysics Data System (ADS)
Alba, Vincenzo; Calabrese, Pasquale
2017-09-01
Entropy is a fundamental concept in equilibrium statistical mechanics, yet its origin in the nonequilibrium dynamics of isolated quantum systems is not fully understood. A strong consensus is emerging around the idea that the stationary thermodynamic entropy is the von Neumann entanglement entropy of a large subsystem embedded in an infinite system. Also motivated by cold-atom experiments, here we consider the generalization to Rényi entropies. We develop a new technique to calculate the diagonal Rényi entropy in the quench action formalism. In the spirit of the replica treatment for the entanglement entropy, the diagonal Rényi entropies are generalized free energies evaluated over a thermodynamic macrostate which depends on the Rényi index and, in particular, is not the same state describing von Neumann entropy. The technical reason for this perhaps surprising result is that the evaluation of the moments of the diagonal density matrix shifts the saddle point of the quench action. An interesting consequence is that different Rényi entropies encode information about different regions of the spectrum of the postquench Hamiltonian. Our approach provides a very simple proof of the long-standing issue that, for integrable systems, the diagonal entropy is half of the thermodynamic one and it allows us to generalize this result to the case of arbitrary Rényi entropy.
Lin, Shiang-Tai; Maiti, Prabal K; Goddard, William A
2010-06-24
Presented here is the two-phase thermodynamic (2PT) model for the calculation of energy and entropy of molecular fluids from the trajectory of molecular dynamics (MD) simulations. In this method, the density of state (DoS) functions (including the normal modes of translation, rotation, and intramolecular vibration motions) are determined from the Fourier transform of the corresponding velocity autocorrelation functions. A fluidicity parameter (f), extracted from the thermodynamic state of the system derived from the same MD, is used to partition the translation and rotation modes into a diffusive, gas-like component (with 3Nf degrees of freedom) and a nondiffusive, solid-like component. The thermodynamic properties, including the absolute value of entropy, are then obtained by applying quantum statistics to the solid component and applying hard sphere/rigid rotor thermodynamics to the gas component. The 2PT method produces exact thermodynamic properties of the system in two limiting states: the nondiffusive solid state (where the fluidicity is zero) and the ideal gas state (where the fluidicity becomes unity). We examine the 2PT entropy for various water models (F3C, SPC, SPC/E, TIP3P, and TIP4P-Ew) at ambient conditions and find good agreement with literature results obtained based on other simulation techniques. We also validate the entropy of water in the liquid and vapor phases along the vapor-liquid equilibrium curve from the triple point to the critical point. We show that this method produces converged liquid phase entropy in tens of picoseconds, making it an efficient means for extracting thermodynamic properties from MD simulations.
Quantum engine efficiency bound beyond the second law of thermodynamics.
Niedenzu, Wolfgang; Mukherjee, Victor; Ghosh, Arnab; Kofman, Abraham G; Kurizki, Gershon
2018-01-11
According to the second law, the efficiency of cyclic heat engines is limited by the Carnot bound that is attained by engines that operate between two thermal baths under the reversibility condition whereby the total entropy does not increase. Quantum engines operating between a thermal and a squeezed-thermal bath have been shown to surpass this bound. Yet, their maximum efficiency cannot be determined by the reversibility condition, which may yield an unachievable efficiency bound above unity. Here we identify the fraction of the exchanged energy between a quantum system and a bath that necessarily causes an entropy change and derive an inequality for this change. This inequality reveals an efficiency bound for quantum engines energised by a non-thermal bath. This bound does not imply reversibility, unless the two baths are thermal. It cannot be solely deduced from the laws of thermodynamics.
The many faces of the second law
NASA Astrophysics Data System (ADS)
Van den Broeck, C.
2010-10-01
There exists no perpetuum mobile of the second kind. We review the implications of this observation on the second law, on the efficiency of thermal machines, on Onsager symmetry, on Brownian motors and Brownian refrigerators, and on the universality of efficiency of thermal machines at maximum power. We derive a microscopic expression for the stochastic entropy production, and obtain from it the detailed and integral fluctuation theorem. We close with the remarkable observation that the second law can be split in two: the total entropy production is the sum of two contributions each of which is growing independently in time.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
Kuntzelman, Karl; Jack Rhodes, L; Harrington, Lillian N; Miskovic, Vladimir
2018-06-01
There is a broad family of statistical methods for capturing time series regularity, with increasingly widespread adoption by the neuroscientific community. A common feature of these methods is that they permit investigators to quantify the entropy of brain signals - an index of unpredictability/complexity. Despite the proliferation of algorithms for computing entropy from neural time series data there is scant evidence concerning their relative stability and efficiency. Here we evaluated several different algorithmic implementations (sample, fuzzy, dispersion and permutation) of multiscale entropy in terms of their stability across sessions, internal consistency and computational speed, accuracy and precision using a combination of electroencephalogram (EEG) and synthetic 1/ƒ noise signals. Overall, we report fair to excellent internal consistency and longitudinal stability over a one-week period for the majority of entropy estimates, with several caveats. Computational timing estimates suggest distinct advantages for dispersion and permutation entropy over other entropy estimates. Considered alongside the psychometric evidence, we suggest several ways in which researchers can maximize computational resources (without sacrificing reliability), especially when working with high-density M/EEG data or multivoxel BOLD time series signals. Copyright © 2018 Elsevier Inc. All rights reserved.
Characterization of Early Partial Seizure Onset: Frequency, Complexity and Entropy
Jouny, Christophe C.; Bergey, Gregory K.
2011-01-01
Objective A clear classification of partial seizures onset features is not yet established. Complexity and entropy have been very widely used to describe dynamical systems, but a systematic evaluation of these measures to characterize partial seizures has never been performed. Methods Eighteen different measures including power in frequency bands up to 300Hz, Gabor atom density (GAD), Higuchi fractal dimension (HFD), Lempel-Ziv complexity, Shannon entropy, sample entropy, and permutation entropy, were selected to test sensitivity to partial seizure onset. Intracranial recordings from forty-five patients with mesial temporal, neocortical temporal and neocortical extratemporal seizure foci were included (331 partial seizures). Results GAD, Lempel-Ziv complexity, HFD, high frequency activity, and sample entropy were the most reliable measures to assess early seizure onset. Conclusions Increases in complexity and occurrence of high-frequency components appear to be commonly associated with early stages of partial seizure evolution from all regions. The type of measure (frequency-based, complexity or entropy) does not predict the efficiency of the method to detect seizure onset. Significance Differences between measures such as GAD and HFD highlight the multimodal nature of partial seizure onsets. Improved methods for early seizure detection may be achieved from a better understanding of these underlying dynamics. PMID:21872526
Magnetic Stirling cycles - A new application for magnetic materials
NASA Technical Reports Server (NTRS)
Brown, G. V.
1977-01-01
There is the prospect of a fundamental new application for magnetic materials as the working substance in thermodynamic cycles. Recuperative cycles which use a rare-earth ferromagnetic material near its Curie point in the field of a superconducting magnet appear feasible for applications from below 20 K to above room temperature. The elements of the cycle, advanced in an earlier paper, are summarized. The basic advantages include high entropy density in the magnetic material, completely reversible processes, convenient control of the entropy by the applied field, the feature that heat transfer is possible during all processes, and the ability of the ideal cycle to attain Carnot efficiency. The mean field theory is used to predict the entropy of a ferromagnet in an applied field and also the isothermal entropy change and isentropic temperature change caused by applying a field. Results are presented for J = 7/2 and g = 2. The results for isentropic temperature change are compared with experimental data on Gd. Coarse mixtures of ferromagnetic materials with different Curie points are proposed to modify the path of the cycle in the T-S diagram in order to improve the efficiency or to increase the specific power.
Dynamic Impact Behaviour of High Entropy Alloys Used in the Military Domain
NASA Astrophysics Data System (ADS)
Geantă, V.; Voiculescu, I.; Stefănoiu, R.; Chereches, T.; Zecheru, T.; Matache, L.; Rotariu, A.
2018-06-01
AlFeCrCoNi high entropy alloys (HEA) feature significant compressive strength characteristics, being usable for severe impact applications in the military domain. The research paper presents the results obtained by testing the impact resistance of four HEA samples of different chemical compositions at perforation with 7.62 mm calibre incendiary armour-piercing bullets. The dynamical behaviour was modelled by numerical simulation based on the results of the dynamic tests conducted in the firing range, thus allowing the development of more efficient high entropy alloys, to be used for collective/personal protection.
Minimum entropy density method for the time series analysis
NASA Astrophysics Data System (ADS)
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
Analysis of entropy extraction efficiencies in random number generation systems
NASA Astrophysics Data System (ADS)
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
Entropy-Based Registration of Point Clouds Using Terrestrial Laser Scanning and Smartphone GPS.
Chen, Maolin; Wang, Siying; Wang, Mingwei; Wan, Youchuan; He, Peipei
2017-01-20
Automatic registration of terrestrial laser scanning point clouds is a crucial but unresolved topic that is of great interest in many domains. This study combines terrestrial laser scanner with a smartphone for the coarse registration of leveled point clouds with small roll and pitch angles and height differences, which is a novel sensor combination mode for terrestrial laser scanning. The approximate distance between two neighboring scan positions is firstly calculated with smartphone GPS coordinates. Then, 2D distribution entropy is used to measure the distribution coherence between the two scans and search for the optimal initial transformation parameters. To this end, we propose a method called Iterative Minimum Entropy (IME) to correct initial transformation parameters based on two criteria: the difference between the average and minimum entropy and the deviation from the minimum entropy to the expected entropy. Finally, the presented method is evaluated using two data sets that contain tens of millions of points from panoramic and non-panoramic, vegetation-dominated and building-dominated cases and can achieve high accuracy and efficiency.
Achieving the classical Carnot efficiency in a strongly coupled quantum heat engine
NASA Astrophysics Data System (ADS)
Xu, Y. Y.; Chen, B.; Liu, J.
2018-02-01
Generally, the efficiency of a heat engine strongly coupled with a heat bath is less than the classical Carnot efficiency. Through a model-independent method, we show that the classical Carnot efficiency is achieved in a strongly coupled quantum heat engine. First, we present the first law of quantum thermodynamics in strong coupling. Then, we show how to achieve the Carnot cycle and the classical Carnot efficiency at strong coupling. We find that this classical Carnot efficiency stems from the fact that the heat released in a nonequilibrium process is balanced by the absorbed heat. We also analyze the restrictions in the achievement of the Carnot cycle. The first restriction is that there must be two corresponding intervals of the controllable parameter in which the corresponding entropies of the work substance at the hot and cold temperatures are equal, and the second is that the entropy of the initial and final states in a nonequilibrium process must be equal. Through these restrictions, we obtain the positive work conditions, including the usual one in which the hot temperature should be higher than the cold, and a new one in which there must be an entropy interval at the hot temperature overlapping that at the cold. We demonstrate our result through a paradigmatic model—a two-level system in which a work substance strongly interacts with a heat bath. In this model, we find that the efficiency may abruptly decrease to zero due to the first restriction, and that the second restriction results in the control scheme becoming complex.
Achieving the classical Carnot efficiency in a strongly coupled quantum heat engine.
Xu, Y Y; Chen, B; Liu, J
2018-02-01
Generally, the efficiency of a heat engine strongly coupled with a heat bath is less than the classical Carnot efficiency. Through a model-independent method, we show that the classical Carnot efficiency is achieved in a strongly coupled quantum heat engine. First, we present the first law of quantum thermodynamics in strong coupling. Then, we show how to achieve the Carnot cycle and the classical Carnot efficiency at strong coupling. We find that this classical Carnot efficiency stems from the fact that the heat released in a nonequilibrium process is balanced by the absorbed heat. We also analyze the restrictions in the achievement of the Carnot cycle. The first restriction is that there must be two corresponding intervals of the controllable parameter in which the corresponding entropies of the work substance at the hot and cold temperatures are equal, and the second is that the entropy of the initial and final states in a nonequilibrium process must be equal. Through these restrictions, we obtain the positive work conditions, including the usual one in which the hot temperature should be higher than the cold, and a new one in which there must be an entropy interval at the hot temperature overlapping that at the cold. We demonstrate our result through a paradigmatic model-a two-level system in which a work substance strongly interacts with a heat bath. In this model, we find that the efficiency may abruptly decrease to zero due to the first restriction, and that the second restriction results in the control scheme becoming complex.
Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R
2014-01-01
Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.
On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method
Roux, Benoît; Weare, Jonathan
2013-01-01
An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140
NASA Astrophysics Data System (ADS)
Chen, Jing-Han; Us Saleheen, Ahmad; Adams, Philip W.; Young, David P.; Ali, Naushad; Stadler, Shane
2018-04-01
In this work, we discuss measurement protocols for the determination of the magnetic entropy change associated with first-order magneto-structural transitions from both magnetization and calorimetric experiments. The Cu-doped Ni2MnGa Heusler alloy with a first-order magneto-structural phase transition is used as a case study to illustrate how commonly-used magnetization measurement protocols result in spurious entropy evaluations. Two magnetization measurement protocols which allow for the accurate assessment of the magnetic entropy change across first-order magneto-structural transitions are presented. In addition, calorimetric measurements were performed to validate the results from the magnetization measurements. Self-consistent results between the magnetization and calorimetric measurements were obtained when the non-equilibrium thermodynamic state was carefully handled. Such methods could be applicable to other systems displaying giant magnetocaloric effects caused by first-order phase transitions with magnetic and thermal hysteresis.
Entropy generation across Earth's collisionless bow shock.
Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A
2012-02-10
Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.
Zhang, Xiaomeng; Shao, Bin; Wu, Yangle; Qi, Ouyang
2013-01-01
One of the major objectives in systems biology is to understand the relation between the topological structures and the dynamics of biological regulatory networks. In this context, various mathematical tools have been developed to deduct structures of regulatory networks from microarray expression data. In general, from a single data set, one cannot deduct the whole network structure; additional expression data are usually needed. Thus how to design a microarray expression experiment in order to get the most information is a practical problem in systems biology. Here we propose three methods, namely, maximum distance method, trajectory entropy method, and sampling method, to derive the optimal initial conditions for experiments. The performance of these methods is tested and evaluated in three well-known regulatory networks (budding yeast cell cycle, fission yeast cell cycle, and E. coli. SOS network). Based on the evaluation, we propose an efficient strategy for the design of microarray expression experiments.
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
The pressure and entropy of a unitary Fermi gas with particle-hole fluctuation
NASA Astrophysics Data System (ADS)
Gong, Hao; Ruan, Xiao-Xia; Zong, Hong-Shi
2018-01-01
We calculate the pressure and entropy of a unitary Fermi gas based on universal relations combined with our previous prediction of energy which was calculated within the framework of the non-self-consistent T-matrix approximation with particle-hole fluctuation. The resulting entropy and pressure are compared with the experimental data and the theoretical results without induced interaction. For entropy, we find good agreement between our results with particle-hole fluctuation and the experimental measurements reported by ENS group and MIT experiment. For pressure, our results suffer from a systematic upshift compared to MIT data.
Entropy production in a box: Analysis of instabilities in confined hydrothermal systems
NASA Astrophysics Data System (ADS)
Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.
2017-09-01
We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.
Investigation of FeNiCrWMn - a new high entropy alloy
NASA Astrophysics Data System (ADS)
Buluc, G.; Florea, I.; Bălţătescu, O.; Florea, R. M.; Carcea, I.
2015-11-01
The term of high entropy alloys started from the analysis of multicomponent alloys, which were produced at an experimental level since 1995 by developing a new concept related to the development of metallic materials. Recent developments in the field of high-entropy alloys have revealed that they have versatile properties like: ductility, toughness, hardness and corrosion resistance [1]. Up until now, it has been demonstrated that the explored this alloys are feasible to be synthesized, processed and analyzed contrary to the misunderstanding based on traditional experiences. Moreover, there are many opportunities in this field for academic studies and industrial applications [1, 2]. As the combinations of composition and process for producing high entropy alloys are numerous and each high entropy alloy has its own microstructure and properties to be identified and understood, the research work is truly limitless. The novelty of these alloys consists of chemical composition. These alloys have been named high entropy alloys due to the atomic scale mixing entropies higher than traditional alloys. In this paper, I will present the microscopy and the mechanical properties of high entropy alloy FeNiCrWMn.
Reconstructing quantum entropy production to probe irreversibility and correlations
NASA Astrophysics Data System (ADS)
Gherardini, Stefano; Müller, Matthias M.; Trombettoni, Andrea; Ruffo, Stefano; Caruso, Filippo
2018-07-01
One of the major goals of quantum thermodynamics is the characterization of irreversibility and its consequences in quantum processes. Here, we discuss how entropy production provides a quantification of the irreversibility in open quantum systems through the quantum fluctuation theorem. We start by introducing a two-time quantum measurement scheme, in which the dynamical evolution between the measurements is described by a completely positive, trace-preserving (CPTP) quantum map (forward process). By inverting the measurement scheme and applying the time-reversed version of the quantum map, we can study how this backward process differs from the forward one. When the CPTP map is unital, we show that the stochastic quantum entropy production is a function only of the probabilities to get the initial measurement outcomes in correspondence of the forward and backward processes. For bipartite open quantum systems we also prove that the mean value of the stochastic quantum entropy production is sub-additive with respect to the bipartition (except for product states). Hence, we find a method to detect correlations between the subsystems. Our main result is the proposal of an efficient protocol to determine and reconstruct the characteristic functions of the stochastic entropy production for each subsystem. This procedure enables to reconstruct even others thermodynamical quantities, such as the work distribution of the composite system and the corresponding internal energy. Efficiency and possible extensions of the protocol are also discussed. Finally, we show how our findings might be experimentally tested by exploiting the state of-the-art trapped-ion platforms.
Simultaneous Multi-Scale Diffusion Estimation and Tractography Guided by Entropy Spectrum Pathways
Galinsky, Vitaly L.; Frank, Lawrence R.
2015-01-01
We have developed a method for the simultaneous estimation of local diffusion and the global fiber tracts based upon the information entropy flow that computes the maximum entropy trajectories between locations and depends upon the global structure of the multi-dimensional and multi-modal diffusion field. Computation of the entropy spectrum pathways requires only solving a simple eigenvector problem for the probability distribution for which efficient numerical routines exist, and a straight forward integration of the probability conservation through ray tracing of the convective modes guided by a global structure of the entropy spectrum coupled with a small scale local diffusion. The intervoxel diffusion is sampled by multi b-shell multi q-angle DWI data expanded in spherical waves. This novel approach to fiber tracking incorporates global information about multiple fiber crossings in every individual voxel and ranks it in the most scientifically rigorous way. This method has potential significance for a wide range of applications, including studies of brain connectivity. PMID:25532167
Two-phase thermodynamic model for computing entropies of liquids reanalyzed
NASA Astrophysics Data System (ADS)
Sun, Tao; Xian, Jiawei; Zhang, Huai; Zhang, Zhigang; Zhang, Yigang
2017-11-01
The two-phase thermodynamic (2PT) model [S.-T. Lin et al., J. Chem. Phys. 119, 11792-11805 (2003)] provides a promising paradigm to efficiently determine the ionic entropies of liquids from molecular dynamics. In this model, the vibrational density of states (VDoS) of a liquid is decomposed into a diffusive gas-like component and a vibrational solid-like component. By treating the diffusive component as hard sphere (HS) gas and the vibrational component as harmonic oscillators, the ionic entropy of the liquid is determined. Here we examine three issues crucial for practical implementations of the 2PT model: (i) the mismatch between the VDoS of the liquid system and that of the HS gas; (ii) the excess entropy of the HS gas; (iii) the partition of the gas-like and solid-like components. Some of these issues have not been addressed before, yet they profoundly change the entropy predicted from the model. Based on these findings, a revised 2PT formalism is proposed and successfully tested in systems with Lennard-Jones potentials as well as many-atom potentials of liquid metals. Aside from being capable of performing quick entropy estimations for a wide range of systems, the formalism also supports fine-tuning to accurately determine entropies at specific thermal states.
An entropy-based statistic for genomewide association studies.
Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao
2005-07-01
Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.
Entropy production and optimization of geothermal power plants
NASA Astrophysics Data System (ADS)
Michaelides, Efstathios E.
2012-09-01
Geothermal power plants are currently producing reliable and low-cost, base load electricity. Three basic types of geothermal power plants are currently in operation: single-flashing, dual-flashing, and binary power plants. Typically, the single-flashing and dual-flashing geothermal power plants utilize geothermal water (brine) at temperatures in the range of 550-430 K. Binary units utilize geothermal resources at lower temperatures, typically 450-380 K. The entropy production in the various components of the three types of geothermal power plants determines the efficiency of the plants. It is axiomatic that a lower entropy production would improve significantly the energy utilization factor of the corresponding power plant. For this reason, the entropy production in the major components of the three types of geothermal power plants has been calculated. It was observed that binary power plants generate the lowest amount of entropy and, thus, convert the highest rate of geothermal energy into mechanical energy. The single-flashing units generate the highest amount of entropy, primarily because they re-inject fluid at relatively high temperature. The calculations for entropy production provide information on the equipment where the highest irreversibilities occur, and may be used to optimize the design of geothermal processes in future geothermal power plants and thermal cycles used for the harnessing of geothermal energy.
Optimal protocols for slowly driven quantum systems.
Zulkowski, Patrick R; DeWeese, Michael R
2015-09-01
The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.
Subband Image Coding with Jointly Optimized Quantizers
NASA Technical Reports Server (NTRS)
Kossentini, Faouzi; Chung, Wilson C.; Smith Mark J. T.
1995-01-01
An iterative design algorithm for the joint design of complexity- and entropy-constrained subband quantizers and associated entropy coders is proposed. Unlike conventional subband design algorithms, the proposed algorithm does not require the use of various bit allocation algorithms. Multistage residual quantizers are employed here because they provide greater control of the complexity-performance tradeoffs, and also because they allow efficient and effective high-order statistical modeling. The resulting subband coder exploits statistical dependencies within subbands, across subbands, and across stages, mainly through complexity-constrained high-order entropy coding. Experimental results demonstrate that the complexity-rate-distortion performance of the new subband coder is exceptional.
Sandford, M.T. II; Handel, T.G.; Bradley, J.N.
1998-03-10
A method of embedding auxiliary information into the digital representation of host data created by a lossy compression technique is disclosed. The method applies to data compressed with lossy algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as integer indices having redundancy and uncertainty in value by one unit. Indices which are adjacent in value are manipulated to encode auxiliary data. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user. Lossy compression methods use loss-less compressions known also as entropy coding, to reduce to the final size the intermediate representation as indices. The efficiency of the compression entropy coding, known also as entropy coding is increased by manipulating the indices at the intermediate stage in the manner taught by the method. 11 figs.
Sandford, II, Maxwell T.; Handel, Theodore G.; Bradley, Jonathan N.
1998-01-01
A method of embedding auxiliary information into the digital representation of host data created by a lossy compression technique. The method applies to data compressed with lossy algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as integer indices having redundancy and uncertainty in value by one unit. Indices which are adjacent in value are manipulated to encode auxiliary data. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user. Lossy compression methods use loss-less compressions known also as entropy coding, to reduce to the final size the intermediate representation as indices. The efficiency of the compression entropy coding, known also as entropy coding is increased by manipulating the indices at the intermediate stage in the manner taught by the method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Maoyuan; Besford, Quinn Alexander; Mulvaney, Thomas
The entropy of hydrophobic solvation has been explained as the result of ordered solvation structures, of hydrogen bonds, of the small size of the water molecule, of dispersion forces, and of solvent density fluctuations. We report a new approach to the calculation of the entropy of hydrophobic solvation, along with tests of and comparisons to several other methods. The methods are assessed in the light of the available thermodynamic and spectroscopic information on the effects of temperature on hydrophobic solvation. Five model hydrophobes in SPC/E water give benchmark solvation entropies via Widom’s test-particle insertion method, and other methods and modelsmore » are tested against these particle-insertion results. Entropies associated with distributions of tetrahedral order, of electric field, and of solvent dipole orientations are examined. We find these contributions are small compared to the benchmark particle-insertion entropy. Competitive with or better than other theories in accuracy, but with no free parameters, is the new estimate of the entropy contributed by correlations between dipole moments. Dipole correlations account for most of the hydrophobic solvation entropy for all models studied and capture the distinctive temperature dependence seen in thermodynamic and spectroscopic experiments. Entropies based on pair and many-body correlations in number density approach the correct magnitudes but fail to describe temperature and size dependences, respectively. Hydrogen-bond definitions and free energies that best reproduce entropies from simulations are reported, but it is difficult to choose one hydrogen bond model that fits a variety of experiments. The use of information theory, scaled-particle theory, and related methods is discussed briefly. Our results provide a test of the Frank-Evans hypothesis that the negative solvation entropy is due to structured water near the solute, complement the spectroscopic detection of that solvation structure by identifying the structural feature responsible for the entropy change, and point to a possible explanation for the observed dependence on length scale. Our key results are that the hydrophobic effect, i.e. the signature, temperature-dependent, solvation entropy of nonpolar molecules in water, is largely due to a dispersion force arising from correlations between rotating permanent dipole moments, that the strength of this force depends on the Kirkwood g-factor, and that the strength of this force may be obtained exactly without simulation.« less
NASA Astrophysics Data System (ADS)
Keum, Jongho; Coulibaly, Paulin
2017-07-01
Adequate and accurate hydrologic information from optimal hydrometric networks is an essential part of effective water resources management. Although the key hydrologic processes in the water cycle are interconnected, hydrometric networks (e.g., streamflow, precipitation, groundwater level) have been routinely designed individually. A decision support framework is proposed for integrated design of multivariable hydrometric networks. The proposed method is applied to design optimal precipitation and streamflow networks simultaneously. The epsilon-dominance hierarchical Bayesian optimization algorithm was combined with Shannon entropy of information theory to design and evaluate hydrometric networks. Specifically, the joint entropy from the combined networks was maximized to provide the most information, and the total correlation was minimized to reduce redundant information. To further optimize the efficiency between the networks, they were designed by maximizing the conditional entropy of the streamflow network given the information of the precipitation network. Compared to the traditional individual variable design approach, the integrated multivariable design method was able to determine more efficient optimal networks by avoiding the redundant stations. Additionally, four quantization cases were compared to evaluate their effects on the entropy calculations and the determination of the optimal networks. The evaluation results indicate that the quantization methods should be selected after careful consideration for each design problem since the station rankings and the optimal networks can change accordingly.
Research on interpolation methods in medical image processing.
Pan, Mei-Sen; Yang, Xiao-Li; Tang, Jing-Tian
2012-04-01
Image interpolation is widely used for the field of medical image processing. In this paper, interpolation methods are divided into three groups: filter interpolation, ordinary interpolation and general partial volume interpolation. Some commonly-used filter methods for image interpolation are pioneered, but the interpolation effects need to be further improved. When analyzing and discussing ordinary interpolation, many asymmetrical kernel interpolation methods are proposed. Compared with symmetrical kernel ones, the former are have some advantages. After analyzing the partial volume and generalized partial volume estimation interpolations, the new concept and constraint conditions of the general partial volume interpolation are defined, and several new partial volume interpolation functions are derived. By performing the experiments of image scaling, rotation and self-registration, the interpolation methods mentioned in this paper are compared in the entropy, peak signal-to-noise ratio, cross entropy, normalized cross-correlation coefficient and running time. Among the filter interpolation methods, the median and B-spline filter interpolations have a relatively better interpolating performance. Among the ordinary interpolation methods, on the whole, the symmetrical cubic kernel interpolations demonstrate a strong advantage, especially the symmetrical cubic B-spline interpolation. However, we have to mention that they are very time-consuming and have lower time efficiency. As for the general partial volume interpolation methods, from the total error of image self-registration, the symmetrical interpolations provide certain superiority; but considering the processing efficiency, the asymmetrical interpolations are better.
NASA Astrophysics Data System (ADS)
He, Jiayi; Shang, Pengjian; Xiong, Hui
2018-06-01
Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.
Carroll, Robert; Lee, Chi; Tsai, Che-Wei; ...
2015-11-23
In this study, high-entropy alloys (HEAs) are new alloys that contain five or more elements in roughly-equal proportion. We present new experiments and theory on the deformation behavior of HEAs under slow stretching (straining), and observe differences, compared to conventional alloys with fewer elements. For a specific range of temperatures and strain-rates, HEAs deform in a jerky way, with sudden slips that make it difficult to precisely control the deformation. An analytic model explains these slips as avalanches of slipping weak spots and predicts the observed slip statistics, stress-strain curves, and their dependence on temperature, strain-rate, and material composition. Themore » ratio of the weak spots’ healing rate to the strain-rate is the main tuning parameter, reminiscent of the Portevin- LeChatellier effect and time-temperature superposition in polymers. Our model predictions agree with the experimental results. The proposed widely-applicable deformation mechanism is useful for deformation control and alloy design.« less
An instructive model of entropy
NASA Astrophysics Data System (ADS)
Zimmerman, Seth
2010-09-01
This article first notes the misinterpretation of a common thought experiment, and the misleading comment that 'systems tend to flow from less probable to more probable macrostates'. It analyses the experiment, generalizes it and introduces a new tool of investigation, the simplectic structure. A time-symmetric model is built upon this structure, yielding several non-intuitive results. The approach is combinatorial rather than statistical, and assumes that entropy is equivalent to 'missing information'. The intention of this article is not only to present interesting results, but also, by deliberately starting with a simple example and developing it through proof and computer simulation, to clarify the often confusing subject of entropy. The article should be particularly stimulating to students and instructors of discrete mathematics or undergraduate physics.
Entropy favours open colloidal lattices
NASA Astrophysics Data System (ADS)
Mao, Xiaoming; Chen, Qian; Granick, Steve
2013-03-01
Burgeoning experimental and simulation activity seeks to understand the existence of self-assembled colloidal structures that are not close-packed. Here we describe an analytical theory based on lattice dynamics and supported by experiments that reveals the fundamental role entropy can play in stabilizing open lattices. The entropy we consider is associated with the rotational and vibrational modes unique to colloids interacting through extended attractive patches. The theory makes predictions of the implied temperature, pressure and patch-size dependence of the phase diagram of open and close-packed structures. More generally, it provides guidance for the conditions at which targeted patchy colloidal assemblies in two and three dimensions are stable, thus overcoming the difficulty in exploring by experiment or simulation the full range of conceivable parameters.
Ye, Qing; Pan, Hao; Liu, Changhua
2015-01-01
This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F 1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach. PMID:25722717
Efficient Ab initio Modeling of Random Multicomponent Alloys
Jiang, Chao; Uberuaga, Blas P.
2016-03-08
Here, we present in this Letter a novel small set of ordered structures (SSOS) method that allows extremely efficient ab initio modeling of random multi-component alloys. Using inverse II-III spinel oxides and equiatomic quinary bcc (so-called high entropy) alloys as examples, we also demonstrate that a SSOS can achieve the same accuracy as a large supercell or a well-converged cluster expansion, but with significantly reduced computational cost. In particular, because of this efficiency, a large number of quinary alloy compositions can be quickly screened, leading to the identification of several new possible high entropy alloy chemistries. Furthermore, the SSOS methodmore » developed here can be broadly useful for the rapid computational design of multi-component materials, especially those with a large number of alloying elements, a challenging problem for other approaches.« less
Batteries for efficient energy extraction from a water salinity difference.
La Mantia, Fabio; Pasta, Mauro; Deshazer, Heather D; Logan, Bruce E; Cui, Yi
2011-04-13
The salinity difference between seawater and river water is a renewable source of enormous entropic energy, but extracting it efficiently as a form of useful energy remains a challenge. Here we demonstrate a device called "mixing entropy battery", which can extract and store it as useful electrochemical energy. The battery, containing a Na(2-x)Mn(5)O(10) nanorod electrode, was shown to extract energy from real seawater and river water and can be applied to a variety of salt waters. We demonstrated energy extraction efficiencies of up to 74%. Considering the flow rate of river water into oceans as the limiting factor, the renewable energy production could potentially reach 2 TW, or ∼13% of the current world energy consumption. The mixing entropy battery is simple to fabricate and could contribute significantly to renewable energy in the future.
NASA Astrophysics Data System (ADS)
Kim, Y.; Hwang, T.; Vose, J. M.; Martin, K. L.; Band, L. E.
2016-12-01
Obtaining quality hydrologic observations is the first step towards a successful water resources management. While remote sensing techniques have enabled to convert satellite images of the Earth's surface to hydrologic data, the importance of ground-based observations has never been diminished because in-situ data are often highly accurate and can be used to validate remote measurements. The existence of efficient hydrometric networks is becoming more important to obtain as much as information with minimum redundancy. The World Meteorological Organization (WMO) has recommended a guideline for the minimum hydrometric network density based on physiography; however, this guideline is not for the optimum network design but for avoiding serious deficiency from a network. Moreover, all hydrologic variables are interconnected within the hydrologic cycle, while monitoring networks have been designed individually. This study proposes an integrated network design method using entropy theory with a multiobjective optimization approach. In specific, a precipitation and a streamflow networks in a semi-urban watershed in Ontario, Canada were designed simultaneously by maximizing joint entropy, minimizing total correlation, and maximizing conditional entropy of streamflow network given precipitation network. After comparing with the typical individual network designs, the proposed design method would be able to determine more efficient optimal networks by avoiding the redundant stations, in which hydrologic information is transferable. Additionally, four quantization cases were applied in entropy calculations to assess their implications on the station rankings and the optimal networks. The results showed that the selection of quantization method should be considered carefully because the rankings and optimal networks are subject to change accordingly.
Radiation Entropy and Near-Field Thermophotovoltaics
NASA Astrophysics Data System (ADS)
Zhang, Zhuomin M.
2008-08-01
Radiation entropy was key to the original derivation of Planck's law of blackbody radiation, in 1900. This discovery opened the door to quantum mechanical theory and Planck was awarded the Nobel Prize in Physics in 1918. Thermal radiation plays an important role in incandescent lamps, solar energy utilization, temperature measurements, materials processing, remote sensing for astronomy and space exploration, combustion and furnace design, food processing, cryogenic engineering, as well as numerous agricultural, health, and military applications. While Planck's law has been fruitfully applied to a large number of engineering problems for over 100 years, questions have been raised about its limitation in micro/nano systems, especially at subwavelength distances or in the near field. When two objects are located closer than the characteristic wavelength, wave interference and photon tunneling occurs that can result in significant enhancement of the radiative transfer. Recent studies have shown that the near-field effects can realize emerging technologies, such as superlens, sub-wavelength light source, polariton-assisted nanolithography, thermophotovoltaic (TPV) systems, scanning tunneling thermal microscopy, etc. The concept of entropy has also been applied to explain laser cooling of solids as well as the second law efficiency of devices that utilize thermal radiation to produce electricity. However, little is known as regards the nature of entropy in near-field radiation. Some history and recent advances are reviewed in this presentation with a call for research of radiation entropy in the near field, due to the important applications in the optimization of thermophotovoltaic converters and in the design of practical systems that can harvest photon energies efficiently.
NASA Astrophysics Data System (ADS)
Keum, J.; Coulibaly, P. D.
2017-12-01
Obtaining quality hydrologic observations is the first step towards a successful water resources management. While remote sensing techniques have enabled to convert satellite images of the Earth's surface to hydrologic data, the importance of ground-based observations has never been diminished because in-situ data are often highly accurate and can be used to validate remote measurements. The existence of efficient hydrometric networks is becoming more important to obtain as much as information with minimum redundancy. The World Meteorological Organization (WMO) has recommended a guideline for the minimum hydrometric network density based on physiography; however, this guideline is not for the optimum network design but for avoiding serious deficiency from a network. Moreover, all hydrologic variables are interconnected within the hydrologic cycle, while monitoring networks have been designed individually. This study proposes an integrated network design method using entropy theory with a multiobjective optimization approach. In specific, a precipitation and a streamflow networks in a semi-urban watershed in Ontario, Canada were designed simultaneously by maximizing joint entropy, minimizing total correlation, and maximizing conditional entropy of streamflow network given precipitation network. After comparing with the typical individual network designs, the proposed design method would be able to determine more efficient optimal networks by avoiding the redundant stations, in which hydrologic information is transferable. Additionally, four quantization cases were applied in entropy calculations to assess their implications on the station rankings and the optimal networks. The results showed that the selection of quantization method should be considered carefully because the rankings and optimal networks are subject to change accordingly.
Contour entropy: a new determinant of perceiving ground or a hole.
Gillam, Barbara J; Grove, Philip M
2011-06-01
Figure-ground perception is typically described as seeing one surface occluding another. Figure properties, not ground properties, are considered the significant factors. In scenes, however, a near surface will often occlude multiple contours and surfaces, often at different depths, producing alignments that are improbable except under conditions of occlusion. We thus hypothesized that unrelated (high entropy) lines would tend to appear as ground in a figure-ground paradigm more often than similarly aligned ordered (low entropy) lines. We further hypothesized that for lines spanning a closed area, high line entropy should increase the hole-like appearance of that area. These predictions were confirmed in three experiments. The probability that patterned rectangles were seen as ground when alternated with blank rectangles increased with pattern entropy. A single rectangular shape appeared more hole-like when the entropy of the enclosed contours increased. Furthermore, these same contours, with the outline shape removed, gave rise to bounding illusory contours whose strength increased with contour entropy. We conclude that figure-ground and hole perception can be determined by properties of ground in the absence of any figural shape, or surround, factors.
de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie
2011-12-14
We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics
Application of an Entropic Approach to Assessing Systems Integration
2012-03-01
two econometrical measures of information efficiency – Shannon entropy and Hurst exponent . Shannon entropy (which is explained in Chapter III) can be...applied to evaluate long-term correlation of time series, while Hurst exponent can be applied to classify the time series in accordance to existence...of trend. Hurst exponent is the statistical measure of time series long-range dependence, and its value falls in the interval [0, 1] – a value in
A Simple Statistical Thermodynamics Experiment
ERIC Educational Resources Information Center
LoPresto, Michael C.
2010-01-01
Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozhdestvensky, Yu V
The possibility is studied for obtaining intense cold atomic beams by using the Renyi entropy to optimise the laser cooling process. It is shown in the case of a Gaussian velocity distribution of atoms, the Renyi entropy coincides with the density of particles in the phase space. The optimisation procedure for cooling atoms by resonance optical radiation is described, which is based on the thermodynamic law of increasing the Renyi entropy in time. Our method is compared with the known methods for increasing the laser cooling efficiency such as the tuning of a laser frequency in time and a changemore » of the atomic transition frequency in an inhomogeneous transverse field of a magnetic solenoid. (laser cooling)« less
NASA Astrophysics Data System (ADS)
Cheng, Qing; Yang, Xiaofeng; Shen, Jie
2017-07-01
In this paper, we consider numerical approximations of a hydro-dynamically coupled phase field diblock copolymer model, in which the free energy contains a kinetic potential, a gradient entropy, a Ginzburg-Landau double well potential, and a long range nonlocal type potential. We develop a set of second order time marching schemes for this system using the "Invariant Energy Quadratization" approach for the double well potential, the projection method for the Navier-Stokes equation, and a subtle implicit-explicit treatment for the stress and convective term. The resulting schemes are linear and lead to symmetric positive definite systems at each time step, thus they can be efficiently solved. We further prove that these schemes are unconditionally energy stable. Various numerical experiments are performed to validate the accuracy and energy stability of the proposed schemes.
DNS of Flow in a Low-Pressure Turbine Cascade Using a Discontinuous-Galerkin Spectral-Element Method
NASA Technical Reports Server (NTRS)
Garai, Anirban; Diosady, Laslo Tibor; Murman, Scott; Madavan, Nateri
2015-01-01
A new computational capability under development for accurate and efficient high-fidelity direct numerical simulation (DNS) and large eddy simulation (LES) of turbomachinery is described. This capability is based on an entropy-stable Discontinuous-Galerkin spectral-element approach that extends to arbitrarily high orders of spatial and temporal accuracy and is implemented in a computationally efficient manner on a modern high performance computer architecture. A validation study using this method to perform DNS of flow in a low-pressure turbine airfoil cascade are presented. Preliminary results indicate that the method captures the main features of the flow. Discrepancies between the predicted results and the experiments are likely due to the effects of freestream turbulence not being included in the simulation and will be addressed in the final paper.
Design of new face-centered cubic high entropy alloys by thermodynamic calculation
NASA Astrophysics Data System (ADS)
Choi, Won-Mi; Jung, Seungmun; Jo, Yong Hee; Lee, Sunghak; Lee, Byeong-Joo
2017-09-01
A new face-centered cubic (fcc) high entropy alloy system with non-equiatomic compositions has been designed by utilizing a CALculation of PHAse Diagram (CALPHAD) - type thermodynamic calculation technique. The new alloy system is based on the representative fcc high entropy alloy, the Cantor alloy which is an equiatomic Co- Cr-Fe-Mn-Ni five-component alloy, but fully or partly replace the cobalt by vanadium and is of non-equiatomic compositions. Alloy compositions expected to have an fcc single-phase structure between 700 °C and melting temperatures are proposed. All the proposed alloys are experimentally confirmed to have the fcc single-phase during materials processes (> 800 °C), through an X-ray diffraction analysis. It is shown that there are more chances to find fcc single-phase high entropy alloys if paying attention to non-equiatomic composition regions and that the CALPHAD thermodynamic calculation can be an efficient tool for it. An alloy design technique based on thermodynamic calculation is demonstrated and the applicability and limitation of the approach as a design tool for high entropy alloys is discussed.
Entropy-stable summation-by-parts discretization of the Euler equations on general curved elements
NASA Astrophysics Data System (ADS)
Crean, Jared; Hicken, Jason E.; Del Rey Fernández, David C.; Zingg, David W.; Carpenter, Mark H.
2018-03-01
We present and analyze an entropy-stable semi-discretization of the Euler equations based on high-order summation-by-parts (SBP) operators. In particular, we consider general multidimensional SBP elements, building on and generalizing previous work with tensor-product discretizations. In the absence of dissipation, we prove that the semi-discrete scheme conserves entropy; significantly, this proof of nonlinear L2 stability does not rely on integral exactness. Furthermore, interior penalties can be incorporated into the discretization to ensure that the total (mathematical) entropy decreases monotonically, producing an entropy-stable scheme. SBP discretizations with curved elements remain accurate, conservative, and entropy stable provided the mapping Jacobian satisfies the discrete metric invariants; polynomial mappings at most one degree higher than the SBP operators automatically satisfy the metric invariants in two dimensions. In three-dimensions, we describe an elementwise optimization that leads to suitable Jacobians in the case of polynomial mappings. The properties of the semi-discrete scheme are verified and investigated using numerical experiments.
Polymorphism in a high-entropy alloy
Zhang, Fei; Wu, Yuan; Lou, Hongbo; ...
2017-06-01
Polymorphism, which describes the occurrence of different lattice structures in a crystalline material, is a critical phenomenon in materials science and condensed matter physics. Recently, configuration disorder was compositionally engineered into single lattices, leading to the discovery of high-entropy alloys and high-entropy oxides. For these novel entropy-stabilized forms of crystalline matter with extremely high structural stability, is polymorphism still possible? Here by employing in situ high-pressure synchrotron radiation X-ray diffraction, we reveal a polymorphic transition from face-centred-cubic (fcc) structure to hexagonal-close-packing (hcp) structure in the prototype CoCrFeMnNi high-entropy alloy. The transition is irreversible, and our in situ high-temperature synchrotron radiationmore » X-ray diffraction experiments at different pressures of the retained hcp high-entropy alloy reveal that the fcc phase is a stable polymorph at high temperatures, while the hcp structure is more thermodynamically favourable at lower temperatures. Lastly, as pressure is increased, the critical temperature for the hcp-to-fcc transformation also rises.« less
The Macro and Micro of it Is that Entropy Is the Spread of Energy
NASA Astrophysics Data System (ADS)
Phillips, Jeffrey A.
2016-09-01
While entropy is often described as "disorder," it is better thought of as a measure of how spread out energy is within a system. To illustrate this interpretation of entropy to introductory college or high school students, several activities have been created. Students first study the relationship between microstates and macrostates to better understand the probabilities involved. Then, each student observes how a system evolves as energy is allowed to move within it. By studying how the class's ensemble of systems evolves, the tendency of energy to spread, rather than concentrate, can be observed. All activities require minimal equipment and provide students with a tactile and visual experience with entropy.
Satellite classification and segmentation using non-additive entropy
NASA Astrophysics Data System (ADS)
Assirati, Lucas; Souto Martinez, Alexandre; Martinez Bruno, Odemir
2014-03-01
Here we compare the Boltzmann-Gibbs-Shannon (standard) with the Tsallis entropy on the pattern recognition and segmentation of colored images obtained by satellites, via "Google Earth". By segmentation we mean particionate an image to locate regions of interest. Here, we discriminate and define an image partition classes according to a training basis. This training basis consists of three pattern classes: aquatic, urban and vegetation regions. Our numerical experiments demonstrate that the Tsallis entropy, used as a feature vector composed of distinct entropic indexes q outperforms the standard entropy. There are several applications of our proposed methodology, once satellite images can be used to monitor migration form rural to urban regions, agricultural activities, oil spreading on the ocean etc.
Metastable high-entropy dual-phase alloys overcome the strength-ductility trade-off.
Li, Zhiming; Pradeep, Konda Gokuldoss; Deng, Yun; Raabe, Dierk; Tasan, Cemal Cem
2016-06-09
Metals have been mankind's most essential materials for thousands of years; however, their use is affected by ecological and economical concerns. Alloys with higher strength and ductility could alleviate some of these concerns by reducing weight and improving energy efficiency. However, most metallurgical mechanisms for increasing strength lead to ductility loss, an effect referred to as the strength-ductility trade-off. Here we present a metastability-engineering strategy in which we design nanostructured, bulk high-entropy alloys with multiple compositionally equivalent high-entropy phases. High-entropy alloys were originally proposed to benefit from phase stabilization through entropy maximization. Yet here, motivated by recent work that relaxes the strict restrictions on high-entropy alloy compositions by demonstrating the weakness of this connection, the concept is overturned. We decrease phase stability to achieve two key benefits: interface hardening due to a dual-phase microstructure (resulting from reduced thermal stability of the high-temperature phase); and transformation-induced hardening (resulting from the reduced mechanical stability of the room-temperature phase). This combines the best of two worlds: extensive hardening due to the decreased phase stability known from advanced steels and massive solid-solution strengthening of high-entropy alloys. In our transformation-induced plasticity-assisted, dual-phase high-entropy alloy (TRIP-DP-HEA), these two contributions lead respectively to enhanced trans-grain and inter-grain slip resistance, and hence, increased strength. Moreover, the increased strain hardening capacity that is enabled by dislocation hardening of the stable phase and transformation-induced hardening of the metastable phase produces increased ductility. This combined increase in strength and ductility distinguishes the TRIP-DP-HEA alloy from other recently developed structural materials. This metastability-engineering strategy should thus usefully guide design in the near-infinite compositional space of high-entropy alloys.
Awan, Imtiaz; Aziz, Wajid; Habib, Nazneen; Alowibdi, Jalal S.; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali
2018-01-01
Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features. PMID:29771977
NASA Astrophysics Data System (ADS)
Melchert, O.; Hartmann, A. K.
2015-02-01
In this work we consider information-theoretic observables to analyze short symbolic sequences, comprising time series that represent the orientation of a single spin in a two-dimensional (2D) Ising ferromagnet on a square lattice of size L2=1282 for different system temperatures T . The latter were chosen from an interval enclosing the critical point Tc of the model. At small temperatures the sequences are thus very regular; at high temperatures they are maximally random. In the vicinity of the critical point, nontrivial, long-range correlations appear. Here we implement estimators for the entropy rate, excess entropy (i.e., "complexity"), and multi-information. First, we implement a Lempel-Ziv string-parsing scheme, providing seemingly elaborate entropy rate and multi-information estimates and an approximate estimator for the excess entropy. Furthermore, we apply easy-to-use black-box data-compression utilities, providing approximate estimators only. For comparison and to yield results for benchmarking purposes, we implement the information-theoretic observables also based on the well-established M -block Shannon entropy, which is more tedious to apply compared to the first two "algorithmic" entropy estimation procedures. To test how well one can exploit the potential of such data-compression techniques, we aim at detecting the critical point of the 2D Ising ferromagnet. Among the above observables, the multi-information, which is known to exhibit an isolated peak at the critical point, is very easy to replicate by means of both efficient algorithmic entropy estimation procedures. Finally, we assess how good the various algorithmic entropy estimates compare to the more conventional block entropy estimates and illustrate a simple modification that yields enhanced results.
Awan, Imtiaz; Aziz, Wajid; Shah, Imran Hussain; Habib, Nazneen; Alowibdi, Jalal S; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali
2018-01-01
Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features.
Metastable high-entropy dual-phase alloys overcome the strength-ductility trade-off
NASA Astrophysics Data System (ADS)
Li, Zhiming; Pradeep, Konda Gokuldoss; Deng, Yun; Raabe, Dierk; Tasan, Cemal Cem
2016-06-01
Metals have been mankind’s most essential materials for thousands of years; however, their use is affected by ecological and economical concerns. Alloys with higher strength and ductility could alleviate some of these concerns by reducing weight and improving energy efficiency. However, most metallurgical mechanisms for increasing strength lead to ductility loss, an effect referred to as the strength-ductility trade-off. Here we present a metastability-engineering strategy in which we design nanostructured, bulk high-entropy alloys with multiple compositionally equivalent high-entropy phases. High-entropy alloys were originally proposed to benefit from phase stabilization through entropy maximization. Yet here, motivated by recent work that relaxes the strict restrictions on high-entropy alloy compositions by demonstrating the weakness of this connection, the concept is overturned. We decrease phase stability to achieve two key benefits: interface hardening due to a dual-phase microstructure (resulting from reduced thermal stability of the high-temperature phase); and transformation-induced hardening (resulting from the reduced mechanical stability of the room-temperature phase). This combines the best of two worlds: extensive hardening due to the decreased phase stability known from advanced steels and massive solid-solution strengthening of high-entropy alloys. In our transformation-induced plasticity-assisted, dual-phase high-entropy alloy (TRIP-DP-HEA), these two contributions lead respectively to enhanced trans-grain and inter-grain slip resistance, and hence, increased strength. Moreover, the increased strain hardening capacity that is enabled by dislocation hardening of the stable phase and transformation-induced hardening of the metastable phase produces increased ductility. This combined increase in strength and ductility distinguishes the TRIP-DP-HEA alloy from other recently developed structural materials. This metastability-engineering strategy should thus usefully guide design in the near-infinite compositional space of high-entropy alloys.
An investigation of combustion and entropy noise
NASA Technical Reports Server (NTRS)
Strahle, W. C.
1977-01-01
The relative importance of entropy and direct combustion noise in turbopropulsion systems and the parameters upon which these noise sources depend were studied. Theory and experiment were employed to determine that at least with the apparatus used here, entropy noise can dominate combustion noise if there is a sufficient pressure gradient terminating the combustor. Measurements included combustor interior fluctuating pressure, near and far field fluctuating pressure, and combustor exit plane fluctuating temperatures, as well as mean pressures and temperatures. Analysis techniques included spectral, cross-correlation, cross power spectra, and ordinary and partial coherence analysis. Also conducted were combustor liner modification experiments to investigate the origin of the frequency content of combustion noise. Techniques were developed to extract nonpropagational pseudo-sound and the heat release fluctuation spectra from the data.
Use of Dimples to Suppress Boundary Layer Separation on a Low Pressure Turbine Blade
2002-12-01
Brayton cycle for an ideal gas turbine engine.............................................. 11 Figure 5. T-S diagram for a non-ideal turbine stage...engine efficiency is well illustrated with a T-S diagram, where T is temperature and S is entropy. The ideal jet engine is represented with the Brayton ...the Brayton cycle represents an ideal engine, no losses are present, and entropy is not produced. Between station 3 and 4, fuel (energy) is added
Efficient algorithms and implementations of entropy-based moment closures for rarefied gases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaerer, Roman Pascal, E-mail: schaerer@mathcces.rwth-aachen.de; Bansal, Pratyuksh; Torrilhon, Manuel
We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropymore » distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.« less
NASA Astrophysics Data System (ADS)
Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam
2016-03-01
This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.
Entropy of Movement Outcome in Space-Time.
Lai, Shih-Chiung; Hsieh, Tsung-Yu; Newell, Karl M
2015-07-01
Information entropy of the joint spatial and temporal (space-time) probability of discrete movement outcome was investigated in two experiments as a function of different movement strategies (space-time, space, and time instructional emphases), task goals (point-aiming and target-aiming) and movement speed-accuracy constraints. The variance of the movement spatial and temporal errors was reduced by instructional emphasis on the respective spatial or temporal dimension, but increased on the other dimension. The space-time entropy was lower in targetaiming task than the point aiming task but did not differ between instructional emphases. However, the joint probabilistic measure of spatial and temporal entropy showed that spatial error is traded for timing error in tasks with space-time criteria and that the pattern of movement error depends on the dimension of the measurement process. The unified entropy measure of movement outcome in space-time reveals a new relation for the speed-accuracy.
NASA Astrophysics Data System (ADS)
Liu, Hao; Li, Kangda; Wang, Bing; Tang, Hainie; Gong, Xiaohui
2017-01-01
A quantized block compressive sensing (QBCS) framework, which incorporates the universal measurement, quantization/inverse quantization, entropy coder/decoder, and iterative projected Landweber reconstruction, is summarized. Under the QBCS framework, this paper presents an improved reconstruction algorithm for aerial imagery, QBCS, with entropy-aware projected Landweber (QBCS-EPL), which leverages the full-image sparse transform without Wiener filter and an entropy-aware thresholding model for wavelet-domain image denoising. Through analyzing the functional relation between the soft-thresholding factors and entropy-based bitrates for different quantization methods, the proposed model can effectively remove wavelet-domain noise of bivariate shrinkage and achieve better image reconstruction quality. For the overall performance of QBCS reconstruction, experimental results demonstrate that the proposed QBCS-EPL algorithm significantly outperforms several existing algorithms. With the experiment-driven methodology, the QBCS-EPL algorithm can obtain better reconstruction quality at a relatively moderate computational cost, which makes it more desirable for aerial imagery applications.
Generalized permutation entropy analysis based on the two-index entropic form S q , δ
NASA Astrophysics Data System (ADS)
Xu, Mengjia; Shang, Pengjian
2015-05-01
Permutation entropy (PE) is a novel measure to quantify the complexity of nonlinear time series. In this paper, we propose a generalized permutation entropy ( P E q , δ ) based on the recently postulated entropic form, S q , δ , which was proposed as an unification of the well-known Sq of nonextensive-statistical mechanics and S δ , a possibly appropriate candidate for the black-hole entropy. We find that P E q , δ with appropriate parameters can amplify minor changes and trends of complexities in comparison to PE. Experiments with this generalized permutation entropy method are performed with both synthetic and stock data showing its power. Results show that P E q , δ is an exponential function of q and the power ( k ( δ ) ) is a constant if δ is determined. Some discussions about k ( δ ) are provided. Besides, we also find some interesting results about power law.
NASA Astrophysics Data System (ADS)
Yinkai Lei
Atomistic simulation refers to a set of simulation methods that model the materials on the atomistic scale. These simulation methods are faster and cheaper alternative approaches to investigate thermodynamics and kinetics of materials compared to experiments. In this dissertation, atomistic simulation methods have been used to study the thermodynamic and kinetic properties of two material systems, i.e. the entropy of Al-containing high entropy alloys (HEAs) and the vacancy migration energy of thermally grown aluminum oxide. (Abstract shortened by ProQuest.).
NASA Astrophysics Data System (ADS)
Yao, Lei; Wang, Zhenpo; Ma, Jun
2015-10-01
This paper proposes a method of fault detection of the connection of Lithium-Ion batteries based on entropy for electric vehicle. In electric vehicle operation process, some factors, such as road conditions, driving habits, vehicle performance, always affect batteries by vibration, which easily cause loosing or virtual connection between batteries. Through the simulation of the battery charging and discharging experiment under vibration environment, the data of voltage fluctuation can be obtained. Meanwhile, an optimal filtering method is adopted using discrete cosine filter method to analyze the characteristics of system noise, based on the voltage set when batteries are working under different vibration frequency. Experimental data processed by filtering is analyzed based on local Shannon entropy, ensemble Shannon entropy and sample entropy. And the best way to find a method of fault detection of the connection of lithium-ion batteries based on entropy is presented for electric vehicle. The experimental data shows that ensemble Shannon entropy can predict the accurate time and the location of battery connection failure in real time. Besides electric-vehicle industry, this method can also be used in other areas in complex vibration environment.
Video and accelerometer-based motion analysis for automated surgical skills assessment.
Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan
2018-03-01
Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.
Shearlet-based measures of entropy and complexity for two-dimensional patterns
NASA Astrophysics Data System (ADS)
Brazhe, Alexey
2018-06-01
New spatial entropy and complexity measures for two-dimensional patterns are proposed. The approach is based on the notion of disequilibrium and is built on statistics of directional multiscale coefficients of the fast finite shearlet transform. Shannon entropy and Jensen-Shannon divergence measures are employed. Both local and global spatial complexity and entropy estimates can be obtained, thus allowing for spatial mapping of complexity in inhomogeneous patterns. The algorithm is validated in numerical experiments with a gradually decaying periodic pattern and Ising surfaces near critical state. It is concluded that the proposed algorithm can be instrumental in describing a wide range of two-dimensional imaging data, textures, or surfaces, where an understanding of the level of order or randomness is desired.
Physics of negative absolute temperatures.
Abraham, Eitan; Penrose, Oliver
2017-01-01
Negative absolute temperatures were introduced into experimental physics by Purcell and Pound, who successfully applied this concept to nuclear spins; nevertheless, the concept has proved controversial: a recent article aroused considerable interest by its claim, based on a classical entropy formula (the "volume entropy") due to Gibbs, that negative temperatures violated basic principles of statistical thermodynamics. Here we give a thermodynamic analysis that confirms the negative-temperature interpretation of the Purcell-Pound experiments. We also examine the principal arguments that have been advanced against the negative temperature concept; we find that these arguments are not logically compelling, and moreover that the underlying "volume" entropy formula leads to predictions inconsistent with existing experimental results on nuclear spins. We conclude that, despite the counterarguments, negative absolute temperatures make good theoretical sense and did occur in the experiments designed to produce them.
Brain Entropy Mapping Using fMRI
Wang, Ze; Li, Yin; Childress, Anna Rose; Detre, John A.
2014-01-01
Entropy is an important trait for life as well as the human brain. Characterizing brain entropy (BEN) may provide an informative tool to assess brain states and brain functions. Yet little is known about the distribution and regional organization of BEN in normal brain. The purpose of this study was to examine the whole brain entropy patterns using a large cohort of normal subjects. A series of experiments were first performed to validate an approximate entropy measure regarding its sensitivity, specificity, and reliability using synthetic data and fMRI data. Resting state fMRI data from a large cohort of normal subjects (n = 1049) from multi-sites were then used to derive a 3-dimensional BEN map, showing a sharp low-high entropy contrast between the neocortex and the rest of brain. The spatial heterogeneity of resting BEN was further studied using a data-driven clustering method, and the entire brain was found to be organized into 7 hierarchical regional BEN networks that are consistent with known structural and functional brain parcellations. These findings suggest BEN mapping as a physiologically and functionally meaningful measure for studying brain functions. PMID:24657999
Equilibrium high entropy alloy phase stability from experiments and thermodynamic modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saal, James E.; Berglund, Ida S.; Sebastian, Jason T.
Long-term stability of high entropy alloys (HEAs) is a critical consideration for the design and practical application of HEAs. It has long been assumed that many HEAs are a kinetically-stabilized metastable structure, and recent experiments have confirmed this hypothesis by observing HEA ecomposition after long-termequilibration. In the presentwork,we demonstrate the use of the CALculation of PHAse Diagrams (CALPHAD) approach to predict HEA stability and processing parameters, comparing experimental long-term annealing observations to CALPHAD phase diagrams from a commercially-available HEA database. As a result, we find good agreement between single- and multi-phase predictions and experiments.
Equilibrium high entropy alloy phase stability from experiments and thermodynamic modeling
Saal, James E.; Berglund, Ida S.; Sebastian, Jason T.; ...
2017-10-29
Long-term stability of high entropy alloys (HEAs) is a critical consideration for the design and practical application of HEAs. It has long been assumed that many HEAs are a kinetically-stabilized metastable structure, and recent experiments have confirmed this hypothesis by observing HEA ecomposition after long-termequilibration. In the presentwork,we demonstrate the use of the CALculation of PHAse Diagrams (CALPHAD) approach to predict HEA stability and processing parameters, comparing experimental long-term annealing observations to CALPHAD phase diagrams from a commercially-available HEA database. As a result, we find good agreement between single- and multi-phase predictions and experiments.
Adjusting protein graphs based on graph entropy.
Peng, Sheng-Lung; Tsay, Yu-Wei
2014-01-01
Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid.
Adjusting protein graphs based on graph entropy
2014-01-01
Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid. PMID:25474347
Generating intrinsically disordered protein conformational ensembles from a Markov chain
NASA Astrophysics Data System (ADS)
Cukier, Robert I.
2018-03-01
Intrinsically disordered proteins (IDPs) sample a diverse conformational space. They are important to signaling and regulatory pathways in cells. An entropy penalty must be payed when an IDP becomes ordered upon interaction with another protein or a ligand. Thus, the degree of conformational disorder of an IDP is of interest. We create a dichotomic Markov model that can explore entropic features of an IDP. The Markov condition introduces local (neighbor residues in a protein sequence) rotamer dependences that arise from van der Waals and other chemical constraints. A protein sequence of length N is characterized by its (information) entropy and mutual information, MIMC, the latter providing a measure of the dependence among the random variables describing the rotamer probabilities of the residues that comprise the sequence. For a Markov chain, the MIMC is proportional to the pair mutual information MI which depends on the singlet and pair probabilities of neighbor residue rotamer sampling. All 2N sequence states are generated, along with their probabilities, and contrasted with the probabilities under the assumption of independent residues. An efficient method to generate realizations of the chain is also provided. The chain entropy, MIMC, and state probabilities provide the ingredients to distinguish different scenarios using the terminologies: MoRF (molecular recognition feature), not-MoRF, and not-IDP. A MoRF corresponds to large entropy and large MIMC (strong dependence among the residues' rotamer sampling), a not-MoRF corresponds to large entropy but small MIMC, and not-IDP corresponds to low entropy irrespective of the MIMC. We show that MorFs are most appropriate as descriptors of IDPs. They provide a reasonable number of high-population states that reflect the dependences between neighbor residues, thus classifying them as IDPs, yet without very large entropy that might lead to a too high entropy penalty.
Binarized cross-approximate entropy in crowdsensing environment.
Skoric, Tamara; Mohamoud, Omer; Milovanovic, Branislav; Japundzic-Zigon, Nina; Bajic, Dragana
2017-01-01
Personalised monitoring in health applications has been recognised as part of the mobile crowdsensing concept, where subjects equipped with sensors extract information and share them for personal or common benefit. Limited transmission resources impose the use of local analyses methodology, but this approach is incompatible with analytical tools that require stationary and artefact-free data. This paper proposes a computationally efficient binarised cross-approximate entropy, referred to as (X)BinEn, for unsupervised cardiovascular signal processing in environments where energy and processor resources are limited. The proposed method is a descendant of the cross-approximate entropy ((X)ApEn). It operates on binary, differentially encoded data series split into m-sized vectors. The Hamming distance is used as a distance measure, while a search for similarities is performed on the vector sets. The procedure is tested on rats under shaker and restraint stress, and compared to the existing (X)ApEn results. The number of processing operations is reduced. (X)BinEn captures entropy changes in a similar manner to (X)ApEn. The coding coarseness yields an adverse effect of reduced sensitivity, but it attenuates parameter inconsistency and binary bias. A special case of (X)BinEn is equivalent to Shannon's entropy. A binary conditional entropy for m =1 vectors is embedded into the (X)BinEn procedure. (X)BinEn can be applied to a single time series as an auto-entropy method, or to a pair of time series, as a cross-entropy method. Its low processing requirements makes it suitable for mobile, battery operated, self-attached sensing devices, with limited power and processor resources. Copyright © 2016 Elsevier Ltd. All rights reserved.
Entropy of balance - some recent results
2010-01-01
Background Entropy when applied to biological signals is expected to reflect the state of the biological system. However the physiological interpretation of the entropy is not always straightforward. When should high entropy be interpreted as a healthy sign, and when as marker of deteriorating health? We address this question for the particular case of human standing balance and the Center of Pressure data. Methods We have measured and analyzed balance data of 136 participants (young, n = 45; elderly, n = 91) comprising in all 1085 trials, and calculated the Sample Entropy (SampEn) for medio-lateral (M/L) and anterior-posterior (A/P) Center of Pressure (COP) together with the Hurst self-similariy (ss) exponent α using Detrended Fluctuation Analysis (DFA). The COP was measured with a force plate in eight 30 seconds trials with eyes closed, eyes open, foam, self-perturbation and nudge conditions. Results 1) There is a significant difference in SampEn for the A/P-direction between the elderly and the younger groups Old > young. 2) For the elderly we have in general A/P > M/L. 3) For the younger group there was no significant A/P-M/L difference with the exception for the nudge trials where we had the reverse situation, A/P < M/L. 4) For the elderly we have, Eyes Closed > Eyes Open. 5) In case of the Hurst ss-exponent we have for the elderly, M/L > A/P. Conclusions These results seem to be require some modifications of the more or less established attention-constraint interpretation of entropy. This holds that higher entropy correlates with a more automatic and a less constrained mode of balance control, and that a higher entropy reflects, in this sense, a more efficient balancing. PMID:20670457
ERIC Educational Resources Information Center
Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.
2008-01-01
In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…
Allnér, Olof; Foloppe, Nicolas; Nilsson, Lennart
2015-01-22
Molecular dynamics simulations of E. coli glutaredoxin1 in water have been performed to relate the dynamical parameters and entropy obtained in NMR relaxation experiments, with results extracted from simulated trajectory data. NMR relaxation is the most widely used experimental method to obtain data on dynamics of proteins, but it is limited to relatively short timescales and to motions of backbone amides or in some cases (13)C-H vectors. By relating the experimental data to the all-atom picture obtained in molecular dynamics simulations, valuable insights on the interpretation of the experiment can be gained. We have estimated the internal dynamics and their timescales by calculating the generalized order parameters (O) for different time windows. We then calculate the quasiharmonic entropy (S) and compare it to the entropy calculated from the NMR-derived generalized order parameter of the amide vectors. Special emphasis is put on characterizing dynamics that are not expressed through the motions of the amide group. The NMR and MD methods suffer from complementary limitations, with NMR being restricted to local vectors and dynamics on a timescale determined by the rotational diffusion of the solute, while in simulations, it may be difficult to obtain sufficient sampling to ensure convergence of the results. We also evaluate the amount of sampling obtained with molecular dynamics simulations and how it is affected by the length of individual simulations, by clustering of the sampled conformations. We find that two structural turns act as hinges, allowing the α helix between them to undergo large, long timescale motions that cannot be detected in the time window of the NMR dipolar relaxation experiments. We also show that the entropy obtained from the amide vector does not account for correlated motions of adjacent residues. Finally, we show that the sampling in a total of 100 ns molecular dynamics simulation can be increased by around 50%, by dividing the trajectory into 10 replicas with different starting velocities.
How to assess the efficiency of synchronization experiments in tokamaks
NASA Astrophysics Data System (ADS)
Murari, A.; Craciunescu, T.; Peluso, E.; Gelfusa, M.; Lungaroni, M.; Garzotti, L.; Frigione, D.; Gaudio, P.; Contributors, JET
2016-07-01
Control of instabilities such as ELMs and sawteeth is considered an important ingredient in the development of reactor-relevant scenarios. Various forms of ELM pacing have been tried in the past to influence their behavior using external perturbations. One of the main problems with these synchronization experiments resides in the fact that ELMs are periodic or quasi-periodic in nature. Therefore, after any pulsed perturbation, if one waits long enough, an ELM is always bound to occur. To evaluate the effectiveness of ELM pacing techniques, it is crucial to determine an appropriate interval over which they can have a real influence and an effective triggering capability. In this paper, three independent statistical methods are described to address this issue: Granger causality, transfer entropy and recurrence plots. The obtained results for JET with the ITER-like wall (ILW) indicate that the proposed techniques agree very well and provide much better estimates than the traditional heuristic criteria reported in the literature. Moreover, their combined use allows for the improvement of the time resolution of the assessment and determination of the efficiency of the pellet triggering in different phases of the same discharge. Therefore, the developed methods can be used to provide a quantitative and statistically robust estimate of the triggering efficiency of ELM pacing under realistic experimental conditions.
Dynamical Disentangling and Cooling of Atoms in Bilayer Optical Lattices
NASA Astrophysics Data System (ADS)
Kantian, A.; Langer, S.; Daley, A. J.
2018-02-01
We show how experimentally available bilayer lattice systems can be used to prepare quantum many-body states with exceptionally low entropy in one layer, by dynamically disentangling the two layers. This disentangling operation moves one layer—subsystem A —into a regime where excitations in A develop a single-particle gap. As a result, this operation maps directly to cooling for subsystem A , with entropy being shuttled to the other layer. For both bosonic and fermionic atoms, we study the corresponding dynamics showing that disentangling can be realized cleanly in ongoing experiments. The corresponding entanglement entropies are directly measurable with quantum gas microscopes, and, as a tool for producing lower-entropy states, this technique opens a range of applications beginning with simplifying production of magnetically ordered states of bosons and fermions.
Work and information from thermal states after subtraction of energy quanta.
Hloušek, J; Ježek, M; Filip, R
2017-10-12
Quantum oscillators prepared out of thermal equilibrium can be used to produce work and transmit information. By intensive cooling of a single oscillator, its thermal energy deterministically dissipates to a colder environment, and the oscillator substantially reduces its entropy. This out-of-equilibrium state allows us to obtain work and to carry information. Here, we propose and experimentally demonstrate an advanced approach, conditionally preparing more efficient out-of-equilibrium states only by a weak dissipation, an inefficient quantum measurement of the dissipated thermal energy, and subsequent triggering of that states. Although it conditionally subtracts the energy quanta from the oscillator, average energy grows, and second-order correlation function approaches unity as by coherent external driving. On the other hand, the Fano factor remains constant and the entropy of the subtracted state increases, which raise doubts about a possible application of this approach. To resolve it, we predict and experimentally verify that both available work and transmitted information can be conditionally higher in this case than by arbitrary cooling or adequate thermal heating up to the same average energy. It qualifies the conditional procedure as a useful source for experiments in quantum information and thermodynamics.
Zhao, Min; Chen, Yanming; Qu, Dacheng; Qu, Hong
2015-01-01
The substrates of a transporter are not only useful for inferring function of the transporter, but also important to discover compound-compound interaction and to reconstruct metabolic pathway. Though plenty of data has been accumulated with the developing of new technologies such as in vitro transporter assays, the search for substrates of transporters is far from complete. In this article, we introduce METSP, a maximum-entropy classifier devoted to retrieve transporter-substrate pairs (TSPs) from semistructured text. Based on the high quality annotation from UniProt, METSP achieves high precision and recall in cross-validation experiments. When METSP is applied to 182,829 human transporter annotation sentences in UniProt, it identifies 3942 sentences with transporter and compound information. Finally, 1547 confidential human TSPs are identified for further manual curation, among which 58.37% pairs with novel substrates not annotated in public transporter databases. METSP is the first efficient tool to extract TSPs from semistructured annotation text in UniProt. This tool can help to determine the precise substrates and drugs of transporters, thus facilitating drug-target prediction, metabolic network reconstruction, and literature classification.
Combining Experiments and Simulations Using the Maximum Entropy Principle
Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten
2014-01-01
A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124
Novel carbon-ion fuel cells. Quarterly technical report No. 9, October 1, 1995--December 31, 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cocks, F.H.
1995-12-31
This report presents research to develop an entirely new, fundamentally different class of fuel cell using a solid electrolyte that transports carbon ions. This fuel cell would use solid carbon dissolved in molten metal as a fuel reservoir and anode; expensive gaseous or liquid fuel would not be required. Thermodynamic factors favor a carbon-ion fuel cell over other fuel cell designs: a combination of enthalpy, entropy, and Gibbs free energy makes the reaction of solid carbon and oxygen very efficient, and the entropy change allows this efficiency to slightly increase at high temperatures. The high temperature exhaust of the fuelmore » cell would make it useful as a ``topping cycle``, to be followed by conventional steam turbine systems.« less
Investigating the structure preserving encryption of high efficiency video coding (HEVC)
NASA Astrophysics Data System (ADS)
Shahid, Zafar; Puech, William
2013-02-01
This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.
Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten
2012-10-01
It has become well established in laboratory experiments that switching tasks, perhaps due to interruptions at work, incur costs in response time to complete the next task. Conditions are also known that exaggerate or lessen the switching costs. Although switching costs can contribute to fatigue, task switching can also be an adaptive response to fatigue. The present study introduces a new research paradigm for studying the emergence of voluntary task switching regimes, self-organizing processes therein, and the possibly conflicting roles of switching costs and minimum entropy. Fifty-four undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The sequences of task choices were analyzed using orbital decomposition to extract pattern types and lengths, which were then classified and compared with regard to Shannon entropy, topological entropy, number of task switches involved, and overall performance. Results indicated that similar but different patterns were generated under the two instructional conditions, and better performance was associated with lower topological entropy. Both entropy metrics were associated with the amount of voluntary task switching. Future research should explore conditions affecting the trade-off between switching costs and entropy, levels of automaticity between task elements, and the role of voluntary switching regimes on fatigue.
NASA Astrophysics Data System (ADS)
Zunino, Luciano; Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.
2016-08-01
In this paper the permutation min-entropy has been implemented to unveil the presence of temporal structures in the daily values of European corporate bond indices from April 2001 to August 2015. More precisely, the informational efficiency evolution of the prices of fifteen sectorial indices has been carefully studied by estimating this information-theory-derived symbolic tool over a sliding time window. Such a dynamical analysis makes possible to obtain relevant conclusions about the effect that the 2008 credit crisis has had on the different European corporate bond sectors. It is found that the informational efficiency of some sectors, namely banks, financial services, insurance, and basic resources, has been strongly reduced due to the financial crisis whereas another set of sectors, integrated by chemicals, automobiles, media, energy, construction, industrial goods & services, technology, and telecommunications has only suffered a transitory loss of efficiency. Last but not least, the food & beverage, healthcare, and utilities sectors show a behavior close to a random walk practically along all the period of analysis, confirming a remarkable immunity against the 2008 financial crisis.
Entropic and near-field improvements of thermoradiative cells
Hsu, Wei -Chun; Tong, Jonathan K.; Liao, Bolin; ...
2016-10-13
A p-n junction maintained at above ambient temperature can work as a heat engine, converting some of the supplied heat into electricity and rejecting entropy by interband emission. Such thermoradiative cells have potential to harvest low-grade heat into electricity. By analyzing the entropy content of different spectral components of thermal radiation, we identify an approach to increase the efficiency of thermoradiative cells via spectrally selecting long-wavelength photons for radiative exchange. Furthermore, we predict that the near-field photon extraction by coupling photons generated from interband electronic transition to phonon polariton modes on the surface of a heat sink can increase themore » conversion efficiency as well as the power generation density, providing more opportunities to efficiently utilize terrestrial emission for clean energy. An ideal InSb thermoradiative cell can achieve a maximum efficiency and power density up to 20.4% and 327 Wm -2, respectively, between a hot source at 500 K and a cold sink at 300 K. Furthermore, sub-bandgap and non-radiative losses will significantly degrade the cell performance.« less
Entropic and Near-Field Improvements of Thermoradiative Cells
Hsu, Wei-Chun; Tong, Jonathan K.; Liao, Bolin; Huang, Yi; Boriskina, Svetlana V.; Chen, Gang
2016-01-01
A p-n junction maintained at above ambient temperature can work as a heat engine, converting some of the supplied heat into electricity and rejecting entropy by interband emission. Such thermoradiative cells have potential to harvest low-grade heat into electricity. By analyzing the entropy content of different spectral components of thermal radiation, we identify an approach to increase the efficiency of thermoradiative cells via spectrally selecting long-wavelength photons for radiative exchange. Furthermore, we predict that the near-field photon extraction by coupling photons generated from interband electronic transition to phonon polariton modes on the surface of a heat sink can increase the conversion efficiency as well as the power generation density, providing more opportunities to efficiently utilize terrestrial emission for clean energy. An ideal InSb thermoradiative cell can achieve a maximum efficiency and power density up to 20.4% and 327 Wm−2, respectively, between a hot source at 500 K and a cold sink at 300 K. However, sub-bandgap and non-radiative losses will significantly degrade the cell performance. PMID:27734902
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Fei; Wu, Yuan; Lou, Hongbo
Polymorphism, which describes the occurrence of different lattice structures in a crystalline material, is a critical phenomenon in materials science and condensed matter physics. Recently, configuration disorder was compositionally engineered into single lattices, leading to the discovery of high-entropy alloys and high-entropy oxides. For these novel entropy-stabilized forms of crystalline matter with extremely high structural stability, is polymorphism still possible? Here by employing in situ high-pressure synchrotron radiation X-ray diffraction, we reveal a polymorphic transition from face-centred-cubic (fcc) structure to hexagonal-close-packing (hcp) structure in the prototype CoCrFeMnNi high-entropy alloy. The transition is irreversible, and our in situ high-temperature synchrotron radiationmore » X-ray diffraction experiments at different pressures of the retained hcp high-entropy alloy reveal that the fcc phase is a stable polymorph at high temperatures, while the hcp structure is more thermodynamically favourable at lower temperatures. Lastly, as pressure is increased, the critical temperature for the hcp-to-fcc transformation also rises.« less
NASA Astrophysics Data System (ADS)
Liu, Weixin; Jin, Ningde; Han, Yunfeng; Ma, Jing
2018-06-01
In the present study, multi-scale entropy algorithm was used to characterise the complex flow phenomena of turbulent droplets in high water-cut oil-water two-phase flow. First, we compared multi-scale weighted permutation entropy (MWPE), multi-scale approximate entropy (MAE), multi-scale sample entropy (MSE) and multi-scale complexity measure (MCM) for typical nonlinear systems. The results show that MWPE presents satisfied variability with scale and anti-noise ability. Accordingly, we conducted an experiment of vertical upward oil-water two-phase flow with high water-cut and collected the signals of a high-resolution microwave resonant sensor, based on which two indexes, the entropy rate and mean value of MWPE, were extracted. Besides, the effects of total flow rate and water-cut on these two indexes were analysed. Our researches show that MWPE is an effective method to uncover the dynamic instability of oil-water two-phase flow with high water-cut.
Huang, L.; Cong, D. Y.; Ma, L.; ...
2015-07-02
A polycrystalline Ni 41Co 9Mn 40Sn 10 (at. %) magnetic shape memory alloy was prepared by arc melting and characterized mainly by magnetic measurements, in-situ high-energy X-ray diffraction (HEXRD), and mechanical testing. A large magnetoresistance of 53.8% (under 5 T) and a large magnetic entropy change of 31.9 J/(kg K) (under 5 T) were simultaneously achieved. Both of these values are among the highest values reported so far in Ni-Mn-Sn-based Heusler alloys. The large magnetic entropy change, closely related to the structural entropy change, is attributed to the large unit cell volume change across martensitic transformation as revealed by ourmore » in-situ HEXRD experiment. Furthermore, good compressive properties were also obtained. Lastly, the combination of large magnetoresistance, large magnetic entropy change, and good compressive properties, as well as low cost makes this alloy a promising candidate for multifunctional applications.« less
Pei, Jiquan; Han, Steve; Liao, Haijun; Li, Tao
2014-01-22
A highly efficient and simple-to-implement Monte Carlo algorithm is proposed for the evaluation of the Rényi entanglement entropy (REE) of the quantum dimer model (QDM) at the Rokhsar-Kivelson (RK) point. It makes possible the evaluation of REE at the RK point to the thermodynamic limit for a general QDM. We apply the algorithm to a QDM defined on the triangular and the square lattice in two dimensions and the simple and the face centered cubic (fcc) lattice in three dimensions. We find the REE on all these lattices follows perfect linear scaling in the thermodynamic limit, apart from an even-odd oscillation in the case of the square lattice. We also evaluate the topological entanglement entropy (TEE) with both a subtraction and an extrapolation procedure. We find the QDMs on both the triangular and the fcc lattice exhibit robust Z2 topological order. The expected TEE of ln2 is clearly demonstrated in both cases. Our large scale simulation also proves the recently proposed extrapolation procedure in cylindrical geometry to be a highly reliable way to extract the TEE of a topologically ordered system.
The entropy reduction engine: Integrating planning, scheduling, and control
NASA Technical Reports Server (NTRS)
Drummond, Mark; Bresina, John L.; Kedar, Smadar T.
1991-01-01
The Entropy Reduction Engine, an architecture for the integration of planning, scheduling, and control, is described. The architecture is motivated, presented, and analyzed in terms of its different components; namely, problem reduction, temporal projection, and situated control rule execution. Experience with this architecture has motivated the recent integration of learning. The learning methods are described along with their impact on architecture performance.
Relative entropy and optimization-driven coarse-graining methods in VOTCA
Mashayak, S. Y.; Jochum, Mara N.; Koschke, Konstantin; ...
2015-07-20
We discuss recent advances of the VOTCA package for systematic coarse-graining. Two methods have been implemented, namely the downhill simplex optimization and the relative entropy minimization. We illustrate the new methods by coarse-graining SPC/E bulk water and more complex water-methanol mixture systems. The CG potentials obtained from both methods are then evaluated by comparing the pair distributions from the coarse-grained to the reference atomistic simulations.We have also added a parallel analysis framework to improve the computational efficiency of the coarse-graining process.
Directionality theory and the evolution of body size.
Demetrius, L
2000-12-07
Directionality theory, a dynamic theory of evolution that integrates population genetics with demography, is based on the concept of evolutionary entropy, a measure of the variability in the age of reproducing individuals in a population. The main tenets of the theory are three principles relating the response to the ecological constraints a population experiences, with trends in entropy as the population evolves under mutation and natural selection. (i) Stationary size or fluctuations around a stationary size (bounded growth): a unidirectional increase in entropy; (ii) prolonged episodes of exponential growth (unbounded growth), large population size: a unidirectional decrease in entropy; and (iii) prolonged episodes of exponential growth (unbounded growth), small population size: random, non-directional change in entropy. We invoke these principles, together with an allometric relationship between entropy, and the morphometric variable body size, to provide evolutionary explanations of three empirical patterns pertaining to trends in body size, namely (i) Cope's rule, the tendency towards size increase within phyletic lineages; (ii) the island rule, which pertains to changes in body size that occur as species migrate from mainland populations to colonize island habitats; and (iii) Bergmann's rule, the tendency towards size increase with increasing latitude. The observation that these ecotypic patterns can be explained in terms of the directionality principles for entropy underscores the significance of evolutionary entropy as a unifying concept in forging a link between micro-evolution, the dynamics of gene frequency change, and macro-evolution, dynamic changes in morphometric variables.
Hacisuleyman, Aysima; Erman, Burak
2017-01-01
It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
Generation of skeletal mechanism by means of projected entropy participation indices
NASA Astrophysics Data System (ADS)
Paolucci, Samuel; Valorani, Mauro; Ciottoli, Pietro Paolo; Galassi, Riccardo Malpica
2017-11-01
When the dynamics of reactive systems develop very-slow and very-fast time scales separated by a range of active time scales, with gaps in the fast/active and slow/active time scales, then it is possible to achieve multi-scale adaptive model reduction along-with the integration of the ODEs using the G-Scheme. The scheme assumes that the dynamics is decomposed into active, slow, fast, and invariant subspaces. We derive expressions that establish a direct link between time scales and entropy production by using estimates provided by the G-Scheme. To calculate the contribution to entropy production, we resort to a standard model of a constant pressure, adiabatic, batch reactor, where the mixture temperature of the reactants is initially set above the auto-ignition temperature. Numerical experiments show that the contribution to entropy production of the fast subspace is of the same magnitude as the error threshold chosen for the identification of the decomposition of the tangent space, and the contribution of the slow subspace is generally much smaller than that of the active subspace. The information on entropy production associated with reactions within each subspace is used to define an entropy participation index that is subsequently utilized for model reduction.
Optimal channel efficiency in a sensory network
NASA Astrophysics Data System (ADS)
Mosqueiro, Thiago S.; Maia, Leonardo P.
2013-07-01
Spontaneous neural activity has been increasingly recognized as a subject of key relevance in neuroscience. It exhibits nontrivial spatiotemporal structure reflecting the organization of the underlying neural network and has proved to be closely intertwined with stimulus-induced activity patterns. As an additional contribution in this regard, we report computational studies that strongly suggest that a stimulus-free feature rules the behavior of an important psychophysical measure of the sensibility of a sensory system to a stimulus, the so-called dynamic range. Indeed in this paper we show that the entropy of the distribution of avalanche lifetimes (information efficiency, since it can be interpreted as the efficiency of the network seen as a communication channel) always accompanies the dynamic range in the benchmark model for sensory systems. Specifically, by simulating the Kinouchi-Copelli (KC) model on two broad families of model networks, we generically observed that both quantities always increase or decrease together as functions of the average branching ratio (the control parameter of the KC model) and that the information efficiency typically exhibits critical optimization jointly with the dynamic range (i.e., both quantities are optimized at the same value of that control parameter, that turns out to be the critical point of a nonequilibrium phase transition). In contrast with the practice of taking power laws to identify critical points in most studies describing measured neuronal avalanches, we rely on data collapses as more robust signatures of criticality to claim that critical optimization may happen even when the distribution of avalanche lifetimes is not a power law, as suggested by a recent experiment. Finally, we note that the entropy of the size distribution of avalanches (information capacity) does not always follow the dynamic range and the information efficiency when they are critically optimized, despite being more widely used than the latter to describe the computational capabilities of a neural network. This strongly suggests that dynamical rules allowing a proper temporal matching of the states of the interacting neurons is the key for achieving good performance in information processing, rather than increasing the number of available units.
Retinal blood vessel extraction using tunable bandpass filter and fuzzy conditional entropy.
Sil Kar, Sudeshna; Maity, Santi P
2016-09-01
Extraction of blood vessels on retinal images plays a significant role for screening of different opthalmologic diseases. However, accurate extraction of the entire and individual type of vessel silhouette from the noisy images with poorly illuminated background is a complicated task. To this aim, an integrated system design platform is suggested in this work for vessel extraction using a sequential bandpass filter followed by fuzzy conditional entropy maximization on matched filter response. At first noise is eliminated from the image under consideration through curvelet based denoising. To include the fine details and the relatively less thick vessel structures, the image is passed through a bank of sequential bandpass filter structure optimized for contrast enhancement. Fuzzy conditional entropy on matched filter response is then maximized to find the set of multiple optimal thresholds to extract the different types of vessel silhouettes from the background. Differential Evolution algorithm is used to determine the optimal gain in bandpass filter and the combination of the fuzzy parameters. Using the multiple thresholds, retinal image is classified as the thick, the medium and the thin vessels including neovascularization. Performance evaluated on different publicly available retinal image databases shows that the proposed method is very efficient in identifying the diverse types of vessels. Proposed method is also efficient in extracting the abnormal and the thin blood vessels in pathological retinal images. The average values of true positive rate, false positive rate and accuracy offered by the method is 76.32%, 1.99% and 96.28%, respectively for the DRIVE database and 72.82%, 2.6% and 96.16%, respectively for the STARE database. Simulation results demonstrate that the proposed method outperforms the existing methods in detecting the various types of vessels and the neovascularization structures. The combination of curvelet transform and tunable bandpass filter is found to be very much effective in edge enhancement whereas fuzzy conditional entropy efficiently distinguishes vessels of different widths. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A Boltzmann machine for the organization of intelligent machines
NASA Technical Reports Server (NTRS)
Moed, Michael C.; Saridis, George N.
1990-01-01
A three-tier structure consisting of organization, coordination, and execution levels forms the architecture of an intelligent machine using the principle of increasing precision with decreasing intelligence from a hierarchically intelligent control. This system has been formulated as a probabilistic model, where uncertainty and imprecision can be expressed in terms of entropies. The optimal strategy for decision planning and task execution can be found by minimizing the total entropy in the system. The focus is on the design of the organization level as a Boltzmann machine. Since this level is responsible for planning the actions of the machine, the Boltzmann machine is reformulated to use entropy as the cost function to be minimized. Simulated annealing, expanding subinterval random search, and the genetic algorithm are presented as search techniques to efficiently find the desired action sequence and illustrated with numerical examples.
Entropy production and rectification efficiency in colloid transport along a pulsating channel
NASA Astrophysics Data System (ADS)
Florencia Carusela, M.; Rubi, J. Miguel
2018-06-01
We study the current rectification of particles moving in a pulsating channel under the influence of an applied force. We have shown the existence of different rectification scenarios in which entropic and energetic effects compete. The effect can be quantified by means of a rectification coefficient that is analyzed in terms of the force, the frequency and the diffusion coefficient. The energetic cost of the motion of the particles expressed in terms of the entropy production depends on the importance of the entropic contribution to the total force. Rectification is more important at low values of the applied force when entropic effects become dominant. In this regime, the entropy production is not invariant under reversal of the applied force. The phenomenon observed could be used to optimize transport in microfluidic devices or in biological channels.
ERIC Educational Resources Information Center
Rose, Michael T.; Crossan, Angus N.; Kennedy, Ivan R.
2008-01-01
Consideration of the property of action is proposed to provide a more meaningful definition of efficient energy use and sustainable production in ecosystems. Action has physical dimensions similar to angular momentum, its magnitude varying with mass, spatial configuration and relative motion. In this article, the relationship of action to…
Enzyme catalysis by entropy without Circe effect.
Kazemi, Masoud; Himo, Fahmi; Åqvist, Johan
2016-03-01
Entropic effects have often been invoked to explain the extraordinary catalytic power of enzymes. In particular, the hypothesis that enzymes can use part of the substrate-binding free energy to reduce the entropic penalty associated with the subsequent chemical transformation has been very influential. The enzymatic reaction of cytidine deaminase appears to be a distinct example. Here, substrate binding is associated with a significant entropy loss that closely matches the activation entropy penalty for the uncatalyzed reaction in water, whereas the activation entropy for the rate-limiting catalytic step in the enzyme is close to zero. Herein, we report extensive computer simulations of the cytidine deaminase reaction and its temperature dependence. The energetics of the catalytic reaction is first evaluated by density functional theory calculations. These results are then used to parametrize an empirical valence bond description of the reaction, which allows efficient sampling by molecular dynamics simulations and computation of Arrhenius plots. The thermodynamic activation parameters calculated by this approach are in excellent agreement with experimental data and indeed show an activation entropy close to zero for the rate-limiting transition state. However, the origin of this effect is a change of reaction mechanism compared the uncatalyzed reaction. The enzyme operates by hydroxide ion attack, which is intrinsically associated with a favorable activation entropy. Hence, this has little to do with utilization of binding free energy to pay the entropic penalty but rather reflects how a preorganized active site can stabilize a reaction path that is not operational in solution.
Enzyme catalysis by entropy without Circe effect
Kazemi, Masoud; Himo, Fahmi; Åqvist, Johan
2016-01-01
Entropic effects have often been invoked to explain the extraordinary catalytic power of enzymes. In particular, the hypothesis that enzymes can use part of the substrate-binding free energy to reduce the entropic penalty associated with the subsequent chemical transformation has been very influential. The enzymatic reaction of cytidine deaminase appears to be a distinct example. Here, substrate binding is associated with a significant entropy loss that closely matches the activation entropy penalty for the uncatalyzed reaction in water, whereas the activation entropy for the rate-limiting catalytic step in the enzyme is close to zero. Herein, we report extensive computer simulations of the cytidine deaminase reaction and its temperature dependence. The energetics of the catalytic reaction is first evaluated by density functional theory calculations. These results are then used to parametrize an empirical valence bond description of the reaction, which allows efficient sampling by molecular dynamics simulations and computation of Arrhenius plots. The thermodynamic activation parameters calculated by this approach are in excellent agreement with experimental data and indeed show an activation entropy close to zero for the rate-limiting transition state. However, the origin of this effect is a change of reaction mechanism compared the uncatalyzed reaction. The enzyme operates by hydroxide ion attack, which is intrinsically associated with a favorable activation entropy. Hence, this has little to do with utilization of binding free energy to pay the entropic penalty but rather reflects how a preorganized active site can stabilize a reaction path that is not operational in solution. PMID:26755610
Optimization of pressure gauge locations for water distribution systems using entropy theory.
Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon
2012-12-01
It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.
Estimating Bayesian Phylogenetic Information Content
Lewis, Paul O.; Chen, Ming-Hui; Kuo, Lynn; Lewis, Louise A.; Fučíková, Karolina; Neupane, Suman; Wang, Yu-Bo; Shi, Daoyuan
2016-01-01
Measuring the phylogenetic information content of data has a long history in systematics. Here we explore a Bayesian approach to information content estimation. The entropy of the posterior distribution compared with the entropy of the prior distribution provides a natural way to measure information content. If the data have no information relevant to ranking tree topologies beyond the information supplied by the prior, the posterior and prior will be identical. Information in data discourages consideration of some hypotheses allowed by the prior, resulting in a posterior distribution that is more concentrated (has lower entropy) than the prior. We focus on measuring information about tree topology using marginal posterior distributions of tree topologies. We show that both the accuracy and the computational efficiency of topological information content estimation improve with use of the conditional clade distribution, which also allows topological information content to be partitioned by clade. We explore two important applications of our method: providing a compelling definition of saturation and detecting conflict among data partitions that can negatively affect analyses of concatenated data. [Bayesian; concatenation; conditional clade distribution; entropy; information; phylogenetics; saturation.] PMID:27155008
Ahmadi, Samira; Wu, Christine; Sepehri, Nariman; Kantikar, Anuprita; Nankar, Mayur; Szturm, Tony
2018-01-01
Quantized dynamical entropy (QDE) has recently been proposed as a new measure to quantify the complexity of dynamical systems with the purpose of offering a better computational efficiency. This paper further investigates the viability of this method using five different human gait signals. These signals are recorded while normal walking and while performing secondary tasks among two age groups (young and older age groups). The results are compared with the outcomes of previously established sample entropy (SampEn) measure for the same signals. We also study how analyzing segmented and spatially and temporally normalized signal differs from analyzing whole data. Our findings show that human gait signals become more complex as people age and while they are cognitively loaded. Center of pressure (COP) displacement in mediolateral direction is the best signal for showing the gait changes. Moreover, the results suggest that by segmenting data, more information about intrastride dynamical features are obtained. Most importantly, QDE is shown to be a reliable measure for human gait complexity analysis.
Towards an Entropy Stable Spectral Element Framework for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.; Parsani, Matteo; Fisher, Travis C.; Nielsen, Eric J.
2016-01-01
Entropy stable (SS) discontinuous spectral collocation formulations of any order are developed for the compressible Navier-Stokes equations on hexahedral elements. Recent progress on two complementary efforts is presented. The first effort is a generalization of previous SS spectral collocation work to extend the applicable set of points from tensor product, Legendre-Gauss-Lobatto (LGL) to tensor product Legendre-Gauss (LG) points. The LG and LGL point formulations are compared on a series of test problems. Although being more costly to implement, it is shown that the LG operators are significantly more accurate on comparable grids. Both the LGL and LG operators are of comparable efficiency and robustness, as is demonstrated using test problems for which conventional FEM techniques suffer instability. The second effort generalizes previous SS work to include the possibility of p-refinement at non-conforming interfaces. A generalization of existing entropy stability machinery is developed to accommodate the nuances of fully multi-dimensional summation-by-parts (SBP) operators. The entropy stability of the compressible Euler equations on non-conforming interfaces is demonstrated using the newly developed LG operators and multi-dimensional interface interpolation operators.
Symbolic phase transfer entropy method and its application
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
2017-10-01
In this paper, we introduce symbolic phase transfer entropy (SPTE) to infer the direction and strength of information flow among systems. The advantages of the proposed method are investigated by simulations on synthetic signals and real-world data. We demonstrate that symbolic phase transfer entropy is a robust and efficient tool to infer the information flow between complex systems. Based on the study of the synthetic data, we find a significant advantage of SPTE is its reduced sensitivity to noise. In addition, SPTE requires less amount of data than symbolic transfer entropy(STE). We analyze the direction and strength of information flow between six stock markets during the period from 2006 to 2016. The results indicate that the information flow among stocks varies over different periods. We also find that the interaction network pattern among stocks undergoes hierarchial reorganization with transition from one period to another. It is shown that the clusters are mainly classified according to period, and then by region. The stocks during the same time period are shown to drop into the same cluster.
Multibody local approximation: Application to conformational entropy calculations on biomolecules
NASA Astrophysics Data System (ADS)
Suárez, Ernesto; Suárez, Dimas
2012-08-01
Multibody type expansions like mutual information expansions are widely used for computing or analyzing properties of large composite systems. The power of such expansions stems from their generality. Their weaknesses, however, are the large computational cost of including high order terms due to the combinatorial explosion and the fact that truncation errors do not decrease strictly with the expansion order. Herein, we take advantage of the redundancy of multibody expansions in order to derive an efficient reformulation that captures implicitly all-order correlation effects within a given cutoff, avoiding the combinatory explosion. This approach, which is cutoff dependent rather than order dependent, keeps the generality of the original expansions and simultaneously mitigates their limitations provided that a reasonable cutoff can be used. An application of particular interest can be the computation of the conformational entropy of flexible peptide molecules from molecular dynamics trajectories. By combining the multibody local estimations of conformational entropy with average values of the rigid-rotor and harmonic-oscillator entropic contributions, we obtain by far a tighter upper bound of the absolute entropy than the one obtained by the broadly used quasi-harmonic method.
Multibody local approximation: application to conformational entropy calculations on biomolecules.
Suárez, Ernesto; Suárez, Dimas
2012-08-28
Multibody type expansions like mutual information expansions are widely used for computing or analyzing properties of large composite systems. The power of such expansions stems from their generality. Their weaknesses, however, are the large computational cost of including high order terms due to the combinatorial explosion and the fact that truncation errors do not decrease strictly with the expansion order. Herein, we take advantage of the redundancy of multibody expansions in order to derive an efficient reformulation that captures implicitly all-order correlation effects within a given cutoff, avoiding the combinatory explosion. This approach, which is cutoff dependent rather than order dependent, keeps the generality of the original expansions and simultaneously mitigates their limitations provided that a reasonable cutoff can be used. An application of particular interest can be the computation of the conformational entropy of flexible peptide molecules from molecular dynamics trajectories. By combining the multibody local estimations of conformational entropy with average values of the rigid-rotor and harmonic-oscillator entropic contributions, we obtain by far a tighter upper bound of the absolute entropy than the one obtained by the broadly used quasi-harmonic method.
Methodes entropiques appliquees au probleme inverse en magnetoencephalographie
NASA Astrophysics Data System (ADS)
Lapalme, Ervig
2005-07-01
This thesis is devoted to biomagnetic source localization using magnetoencephalography. This problem is known to have an infinite number of solutions. So methods are required to take into account anatomical and functional information on the solution. The work presented in this thesis uses the maximum entropy on the mean method to constrain the solution. This method originates from statistical mechanics and information theory. This thesis is divided into two main parts containing three chapters each. The first part reviews the magnetoencephalographic inverse problem: the theory needed to understand its context and the hypotheses for simplifying the problem. In the last chapter of this first part, the maximum entropy on the mean method is presented: its origins are explained and also how it is applied to our problem. The second part is the original work of this thesis presenting three articles; one of them already published and two others submitted for publication. In the first article, a biomagnetic source model is developed and applied in a theoretical con text but still demonstrating the efficiency of the method. In the second article, we go one step further towards a realistic modelization of the cerebral activation. The main priors are estimated using the magnetoencephalographic data. This method proved to be very efficient in realistic simulations. In the third article, the previous method is extended to deal with time signals thus exploiting the excellent time resolution offered by magnetoencephalography. Compared with our previous work, the temporal method is applied to real magnetoencephalographic data coming from a somatotopy experience and results agree with previous physiological knowledge about this kind of cognitive process.
Revisiting Feynman's ratchet with thermoelectric transport theory.
Apertet, Y; Ouerdane, H; Goupil, C; Lecoeur, Ph
2014-07-01
We show how the formalism used for thermoelectric transport may be adapted to Smoluchowski's seminal thought experiment, also known as Feynman's ratchet and pawl system. Our analysis rests on the notion of useful flux, which for a thermoelectric system is the electrical current and for Feynman's ratchet is the effective jump frequency. Our approach yields original insight into the derivation and analysis of the system's properties. In particular we define an entropy per tooth in analogy with the entropy per carrier or Seebeck coefficient, and we derive the analog to Kelvin's second relation for Feynman's ratchet. Owing to the formal similarity between the heat fluxes balance equations for a thermoelectric generator (TEG) and those for Feynman's ratchet, we introduce a distribution parameter γ that quantifies the amount of heat that flows through the cold and hot sides of both heat engines. While it is well established that γ = 1/2 for a TEG, it is equal to 1 for Feynman's ratchet. This implies that no heat may be rejected in the cold reservoir for the latter case. Further, the analysis of the efficiency at maximum power shows that the so-called Feynman efficiency corresponds to that of an exoreversible engine, with γ = 1. Then, turning to the nonlinear regime, we generalize the approach based on the convection picture and introduce two different types of resistance to distinguish the dynamical behavior of the considered system from its ability to dissipate energy. We finally put forth the strong similarity between the original Feynman ratchet and a mesoscopic thermoelectric generator with a single conducting channel.
Silveira, Vladímir de Aquino; Souza, Givago da Silva; Gomes, Bruno Duarte; Rodrigues, Anderson Raiol; Silveira, Luiz Carlos de Lima
2014-01-01
We used psychometric functions to estimate the joint entropy for space discrimination and spatial frequency discrimination. Space discrimination was taken as discrimination of spatial extent. Seven subjects were tested. Gábor functions comprising unidimensionalsinusoidal gratings (0.4, 2, and 10 cpd) and bidimensionalGaussian envelopes (1°) were used as reference stimuli. The experiment comprised the comparison between reference and test stimulithat differed in grating's spatial frequency or envelope's standard deviation. We tested 21 different envelope's standard deviations around the reference standard deviation to study spatial extent discrimination and 19 different grating's spatial frequencies around the reference spatial frequency to study spatial frequency discrimination. Two series of psychometric functions were obtained for 2%, 5%, 10%, and 100% stimulus contrast. The psychometric function data points for spatial extent discrimination or spatial frequency discrimination were fitted with Gaussian functions using the least square method, and the spatial extent and spatial frequency entropies were estimated from the standard deviation of these Gaussian functions. Then, joint entropy was obtained by multiplying the square root of space extent entropy times the spatial frequency entropy. We compared our results to the theoretical minimum for unidimensional Gábor functions, 1/4π or 0.0796. At low and intermediate spatial frequencies and high contrasts, joint entropy reached levels below the theoretical minimum, suggesting non-linear interactions between two or more visual mechanisms. We concluded that non-linear interactions of visual pathways, such as the M and P pathways, could explain joint entropy values below the theoretical minimum at low and intermediate spatial frequencies and high contrasts. These non-linear interactions might be at work at intermediate and high contrasts at all spatial frequencies once there was a substantial decrease in joint entropy for these stimulus conditions when contrast was raised. PMID:24466158
Silveira, Vladímir de Aquino; Souza, Givago da Silva; Gomes, Bruno Duarte; Rodrigues, Anderson Raiol; Silveira, Luiz Carlos de Lima
2014-01-01
We used psychometric functions to estimate the joint entropy for space discrimination and spatial frequency discrimination. Space discrimination was taken as discrimination of spatial extent. Seven subjects were tested. Gábor functions comprising unidimensionalsinusoidal gratings (0.4, 2, and 10 cpd) and bidimensionalGaussian envelopes (1°) were used as reference stimuli. The experiment comprised the comparison between reference and test stimulithat differed in grating's spatial frequency or envelope's standard deviation. We tested 21 different envelope's standard deviations around the reference standard deviation to study spatial extent discrimination and 19 different grating's spatial frequencies around the reference spatial frequency to study spatial frequency discrimination. Two series of psychometric functions were obtained for 2%, 5%, 10%, and 100% stimulus contrast. The psychometric function data points for spatial extent discrimination or spatial frequency discrimination were fitted with Gaussian functions using the least square method, and the spatial extent and spatial frequency entropies were estimated from the standard deviation of these Gaussian functions. Then, joint entropy was obtained by multiplying the square root of space extent entropy times the spatial frequency entropy. We compared our results to the theoretical minimum for unidimensional Gábor functions, 1/4π or 0.0796. At low and intermediate spatial frequencies and high contrasts, joint entropy reached levels below the theoretical minimum, suggesting non-linear interactions between two or more visual mechanisms. We concluded that non-linear interactions of visual pathways, such as the M and P pathways, could explain joint entropy values below the theoretical minimum at low and intermediate spatial frequencies and high contrasts. These non-linear interactions might be at work at intermediate and high contrasts at all spatial frequencies once there was a substantial decrease in joint entropy for these stimulus conditions when contrast was raised.
The Evolution of Gas Giant Entropy During Formation by Runaway Accretion
NASA Astrophysics Data System (ADS)
Berardo, David; Cumming, Andrew; Marleau, Gabriel-Dominique
2017-01-01
We calculate the evolution of gas giant planets during the runaway gas accretion phase of formation, to understand how the luminosity of young giant planets depends on the accretion conditions. We construct steady-state envelope models, and run time-dependent simulations of accreting planets with the code Modules for Experiments in Stellar Astrophysics. We show that the evolution of the internal entropy depends on the contrast between the internal adiabat and the entropy of the accreted material, parametrized by the shock temperature T 0 and pressure P 0. At low temperatures ({T}0≲ 300-1000 {{K}}, depending on model parameters), the accreted material has a lower entropy than the interior. The convection zone extends to the surface and can drive a high luminosity, leading to rapid cooling and cold starts. For higher temperatures, the accreted material has a higher entropy than the interior, giving a radiative zone that stalls cooling. For {T}0≳ 2000 {{K}}, the surface-interior entropy contrast cannot be accommodated by the radiative envelope, and the accreted matter accumulates with high entropy, forming a hot start. The final state of the planet depends on the shock temperature, accretion rate, and starting entropy at the onset of runaway accretion. Cold starts with L≲ 5× {10}-6 {L}⊙ require low accretion rates and starting entropy, and the temperature of the accreting material needs to be maintained close to the nebula temperature. If instead the temperature is near the value required to radiate the accretion luminosity, 4π {R}2σ {T}04˜ ({GM}\\dot{M}/R), as suggested by previous work on radiative shocks in the context of star formation, gas giant planets form in a hot start with L˜ {10}-4 {L}⊙ .
It is not the entropy you produce, rather, how you produce it
Volk, Tyler; Pauluis, Olivier
2010-01-01
The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise the entropy production rate. Our analysis makes explicit fundamental questions for MEP that should be brought into focus: can MEP predict not just the overall state of entropy production of a system but also the details of the sub-systems of dissipaters within the system? Which fluxes of the system are those that are most likely to be maximized? How it is possible for MEP theory to be so domain-neutral that it can claim to apply equally to both purely physical–chemical systems and also systems governed by the ‘laws’ of biological evolution? We conclude that the principle of MEP needs to take on the issue of exactly how entropy is produced. PMID:20368249
Symmetry for the duration of entropy-consuming intervals.
García-García, Reinaldo; Domínguez, Daniel
2014-05-01
We introduce the violation fraction υ as the cumulative fraction of time that a mesoscopic system spends consuming entropy at a single trajectory in phase space. We show that the fluctuations of this quantity are described in terms of a symmetry relation reminiscent of fluctuation theorems, which involve a function Φ, which can be interpreted as an entropy associated with the fluctuations of the violation fraction. The function Φ, when evaluated for arbitrary stochastic realizations of the violation fraction, is odd upon the symmetry transformations that are relevant for the associated stochastic entropy production. This fact leads to a detailed fluctuation theorem for the probability density function of Φ. We study the steady-state limit of this symmetry in the paradigmatic case of a colloidal particle dragged by optical tweezers through an aqueous solution. Finally, we briefly discuss possible applications of our results for the estimation of free-energy differences from single-molecule experiments.
NASA Astrophysics Data System (ADS)
Wang, Bingjie; Pi, Shaohua; Sun, Qi; Jia, Bo
2015-05-01
An improved classification algorithm that considers multiscale wavelet packet Shannon entropy is proposed. Decomposition coefficients at all levels are obtained to build the initial Shannon entropy feature vector. After subtracting the Shannon entropy map of the background signal, components of the strongest discriminating power in the initial feature vector are picked out to rebuild the Shannon entropy feature vector, which is transferred to radial basis function (RBF) neural network for classification. Four types of man-made vibrational intrusion signals are recorded based on a modified Sagnac interferometer. The performance of the improved classification algorithm has been evaluated by the classification experiments via RBF neural network under different diffusion coefficients. An 85% classification accuracy rate is achieved, which is higher than the other common algorithms. The classification results show that this improved classification algorithm can be used to classify vibrational intrusion signals in an automatic real-time monitoring system.
2017-01-01
It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins. PMID:28095404
Fault detection and diagnosis for gas turbines based on a kernelized information entropy model.
Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei
2014-01-01
Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.
NASA Astrophysics Data System (ADS)
Boyack, Rufus; Guo, Hao; Levin, K.
2015-03-01
Recent experiments on both unitary Fermi gases and high temperature superconductors (arxiv:1410.4835 [cond-mat.quant-gas], arxiv:1409.5820 [cond-mat.str-el].) have led to renewed interest in near perfect fluidity in condensed matter systems. This is quantified by studying the ratio of shear viscosity to entropy density. In this talk we present calculations of this ratio in homogeneous bosonic and fermionic superfluids, with the latter ranging from BCS to BEC. While the shear viscosity exhibits a power law (for bosons) or exponential suppression (for fermions), a similar dependence is found for the respective entropy densities. As a result, strict BCS and (true) bosonic superfluids have an analogous viscosity to entropy density ratio, behaving linearly with temperature times the (T-dependent) dissipation rate; this is characteristic of imperfect fluidity in weakly coupled fluids. This is contrasted with the behavior of fermions at unitarity which we argue is a consequence of additional terms in the entropy density thereby leading to more perfect fluidity. (arXiv:1407.7572v1 [cond-mat.quant-gas])
Force-Time Entropy of Isometric Impulse.
Hsieh, Tsung-Yu; Newell, Karl M
2016-01-01
The relation between force and temporal variability in discrete impulse production has been viewed as independent (R. A. Schmidt, H. Zelaznik, B. Hawkins, J. S. Frank, & J. T. Quinn, 1979 ) or dependent on the rate of force (L. G. Carlton & K. M. Newell, 1993 ). Two experiments in an isometric single finger force task investigated the joint force-time entropy with (a) fixed time to peak force and different percentages of force level and (b) fixed percentage of force level and different times to peak force. The results showed that the peak force variability increased either with the increment of force level or through a shorter time to peak force that also reduced timing error variability. The peak force entropy and entropy of time to peak force increased on the respective dimension as the parameter conditions approached either maximum force or a minimum rate of force production. The findings show that force error and timing error are dependent but complementary when considered in the same framework with the joint force-time entropy at a minimum in the middle parameter range of discrete impulse.
Entropy of space-time outcome in a movement speed-accuracy task.
Hsieh, Tsung-Yu; Pacheco, Matheus Maia; Newell, Karl M
2015-12-01
The experiment reported was set-up to investigate the space-time entropy of movement outcome as a function of a range of spatial (10, 20 and 30 cm) and temporal (250-2500 ms) criteria in a discrete aiming task. The variability and information entropy of the movement spatial and temporal errors considered separately increased and decreased on the respective dimension as a function of an increment of movement velocity. However, the joint space-time entropy was lowest when the relative contribution of spatial and temporal task criteria was comparable (i.e., mid-range of space-time constraints), and it increased with a greater trade-off between spatial or temporal task demands, revealing a U-shaped function across space-time task criteria. The traditional speed-accuracy functions of spatial error and temporal error considered independently mapped to this joint space-time U-shaped entropy function. The trade-off in movement tasks with joint space-time criteria is between spatial error and timing error, rather than movement speed and accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kollet, S. J.
2015-05-01
In this study, entropy production optimization and inference principles are applied to a synthetic semi-arid hillslope in high-resolution, physics-based simulations. The results suggest that entropy or power is indeed maximized, because of the strong nonlinearity of variably saturated flow and competing processes related to soil moisture fluxes, the depletion of gradients, and the movement of a free water table. Thus, it appears that the maximum entropy production (MEP) principle may indeed be applicable to hydrologic systems. In the application to hydrologic system, the free water table constitutes an important degree of freedom in the optimization of entropy production and may also relate the theory to actual observations. In an ensuing analysis, an attempt is made to transfer the complex, "microscopic" hillslope model into a macroscopic model of reduced complexity using the MEP principle as an interference tool to obtain effective conductance coefficients and forces/gradients. The results demonstrate a new approach for the application of MEP to hydrologic systems and may form the basis for fruitful discussions and research in future.
Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model
Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei
2014-01-01
Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms. PMID:25258726
Nighttime images fusion based on Laplacian pyramid
NASA Astrophysics Data System (ADS)
Wu, Cong; Zhan, Jinhao; Jin, Jicheng
2018-02-01
This paper expounds method of the average weighted fusion, image pyramid fusion, the wavelet transform and apply these methods on the fusion of multiple exposures nighttime images. Through calculating information entropy and cross entropy of fusion images, we can evaluate the effect of different fusion. Experiments showed that Laplacian pyramid image fusion algorithm is suitable for processing nighttime images fusion, it can reduce the halo while preserving image details.
On the morphological instability of a bubble during inertia-controlled growth
NASA Astrophysics Data System (ADS)
Martyushev, L. M.; Birzina, A. I.; Soboleva, A. S.
2018-06-01
The morphological stability of a spherical bubble growing under inertia control is analyzed. Based on the comparison of entropy productions for a distorted and undistorted surface and using the maximum entropy production principle, the morphological instability of the bubble under arbitrary amplitude distortions is shown. This result allows explaining a number of experiments where the surface roughness of bubbles was observed during their explosive-type growth.
Psychoacoustic entropy theory and its implications for performance practice
NASA Astrophysics Data System (ADS)
Strohman, Gregory J.
This dissertation attempts to motivate, derive and imply potential uses for a generalized perceptual theory of musical harmony called psychoacoustic entropy theory. This theory treats the human auditory system as a physical system which takes acoustic measurements. As a result, the human auditory system is subject to all the appropriate uncertainties and limitations of other physical measurement systems. This is the theoretic basis for defining psychoacoustic entropy. Psychoacoustic entropy is a numerical quantity which indexes the degree to which the human auditory system perceives instantaneous disorder within a sound pressure wave. Chapter one explains the importance of harmonic analysis as a tool for performance practice. It also outlines the critical limitations for many of the most influential historical approaches to modeling harmonic stability, particularly when compared to available scientific research in psychoacoustics. Rather than analyze a musical excerpt, psychoacoustic entropy is calculated directly from sound pressure waves themselves. This frames psychoacoustic entropy theory in the most general possible terms as a theory of musical harmony, enabling it to be invoked for any perceivable sound. Chapter two provides and examines many widely accepted mathematical models of the acoustics and psychoacoustics of these sound pressure waves. Chapter three introduces entropy as a precise way of measuring perceived uncertainty in sound pressure waves. Entropy is used, in combination with the acoustic and psychoacoustic models introduced in chapter two, to motivate the mathematical formulation of psychoacoustic entropy theory. Chapter four shows how to use psychoacoustic entropy theory to analyze the certain types of musical harmonies, while chapter five applies the analytical tools developed in chapter four to two short musical excerpts to influence their interpretation. Almost every form of harmonic analysis invokes some degree of mathematical reasoning. However, the limited scope of most harmonic systems used for Western common practice music greatly simplifies the necessary level of mathematical detail. Psychoacoustic entropy theory requires a greater deal of mathematical complexity due to its sheer scope as a generalized theory of musical harmony. Fortunately, under specific assumptions the theory can take on vastly simpler forms. Psychoacoustic entropy theory appears to be highly compatible with the latest scientific research in psychoacoustics. However, the theory itself should be regarded as a hypothesis and this dissertation an experiment in progress. The evaluation of psychoacoustic entropy theory as a scientific theory of human sonic perception must wait for more rigorous future research.
NASA Astrophysics Data System (ADS)
Ronglian, Yuan; Mingye, Ai; Qiaona, Jia; Yuxuan, Liu
2018-03-01
Sustainable development is the only way for the development of human society. As an important part of the national economy, the steel industry is an energy-intensive industry and needs to go further for sustainable development. In this paper, we use entropy method and Topsis method to evaluate the development of China’s steel industry during the “12th Five-Year Plan” from four aspects: resource utilization efficiency, main energy and material consumption, pollution status and resource reuse rate. And we also put forward some suggestions for the development of China’s steel industry.
Thermodynamics of Aryl-Dihydroxyphenyl-Thiadiazole Binding to Human Hsp90
Kazlauskas, Egidijus; Petrikaitė, Vilma; Michailovienė, Vilma; Revuckienė, Jurgita; Matulienė, Jurgita; Grinius, Leonas; Matulis, Daumantas
2012-01-01
The design of specific inhibitors against the Hsp90 chaperone and other enzyme relies on the detailed and correct understanding of both the thermodynamics of inhibitor binding and the structural features of the protein-inhibitor complex. Here we present a detailed thermodynamic study of binding of aryl-dihydroxyphenyl-thiadiazole inhibitor series to recombinant human Hsp90 alpha isozyme. The inhibitors are highly potent, with the intrinsic Kd approximately equal to 1 nM as determined by isothermal titration calorimetry (ITC) and thermal shift assay (TSA). Dissection of protonation contributions yielded the intrinsic thermodynamic parameters of binding, such as enthalpy, entropy, Gibbs free energy, and the heat capacity. The differences in binding thermodynamic parameters between the series of inhibitors revealed contributions of the functional groups, thus providing insight into molecular reasons for improved or diminished binding efficiency. The inhibitor binding to Hsp90 alpha primarily depended on a large favorable enthalpic contribution combined with the smaller favorable entropic contribution, thus suggesting that their binding was both enthalpically and entropically optimized. The enthalpy-entropy compensation phenomenon was highly evident when comparing the inhibitor binding enthalpies and entropies. This study illustrates how detailed thermodynamic analysis helps to understand energetic reasons for the binding efficiency and develop more potent inhibitors that could be applied for therapeutic use as Hsp90 inhibitors. PMID:22655030
NASA Astrophysics Data System (ADS)
Chen, Xiao; Li, Yaan; Yu, Jing; Li, Yuxing
2018-01-01
For fast and more effective implementation of tracking multiple targets in a cluttered environment, we propose a multiple targets tracking (MTT) algorithm called maximum entropy fuzzy c-means clustering joint probabilistic data association that combines fuzzy c-means clustering and the joint probabilistic data association (PDA) algorithm. The algorithm uses the membership value to express the probability of the target originating from measurement. The membership value is obtained through fuzzy c-means clustering objective function optimized by the maximum entropy principle. When considering the effect of the public measurement, we use a correction factor to adjust the association probability matrix to estimate the state of the target. As this algorithm avoids confirmation matrix splitting, it can solve the high computational load problem of the joint PDA algorithm. The results of simulations and analysis conducted for tracking neighbor parallel targets and cross targets in a different density cluttered environment show that the proposed algorithm can realize MTT quickly and efficiently in a cluttered environment. Further, the performance of the proposed algorithm remains constant with increasing process noise variance. The proposed algorithm has the advantages of efficiency and low computational load, which can ensure optimum performance when tracking multiple targets in a dense cluttered environment.
NASA Astrophysics Data System (ADS)
Açıkkalp, Emin; Caner, Necmettin
2015-09-01
In this study, a nano-scale irreversible Brayton cycle operating with quantum gasses including Bose and Fermi gasses is researched. Developments in the nano-technology cause searching the nano-scale machines including thermal systems to be unavoidable. Thermodynamic analysis of a nano-scale irreversible Brayton cycle operating with Bose and Fermi gasses was performed (especially using exergetic sustainability index). In addition, thermodynamic analysis involving classical evaluation parameters such as work output, exergy output, entropy generation, energy and exergy efficiencies were conducted. Results are submitted numerically and finally some useful recommendations were conducted. Some important results are: entropy generation and exergetic sustainability index are affected mostly for Bose gas and power output and exergy output are affected mostly for the Fermi gas by x. At the high temperature conditions, work output and entropy generation have high values comparing with other degeneracy conditions.
NASA Astrophysics Data System (ADS)
Fodor, Petru; Vyhnalek, Brian; Kaufman, Miron
2013-03-01
We investigate mixing in Dean flows by solving numerically the Navier-Stokes equation for a circular channel. Tracers of two chemical species are carried by the fluid. The centrifugal forces, experienced as the fluid travels along a curved trajectory, coupled with the fluid incompressibility induce cross-sectional rotating flows (Dean vortices). These transversal flows promote the mixing of the chemical species. We generate images for different cross sections along the trajectory. The mixing efficiency is evaluated using the Shannon entropy. Previously we have found, P. S. Fodor and M. Kaufman, Modern Physics Letters B 25, 1111 (2011), this measure to be useful in understanding mixing in the staggered herringbone mixer. The mixing entropy is determined as function of the Reynolds number, the angle of the cross section and the observation scale (number of bins). Quantitative comparison of the mixing in the Dean micromixer and in the staggered herringbone mixer is attempted.
Conflict management based on belief function entropy in sensor fusion.
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Wireless sensor network plays an important role in intelligent navigation. It incorporates a group of sensors to overcome the limitation of single detection system. Dempster-Shafer evidence theory can combine the sensor data of the wireless sensor network by data fusion, which contributes to the improvement of accuracy and reliability of the detection system. However, due to different sources of sensors, there may be conflict among the sensor data under uncertain environment. Thus, this paper proposes a new method combining Deng entropy and evidence distance to address the issue. First, Deng entropy is adopted to measure the uncertain information. Then, evidence distance is applied to measure the conflict degree. The new method can cope with conflict effectually and improve the accuracy and reliability of the detection system. An example is illustrated to show the efficiency of the new method and the result is compared with that of the existing methods.
An entropy correction method for unsteady full potential flows with strong shocks
NASA Technical Reports Server (NTRS)
Whitlow, W., Jr.; Hafez, M. M.; Osher, S. J.
1986-01-01
An entropy correction method for the unsteady full potential equation is presented. The unsteady potential equation is modified to account for entropy jumps across shock waves. The conservative form of the modified equation is solved in generalized coordinates using an implicit, approximate factorization method. A flux-biasing differencing method, which generates the proper amounts of artificial viscosity in supersonic regions, is used to discretize the flow equations in space. Comparisons between the present method and solutions of the Euler equations and between the present method and experimental data are presented. The comparisons show that the present method more accurately models solutions of the Euler equations and experiment than does the isentropic potential formulation.
Least action and entropy considerations of self-organization in Benard cells
NASA Astrophysics Data System (ADS)
Georgiev, Georgi; Iannacchione, Germano
We study self-organization in complex systems using first principles in physics. Our approach involves the principle of least action and the second law of thermodynamics. In far from equilibrium systems, energy gradients cause internal ordering to facilitate the dissipation of energy in the environment. This internal ordering decreases their internal entropy in order to obey the principle of least action, minimizing the product of time and energy for transport through the system. We are considering the connection between action and entropy decrease inside Benard cells in order to derive some general features of self-organization. We are developing mathematical treatment of this coupling and comparing it to results from experiments and simulations.
Dynamics of entanglement entropy of interacting fermions in a 1D driven harmonic trap
NASA Astrophysics Data System (ADS)
McKenney, Joshua R.; Porter, William J.; Drut, Joaquín E.
2018-03-01
Following up on a recent analysis of two cold atoms in a time-dependent harmonic trap in one dimension, we explore the entanglement entropy of two and three fermions in the same situation when driven through a parametric resonance. We find that the presence of such a resonance in the two-particle system leaves a clear imprint on the entanglement entropy. We show how the signal is modified by attractive and repulsive contact interactions, and how it remains present for the three-particle system. Additionaly, we extend the work of recent experiments to demonstrate how restricting observation to a limited subsystem gives rise to locally thermal behavior.
Rost, Christina M.; Sachet, Edward; Borman, Trent; Moballegh, Ali; Dickey, Elizabeth C.; Hou, Dong; Jones, Jacob L.; Curtarolo, Stefano; Maria, Jon-Paul
2015-01-01
Configurational disorder can be compositionally engineered into mixed oxide by populating a single sublattice with many distinct cations. The formulations promote novel and entropy-stabilized forms of crystalline matter where metal cations are incorporated in new ways. Here, through rigorous experiments, a simple thermodynamic model, and a five-component oxide formulation, we demonstrate beyond reasonable doubt that entropy predominates the thermodynamic landscape, and drives a reversible solid-state transformation between a multiphase and single-phase state. In the latter, cation distributions are proven to be random and homogeneous. The findings validate the hypothesis that deliberate configurational disorder provides an orthogonal strategy to imagine and discover new phases of crystalline matter and untapped opportunities for property engineering. PMID:26415623
NASA Astrophysics Data System (ADS)
Hari, Yvonne; Dugovič, Branislav; Istrate, Alena; Fignolé, Annabel; Leumann, Christian J.; Schürch, Stefan
2016-07-01
Tricyclo-DNA (tcDNA) is a sugar-modified analogue of DNA currently tested for the treatment of Duchenne muscular dystrophy in an antisense approach. Tandem mass spectrometry plays a key role in modern medical diagnostics and has become a widespread technique for the structure elucidation and quantification of antisense oligonucleotides. Herein, mechanistic aspects of the fragmentation of tcDNA are discussed, which lay the basis for reliable sequencing and quantification of the antisense oligonucleotide. Excellent selectivity of tcDNA for complementary RNA is demonstrated in direct competition experiments. Moreover, the kinetic stability and fragmentation pattern of matched and mismatched tcDNA heteroduplexes were investigated and compared with non-modified DNA and RNA duplexes. Although the separation of the constituting strands is the entropy-favored fragmentation pathway of all nucleic acid duplexes, it was found to be only a minor pathway of tcDNA duplexes. The modified hybrid duplexes preferentially undergo neutral base loss and backbone cleavage. This difference is due to the low activation entropy for the strand dissociation of modified duplexes that arises from the conformational constraint of the tc-sugar-moiety. The low activation entropy results in a relatively high free activation enthalpy for the dissociation comparable to the free activation enthalpy of the alternative reaction pathway, the release of a nucleobase. The gas-phase behavior of tcDNA duplexes illustrates the impact of the activation entropy on the fragmentation kinetics and suggests that tandem mass spectrometric experiments are not suited to determine the relative stability of different types of nucleic acid duplexes.
NASA Astrophysics Data System (ADS)
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
Pasta, Mauro; Wessells, Colin D; Cui, Yi; La Mantia, Fabio
2012-02-08
Water desalination is an important approach to provide fresh water around the world, although its high energy consumption, and thus high cost, call for new, efficient technology. Here, we demonstrate the novel concept of a "desalination battery", which operates by performing cycles in reverse on our previously reported mixing entropy battery. Rather than generating electricity from salinity differences, as in mixing entropy batteries, desalination batteries use an electrical energy input to extract sodium and chloride ions from seawater and to generate fresh water. The desalination battery is comprised by a Na(2-x)Mn(5)O(10) nanorod positive electrode and Ag/AgCl negative electrode. Here, we demonstrate an energy consumption of 0.29 Wh l(-1) for the removal of 25% salt using this novel desalination battery, which is promising when compared to reverse osmosis (~ 0.2 Wh l(-1)), the most efficient technique presently available. © 2012 American Chemical Society
Thermodynamically efficient solar concentrators
NASA Astrophysics Data System (ADS)
Winston, Roland
2012-10-01
Non-imaging Optics is the theory of thermodynamically efficient optics and as such depends more on thermodynamics than on optics. Hence in this paper a condition for the "best" design is proposed based on purely thermodynamic arguments, which we believe has profound consequences for design of thermal and even photovoltaic systems. This new way of looking at the problem of efficient concentration depends on probabilities, the ingredients of entropy and information theory while "optics" in the conventional sense recedes into the background.
Pagel, Anna; Arieta, Alejandro Hernandez; Riener, Robert; Vallery, Heike
2016-10-01
Despite recent advances in leg prosthetics, transfemoral amputees still experience limitations in postural control and gait symmetry. It has been hypothesized that artificial sensory information might improve the integration of the prosthesis into the human sensory-motor control loops and, thus, reduce these limitations. In three transfemoral amputees, we investigated the effect of Electrotactile Moving Sensation for Sensory Augmentation (EMSSA) without training and present preliminary findings. Experimental conditions included standing with open/closed eyes on stable/unstable ground as well as treadmill walking. For standing conditions, spatiotemporal posturographic measures and sample entropy were derived from the center of pressure. For walking conditions, step length and stance duration were calculated. Conditions without feedback showed effects congruent with findings in the literature, e.g., asymmetric weight bearing and step length, and validated the collected data. During standing, with EMSSA a tendency to influence postural control in a negative way was found: Postural control was less effective and less efficient and the prosthetic leg was less involved. Sample entropy tended to decrease, suggesting that EMSSA demanded increased attention. During walking, with EMSSA no persistent positive effect was found. This contrasts the positive subjective assessment and the positive effect on one subject's step length.
NASA Astrophysics Data System (ADS)
Xu, Xuefang; Qiao, Zijian; Lei, Yaguo
2018-03-01
The presence of repetitive transients in vibration signals is a typical symptom of local faults of rotating machinery. Infogram was developed to extract the repetitive transients from vibration signals based on Shannon entropy. Unfortunately, the Shannon entropy is maximized for random processes and unable to quantify the repetitive transients buried in heavy random noise. In addition, the vibration signals always contain multiple intrinsic oscillatory modes due to interaction and coupling effects between machine components. Under this circumstance, high values of Shannon entropy appear in several frequency bands or high value of Shannon entropy doesn't appear in the optimal frequency band, and the infogram becomes difficult to interpret. Thus, it also becomes difficult to select the optimal frequency band for extracting the repetitive transients from the whole frequency bands. To solve these problems, multiscale fractional order entropy (MSFE) infogram is proposed in this paper. With the help of MSFE infogram, the complexity and nonlinear signatures of the vibration signals can be evaluated by quantifying spectral entropy over a range of scales in fractional domain. Moreover, the similarity tolerance of MSFE infogram is helpful for assessing the regularity of signals. A simulation and two experiments concerning a locomotive bearing and a wind turbine gear are used to validate the MSFE infogram. The results demonstrate that the MSFE infogram is more robust to the heavy noise than infogram and the high value is able to only appear in the optimal frequency band for the repetitive transient extraction.
Optimized Kernel Entropy Components.
Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau
2017-06-01
This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.
LSD-induced entropic brain activity predicts subsequent personality change.
Lebedev, A V; Kaelen, M; Lövdén, M; Nilsson, J; Feilding, A; Nutt, D J; Carhart-Harris, R L
2016-09-01
Personality is known to be relatively stable throughout adulthood. Nevertheless, it has been shown that major life events with high personal significance, including experiences engendered by psychedelic drugs, can have an enduring impact on some core facets of personality. In the present, balanced-order, placebo-controlled study, we investigated biological predictors of post-lysergic acid diethylamide (LSD) changes in personality. Nineteen healthy adults underwent resting state functional MRI scans under LSD (75µg, I.V.) and placebo (saline I.V.). The Revised NEO Personality Inventory (NEO-PI-R) was completed at screening and 2 weeks after LSD/placebo. Scanning sessions consisted of three 7.5-min eyes-closed resting-state scans, one of which involved music listening. A standardized preprocessing pipeline was used to extract measures of sample entropy, which characterizes the predictability of an fMRI time-series. Mixed-effects models were used to evaluate drug-induced shifts in brain entropy and their relationship with the observed increases in the personality trait openness at the 2-week follow-up. Overall, LSD had a pronounced global effect on brain entropy, increasing it in both sensory and hierarchically higher networks across multiple time scales. These shifts predicted enduring increases in trait openness. Moreover, the predictive power of the entropy increases was greatest for the music-listening scans and when "ego-dissolution" was reported during the acute experience. These results shed new light on how LSD-induced shifts in brain dynamics and concomitant subjective experience can be predictive of lasting changes in personality. Hum Brain Mapp 37:3203-3213, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Coupled-Double-Quantum-Dot Environmental Information Engines: A Numerical Analysis
NASA Astrophysics Data System (ADS)
Tanabe, Katsuaki
2016-06-01
We conduct numerical simulations for an autonomous information engine comprising a set of coupled double quantum dots using a simple model. The steady-state entropy production rate in each component, heat and electron transfer rates are calculated via the probability distribution of the four electronic states from the master transition-rate equations. We define an information-engine efficiency based on the entropy change of the reservoir, implicating power generators that employ the environmental order as a new energy resource. We acquire device-design principles, toward the realization of corresponding practical energy converters, including that (1) higher energy levels of the detector-side reservoir than those of the detector dot provide significantly higher work production rates by faster states' circulation, (2) the efficiency is strongly dependent on the relative temperatures of the detector and system sides and becomes high in a particular Coulomb-interaction strength region between the quantum dots, and (3) the efficiency depends little on the system dot's energy level relative to its reservoir but largely on the antisymmetric relative amplitudes of the electronic tunneling rates.
Efficient Bayesian experimental design for contaminant source identification
NASA Astrophysics Data System (ADS)
Zhang, Jiangjiang; Zeng, Lingzao; Chen, Cheng; Chen, Dingjiang; Wu, Laosheng
2015-01-01
In this study, an efficient full Bayesian approach is developed for the optimal sampling well location design and source parameters identification of groundwater contaminants. An information measure, i.e., the relative entropy, is employed to quantify the information gain from concentration measurements in identifying unknown parameters. In this approach, the sampling locations that give the maximum expected relative entropy are selected as the optimal design. After the sampling locations are determined, a Bayesian approach based on Markov Chain Monte Carlo (MCMC) is used to estimate unknown parameters. In both the design and estimation, the contaminant transport equation is required to be solved many times to evaluate the likelihood. To reduce the computational burden, an interpolation method based on the adaptive sparse grid is utilized to construct a surrogate for the contaminant transport equation. The approximated likelihood can be evaluated directly from the surrogate, which greatly accelerates the design and estimation process. The accuracy and efficiency of our approach are demonstrated through numerical case studies. It is shown that the methods can be used to assist in both single sampling location and monitoring network design for contaminant source identifications in groundwater.
Revisiting the European sovereign bonds with a permutation-information-theory approach
NASA Astrophysics Data System (ADS)
Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.
2013-12-01
In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.
Photon ratchet intermediate band solar cells
NASA Astrophysics Data System (ADS)
Yoshida, M.; Ekins-Daukes, N. J.; Farrell, D. J.; Phillips, C. C.
2012-06-01
In this paper, we propose an innovative concept for solar power conversion—the "photon ratchet" intermediate band solar cell (IBSC)—which may increase the photovoltaic energy conversion efficiency of IBSCs by increasing the lifetime of charge carriers in the intermediate state. The limiting efficiency calculation for this concept shows that the efficiency can be increased by introducing a fast thermal transition of carriers into a non-emissive state. At 1 sun, the introduction of a "ratchet band" results in an increase of efficiency from 46.8% to 48.5%, due to suppression of entropy generation.
Stochastic thermodynamics, fluctuation theorems and molecular machines.
Seifert, Udo
2012-12-01
Stochastic thermodynamics as reviewed here systematically provides a framework for extending the notions of classical thermodynamics such as work, heat and entropy production to the level of individual trajectories of well-defined non-equilibrium ensembles. It applies whenever a non-equilibrium process is still coupled to one (or several) heat bath(s) of constant temperature. Paradigmatic systems are single colloidal particles in time-dependent laser traps, polymers in external flow, enzymes and molecular motors in single molecule assays, small biochemical networks and thermoelectric devices involving single electron transport. For such systems, a first-law like energy balance can be identified along fluctuating trajectories. For a basic Markovian dynamics implemented either on the continuum level with Langevin equations or on a discrete set of states as a master equation, thermodynamic consistency imposes a local-detailed balance constraint on noise and rates, respectively. Various integral and detailed fluctuation theorems, which are derived here in a unifying approach from one master theorem, constrain the probability distributions for work, heat and entropy production depending on the nature of the system and the choice of non-equilibrium conditions. For non-equilibrium steady states, particularly strong results hold like a generalized fluctuation-dissipation theorem involving entropy production. Ramifications and applications of these concepts include optimal driving between specified states in finite time, the role of measurement-based feedback processes and the relation between dissipation and irreversibility. Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecker, Christian; Grumiller, Daniel; Stanzer, Philipp
In this paper, we study the time evolution of 2-point functions and entanglement entropy in strongly anisotropic, inhomogeneous and time-dependent N = 4 super Yang-Mills theory in the large N and large ’t Hooft coupling limit using AdS/CFT. On the gravity side this amounts to calculating the length of geodesics and area of extremal surfaces in the dynamical background of two colliding gravitational shockwaves, which we do numerically. We discriminate between three classes of initial conditions corresponding to wide, intermediate and narrow shocks, and show that they exhibit different phenomenology with respect to the nonlocal observables that we determine. Ourmore » results permit to use (holographic) entanglement entropy as an order parameter to distinguish between the two phases of the cross-over from the transparency to the full-stopping scenario in dynamical Yang-Mills plasma formation, which is frequently used as a toy model for heavy ion collisions. The time evolution of entanglement entropy allows to discern four regimes: highly efficient initial growth of entanglement, linear growth, (post) collisional drama and late time (polynomial) fall off. Surprisingly, we found that 2-point functions can be sensitive to the geometry inside the black hole apparent horizon, while we did not find such cases for the entanglement entropy.« less
NASA Astrophysics Data System (ADS)
Michelon, M. F.; Antonelli, A.
2010-03-01
We have developed a methodology to study the thermodynamics of order-disorder transformations in n -component substitutional alloys that combines nonequilibrium methods, which can efficiently compute free energies, with Monte Carlo simulations, in which configurational and vibrational degrees of freedom are simultaneously considered on an equal footing basis. Furthermore, with this methodology one can easily perform simulations in the canonical and in the isobaric-isothermal ensembles, which allow the investigation of the bulk volume effect. We have applied this methodology to calculate configurational and vibrational contributions to the entropy of the Ni3Al alloy as functions of temperature. The simulations show that when the volume of the system is kept constant, the vibrational entropy does not change upon transition while constant-pressure calculations indicate that the volume increase at the order-disorder transition causes a vibrational entropy increase of 0.08kB/atom . This is significant when compared to the configurational entropy increase of 0.27kB/atom . Our calculations also indicate that the inclusion of vibrations reduces in about 30% the order-disorder transition temperature determined solely considering the configurational degrees of freedom.
NASA Astrophysics Data System (ADS)
Tušek, Jaka; Engelbrecht, Kurt; Mañosa, Lluis; Vives, Eduard; Pryds, Nini
2016-12-01
This paper presents direct and indirect methods for studying the elastocaloric effect (eCE) in shape memory materials and its comparison. The eCE can be characterized by the adiabatic temperature change or the isothermal entropy change (both as a function of applied stress/strain). To get these quantities, the evaluation of the eCE can be done using either direct methods, where one measures (adiabatic) temperature changes or indirect methods where one can measure the stress-strain-temperature characteristics of the materials and from these deduce the adiabatic temperature and isothermal entropy changes. The former can be done using the basic thermodynamic relations, i.e. Maxwell relation and Clausius-Clapeyron equation. This paper further presents basic thermodynamic properties of shape memory materials, such as the adiabatic temperature change, isothermal entropy change and total entropy-temperature diagrams (all as a function of temperature and applied stress/strain) of two groups of materials (Ni-Ti and Cu-Zn-Al alloys) obtained using indirect methods through phenomenological modelling and Maxwell relation. In the last part of the paper, the basic definition of the efficiency of the elastocaloric thermodynamic cycle (coefficient of performance) is defined and discussed.
Thermodynamics of photon-enhanced thermionic emission solar cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reck, Kasper, E-mail: kasper.reck@nanotech.dtu.dk; Hansen, Ole, E-mail: ole.hansen@nanotech.dtu.dk; CINF Center for Individual Nanoparticle Functionality, Technical University of Denmark, Kgs. Lyngby 2800
2014-01-13
Photon-enhanced thermionic emission (PETE) cells in which direct photon energy as well as thermal energy can be harvested have recently been suggested as a new candidate for high efficiency solar cells. Here, we present an analytic thermodynamical model for evaluation of the efficiency of PETE solar cells including an analysis of the entropy production due to thermionic emission of general validity. The model is applied to find the maximum efficiency of a PETE cell for given cathode and anode work functions and temperatures.
Mei, Zhanyong; Ivanov, Kamen; Zhao, Guoru; Li, Huihui; Wang, Lei
2017-04-01
In the study of biomechanics of different foot types, temporal or spatial parameters derived from plantar pressure are often used. However, there is no comparative study of complexity and regularity of the center of pressure (CoP) during the stance phase among pes valgus, pes cavus, hallux valgus and normal foot. We aim to analyze whether CoP sample entropy characteristics differ among these four foot types. In our experiment participated 40 subjects with normal feet, 40 with pes cavus, 19 with pes valgus and 36 with hallux valgus. A Footscan ® system was used to collect CoP data. We used sample entropy to quantify several parameters of the investigated four foot types. These are the displacement in medial-lateral (M/L) and anterior-posterior (A/P) directions, as well as the vertical ground reaction force of CoP during the stance phase. To fully examine the potential of the sample entropy method for quantification of CoP components, we provide results for two cases: calculating the sample entropy of normalized CoP components, as well as calculating it using the raw data of CoP components. We also explored what are the optimal values of parameters m (the matching length) and r (the tolerance range) when calculating the sample entropy of CoP data obtained during the stance phases. According to statistical results, some factors significantly influenced the sample entropy of CoP components. The sample entropies of non-normalized A/P values for the left foot, as well as for the right foot, were different between the normal foot and pes valgus, and between the normal foot and hallux valgus. The sample entropy of normalized M/L displacement of the right foot was different between the normal foot and pes cavus. The measured variable for A/P and M/L displacements could serve for the study of foot function.
Study on corrosion resistance of high - entropy alloy in medium acid liquid and chemical properties
NASA Astrophysics Data System (ADS)
Florea, I.; Buluc, G.; Florea, R. M.; Soare, V.; Carcea, I.
2015-11-01
High-entropy alloy is a new alloy which is different from traditional alloys. The high entropy alloys were started in Tsing Hua University of Taiwan since 1995 by Yeh et al. Consisting of a variety of elements, each element occupying a similar compared with other alloy elements to form a high entropy. We could define high entropy alloys as having approximately equal concentrations, made up of a group of 5 to 11 major elements. In general, the content of each element is not more than 35% by weight of the alloy. During the investigation it turned out that this alloy has a high hardness and is also corrosion proof and also strength and good thermal stability. In the experimental area, scientists used different tools, including traditional casting, mechanical alloying, sputtering, splat-quenching to obtain the high entropy alloys with different alloying elements and then to investigate the corresponding microstructures and mechanical, chemical, thermal, and electronic performances. The present study is aimed to investigate the corrosion resistance in a different medium acid and try to put in evidence the mechanical properties. Forasmuch of the wide composition range and the enormous number of alloy systems in high entropy alloys, the mechanical properties of high entropy alloys can vary significantly. In terms of hardness, the most critical factors are: hardness/strength of each composing phase in the alloy, distribution of the composing phases. The corrosion resistance of an high entropy alloy was made in acid liquid such as 10%HNO3-3%HF, 10%H2SO4, 5%HCl and then was investigated, respectively with weight loss experiment. Weight loss test was carried out by put the samples into the acid solution for corrosion. The solution was maintained at a constant room temperature. The liquid formulations used for tests were 3% hydrofluoric acid with 10% nitric acid, 10% sulphuric acid, 5% hydrochloric acid. Weight loss of the samples was measured by electronic scale.
Metabolic networks evolve towards states of maximum entropy production.
Unrean, Pornkamol; Srienc, Friedrich
2011-11-01
A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.
Energy conservation and maximal entropy production in enzyme reactions.
Dobovišek, Andrej; Vitas, Marko; Brumen, Milan; Fajmut, Aleš
2017-08-01
A procedure for maximization of the density of entropy production in a single stationary two-step enzyme reaction is developed. Under the constraints of mass conservation, fixed equilibrium constant of a reaction and fixed products of forward and backward enzyme rate constants the existence of maximum in the density of entropy production is demonstrated. In the state with maximal density of entropy production the optimal enzyme rate constants, the stationary concentrations of the substrate and the product, the stationary product yield as well as the stationary reaction flux are calculated. The test, whether these calculated values of the reaction parameters are consistent with their corresponding measured values, is performed for the enzyme Glucose Isomerase. It is found that calculated and measured rate constants agree within an order of magnitude, whereas the calculated reaction flux and the product yield differ from their corresponding measured values for less than 20 % and 5 %, respectively. This indicates that the enzyme Glucose Isomerase, considered in a non-equilibrium stationary state, as found in experiments using the continuous stirred tank reactors, possibly operates close to the state with the maximum in the density of entropy production. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.
2014-12-01
The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.
Compact NE213 neutron spectrometer with high energy resolution for fusion applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimbal, A.; Reginatto, M.; Schuhmacher, H.
Neutron spectrometry is a tool for obtaining important information on the fuel ion composition, velocity distribution and temperature of fusion plasmas. A compact NE213 liquid scintillator, fully characterized at Physikalisch-Technische Bundesanstalt, was installed and operated at the Joint European Torus (JET) during two experimental campaigns (C8-2002 and trace tritium experiment-TTE 2003). The results show that this system can operate in a real fusion experiment as a neutron (1.5 MeV
Improving Design Efficiency for Large-Scale Heterogeneous Circuits
NASA Astrophysics Data System (ADS)
Gregerson, Anthony
Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.
Validity of the Stokes-Einstein relation in liquids: simple rules from the excess entropy.
Pasturel, A; Jakse, N
2016-12-07
It is becoming common practice to consider that the Stokes-Einstein relation D/T~ η -1 usually works for liquids above their melting temperatures although there is also experimental evidence for its failure. Here we investigate numerically this commonly-invoked assumption for simple liquid metals as well as for their liquid alloys. Using ab initio molecular dynamics simulations we show how entropy scaling relationships developed by Rosenfeld can be used to predict the conditions for the validity of the Stokes-Einstein relation in the liquid phase. Specifically, we demonstrate the Stokes-Einstein relation may break down in the liquid phase of some liquid alloys mainly due to the presence of local structural ordering as evidenced in their partial two-body excess entropies. Our findings shed new light on the understanding of transport properties of liquid materials and will trigger more experimental and theoretical studies since excess entropy and its two-body approximation are readily obtainable from standard experiments and simulations.
NASA Astrophysics Data System (ADS)
Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier
2018-06-01
Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.
Rényi entropy measure of noise-aided information transmission in a binary channel.
Chapeau-Blondeau, François; Rousseau, David; Delahaies, Agnès
2010-05-01
This paper analyzes a binary channel by means of information measures based on the Rényi entropy. The analysis extends, and contains as a special case, the classic reference model of binary information transmission based on the Shannon entropy measure. The extended model is used to investigate further possibilities and properties of stochastic resonance or noise-aided information transmission. The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order. Furthermore, in definite conditions, when seeking the Rényi information measures that best exploit stochastic resonance, then nontrivial orders differing from the Shannon case usually emerge. In this way, through binary information transmission, stochastic resonance identifies optimal Rényi measures of information differing from the classic Shannon measure. A confrontation of the quantitative information measures with visual perception is also proposed in an experiment of noise-aided binary image transmission.
NASA Technical Reports Server (NTRS)
Bernstein, R. B.; Levine, R. D.
1972-01-01
Optimal means of characterizing the distribution of product energy states resulting from reactive collisions of molecules with restricted distributions of initial states are considered, along with those for characterizing the particular reactant state distribution which yields a given set of product states at a specified total energy. It is suggested to represent the energy-dependence of global-type results in the form of square-faced bar plots, and of data for specific-type experiments as triangular-faced prismatic plots. The essential parameters defining the internal state distribution are isolated, and the information content of such a distribution is put on a quantitative basis. The relationship between the information content, the surprisal, and the entropy of the continuous distribution is established. The concept of an entropy deficiency, which characterizes the specificity of product state formation, is suggested as a useful measure of the deviance from statistical behavior. The degradation of information by experimental averaging is considered, leading to bounds on the entropy deficiency.
Novel sonar signal processing tool using Shannon entropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quazi, A.H.
1996-06-01
Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less
Some practical universal noiseless coding techniques
NASA Technical Reports Server (NTRS)
Rice, R. F.
1979-01-01
Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.
NASA Technical Reports Server (NTRS)
Bardina, J. E.
1994-01-01
A new computational efficient 3-D compressible Reynolds-averaged implicit Navier-Stokes method with advanced two equation turbulence models for high speed flows is presented. All convective terms are modeled using an entropy satisfying higher-order Total Variation Diminishing (TVD) scheme based on implicit upwind flux-difference split approximations and arithmetic averaging procedure of primitive variables. This method combines the best features of data management and computational efficiency of space marching procedures with the generality and stability of time dependent Navier-Stokes procedures to solve flows with mixed supersonic and subsonic zones, including streamwise separated flows. Its robust stability derives from a combination of conservative implicit upwind flux-difference splitting with Roe's property U to provide accurate shock capturing capability that non-conservative schemes do not guarantee, alternating symmetric Gauss-Seidel 'method of planes' relaxation procedure coupled with a three-dimensional two-factor diagonal-dominant approximate factorization scheme, TVD flux limiters of higher-order flux differences satisfying realizability, and well-posed characteristic-based implicit boundary-point a'pproximations consistent with the local characteristics domain of dependence. The efficiency of the method is highly increased with Newton Raphson acceleration which allows convergence in essentially one forward sweep for supersonic flows. The method is verified by comparing with experiment and other Navier-Stokes methods. Here, results of adiabatic and cooled flat plate flows, compression corner flow, and 3-D hypersonic shock-wave/turbulent boundary layer interaction flows are presented. The robust 3-D method achieves a better computational efficiency of at least one order of magnitude over the CNS Navier-Stokes code. It provides cost-effective aerodynamic predictions in agreement with experiment, and the capability of predicting complex flow structures in complex geometries with good accuracy.
NASA Astrophysics Data System (ADS)
Chandler, Damon M.; Field, David J.
2007-04-01
Natural scenes, like most all natural data sets, show considerable redundancy. Although many forms of redundancy have been investigated (e.g., pixel distributions, power spectra, contour relationships, etc.), estimates of the true entropy of natural scenes have been largely considered intractable. We describe a technique for estimating the entropy and relative dimensionality of image patches based on a function we call the proximity distribution (a nearest-neighbor technique). The advantage of this function over simple statistics such as the power spectrum is that the proximity distribution is dependent on all forms of redundancy. We demonstrate that this function can be used to estimate the entropy (redundancy) of 3×3 patches of known entropy as well as 8×8 patches of Gaussian white noise, natural scenes, and noise with the same power spectrum as natural scenes. The techniques are based on assumptions regarding the intrinsic dimensionality of the data, and although the estimates depend on an extrapolation model for images larger than 3×3, we argue that this approach provides the best current estimates of the entropy and compressibility of natural-scene patches and that it provides insights into the efficiency of any coding strategy that aims to reduce redundancy. We show that the sample of 8×8 patches of natural scenes used in this study has less than half the entropy of 8×8 white noise and less than 60% of the entropy of noise with the same power spectrum. In addition, given a finite number of samples (<220) drawn randomly from the space of 8×8 patches, the subspace of 8×8 natural-scene patches shows a dimensionality that depends on the sampling density and that for low densities is significantly lower dimensional than the space of 8×8 patches of white noise and noise with the same power spectrum.
NASA Astrophysics Data System (ADS)
de La Sierra, Ruben Ulises
The present study introduces entropy mapping as a comprehensive method to analyze and describe complex interactive systems; and to assess the effect that entropy has in paradigm changes as described by transition theory. Dynamics of interactions among environmental, economic and demographic conditions affect a number of fast growing locations throughout the world. One of the regions especially affected by accelerated growth in terms of demographic and economic development is the border region between Mexico and the US. As the contrast between these countries provides a significant economic and cultural differential, the dynamics of capital, goods, services and people and the rates at which they interact are rather unique. To illustrate the most fundamental economic and political changes affecting the region, a background addressing the causes for these changes leading to the North America Free Trade Agreement (NAFTA) is presented. Although the concept of thermodynamic entropy was first observed in physical sciences, a relevant homology exists in biological, social and economic sciences as the universal tendency towards disorder, dissipation and equilibrium is present in these disciplines when energy or resources become deficient. Furthermore, information theory is expressed as uncertainty and randomness in terms of efficiency in transmission of information. Although entropy in closed systems is unavoidable, its increase in open systems, can be arrested by a flux of energy, resources and/or information. A critical component of all systems is the boundary. If a boundary is impermeable, it will prevent energy flow from the environment into the system; likewise, if the boundary is too porous, it will not be able to prevent the dissipation of energy and resources into the environment, and will not prevent entropy from entering. Therefore, two expressions of entropy--thermodynamic and information--are identified and related to systems in transition and to spatial distribution. These expressions are used to identify causes and trends leading to growth or disorder.
NASA Astrophysics Data System (ADS)
Zingan, Valentin Nikolaevich
This work develops a discontinuous Galerkin finite element discretization of non- linear hyperbolic conservation equations with efficient and robust high order stabilization built on an entropy-based artificial viscosity approximation. The solutions of equations are represented by elementwise polynomials of an arbitrary degree p > 0 which are continuous within each element but discontinuous on the boundaries. The discretization of equations in time is done by means of high order explicit Runge-Kutta methods identified with respective Butcher tableaux. To stabilize a numerical solution in the vicinity of shock waves and simultaneously preserve the smooth parts from smearing, we add some reasonable amount of artificial viscosity in accordance with the physical principle of entropy production in the interior of shock waves. The viscosity coefficient is proportional to the local size of the residual of an entropy equation and is bounded from above by the first-order artificial viscosity defined by a local wave speed. Since the residual of an entropy equation is supposed to be vanishingly small in smooth regions (of the order of the Local Truncation Error) and arbitrarily large in shocks, the entropy viscosity is almost zero everywhere except the shocks, where it reaches the first-order upper bound. One- and two-dimensional benchmark test cases are presented for nonlinear hyperbolic scalar conservation laws and the system of compressible Euler equations. These tests demonstrate the satisfactory stability properties of the method and optimal convergence rates as well. All numerical solutions to the test problems agree well with the reference solutions found in the literature. We conclude that the new method developed in the present work is a valuable alternative to currently existing techniques of viscous stabilization.
Coding For Compression Of Low-Entropy Data
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu
1994-01-01
Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.
Zehe, Erwin; Blume, Theresa; Blöschl, Günter
2010-01-01
Preferential flow in biological soil structures is of key importance for infiltration and soil water flow at a range of scales. In the present study, we treat soil water flow as a dissipative process in an open non-equilibrium thermodynamic system, to better understand this key process. We define the chemical potential and Helmholtz free energy based on soil physical quantities, parametrize a physically based hydrological model based on field data and simulate the evolution of Helmholtz free energy in a cohesive soil with different populations of worm burrows for a range of rainfall scenarios. The simulations suggest that flow in connected worm burrows allows a more efficient redistribution of water within the soil, which implies a more efficient dissipation of free energy/higher production of entropy. There is additional evidence that the spatial pattern of worm burrow density at the hillslope scale is a major control of energy dissipation. The pattern typically found in the study is more efficient in dissipating energy/producing entropy than other patterns. This is because upslope run-off accumulates and infiltrates via the worm burrows into the dry soil in the lower part of the hillslope, which results in an overall more efficient dissipation of free energy. PMID:20368256
Application of SNODAS and hydrologic models to enhance entropy-based snow monitoring network design
NASA Astrophysics Data System (ADS)
Keum, Jongho; Coulibaly, Paulin; Razavi, Tara; Tapsoba, Dominique; Gobena, Adam; Weber, Frank; Pietroniro, Alain
2018-06-01
Snow has a unique characteristic in the water cycle, that is, snow falls during the entire winter season, but the discharge from snowmelt is typically delayed until the melting period and occurs in a relatively short period. Therefore, reliable observations from an optimal snow monitoring network are necessary for an efficient management of snowmelt water for flood prevention and hydropower generation. The Dual Entropy and Multiobjective Optimization is applied to design snow monitoring networks in La Grande River Basin in Québec and Columbia River Basin in British Columbia. While the networks are optimized to have the maximum amount of information with minimum redundancy based on entropy concepts, this study extends the traditional entropy applications to the hydrometric network design by introducing several improvements. First, several data quantization cases and their effects on the snow network design problems were explored. Second, the applicability the Snow Data Assimilation System (SNODAS) products as synthetic datasets of potential stations was demonstrated in the design of the snow monitoring network of the Columbia River Basin. Third, beyond finding the Pareto-optimal networks from the entropy with multi-objective optimization, the networks obtained for La Grande River Basin were further evaluated by applying three hydrologic models. The calibrated hydrologic models simulated discharges using the updated snow water equivalent data from the Pareto-optimal networks. Then, the model performances for high flows were compared to determine the best optimal network for enhanced spring runoff forecasting.
Thermodynamics and computation during collective motion near criticality
NASA Astrophysics Data System (ADS)
Crosato, Emanuele; Spinney, Richard E.; Nigmatullin, Ramil; Lizier, Joseph T.; Prokopenko, Mikhail
2018-01-01
We study self-organization of collective motion as a thermodynamic phenomenon in the context of the first law of thermodynamics. It is expected that the coherent ordered motion typically self-organises in the presence of changes in the (generalized) internal energy and of (generalized) work done on, or extracted from, the system. We aim to explicitly quantify changes in these two quantities in a system of simulated self-propelled particles and contrast them with changes in the system's configuration entropy. In doing so, we adapt a thermodynamic formulation of the curvatures of the internal energy and the work, with respect to two parameters that control the particles' alignment. This allows us to systematically investigate the behavior of the system by varying the two control parameters to drive the system across a kinetic phase transition. Our results identify critical regimes and show that during the phase transition, where the configuration entropy of the system decreases, the rates of change of the work and of the internal energy also decrease, while their curvatures diverge. Importantly, the reduction of entropy achieved through expenditure of work is shown to peak at criticality. We relate this both to a thermodynamic efficiency and the significance of the increased order with respect to a computational path. Additionally, this study provides an information-geometric interpretation of the curvature of the internal energy as the difference between two curvatures: the curvature of the free entropy, captured by the Fisher information, and the curvature of the configuration entropy.
Spatial Decomposition of Translational Water–Water Correlation Entropy in Binding Pockets
2015-01-01
A number of computational tools available today compute the thermodynamic properties of water at surfaces and in binding pockets by using inhomogeneous solvation theory (IST) to analyze explicit-solvent simulations. Such methods enable qualitative spatial mappings of both energy and entropy around a solute of interest and can also be applied quantitatively. However, the entropy estimates of existing methods have, to date, been almost entirely limited to the first-order terms in the IST’s entropy expansion. These first-order terms account for localization and orientation of water molecules in the field of the solute but not for the modification of water–water correlations by the solute. Here, we present an extension of the Grid Inhomogeneous Solvation Theory (GIST) approach which accounts for water–water translational correlations. The method involves rewriting the two-point density of water in terms of a conditional density and utilizes the efficient nearest-neighbor entropy estimation approach. Spatial maps of this second order term, for water in and around the synthetic host cucurbit[7]uril and in the binding pocket of the enzyme Factor Xa, reveal mainly negative contributions, indicating solute-induced water–water correlations relative to bulk water; particularly strong signals are obtained for sites at the entrances of cavities or pockets. This second-order term thus enters with the same, negative, sign as the first order translational and orientational terms. Numerical and convergence properties of the methodology are examined. PMID:26636620
Cell-model prediction of the melting of a Lennard-Jones solid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holian, B.L.
The classical free energy of the Lennard-Jones 6-12 solid is computed from a single-particle anharmonic cell model with a correction to the entropy given by the classical correlational entropy of quasiharmonic lattice dynamics. The free energy of the fluid is obtained from the Hansen-Ree analytic fit to Monte Carlo equation-of-state calculations. The resulting predictions of the solid-fluid coexistence curves by this corrected cell model of the solid are in excellent agreement with the computer experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stratz, S. Adam; Jones, Steven J.; Mullen, Austin D.
Newly-established adsorption enthalpy and entropy values of 12 lanthanide hexafluoroacetylacetonates, denoted Ln[hfac] 4, along with the experimental and theoretical methodology used to obtain these values, are presented for the first time. The results of this work can be used in conjunction with theoretical modeling techniques to optimize a large-scale gas-phase separation experiment using isothermal chromatography. The results to date indicate average adsorption enthalpy and entropy values of the 12 Ln[hfac] 4 complexes ranging from -33 to -139 kJ/mol K and -299 to -557 J/mol, respectively.
Maximum Entropy Calculations on a Discrete Probability Space
1986-01-01
constraints acting besides normalization. Statement 3: " The aim of this paper is to show that the die experiment just spoken of has solutions by classical ...analysis. Statement 4: We snall solve this problem in a purely classical way, without the need for recourse to any exotic estimator, such as ME." Note... The I’iximoun Entropy Principle lin i rejirk.ible -series ofT papers beginning in 1957, E. T. J.ayiieti (1957) be~gan a revuluuion in inductive
Relationship between efficiency and predictability in stock price change
NASA Astrophysics Data System (ADS)
Eom, Cheoljun; Oh, Gabjin; Jung, Woo-Sung
2008-09-01
In this study, we evaluate the relationship between efficiency and predictability in the stock market. The efficiency, which is the issue addressed by the weak-form efficient market hypothesis, is calculated using the Hurst exponent and the approximate entropy (ApEn). The predictability corresponds to the hit-rate; this is the rate of consistency between the direction of the actual price change and that of the predicted price change, as calculated via the nearest neighbor prediction method. We determine that the Hurst exponent and the ApEn value are negatively correlated. However, predictability is positively correlated with the Hurst exponent.
Entropy, pumped-storage and energy system finance
NASA Astrophysics Data System (ADS)
Karakatsanis, Georgios
2015-04-01
Pumped-storage holds a key role for integrating renewable energy units with non-renewable fuel plants into large-scale energy systems of electricity output. An emerging issue is the development of financial engineering models with physical basis to systematically fund energy system efficiency improvements across its operation. A fundamental physically-based economic concept is the Scarcity Rent; which concerns the pricing of a natural resource's scarcity. Specifically, the scarcity rent comprises a fraction of a depleting resource's full price and accumulates to fund its more efficient future use. In an integrated energy system, scarcity rents derive from various resources and can be deposited to a pooled fund to finance the energy system's overall efficiency increase; allowing it to benefit from economies of scale. With pumped-storage incorporated to the system, water upgrades to a hub resource, in which the scarcity rents of all connected energy sources are denominated to. However, as available water for electricity generation or storage is also limited, a scarcity rent upon it is also imposed. It is suggested that scarcity rent generation is reducible to three (3) main factors, incorporating uncertainty: (1) water's natural renewability, (2) the energy system's intermittent components and (3) base-load prediction deviations from actual loads. For that purpose, the concept of entropy is used in order to measure the energy system's overall uncertainty; hence pumped-storage intensity requirements and generated water scarcity rents. Keywords: pumped-storage, integration, energy systems, financial engineering, physical basis, Scarcity Rent, pooled fund, economies of scale, hub resource, uncertainty, entropy Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)
Diagnosing entropy production and dissipation in fully kinetic plasmas
NASA Astrophysics Data System (ADS)
Juno, James; Tenbarge, Jason; Hakim, Ammar; Dorland, William; Cagas, Petr
2017-10-01
Many plasma systems, from the core of a tokamak to the outer heliosphere, are weakly collisional and thus most accurately described by kinetic theory. The typical approach to solving the kinetic equation has been the particle-in-cell algorithm, which, while a powerful tool, introduces counting noise into the particle distribution function. The counting noise is particularly problematic when attempting to study grand challenge problems such as entropy production from phenomena like shocks and turbulence. In this poster, we present studies of entropy production and dissipation processes present in simple turbulence and shock calculations using the continuum Vlasov-Maxwell solver in the Gkeyll framework. Particular emphasis is placed on a novel diagnostic, the field-particle correlation, which is especially efficient at separating the secular energy transfer into its constituent components, for example, cyclotron damping, Landau damping, or transit-time damping, when applied to a noise-free distribution function. National Science Foundation SHINE award No. AGS-1622306 and the UMD DOE Grant DE-FG02-93ER54197.
Diagnosing entropy production and dissipation in fully kinetic plasmas
NASA Astrophysics Data System (ADS)
Juno, J.; TenBarge, J. M.; Hakim, A.; Dorland, W.
2017-12-01
Many plasma systems, from the core of a tokamak to the outer heliosphere, are weakly collisional and thus most accurately described by kinetic theory. The typical approach to solving the kinetic equation has been the particle-in-cell algorithm, which, while a powerful tool, introduces counting noise into the particle distribution function. The counting noise is particularly problematic when attempting to study grand challenge problems such as entropy production from phenomena like shocks and turbulence. In this poster, we present studies of entropy production and dissipation processes present in simple turbulence and shock calculations using the continuum Vlasov-Maxwell solver in the Gkeyll framework. Particular emphasis is placed on a novel diagnostic, the field-particle correlation, which is especially efficient at separating the secular energy transfer into its constituent components, for example, cyclotron damping, Landau damping, or transit-time damping, when applied to a noise-free distribution function. Using reduced systems such as completely transverse electromagnetic shocks, we also explore the signatures of perpendicular, non-resonant, energization mechanisms.
NASA Astrophysics Data System (ADS)
Michelini, Fabienne; Crépieux, Adeline; Beltako, Katawoura
2017-05-01
We discuss some thermodynamic aspects of energy conversion in electronic nanosystems able to convert light energy into electrical or/and thermal energy using the non-equilibrium Green’s function formalism. In a first part, we derive the photon energy and particle currents inside a nanosystem interacting with light and in contact with two electron reservoirs at different temperatures. Energy conservation is verified, and radiation laws are discussed from electron non-equilibrium Green’s functions. We further use the photon currents to formulate the rate of entropy production for steady-state nanosystems, and we recast this rate in terms of efficiency for specific photovoltaic-thermoelectric nanodevices. In a second part, a quantum dot based nanojunction is closely examined using a two-level model. We show analytically that the rate of entropy production is always positive, but we find numerically that it can reach negative values when the derived particule and energy currents are empirically modified as it is usually done for modeling realistic photovoltaic systems.
Michelini, Fabienne; Crépieux, Adeline; Beltako, Katawoura
2017-05-04
We discuss some thermodynamic aspects of energy conversion in electronic nanosystems able to convert light energy into electrical or/and thermal energy using the non-equilibrium Green's function formalism. In a first part, we derive the photon energy and particle currents inside a nanosystem interacting with light and in contact with two electron reservoirs at different temperatures. Energy conservation is verified, and radiation laws are discussed from electron non-equilibrium Green's functions. We further use the photon currents to formulate the rate of entropy production for steady-state nanosystems, and we recast this rate in terms of efficiency for specific photovoltaic-thermoelectric nanodevices. In a second part, a quantum dot based nanojunction is closely examined using a two-level model. We show analytically that the rate of entropy production is always positive, but we find numerically that it can reach negative values when the derived particule and energy currents are empirically modified as it is usually done for modeling realistic photovoltaic systems.
The relative entropy is fundamental to adaptive resolution simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kreis, Karsten; Graduate School Materials Science in Mainz, Staudingerweg 9, 55128 Mainz; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de
Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy withmore » respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.« less
Classifying epileptic EEG signals with delay permutation entropy and Multi-Scale K-means.
Zhu, Guohun; Li, Yan; Wen, Peng Paul; Wang, Shuaifang
2015-01-01
Most epileptic EEG classification algorithms are supervised and require large training datasets, that hinder their use in real time applications. This chapter proposes an unsupervised Multi-Scale K-means (MSK-means) MSK-means algorithm to distinguish epileptic EEG signals and identify epileptic zones. The random initialization of the K-means algorithm can lead to wrong clusters. Based on the characteristics of EEGs, the MSK-means MSK-means algorithm initializes the coarse-scale centroid of a cluster with a suitable scale factor. In this chapter, the MSK-means algorithm is proved theoretically superior to the K-means algorithm on efficiency. In addition, three classifiers: the K-means, MSK-means MSK-means and support vector machine (SVM), are used to identify seizure and localize epileptogenic zone using delay permutation entropy features. The experimental results demonstrate that identifying seizure with the MSK-means algorithm and delay permutation entropy achieves 4. 7 % higher accuracy than that of K-means, and 0. 7 % higher accuracy than that of the SVM.
Quantum Entanglement and the Topological Order of Fractional Hall States
NASA Astrophysics Data System (ADS)
Rezayi, Edward
2015-03-01
Fractional quantum Hall states or, more generally, topological phases of matter defy Landau classification based on order parameter and broken symmetry. Instead they have been characterized by their topological order. Quantum information concepts, such as quantum entanglement, appear to provide the most efficient method of detecting topological order solely from the knowledge of the ground state wave function. This talk will focus on real-space bi-partitioning of quantum Hall states and will present both exact diagonalization and quantum Monte Carlo studies of topological entanglement entropy in various geometries. Results on the torus for non-contractible cuts are quite rich and, through the use of minimum entropy states, yield the modular S-matrix and hence uniquely determine the topological order, as shown in recent literature. Concrete examples of minimum entropy states from known quantum Hall wave functions and their corresponding quantum numbers, used in exact diagonalizations, will be given. In collaboration with Clare Abreu and Raul Herrera. Supported by DOE Grant DE-SC0002140.
Local statistics adaptive entropy coding method for the improvement of H.26L VLC coding
NASA Astrophysics Data System (ADS)
Yoo, Kook-yeol; Kim, Jong D.; Choi, Byung-Sun; Lee, Yung Lyul
2000-05-01
In this paper, we propose an adaptive entropy coding method to improve the VLC coding efficiency of H.26L TML-1 codec. First of all, we will show that the VLC coding presented in TML-1 does not satisfy the sibling property of entropy coding. Then, we will modify the coding method into the local statistics adaptive one to satisfy the property. The proposed method based on the local symbol statistics dynamically changes the mapping relationship between symbol and bit pattern in the VLC table according to sibling property. Note that the codewords in the VLC table of TML-1 codec is not changed. Since this changed mapping relationship also derived in the decoder side by using the decoded symbols, the proposed VLC coding method does not require any overhead information. The simulation results show that the proposed method gives about 30% and 37% reduction in average bit rate for MB type and CBP information, respectively.
The Cross-Entropy Based Multi-Filter Ensemble Method for Gene Selection.
Sun, Yingqiang; Lu, Chengbo; Li, Xiaobo
2018-05-17
The gene expression profile has the characteristics of a high dimension, low sample, and continuous type, and it is a great challenge to use gene expression profile data for the classification of tumor samples. This paper proposes a cross-entropy based multi-filter ensemble (CEMFE) method for microarray data classification. Firstly, multiple filters are used to select the microarray data in order to obtain a plurality of the pre-selected feature subsets with a different classification ability. The top N genes with the highest rank of each subset are integrated so as to form a new data set. Secondly, the cross-entropy algorithm is used to remove the redundant data in the data set. Finally, the wrapper method, which is based on forward feature selection, is used to select the best feature subset. The experimental results show that the proposed method is more efficient than other gene selection methods and that it can achieve a higher classification accuracy under fewer characteristic genes.
On the optimality of a universal noiseless coder
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu; Rice, Robert F.; Miller, Warner H.
1993-01-01
Rice developed a universal noiseless coding structure that provides efficient performance over an extremely broad range of source entropy. This is accomplished by adaptively selecting the best of several easily implemented variable length coding algorithms. Variations of such noiseless coders have been used in many NASA applications. Custom VLSI coder and decoder modules capable of processing over 50 million samples per second have been fabricated and tested. In this study, the first of the code options used in this module development is shown to be equivalent to a class of Huffman code under the Humblet condition, for source symbol sets having a Laplacian distribution. Except for the default option, other options are shown to be equivalent to the Huffman codes of a modified Laplacian symbol set, at specified symbol entropy values. Simulation results are obtained on actual aerial imagery over a wide entropy range, and they confirm the optimality of the scheme. Comparison with other known techniques are performed on several widely used images and the results further validate the coder's optimality.
The relative entropy is fundamental to adaptive resolution simulations
NASA Astrophysics Data System (ADS)
Kreis, Karsten; Potestio, Raffaello
2016-07-01
Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy with respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.
Multivariate quadrature for representing cloud condensation nuclei activity of aerosol populations
Fierce, Laura; McGraw, Robert L.
2017-07-26
Here, sparse representations of atmospheric aerosols are needed for efficient regional- and global-scale chemical transport models. Here we introduce a new framework for representing aerosol distributions, based on the quadrature method of moments. Given a set of moment constraints, we show how linear programming, combined with an entropy-inspired cost function, can be used to construct optimized quadrature representations of aerosol distributions. The sparse representations derived from this approach accurately reproduce cloud condensation nuclei (CCN) activity for realistically complex distributions simulated by a particleresolved model. Additionally, the linear programming techniques described in this study can be used to bound key aerosolmore » properties, such as the number concentration of CCN. Unlike the commonly used sparse representations, such as modal and sectional schemes, the maximum-entropy approach described here is not constrained to pre-determined size bins or assumed distribution shapes. This study is a first step toward a particle-based aerosol scheme that will track multivariate aerosol distributions with sufficient computational efficiency for large-scale simulations.« less
Multivariate quadrature for representing cloud condensation nuclei activity of aerosol populations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fierce, Laura; McGraw, Robert L.
Here, sparse representations of atmospheric aerosols are needed for efficient regional- and global-scale chemical transport models. Here we introduce a new framework for representing aerosol distributions, based on the quadrature method of moments. Given a set of moment constraints, we show how linear programming, combined with an entropy-inspired cost function, can be used to construct optimized quadrature representations of aerosol distributions. The sparse representations derived from this approach accurately reproduce cloud condensation nuclei (CCN) activity for realistically complex distributions simulated by a particleresolved model. Additionally, the linear programming techniques described in this study can be used to bound key aerosolmore » properties, such as the number concentration of CCN. Unlike the commonly used sparse representations, such as modal and sectional schemes, the maximum-entropy approach described here is not constrained to pre-determined size bins or assumed distribution shapes. This study is a first step toward a particle-based aerosol scheme that will track multivariate aerosol distributions with sufficient computational efficiency for large-scale simulations.« less
Introducing sampling entropy in repository based adaptive umbrella sampling
NASA Astrophysics Data System (ADS)
Zheng, Han; Zhang, Yingkai
2009-12-01
Determining free energy surfaces along chosen reaction coordinates is a common and important task in simulating complex systems. Due to the complexity of energy landscapes and the existence of high barriers, one widely pursued objective to develop efficient simulation methods is to achieve uniform sampling among thermodynamic states of interest. In this work, we have demonstrated sampling entropy (SE) as an excellent indicator for uniform sampling as well as for the convergence of free energy simulations. By introducing SE and the concentration theorem into the biasing-potential-updating scheme, we have further improved the adaptivity, robustness, and applicability of our recently developed repository based adaptive umbrella sampling (RBAUS) approach [H. Zheng and Y. Zhang, J. Chem. Phys. 128, 204106 (2008)]. Besides simulations of one dimensional free energy profiles for various systems, the generality and efficiency of this new RBAUS-SE approach have been further demonstrated by determining two dimensional free energy surfaces for the alanine dipeptide in gas phase as well as in water.
Material-based figure of merit for caloric materials
Griffith, L. D.; Mudryk, Y.; Slaughter, J.; ...
2018-01-21
Efficient use of reversible thermal effects in magnetocaloric, electrocaloric, and elastocaloric materials is a promising avenue that can lead to a substantially increased efficiency of refrigeration and heat pumping devices, most importantly those used in household and commercial cooling applications near ambient temperature. A proliferation in caloric materials research has resulted in a wide array of materials where only the isothermal change in entropy in response to a handful of different field strengths over a limited range of temperatures has been evaluated and reported. Given the abundance of such data, there is a clear need for a simple and reliablemore » figure of merit enabling fast screening and down-selection to justify further detailed characterization of those materials systems that hold the greatest promise. Based on the analysis of several well-known materials that exhibit vastly different magnetocaloric effects, the Temperature averaged Entropy Change (TEC) is introduced as a suitable early indicator of the material’s utility for magnetocaloric cooling applications, and its adoption by the caloric community is recommended.« less
Material-based figure of merit for caloric materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, L. D.; Mudryk, Y.; Slaughter, J.
Efficient use of reversible thermal effects in magnetocaloric, electrocaloric, and elastocaloric materials is a promising avenue that can lead to a substantially increased efficiency of refrigeration and heat pumping devices, most importantly those used in household and commercial cooling applications near ambient temperature. A proliferation in caloric materials research has resulted in a wide array of materials where only the isothermal change in entropy in response to a handful of different field strengths over a limited range of temperatures has been evaluated and reported. Given the abundance of such data, there is a clear need for a simple and reliablemore » figure of merit enabling fast screening and down-selection to justify further detailed characterization of those materials systems that hold the greatest promise. Based on the analysis of several well-known materials that exhibit vastly different magnetocaloric effects, the Temperature averaged Entropy Change (TEC) is introduced as a suitable early indicator of the material’s utility for magnetocaloric cooling applications, and its adoption by the caloric community is recommended.« less
Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks
Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.
2011-01-01
Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.
Exploring nonlocal observables in shock wave collisions
Ecker, Christian; Grumiller, Daniel; Stanzer, Philipp; ...
2016-11-09
In this paper, we study the time evolution of 2-point functions and entanglement entropy in strongly anisotropic, inhomogeneous and time-dependent N = 4 super Yang-Mills theory in the large N and large ’t Hooft coupling limit using AdS/CFT. On the gravity side this amounts to calculating the length of geodesics and area of extremal surfaces in the dynamical background of two colliding gravitational shockwaves, which we do numerically. We discriminate between three classes of initial conditions corresponding to wide, intermediate and narrow shocks, and show that they exhibit different phenomenology with respect to the nonlocal observables that we determine. Ourmore » results permit to use (holographic) entanglement entropy as an order parameter to distinguish between the two phases of the cross-over from the transparency to the full-stopping scenario in dynamical Yang-Mills plasma formation, which is frequently used as a toy model for heavy ion collisions. The time evolution of entanglement entropy allows to discern four regimes: highly efficient initial growth of entanglement, linear growth, (post) collisional drama and late time (polynomial) fall off. Surprisingly, we found that 2-point functions can be sensitive to the geometry inside the black hole apparent horizon, while we did not find such cases for the entanglement entropy.« less
Thermodynamics of an ideal generalized gas: II. Means of order alpha.
Lavenda, B H
2005-11-01
The property that power means are monotonically increasing functions of their order is shown to be the basis of the second laws not only for processes involving heat conduction, but also for processes involving deformations. This generalizes earlier work involving only pure heat conduction and underlines the incomparability of the internal energy and adiabatic potentials when expressed as powers of the adiabatic variable. In an L-potential equilibration, the final state will be one of maximum entropy, whereas in an entropy equilibration, the final state will be one of minimum L. Unlike classical equilibrium thermodynamic phase space, which lacks an intrinsic metric structure insofar as distances and other geometrical concepts do not have an intrinsic thermodynamic significance in such spaces, a metric space can be constructed for the power means: the distance between means of different order is related to the Carnot efficiency. In the ideal classical gas limit, the average change in the entropy is shown to be proportional to the difference between the Shannon and Rényi entropies for nonextensive systems that are multifractal in nature. The L potential, like the internal energy, is a Schur convex function of the empirical temperature, which satisfies Jensen's inequality, and serves as a measure of the tendency to uniformity in processes involving pure thermal conduction.
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
NASA Astrophysics Data System (ADS)
Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie
2008-06-01
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases.
NASA Astrophysics Data System (ADS)
Zhao, Liang; Adhikari, Avishek; Sakurai, Kouichi
Watermarking is one of the most effective techniques for copyright protection and information hiding. It can be applied in many fields of our society. Nowadays, some image scrambling schemes are used as one part of the watermarking algorithm to enhance the security. Therefore, how to select an image scrambling scheme and what kind of the image scrambling scheme may be used for watermarking are the key problems. Evaluation method of the image scrambling schemes can be seen as a useful test tool for showing the property or flaw of the image scrambling method. In this paper, a new scrambling evaluation system based on spatial distribution entropy and centroid difference of bit-plane is presented to obtain the scrambling degree of image scrambling schemes. Our scheme is illustrated and justified through computer simulations. The experimental results show (in Figs. 6 and 7) that for the general gray-scale image, the evaluation degree of the corresponding cipher image for the first 4 significant bit-planes selection is nearly the same as that for the 8 bit-planes selection. That is why, instead of taking 8 bit-planes of a gray-scale image, it is sufficient to take only the first 4 significant bit-planes for the experiment to find the scrambling degree. This 50% reduction in the computational cost makes our scheme efficient.
Zhang, Dandan; Jia, Xiaofeng; Ding, Haiyan; Ye, Datian; Thakor, Nitish V.
2011-01-01
Burst suppression (BS) activity in EEG is clinically accepted as a marker of brain dysfunction or injury. Experimental studies in a rodent model of brain injury following asphyxial cardiac arrest (CA) show evidence of BS soon after resuscitation, appearing as a transitional recovery pattern between isoelectricity and continuous EEG. The EEG trends in such experiments suggest varying levels of uncertainty or randomness in the signals. To quantify the EEG data, Shannon entropy and Tsallis entropy (TsEn) are examined. More specifically, an entropy-based measure named TsEn area (TsEnA) is proposed to reveal the presence and the extent of development of BS following brain injury. The methodology of TsEnA and the selection of its parameter are elucidated in detail. To test the validity of this measure, 15 rats were subjected to 7 or 9 min of asphyxial CA. EEG recordings immediately after resuscitation from CA were investigated and characterized by TsEnA. The results show that TsEnA correlates well with the outcome assessed by evaluating the rodents after the experiments using a well-established neurological deficit score (Pearson correlation = 0.86, p ⪡ 0.01). This research shows that TsEnA reliably quantifies the complex dynamics in BS EEG, and may be useful as an experimental or clinical tool for objective estimation of the gravity of brain damage after CA. PMID:19695982
Coherent entropy induced and acoustic noise separation in compact nozzles
NASA Astrophysics Data System (ADS)
Tao, Wenjie; Schuller, Thierry; Huet, Maxime; Richecoeur, Franck
2017-04-01
A method to separate entropy induced noise from an acoustic pressure wave in an harmonically perturbed flow through a nozzle is presented. It is tested on an original experimental setup generating simultaneously acoustic and temperature fluctuations in an air flow that is accelerated by a convergent nozzle. The setup mimics the direct and indirect noise contributions to the acoustic pressure field in a confined combustion chamber by producing synchronized acoustic and temperature fluctuations, without dealing with the complexity of the combustion process. It allows generating temperature fluctuations with amplitude up to 10 K in the frequency range from 10 to 100 Hz. The noise separation technique uses experiments with and without temperature fluctuations to determine the relative level of acoustic and entropy fluctuations in the system and to identify the nozzle response to these forcing waves. It requires multi-point measurements of acoustic pressure and temperature. The separation method is first validated with direct numerical simulations of the nonlinear Euler equations. These simulations are used to investigate the conditions for which the separation technique is valid and yield similar trends as the experiments for the investigated flow operating conditions. The separation method then gives successfully the acoustic reflection coefficient but does not recover the same entropy reflection coefficient as predicted by the compact nozzle theory due to the sensitivity of the method to signal noises in the explored experimental conditions. This methodology provides a framework for experimental investigation of direct and indirect combustion noises originating from synchronized perturbations.
High-entropy alloys in hexagonal close-packed structure
Gao, Michael C.; Zhang, B.; Guo, S. M.; ...
2015-08-28
The microstructures and properties of high-entropy alloys (HEAs) based on the face-centered cubic and body-centered cubic structures have been studied extensively in the literature, but reports on HEAs in the hexagonal close-packed (HCP) structure are very limited. Using an efficient strategy in combining phase diagram inspection, CALPHAD modeling, and ab initio molecular dynamics simulations, a variety of new compositions are suggested that may hold great potentials in forming single-phase HCP HEAs that comprise rare earth elements and transition metals, respectively. Lastly, experimental verification was carried out on CoFeReRu and CoReRuV using X-ray diffraction, scanning electron microscopy, and energy dispersion spectroscopy.
NASA Astrophysics Data System (ADS)
Obuchi, Tomoyuki; Monasson, Rémi
2015-09-01
The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.
Psychophysical experiments on the PicHunter image retrieval system
NASA Astrophysics Data System (ADS)
Papathomas, Thomas V.; Cox, Ingemar J.; Yianilos, Peter N.; Miller, Matt L.; Minka, Thomas P.; Conway, Tiffany E.; Ghosn, Joumana
2001-01-01
Psychophysical experiments were conducted on PicHunter, a content-based image retrieval (CBIR) experimental prototype with the following properties: (1) Based on a model of how users respond, it uses Bayes's rule to predict what target users want, given their actions. (2) It possesses an extremely simple user interface. (3) It employs an entropy- based scheme to improve convergence. (4) It introduces a paradigm for assessing the performance of CBIR systems. Experiments 1-3 studied human judgment of image similarity to obtain data for the model. Experiment 4 studied the importance of using: (a) semantic information, (b) memory of earlier input, and (c) relative and absolute judgments of similarity. Experiment 5 tested an approach that we propose for comparing performances of CBIR systems objectively. Finally, experiment 6 evaluated the most informative display-updating scheme that is based on entropy minimization, and confirmed earlier simulation results. These experiments represent one of the first attempts to quantify CBIR performance based on psychophysical studies, and they provide valuable data for improving CBIR algorithms. Even though they were designed with PicHunter in mind, their results can be applied to any CBIR system and, more generally, to any system that involves judgment of image similarity by humans.
Using constrained information entropy to detect rare adverse drug reactions from medical forums.
Yi Zheng; Chaowang Lan; Hui Peng; Jinyan Li
2016-08-01
Adverse drug reactions (ADRs) detection is critical to avoid malpractices yet challenging due to its uncertainty in pre-marketing review and the underreporting in post-marketing surveillance. To conquer this predicament, social media based ADRs detection methods have been proposed recently. However, existing researches are mostly co-occurrence based methods and face several issues, in particularly, leaving out the rare ADRs and unable to distinguish irrelevant ADRs. In this work, we introduce a constrained information entropy (CIE) method to solve these problems. CIE first recognizes the drug-related adverse reactions using a predefined keyword dictionary and then captures high- and low-frequency (rare) ADRs by information entropy. Extensive experiments on medical forums dataset demonstrate that CIE outperforms the state-of-the-art co-occurrence based methods, especially in rare ADRs detection.
Gu, Zhi-rong; Wang, Ya-li; Sun, Yu-jing; Dind, Jun-xia
2014-09-01
To investigate the establishment and application methods of entropy-weight TOPSIS model in synthetical quality evaluation of traditional Chinese medicine with Angelica sinensis growing in Gansu Province as an example. The contents of ferulic acid, 3-butylphthalide, Z-butylidenephthalide, Z-ligustilide, linolic acid, volatile oil, and ethanol soluble extractive were used as an evaluation index set. The weights of each evaluation index were determined by information entropy method. The entropyweight TOPSIS model was established to synthetically evaluate the quality of Angelica sinensis growing in Gansu Province by Euclid closeness degree. The results based on established model were in line with the daodi meaning and the knowledge of clinical experience. The established model was simple in calculation, objective, reliable, and can be applied to synthetical quality evaluation of traditional Chinese medicine.
First-order irreversible thermodynamic approach to a simple energy converter
NASA Astrophysics Data System (ADS)
Arias-Hernandez, L. A.; Angulo-Brown, F.; Paez-Hernandez, R. T.
2008-01-01
Several authors have shown that dissipative thermal cycle models based on finite-time thermodynamics exhibit loop-shaped curves of power output versus efficiency, such as it occurs with actual dissipative thermal engines. Within the context of first-order irreversible thermodynamics (FOIT), in this work we show that for an energy converter consisting of two coupled fluxes it is also possible to find loop-shaped curves of both power output and the so-called ecological function versus efficiency. In a previous work Stucki [J. W. Stucki, Eur. J. Biochem. 109, 269 (1980)] used a FOIT approach to describe the modes of thermodynamic performance of oxidative phosphorylation involved in adenosine triphosphate (ATP) synthesis within mithochondrias. In that work the author did not use the mentioned loop-shaped curves and he proposed that oxidative phosphorylation operates in a steady state at both minimum entropy production and maximum efficiency simultaneously, by means of a conductance matching condition between extreme states of zero and infinite conductances, respectively. In the present work we show that all Stucki’s results about the oxidative phosphorylation energetics can be obtained without the so-called conductance matching condition. On the other hand, we also show that the minimum entropy production state implies both null power output and efficiency and therefore this state is not fulfilled by the oxidative phosphorylation performance. Our results suggest that actual efficiency values of oxidative phosphorylation performance are better described by a mode of operation consisting of the simultaneous maximization of both the so-called ecological function and the efficiency.
Quan, H T
2014-06-01
We study the maximum efficiency of a heat engine based on a small system. It is revealed that due to the finiteness of the system, irreversibility may arise when the working substance contacts with a heat reservoir. As a result, there is a working-substance-dependent correction to the Carnot efficiency. We derive a general and simple expression for the maximum efficiency of a Carnot cycle heat engine in terms of the relative entropy. This maximum efficiency approaches the Carnot efficiency asymptotically when the size of the working substance increases to the thermodynamic limit. Our study extends Carnot's result of the maximum efficiency to an arbitrary working substance and elucidates the subtlety of thermodynamic laws in small systems.
Normal Mode Analysis in Zeolites: Toward an Efficient Calculation of Adsorption Entropies.
De Moor, Bart A; Ghysels, An; Reyniers, Marie-Françoise; Van Speybroeck, Veronique; Waroquier, Michel; Marin, Guy B
2011-04-12
An efficient procedure for normal-mode analysis of extended systems, such as zeolites, is developed and illustrated for the physisorption and chemisorption of n-octane and isobutene in H-ZSM-22 and H-FAU using periodic DFT calculations employing the Vienna Ab Initio Simulation Package. Physisorption and chemisorption entropies resulting from partial Hessian vibrational analysis (PHVA) differ at most 10 J mol(-1) K(-1) from those resulting from full Hessian vibrational analysis, even for PHVA schemes in which only a very limited number of atoms are considered free. To acquire a well-conditioned Hessian, much tighter optimization criteria than commonly used for electronic energy calculations in zeolites are required, i.e., at least an energy cutoff of 400 eV, maximum force of 0.02 eV/Å, and self-consistent field loop convergence criteria of 10(-8) eV. For loosely bonded complexes the mobile adsorbate method is applied, in which frequency contributions originating from translational or rotational motions of the adsorbate are removed from the total partition function and replaced by free translational and/or rotational contributions. The frequencies corresponding with these translational and rotational modes can be selected unambiguously based on a mobile block Hessian-PHVA calculation, allowing the prediction of physisorption entropies within an accuracy of 10-15 J mol(-1) K(-1) as compared to experimental values. The approach presented in this study is useful for studies on other extended catalytic systems.
Gas chemical adsorption characterization of lanthanide hexafluoroacetylacetonates
Stratz, S. Adam; Jones, Steven J.; Mullen, Austin D.; ...
2017-03-21
Newly-established adsorption enthalpy and entropy values of 12 lanthanide hexafluoroacetylacetonates, denoted Ln[hfac] 4, along with the experimental and theoretical methodology used to obtain these values, are presented for the first time. The results of this work can be used in conjunction with theoretical modeling techniques to optimize a large-scale gas-phase separation experiment using isothermal chromatography. The results to date indicate average adsorption enthalpy and entropy values of the 12 Ln[hfac] 4 complexes ranging from -33 to -139 kJ/mol K and -299 to -557 J/mol, respectively.
Bilayer graphene phonovoltaic-FET: In situ phonon recycling
NASA Astrophysics Data System (ADS)
Melnick, Corey; Kaviany, Massoud
2017-11-01
A new heat harvester, the phonovoltaic (pV) cell, was recently proposed. The device converts optical phonons into power before they become heat. Due to the low entropy of a typical hot optical phonon population, the phonovoltaic can operate at high fractions of the Carnot limit and harvest heat more efficiently than conventional heat harvesting technologies such as the thermoelectric generator. Previously, the optical phonon source was presumed to produce optical phonons with a single polarization and momentum. Here, we examine a realistic optical phonon source in a potential pV application and the effects this has on pV operation. Supplementing this work is our investigation of bilayer graphene as a new pV material. Our ab initio calculations show that bilayer graphene has a figure of merit exceeding 0.9, well above previously investigated materials. This allows a room-temperature pV to recycle 65% of a highly nonequilibrium, minimum entropy population of phonons. However, full-band Monte Carlo simulations of the electron and phonon dynamics in a bilayer graphene field-effect transistor (FET) show that the optical phonons emitted by field-accelerated electrons can only be recycled in situ with an efficiency of 50%, and this efficiency falls as the field strength grows. Still, an appropriately designed FET-pV can recycle the phonons produced therein in situ with a much higher efficiency than a thermoelectric generator can harvest heat produced by a FET ex situ.
Mutual Information Item Selection in Adaptive Classification Testing
ERIC Educational Resources Information Center
Weissman, Alexander
2007-01-01
A general approach for item selection in adaptive multiple-category classification tests is provided. The approach uses mutual information (MI), a special case of the Kullback-Leibler distance, or relative entropy. MI works efficiently with the sequential probability ratio test and alleviates the difficulties encountered with using other local-…
An adaptive technique to maximize lossless image data compression of satellite images
NASA Technical Reports Server (NTRS)
Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe
1994-01-01
Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.
Predictive uncertainty in auditory sequence processing
Hansen, Niels Chr.; Pearce, Marcus T.
2014-01-01
Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018
Signatures of Solvation Thermodynamics in Spectra of Intermolecular Vibrations
2017-01-01
This study explores the thermodynamic and vibrational properties of water in the three-dimensional environment of solvated ions and small molecules using molecular simulations. The spectrum of intermolecular vibrations in liquid solvents provides detailed information on the shape of the local potential energy surface, which in turn determines local thermodynamic properties such as the entropy. Here, we extract this information using a spatially resolved extension of the two-phase thermodynamics method to estimate hydration water entropies based on the local vibrational density of states (3D-2PT). Combined with an analysis of solute–water and water–water interaction energies, this allows us to resolve local contributions to the solvation enthalpy, entropy, and free energy. We use this approach to study effects of ions on their surrounding water hydrogen bond network, its spectrum of intermolecular vibrations, and resulting thermodynamic properties. In the three-dimensional environment of polar and nonpolar functional groups of molecular solutes, we identify distinct hydration water species and classify them by their characteristic vibrational density of states and molecular entropies. In each case, we are able to assign variations in local hydration water entropies to specific changes in the spectrum of intermolecular vibrations. This provides an important link for the thermodynamic interpretation of vibrational spectra that are accessible to far-infrared absorption and Raman spectroscopy experiments. Our analysis provides unique microscopic details regarding the hydration of hydrophobic and hydrophilic functional groups, which enable us to identify interactions and molecular degrees of freedom that determine relevant contributions to the solvation entropy and consequently the free energy. PMID:28783431
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urniezius, Renaldas
2011-03-14
The principle of Maximum relative Entropy optimization was analyzed for dead reckoning localization of a rigid body when observation data of two attached accelerometers was collected. Model constraints were derived from the relationships between the sensors. The experiment's results confirmed that accelerometers each axis' noise can be successfully filtered utilizing dependency between channels and the dependency between time series data. Dependency between channels was used for a priori calculation, and a posteriori distribution was derived utilizing dependency between time series data. There was revisited data of autocalibration experiment by removing the initial assumption that instantaneous rotation axis of a rigidmore » body was known. Performance results confirmed that such an approach could be used for online dead reckoning localization.« less
An Instructive Model of Entropy
ERIC Educational Resources Information Center
Zimmerman, Seth
2010-01-01
This article first notes the misinterpretation of a common thought experiment, and the misleading comment that "systems tend to flow from less probable to more probable macrostates". It analyses the experiment, generalizes it and introduces a new tool of investigation, the simplectic structure. A time-symmetric model is built upon this structure,…
Unsteady specific work and isentropic efficiency of a radial turbine driven by pulsed detonations
NASA Astrophysics Data System (ADS)
Rouser, Kurt P.
There has been longstanding government and industry interest in pressure-gain combustion for use in Brayton cycle based engines. Theoretically, pressure-gain combustion allows heat addition with reduced entropy loss. The pulsed detonation combustor (PDC) is a device that can provide such pressure-gain combustion and possibly replace typical steady deflagration combustors. The PDC is inherently unsteady, however, and comparisons with conventional steady deflagration combustors must be based upon time-integrated performance variables. In this study, the radial turbine of a Garrett automotive turbocharger was coupled directly to and driven, full admission, by a PDC in experiments fueled by hydrogen or ethylene. Data included pulsed cycle time histories of turbine inlet and exit temperature, pressure, velocity, mass flow, and enthalpy. The unsteady inlet flowfield showed momentary reverse flow, and thus unsteady accumulation and expulsion of mass and enthalpy within the device. The coupled turbine-driven compressor provided a time-resolved measure of turbine power. Peak power increased with PDC fill fraction, and duty cycle increased with PDC frequency. Cycle-averaged unsteady specific work increased with fill fraction and frequency. An unsteady turbine efficiency formulation is proposed, including heat transfer effects, enthalpy flux-weighted total pressure ratio, and ensemble averaging over multiple cycles. Turbine efficiency increased with frequency but was lower than the manufacturer reported conventional steady turbine efficiency.
Approximate reversibility in the context of entropy gain, information gain, and complete positivity
NASA Astrophysics Data System (ADS)
Buscemi, Francesco; Das, Siddhartha; Wilde, Mark M.
2016-06-01
There are several inequalities in physics which limit how well we can process physical systems to achieve some intended goal, including the second law of thermodynamics, entropy bounds in quantum information theory, and the uncertainty principle of quantum mechanics. Recent results provide physically meaningful enhancements of these limiting statements, determining how well one can attempt to reverse an irreversible process. In this paper, we apply and extend these results to give strong enhancements to several entropy inequalities, having to do with entropy gain, information gain, entropic disturbance, and complete positivity of open quantum systems dynamics. Our first result is a remainder term for the entropy gain of a quantum channel. This result implies that a small increase in entropy under the action of a subunital channel is a witness to the fact that the channel's adjoint can be used as a recovery map to undo the action of the original channel. We apply this result to pure-loss, quantum-limited amplifier, and phase-insensitive quantum Gaussian channels, showing how a quantum-limited amplifier can serve as a recovery from a pure-loss channel and vice versa. Our second result regards the information gain of a quantum measurement, both without and with quantum side information. We find here that a small information gain implies that it is possible to undo the action of the original measurement if it is efficient. The result also has operational ramifications for the information-theoretic tasks known as measurement compression without and with quantum side information. Our third result shows that the loss of Holevo information caused by the action of a noisy channel on an input ensemble of quantum states is small if and only if the noise can be approximately corrected on average. We finally establish that the reduced dynamics of a system-environment interaction are approximately completely positive and trace preserving if and only if the data processing inequality holds approximately.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee
2007-08-15
We have previously presented a knowledge-based computer-assisted detection (KB-CADe) system for the detection of mammographic masses. The system is designed to compare a query mammographic region with mammographic templates of known ground truth. The templates are stored in an adaptive knowledge database. Image similarity is assessed with information theoretic measures (e.g., mutual information) derived directly from the image histograms. A previous study suggested that the diagnostic performance of the system steadily improves as the knowledge database is initially enriched with more templates. However, as the database increases in size, an exhaustive comparison of the query case with each stored templatemore » becomes computationally burdensome. Furthermore, blind storing of new templates may result in redundancies that do not necessarily improve diagnostic performance. To address these concerns we investigated an entropy-based indexing scheme for improving the speed of analysis and for satisfying database storage restrictions without compromising the overall diagnostic performance of our KB-CADe system. The indexing scheme was evaluated on two different datasets as (i) a search mechanism to sort through the knowledge database, and (ii) a selection mechanism to build a smaller, concise knowledge database that is easier to maintain but still effective. There were two important findings in the study. First, entropy-based indexing is an effective strategy to identify fast a subset of templates that are most relevant to a given query. Only this subset could be analyzed in more detail using mutual information for optimized decision making regarding the query. Second, a selective entropy-based deposit strategy may be preferable where only high entropy cases are maintained in the knowledge database. Overall, the proposed entropy-based indexing scheme was shown to reduce the computational cost of our KB-CADe system by 55% to 80% while maintaining the system's diagnostic performance.« less
Entropy, non-linearity and hierarchy in ecosystems
NASA Astrophysics Data System (ADS)
Addiscott, T.
2009-04-01
Soil-plant systems are open systems thermodynamically because they exchange both energy and matter with their surroundings. Thus they are properly described by the second and third of the three stages of thermodynamics defined by Prigogine and Stengers (1984). The second stage describes a system in which the flow is linearly related to the force. Such a system tends towards a steady state in which entropy production is minimized, but it depends on the capacity of the system for self-organization. In a third stage system, flow is non-linearly related to force, and the system can move far from equilibrium. This system maximizes entropy production but in so doing facilitates self-organization. The second stage system was suggested earlier to provide a useful analogue of the behaviour of natural and agricultural ecosystems subjected to perturbations, but it needs the capacity for self-organization. Considering an ecosystem as a hierarchy suggests this capacity is provided by the soil population, which releases from dead plant matter nutrients such as nitrate, phosphate and captions needed for growth of new plants and the renewal of the whole ecosystem. This release of small molecules from macromolecules increases entropy, and the soil population maximizes entropy production by releasing nutrients and carbon dioxide as vigorously as conditions allow. In so doing it behaves as a third stage thermodynamic system. Other authors (Schneider and Kay, 1994, 1995) consider that it is in the plants in an ecosystem that maximize entropy, mainly through transpiration, but studies on transpiration efficiency suggest that this is questionable. Prigogine, I. & Stengers, I. 1984. Order out of chaos. Bantam Books, Toronto. Schneider, E.D. & Kay, J.J. 1994. Life as a manifestation of the Second Law of Thermodynamics. Mathematical & Computer Modelling, 19, 25-48. Schneider, E.D. & Kay, J.J. 1995. Order from disorder: The Thermodynamics of Complexity in Biology. In: What is Life: the Next Fifty Years (eds. M.P. Murphy & L.A.J. O'Neill), pp. 161-172, Cambridge University Press, Cambridge.
Determination of the Latent Heats and Triple Point of Perfluorocyclobutane
ERIC Educational Resources Information Center
Briggs, A. G.; Strachan, A. N.
1977-01-01
Proposes the use of Perfluorocyclobutane in physical chemistry courses to conduct experiments on latent heat, triple point temperatures and pressures, boiling points, and entropy of vaporization. (SL)
Information theory and robotics meet to study predator-prey interactions
NASA Astrophysics Data System (ADS)
Neri, Daniele; Ruberto, Tommaso; Cord-Cruz, Gabrielle; Porfiri, Maurizio
2017-07-01
Transfer entropy holds promise to advance our understanding of animal behavior, by affording the identification of causal relationships that underlie animal interactions. A critical step toward the reliable implementation of this powerful information-theoretic concept entails the design of experiments in which causal relationships could be systematically controlled. Here, we put forward a robotics-based experimental approach to test the validity of transfer entropy in the study of predator-prey interactions. We investigate the behavioral response of zebrafish to a fear-evoking robotic stimulus, designed after the morpho-physiology of the red tiger oscar and actuated along preprogrammed trajectories. From the time series of the positions of the zebrafish and the robotic stimulus, we demonstrate that transfer entropy correctly identifies the influence of the stimulus on the focal subject. Building on this evidence, we apply transfer entropy to study the interactions between zebrafish and a live red tiger oscar. The analysis of transfer entropy reveals a change in the direction of the information flow, suggesting a mutual influence between the predator and the prey, where the predator adapts its strategy as a function of the movement of the prey, which, in turn, adjusts its escape as a function of the predator motion. Through the integration of information theory and robotics, this study posits a new approach to study predator-prey interactions in freshwater fish.
Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T
2015-02-01
The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.
NASA Astrophysics Data System (ADS)
Cheung, Shao-Yong; Lee, Chieh-Han; Yu, Hwa-Lung
2017-04-01
Due to the limited hydrogeological observation data and high levels of uncertainty within, parameter estimation of the groundwater model has been an important issue. There are many methods of parameter estimation, for example, Kalman filter provides a real-time calibration of parameters through measurement of groundwater monitoring wells, related methods such as Extended Kalman Filter and Ensemble Kalman Filter are widely applied in groundwater research. However, Kalman Filter method is limited to linearity. This study propose a novel method, Bayesian Maximum Entropy Filtering, which provides a method that can considers the uncertainty of data in parameter estimation. With this two methods, we can estimate parameter by given hard data (certain) and soft data (uncertain) in the same time. In this study, we use Python and QGIS in groundwater model (MODFLOW) and development of Extended Kalman Filter and Bayesian Maximum Entropy Filtering in Python in parameter estimation. This method may provide a conventional filtering method and also consider the uncertainty of data. This study was conducted through numerical model experiment to explore, combine Bayesian maximum entropy filter and a hypothesis for the architecture of MODFLOW groundwater model numerical estimation. Through the virtual observation wells to simulate and observe the groundwater model periodically. The result showed that considering the uncertainty of data, the Bayesian maximum entropy filter will provide an ideal result of real-time parameters estimation.
Information efficiency in visual communication
NASA Astrophysics Data System (ADS)
Alter-Gartenberg, Rachel; Rahman, Zia-ur
1993-08-01
This paper evaluates the quantization process in the context of the end-to-end performance of the visual-communication channel. Results show that the trade-off between data transmission and visual quality revolves around the information in the acquired signal, not around its energy. Improved information efficiency is gained by frequency dependent quantization that maintains the information capacity of the channel and reduces the entropy of the encoded signal. Restorations with energy bit-allocation lose both in sharpness and clarity relative to restorations with information bit-allocation. Thus, quantization with information bit-allocation is preferred for high information efficiency and visual quality in optimized visual communication.
Information efficiency in visual communication
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel; Rahman, Zia-Ur
1993-01-01
This paper evaluates the quantization process in the context of the end-to-end performance of the visual-communication channel. Results show that the trade-off between data transmission and visual quality revolves around the information in the acquired signal, not around its energy. Improved information efficiency is gained by frequency dependent quantization that maintains the information capacity of the channel and reduces the entropy of the encoded signal. Restorations with energy bit-allocation lose both in sharpness and clarity relative to restorations with information bit-allocation. Thus, quantization with information bit-allocation is preferred for high information efficiency and visual quality in optimized visual communication.
Serotonergic Psychedelics Temporarily Modify Information Transfer in Humans
Alonso, Joan Francesc; Romero, Sergio; Mañanas, Miquel Àngel
2015-01-01
Background: Psychedelics induce intense modifications in the sensorium, the sense of “self,” and the experience of reality. Despite advances in our understanding of the molecular and cellular level mechanisms of these drugs, knowledge of their actions on global brain dynamics is still incomplete. Recent imaging studies have found changes in functional coupling between frontal and parietal brain structures, suggesting a modification in information flow between brain regions during acute effects. Methods: Here we assessed the psychedelic-induced changes in directionality of information flow during the acute effects of a psychedelic in humans. We measured modifications in connectivity of brain oscillations using transfer entropy, a nonlinear measure of directed functional connectivity based on information theory. Ten healthy male volunteers with prior experience with psychedelics participated in 2 experimental sessions. They received a placebo or a dose of ayahuasca, a psychedelic preparation containing the serotonergic 5-HT2A agonist N,N-dimethyltryptamine. Results: The analysis showed significant changes in the coupling of brain oscillations between anterior and posterior recording sites. Transfer entropy analysis showed that frontal sources decreased their influence over central, parietal, and occipital sites. Conversely, sources in posterior locations increased their influence over signals measured at anterior locations. Exploratory correlations found that anterior-to-posterior transfer entropy decreases were correlated with the intensity of subjective effects, while the imbalance between anterior-to-posterior and posterior-to-anterior transfer entropy correlated with the degree of incapacitation experienced. Conclusions: These results suggest that psychedelics induce a temporary disruption of neural hierarchies by reducing top-down control and increasing bottom-up information transfer in the human brain. PMID:25820842
NASA Astrophysics Data System (ADS)
Prot, Olivier; SantolíK, OndřEj; Trotignon, Jean-Gabriel; Deferaudy, Hervé
2006-06-01
An entropy regularization algorithm (ERA) has been developed to compute the wave-energy density from electromagnetic field measurements. It is based on the wave distribution function (WDF) concept. To assess its suitability and efficiency, the algorithm is applied to experimental data that has already been analyzed using other inversion techniques. The FREJA satellite data that is used consists of six spectral matrices corresponding to six time-frequency points of an ELF hiss-event spectrogram. The WDF analysis is performed on these six points and the results are compared with those obtained previously. A statistical stability analysis confirms the stability of the solutions. The WDF computation is fast and without any prespecified parameters. The regularization parameter has been chosen in accordance with the Morozov's discrepancy principle. The Generalized Cross Validation and L-curve criterions are then tentatively used to provide a fully data-driven method. However, these criterions fail to determine a suitable value of the regularization parameter. Although the entropy regularization leads to solutions that agree fairly well with those already published, some differences are observed, and these are discussed in detail. The main advantage of the ERA is to return the WDF that exhibits the largest entropy and to avoid the use of a priori models, which sometimes seem to be more accurate but without any justification.
Thermodynamics of concentrated solid solution alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Michael C.; Zhang, C.; Gao, P.
This study reviews the three main approaches for predicting the formation of concentrated solid solution alloys (CSSA) and for modeling their thermodynamic properties, in particular, utilizing the methodologies of empirical thermo-physical parameters, CALPHAD method, and first-principles calculations combined with hybrid Monte Carlo/Molecular Dynamics (MC/MD) simulations. In order to speed up CSSA development, a variety of empirical parameters based on Hume-Rothery rules have been developed. Herein, these parameters have been systematically and critically evaluated for their efficiency in predicting solid solution formation. The phase stability of representative CSSA systems is then illustrated from the perspectives of phase diagrams and nucleation drivingmore » force plots of the σ phase using CALPHAD method. The temperature-dependent total entropies of the FCC, BCC, HCP, and σ phases in equimolar compositions of various systems are presented next, followed by the thermodynamic properties of mixing of the BCC phase in Al-containing and Ti-containing refractory metal systems. First-principles calculations on model FCC, BCC and HCP CSSA reveal the presence of both positive and negative vibrational entropies of mixing, while the calculated electronic entropies of mixing are negligible. Temperature dependent configurational entropy is determined from the atomic structures obtained from MC/MD simulations. Current status and challenges in using these methodologies as they pertain to thermodynamic property analysis and CSSA design are discussed.« less
Thermodynamics of concentrated solid solution alloys
Gao, Michael C.; Zhang, C.; Gao, P.; ...
2017-10-12
This study reviews the three main approaches for predicting the formation of concentrated solid solution alloys (CSSA) and for modeling their thermodynamic properties, in particular, utilizing the methodologies of empirical thermo-physical parameters, CALPHAD method, and first-principles calculations combined with hybrid Monte Carlo/Molecular Dynamics (MC/MD) simulations. In order to speed up CSSA development, a variety of empirical parameters based on Hume-Rothery rules have been developed. Herein, these parameters have been systematically and critically evaluated for their efficiency in predicting solid solution formation. The phase stability of representative CSSA systems is then illustrated from the perspectives of phase diagrams and nucleation drivingmore » force plots of the σ phase using CALPHAD method. The temperature-dependent total entropies of the FCC, BCC, HCP, and σ phases in equimolar compositions of various systems are presented next, followed by the thermodynamic properties of mixing of the BCC phase in Al-containing and Ti-containing refractory metal systems. First-principles calculations on model FCC, BCC and HCP CSSA reveal the presence of both positive and negative vibrational entropies of mixing, while the calculated electronic entropies of mixing are negligible. Temperature dependent configurational entropy is determined from the atomic structures obtained from MC/MD simulations. Current status and challenges in using these methodologies as they pertain to thermodynamic property analysis and CSSA design are discussed.« less
Na, Sung Dae; Wei, Qun; Seong, Ki Woong; Cho, Jin Ho; Kim, Myoung Nam
2018-01-01
The conventional methods of speech enhancement, noise reduction, and voice activity detection are based on the suppression of noise or non-speech components of the target air-conduction signals. However, air-conduced speech is hard to differentiate from babble or white noise signals. To overcome this problem, the proposed algorithm uses the bone-conduction speech signals and soft thresholding based on the Shannon entropy principle and cross-correlation of air- and bone-conduction signals. A new algorithm for speech detection and noise reduction is proposed, which makes use of the Shannon entropy principle and cross-correlation with the bone-conduction speech signals to threshold the wavelet packet coefficients of the noisy speech. The proposed method can be get efficient result by objective quality measure that are PESQ, RMSE, Correlation, SNR. Each threshold is generated by the entropy and cross-correlation approaches in the decomposed bands using the wavelet packet decomposition. As a result, the noise is reduced by the proposed method using the MATLAB simulation. To verify the method feasibility, we compared the air- and bone-conduction speech signals and their spectra by the proposed method. As a result, high performance of the proposed method is confirmed, which makes it quite instrumental to future applications in communication devices, noisy environment, construction, and military operations.
NASA Astrophysics Data System (ADS)
Chen, Xiaoguang; Liang, Lin; Liu, Fei; Xu, Guanghua; Luo, Ailing; Zhang, Sicong
2012-05-01
Nowadays, Motor Current Signature Analysis (MCSA) is widely used in the fault diagnosis and condition monitoring of machine tools. However, although the current signal has lower SNR (Signal Noise Ratio), it is difficult to identify the feature frequencies of machine tools from complex current spectrum that the feature frequencies are often dense and overlapping by traditional signal processing method such as FFT transformation. With the study in the Motor Current Signature Analysis (MCSA), it is found that the entropy is of importance for frequency identification, which is associated with the probability distribution of any random variable. Therefore, it plays an important role in the signal processing. In order to solve the problem that the feature frequencies are difficult to be identified, an entropy optimization technique based on motor current signal is presented in this paper for extracting the typical feature frequencies of machine tools which can effectively suppress the disturbances. Some simulated current signals were made by MATLAB, and a current signal was obtained from a complex gearbox of an iron works made in Luxembourg. In diagnosis the MCSA is combined with entropy optimization. Both simulated and experimental results show that this technique is efficient, accurate and reliable enough to extract the feature frequencies of current signal, which provides a new strategy for the fault diagnosis and the condition monitoring of machine tools.
Azizi, Susan; Mahdavi Shahri, Mahnaz; Mohamad, Rosfarizan
2017-06-08
In the present study, ZnO nanoparticles (NPs) were synthesized in zerumbone solution by a green approach and appraised for their ability to absorb Pb(II) ions from aqueous solution. The formation of as-synthesized NPs was established by X-ray diffraction (XRD), Transmission Electron Microscopy (TEM), and UV-visible studies. The XRD and TEM analyses revealed high purity and wurtzite hexagonal structure of ZnO NPs with a mean size of 10.01 ± 2.6 nm. Batch experiments were performed to investigate the impact of process parameters viz. Pb(II) concentration, pH of solution, adsorbent mass, solution temperature, and contact time variations on the removal efficiency of Pb(II). The adsorption isotherm data provided that the adsorption process was mainly monolayer on ZnO NPs. The adsorption process follows pseudo-second-order reaction kinetic. The maximum removal efficiencies were 93% at pH 5. Thermodynamic parameters such as enthalpy change (ΔH⁰), free energy change (ΔG⁰), and entropy change (ΔS⁰) were calculated; the adsorption process was spontaneous and endothermic. The good efficiency of the as-synthesized NPs makes them attractive for applications in water treatment, for removal of heavy metals from aqueous system.
Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo
2018-01-01
We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.
NASA Astrophysics Data System (ADS)
Li, Gang; Zhao, Qing
2017-03-01
In this paper, a minimum entropy deconvolution based sinusoidal synthesis (MEDSS) filter is proposed to improve the fault detection performance of the regular sinusoidal synthesis (SS) method. The SS filter is an efficient linear predictor that exploits the frequency properties during model construction. The phase information of the harmonic components is not used in the regular SS filter. However, the phase relationships are important in differentiating noise from characteristic impulsive fault signatures. Therefore, in this work, the minimum entropy deconvolution (MED) technique is used to optimize the SS filter during the model construction process. A time-weighted-error Kalman filter is used to estimate the MEDSS model parameters adaptively. Three simulation examples and a practical application case study are provided to illustrate the effectiveness of the proposed method. The regular SS method and the autoregressive MED (ARMED) method are also implemented for comparison. The MEDSS model has demonstrated superior performance compared to the regular SS method and it also shows comparable or better performance with much less computational intensity than the ARMED method.
Information entropy of Gegenbauer polynomials and Gaussian quadrature
NASA Astrophysics Data System (ADS)
Sánchez-Ruiz, Jorge
2003-05-01
In a recent paper (Buyarov V S, López-Artés P, Martínez-Finkelshtein A and Van Assche W 2000 J. Phys. A: Math. Gen. 33 6549-60), an efficient method was provided for evaluating in closed form the information entropy of the Gegenbauer polynomials C(lambda)n(x) in the case when lambda = l in Bbb N. For given values of n and l, this method requires the computation by means of recurrence relations of two auxiliary polynomials, P(x) and H(x), of degrees 2l - 2 and 2l - 4, respectively. Here it is shown that P(x) is related to the coefficients of the Gaussian quadrature formula for the Gegenbauer weights wl(x) = (1 - x2)l-1/2, and this fact is used to obtain the explicit expression of P(x). From this result, an explicit formula is also given for the polynomial S(x) = limnrightarrowinfty P(1 - x/(2n2)), which is relevant to the study of the asymptotic (n rightarrow infty with l fixed) behaviour of the entropy.
Information gains from cosmic microwave background experiments
NASA Astrophysics Data System (ADS)
Seehars, Sebastian; Amara, Adam; Refregier, Alexandre; Paranjape, Aseem; Akeret, Joël
2014-07-01
To shed light on the fundamental problems posed by dark energy and dark matter, a large number of experiments have been performed and combined to constrain cosmological models. We propose a novel way of quantifying the information gained by updates on the parameter constraints from a series of experiments which can either complement earlier measurements or replace them. For this purpose, we use the Kullback-Leibler divergence or relative entropy from information theory to measure differences in the posterior distributions in model parameter space from a pair of experiments. We apply this formalism to a historical series of cosmic microwave background experiments ranging from Boomerang to WMAP, SPT, and Planck. Considering different combinations of these experiments, we thus estimate the information gain in units of bits and distinguish contributions from the reduction of statistical errors and the "surprise" corresponding to a significant shift of the parameters' central values. For this experiment series, we find individual relative entropy gains ranging from about 1 to 30 bits. In some cases, e.g. when comparing WMAP and Planck results, we find that the gains are dominated by the surprise rather than by improvements in statistical precision. We discuss how this technique provides a useful tool for both quantifying the constraining power of data from cosmological probes and detecting the tensions between experiments.
Efficient option valuation of single and double barrier options
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Milev, Mariyan; Koleva-Petkova, Dessislava; Vladev, Veselin
2017-12-01
In this paper we present an implementation of pricing algorithm for single and double barrier options using Mellin transformation with Maximum Entropy Inversion and its suitability for real-world applications. A detailed analysis of the applied algorithm is accompanied by implementation in C++ that is then compared to existing solutions in terms of efficiency and computational power. We then compare the applied method with existing closed-form solutions and well known methods of pricing barrier options that are based on finite differences.
NASA Astrophysics Data System (ADS)
Hu, Xiaoqian; Tao, Jinxu; Ye, Zhongfu; Qiu, Bensheng; Xu, Jinzhang
2018-05-01
In order to solve the problem of medical image segmentation, a wavelet neural network medical image segmentation algorithm based on combined maximum entropy criterion is proposed. Firstly, we use bee colony algorithm to optimize the network parameters of wavelet neural network, get the parameters of network structure, initial weights and threshold values, and so on, we can quickly converge to higher precision when training, and avoid to falling into relative extremum; then the optimal number of iterations is obtained by calculating the maximum entropy of the segmented image, so as to achieve the automatic and accurate segmentation effect. Medical image segmentation experiments show that the proposed algorithm can reduce sample training time effectively and improve convergence precision, and segmentation effect is more accurate and effective than traditional BP neural network (back propagation neural network : a multilayer feed forward neural network which trained according to the error backward propagation algorithm.
Elder, Robert M; Jayaraman, Arthi
2013-10-10
Gene therapy relies on the delivery of DNA into cells, and polycations are one class of vectors enabling efficient DNA delivery. Nuclear localization sequences (NLS), cationic oligopeptides that target molecules for nuclear entry, can be incorporated into polycations to improve their gene delivery efficiency. We use simulations to study the effect of peptide chemistry and sequence on the DNA-binding behavior of NLS-grafted polycations by systematically mutating the residues in the grafts, which are based on the SV40 NLS (peptide sequence PKKKRKV). Replacing arginine (R) with lysine (K) reduces binding strength by eliminating arginine-DNA interactions, but placing R in a less hindered location (e.g., farther from the grafting point to the polycation backbone) has surprisingly little effect on polycation-DNA binding strength. Changing the positions of the hydrophobic proline (P) and valine (V) residues relative to the polycation backbone changes hydrophobic aggregation within the polycation and, consequently, changes the conformational entropy loss that occurs upon polycation-DNA binding. Since conformational entropy loss affects the free energy of binding, the positions of P and V in the grafts affect DNA binding affinity. The insight from this work guides synthesis of polycations with tailored DNA binding affinity and, in turn, efficient DNA delivery.
Long, Chengjiang; Hua, Gang; Kapoor, Ashish
2015-01-01
We present a noise resilient probabilistic model for active learning of a Gaussian process classifier from crowds, i.e., a set of noisy labelers. It explicitly models both the overall label noise and the expertise level of each individual labeler with two levels of flip models. Expectation propagation is adopted for efficient approximate Bayesian inference of our probabilistic model for classification, based on which, a generalized EM algorithm is derived to estimate both the global label noise and the expertise of each individual labeler. The probabilistic nature of our model immediately allows the adoption of the prediction entropy for active selection of data samples to be labeled, and active selection of high quality labelers based on their estimated expertise to label the data. We apply the proposed model for four visual recognition tasks, i.e., object category recognition, multi-modal activity recognition, gender recognition, and fine-grained classification, on four datasets with real crowd-sourced labels from the Amazon Mechanical Turk. The experiments clearly demonstrate the efficacy of the proposed model. In addition, we extend the proposed model with the Predictive Active Set Selection Method to speed up the active learning system, whose efficacy is verified by conducting experiments on the first three datasets. The results show our extended model can not only preserve a higher accuracy, but also achieve a higher efficiency. PMID:26924892
NASA Astrophysics Data System (ADS)
di Liberto, Francesco; Pastore, Raffaele; Peruggi, Fulvio
2011-05-01
When some entropy is transferred, by means of a reversible engine, from a hot heat source to a colder one, the maximum efficiency occurs, i.e. the maximum available work is obtained. Similarly, a reversible heat pumps transfer entropy from a cold heat source to a hotter one with the minimum expense of energy. In contrast, if we are faced with non-reversible devices, there is some lost work for heat engines, and some extra work for heat pumps. These quantities are both related to entropy production. The lost work, i.e. ? , is also called 'degraded energy' or 'energy unavailable to do work'. The extra work, i.e. ? , is the excess of work performed on the system in the irreversible process with respect to the reversible one (or the excess of heat given to the hotter source in the irreversible process). Both quantities are analysed in detail and are evaluated for a complex process, i.e. the stepwise circular cycle, which is similar to the stepwise Carnot cycle. The stepwise circular cycle is a cycle performed by means of N small weights, dw, which are first added and then removed from the piston of the vessel containing the gas or vice versa. The work performed by the gas can be found as the increase of the potential energy of the dw's. Each single dw is identified and its increase, i.e. its increase in potential energy, evaluated. In such a way it is found how the energy output of the cycle is distributed among the dw's. The size of the dw's affects entropy production and therefore the lost and extra work. The distribution of increases depends on the chosen removal process.
NASA Astrophysics Data System (ADS)
Tiotsop, M.; Fotue, A. J.; Fotsin, H. B.; Fai, L. C.
2017-08-01
Bound polaron in RbCl delta quantum dot under electric field and Coulombic impurity were considered. The ground and first excited state energy were derived by employing Pekar variational and unitary transformation methods. Applying Fermi golden rule, the expression of temperature and polaron lifetime were derived. The decoherence was studied trough the Tsallis entropy. Results shows that decreasing (or increasing) the lifetime increases (or decreases) the temperature and delta parameter (electric field strength and hydrogenic impurity). This suggests that to accelerate quantum transition in nanostructure, temperature and delta have to be enhanced. The improvement of electric field and coulomb parameter, increases the lifetime of the delta quantum dot qubit. Energy spectrum of polaron increases with increase in temperature, electric field strength, Coulomb parameter, delta parameter, and polaronic radius. The control of the delta quantum dot energies can be done via the electric field, coulomb impurity, and delta parameter. Results also show that the non-extensive entropy is an oscillatory function of time. With the enhancement of delta parameter, non-extensive parameter, Coulombic parameter, and electric field strength, the entropy has a sinusoidal increase behavior with time. With the study of decoherence through the Tsallis entropy, it may be advised that to have a quantum system with efficient transmission of information, the non-extensive and delta parameters need to be significant. The study of the probability density showed an increase from the boundary to the center of the dot where it has its maximum value and oscillates with period T0 = ℏ / ΔE with the tunneling of the delta parameter, electric field strength, and Coulombic parameter. The results may be very helpful in the transmission of information in nanostructures and control of decoherence
Quantization Distortion in Block Transform-Compressed Data
NASA Technical Reports Server (NTRS)
Boden, A. F.
1995-01-01
The popular JPEG image compression standard is an example of a block transform-based compression scheme; the image is systematically subdivided into block that are individually transformed, quantized, and encoded. The compression is achieved by quantizing the transformed data, reducing the data entropy and thus facilitating efficient encoding. A generic block transform model is introduced.
High-Speed Magnetohydrodynamic Flow Control Analyses With 3-D Simulations
2008-01-01
color. 14. ABSTRACT Magnetohydrodynamic studies of high-speed flow control are described with emphasis on understanding fluid response to specific...interactions play a crucial role by distorting the velocity field. The interaction with an external circuit through electrodes is relatively efficient when... Entropy layer . . . . . . . . . . . . . 20 6 Energy management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 7 Conclusion
Logistic Map for Cancellable Biometrics
NASA Astrophysics Data System (ADS)
Supriya, V. G., Dr; Manjunatha, Ramachandra, Dr
2017-08-01
This paper presents design and implementation of secured biometric template protection system by transforming the biometric template using binary chaotic signals and 3 different key streams to obtain another form of template and demonstrating its efficiency by the results and investigating on its security through analysis including, key space analysis, information entropy and key sensitivity analysis.
Sandford, M.T. II; Handel, T.G.; Bradley, J.N.
1998-07-07
A method and apparatus for embedding auxiliary information into the digital representation of host data created by a lossy compression technique and a method and apparatus for constructing auxiliary data from the correspondence between values in a digital key-pair table with integer index values existing in a representation of host data created by a lossy compression technique are disclosed. The methods apply to data compressed with algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as ordered sequences of blocks containing integer indices having redundancy and uncertainty of value by one unit, allowing indices which are adjacent in value to be manipulated to encode auxiliary data. Also included is a method to improve the efficiency of lossy compression algorithms by embedding white noise into the integer indices. Lossy compression methods use loss-less compression to reduce to the final size the intermediate representation as indices. The efficiency of the loss-less compression, known also as entropy coding compression, is increased by manipulating the indices at the intermediate stage. Manipulation of the intermediate representation improves lossy compression performance by 1 to 10%. 21 figs.
Anosov C-systems and random number generators
NASA Astrophysics Data System (ADS)
Savvidy, G. K.
2016-08-01
We further develop our previous proposal to use hyperbolic Anosov C-systems to generate pseudorandom numbers and to use them for efficient Monte Carlo calculations in high energy particle physics. All trajectories of hyperbolic dynamical systems are exponentially unstable, and C-systems therefore have mixing of all orders, a countable Lebesgue spectrum, and a positive Kolmogorov entropy. These exceptional ergodic properties follow from the C-condition introduced by Anosov. This condition defines a rich class of dynamical systems forming an open set in the space of all dynamical systems. An important property of C-systems is that they have a countable set of everywhere dense periodic trajectories and their density increases exponentially with entropy. Of special interest are the C-systems defined on higher-dimensional tori. Such C-systems are excellent candidates for generating pseudorandom numbers that can be used in Monte Carlo calculations. An efficient algorithm was recently constructed that allows generating long C-system trajectories very rapidly. These trajectories have good statistical properties and can be used for calculations in quantum chromodynamics and in high energy particle physics.
Sandford, II, Maxwell T.; Handel, Theodore G.; Bradley, Jonathan N.
1998-01-01
A method and apparatus for embedding auxiliary information into the digital representation of host data created by a lossy compression technique and a method and apparatus for constructing auxiliary data from the correspondence between values in a digital key-pair table with integer index values existing in a representation of host data created by a lossy compression technique. The methods apply to data compressed with algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as ordered sequences of blocks containing integer indices having redundancy and uncertainty of value by one unit, allowing indices which are adjacent in value to be manipulated to encode auxiliary data. Also included is a method to improve the efficiency of lossy compression algorithms by embedding white noise into the integer indices. Lossy compression methods use loss-less compression to reduce to the final size the intermediate representation as indices. The efficiency of the loss-less compression, known also as entropy coding compression, is increased by manipulating the indices at the intermediate stage. Manipulation of the intermediate representation improves lossy compression performance by 1 to 10%.
A system architecture for online data interpretation and reduction in fluorescence microscopy
NASA Astrophysics Data System (ADS)
Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer
2010-01-01
In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.
Decoherence estimation in quantum theory and beyond
NASA Astrophysics Data System (ADS)
Pfister, Corsin
The quantum physics literature provides many different characterizations of decoherence. Most of them have in common that they describe decoherence as a kind of influence on a quantum system upon interacting with an another system. In the spirit of quantum information theory, we adapt a particular viewpoint on decoherence which describes it as the loss of information into a system that is possibly controlled by an adversary. We use a quantitative framework for decoherence that builds on operational characterizations of the min-entropy that have been developed in the quantum information literature. It characterizes decoherence as an influence on quantum channels that reduces their suitability for a variety of quantifiable tasks such as the distribution of secret cryptographic keys of a certain length or the distribution of a certain number of maximally entangled qubit pairs. This allows for a quantitative and operational characterization of decoherence via operational characterizations of the min-entropy. In this thesis, we present a series of results about the estimation of the minentropy, subdivided into three parts. The first part concerns the estimation of a quantum adversary's uncertainty about classical information--expressed by the smooth min-entropy--as it is done in protocols for quantum key distribution (QKD). We analyze this form of min-entropy estimation in detail and find that some of the more recently suggested QKD protocols have previously unnoticed security loopholes. We show that the specifics of the sifting subroutine of a QKD protocol are crucial for security by pointing out mistakes in the security analysis in the literature and by presenting eavesdropping attacks on those problematic protocols. We provide solutions to the identified problems and present a formalized analysis of the min-entropy estimate that incorporates the sifting stage of QKD protocols. In the second part, we extend ideas from QKD to a protocol that allows to estimate an adversary's uncertainty about quantum information, expressed by the fully quantum smooth min-entropy. Roughly speaking, we show that a protocol that resembles the parallel execution of two QKD protocols can be used to lower bound the min-entropy of some unmeasured qubits. We explain how this result may influence the ongoing search for protocols for entanglement distribution. The third part is dedicated to the development of a framework that allows the estimation of decoherence even in experiments that cannot be correctly described by quantum theory. Inspired by an equivalent formulation of the min-entropy that relates it to the fidelity with a maximally entangled state, we define a decoherence quantity for a very general class of probabilistic theories that reduces to the min-entropy in the special case of quantum theory. This entails a definition of maximal entanglement for generalized probabilistic theories. Using techniques from semidefinite and linear programming, we show how bounds on this quantity can be estimated through Bell-type experiments. This allows to test models for decoherence that cannot be described by quantum theory. As an example application, we devise an experimental test of a model for gravitational decoherence that has been suggested in the literature.
Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun
2017-11-01
The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.
NASA Astrophysics Data System (ADS)
Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun
2017-11-01
The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.
Schiff, Rachel; Katan, Pesia; Sasson, Ayelet; Kahta, Shani
2017-07-01
There's a long held view that chunks play a crucial role in artificial grammar learning performance. We compared chunk strength influences on performance, in high and low topological entropy (a measure of complexity) grammar systems, with dyslexic children, age-matched and reading-level-matched control participants. Findings show that age-matched control participants' performance reflected equivalent influence of chunk strength in the two topological entropy conditions, as typically found in artificial grammar learning experiments. By contrast, dyslexic children and reading-level-matched controls' performance reflected knowledge of chunk strength only under the low topological entropy condition. In the low topological entropy grammar system, they appeared completely unable to utilize chunk strength to make appropriate test item selections. In line with previous research, this study suggests that for typically developing children, it is the chunks that are attended during artificial grammar learning and create a foundation on which implicit associative learning mechanisms operate, and these chunks are unitized to different strengths. However, for children with dyslexia, it is complexity that may influence the subsequent memorability of chunks, independently of their strength.
Thermodynamic and Information Entropy in Electroconvection
NASA Astrophysics Data System (ADS)
Cressman, John; Daum, Marcus; Patrick, David; Cerbus, Rory; Goldburg, Walter
Transitions in driven systems often produce wild fluctuations that can be both detrimental and beneficial. Our fundamental understanding of these transients is inadequate to permit optimal interactions with systems ranging from biology, to energy generation, to finance. Here we report on experiments performed in electroconvecting liquid crystals where we abruptly change the electrical forcing across the sample from a state below defect turbulence into a state of defect turbulence. We simultaneously measure the electrical power flow through the liquid crystal as well as image the structure in the sample. These measurements enable us to simultaneously track the evolution of the thermodynamic and information entropies. Our experiments demonstrate that there are strong correlations between the fluctuations in these two entropic measures however they are not exact. We will discuss these discrepancies as well as the relevance of large transient fluctuations in non-equilibrium transitions in general.
Thermodynamic calculations for the liquid systems NaK, KCs and LiPb
NASA Astrophysics Data System (ADS)
Alblas, B. P.; Van Der Lugt, W.; Visser, E. G.; De Hosson, J. Th. M.
1982-06-01
The semi-empirical model for the calculation of the Gibbs free energy of mixing via the entropy of mixing, proposed by Visser et al. [1], is used to determine the activity coefficients and the long-wavelength limit of the structure factor, SCC(0). For the liquid alloys systems NaK and KCs the method leads to fairly accurate results, indicating almost ideal behaviour. For the compound-forming liquid alloys systems LiPb the agreement with experiment is less favourable, but the calculations clearly demonstrate the important influence of the volume contraction on the entropy.
Algorithmic cooling in liquid-state nuclear magnetic resonance
NASA Astrophysics Data System (ADS)
Atia, Yosi; Elias, Yuval; Mor, Tal; Weinstein, Yossi
2016-01-01
Algorithmic cooling is a method that employs thermalization to increase qubit purification level; namely, it reduces the qubit system's entropy. We utilized gradient ascent pulse engineering, an optimal control algorithm, to implement algorithmic cooling in liquid-state nuclear magnetic resonance. Various cooling algorithms were applied onto the three qubits of C132-trichloroethylene, cooling the system beyond Shannon's entropy bound in several different ways. In particular, in one experiment a carbon qubit was cooled by a factor of 4.61. This work is a step towards potentially integrating tools of NMR quantum computing into in vivo magnetic-resonance spectroscopy.
Simple model of hydrophobic hydration.
Lukšič, Miha; Urbic, Tomaz; Hribar-Lee, Barbara; Dill, Ken A
2012-05-31
Water is an unusual liquid in its solvation properties. Here, we model the process of transferring a nonpolar solute into water. Our goal was to capture the physical balance between water's hydrogen bonding and van der Waals interactions in a model that is simple enough to be nearly analytical and not heavily computational. We develop a 2-dimensional Mercedes-Benz-like model of water with which we compute the free energy, enthalpy, entropy, and the heat capacity of transfer as a function of temperature, pressure, and solute size. As validation, we find that this model gives the same trends as Monte Carlo simulations of the underlying 2D model and gives qualitative agreement with experiments. The advantages of this model are that it gives simple insights and that computational time is negligible. It may provide a useful starting point for developing more efficient and more realistic 3D models of aqueous solvation.
The influence of following on bidirectional flow through a doorway
NASA Astrophysics Data System (ADS)
Graves, Amy; Diamond, Rachel; Saakashvili, Eduard
Pedestrian dynamics is a subset of the study of self-propelled particles. We simulate two species of pedestrians undergoing bidirectional flow through a narrow doorway. Using the Helbing-Monlár-Farkas-Vicsek Social Force Model, our pedestrians are soft discs that experience psychosocial and physical contact forces. We vary the ``following'' parameter which determines the degree to which a pedestrian matches its direction of movement to the average of nearby, same-species pedestrians. Current density, efficiency and statistics of bursts and lags are calculated. These indicate that choosing different following parameters for each species affects the efficacy of transport - greater following being associated with lower efficacy. The information entropy associated with velocity and the long time tails of the complementary CDF of lag times are additional indicators of the dynamical consequences of following during bidirectional flow. Acknowledgement is made to the donors of the ACS Petrolium Research Fund, and the Vandervelde-Cheung Fund of Swarthmore College.
Entropy-driven one-step formation of Phi29 pRNA 3WJ from three RNA fragments.
Binzel, Daniel W; Khisamutdinov, Emil F; Guo, Peixuan
2014-04-15
The emerging field of RNA nanotechnology necessitates creation of functional RNA nanoparticles but has been limited by particle instability. It has been shown that the three-way junction of bacteriophage phi29 motor pRNA has unusual stability and can self-assemble from three fragments with high efficiency. It is generally believed that RNA and DNA folding is energy landscape-dependent, and the folding of RNA is driven by enthalpy. Here we examine the thermodynamic characteristics of the 3WJ components as 2'-fluoro RNA, DNA, and RNA. It was seen that the three fragments existed either in 3WJ complex or as monomers, with the intermediate of dimers almost undetectable. It seems that the three fragments can lead to the formation of the 3WJ complex efficiently within a rapid time. A low dissociation constant (apparent KD) of 11.4 nM was determined for RNA, inclusion of 2'-F pyrimidines strengthened the KD to 4.5 nM, and substitution of DNA weakened it to 47.7 nM. The ΔG°37, were -36, -28, and -15 kcal/mol for 3WJ2'-F, 3WJRNA, and 3WJDNA, respectively. It is found that the formation of the three-component complex was governed by entropy, instead of enthalpy, as usually found in RNA complexes. Here entropy-driven is referring to a dominating entropic contribution to the increased stability of the 3WJ(2'-F and 3WJ(RNA) compared to the 3WJ(DNA,) instead of referring to the absolute role or total energy governing 3WJ folding. [corrected].
Using Weighted Entropy to Rank Chemicals in Quantitative High Throughput Screening Experiments
Shockley, Keith R.
2014-01-01
Quantitative high throughput screening (qHTS) experiments can simultaneously produce concentration-response profiles for thousands of chemicals. In a typical qHTS study, a large chemical library is subjected to a primary screen in order to identify candidate hits for secondary screening, validation studies or prediction modeling. Different algorithms, usually based on the Hill equation logistic model, have been used to classify compounds as active or inactive (or inconclusive). However, observed concentration-response activity relationships may not adequately fit a sigmoidal curve. Furthermore, it is unclear how to prioritize chemicals for follow-up studies given the large uncertainties that often accompany parameter estimates from nonlinear models. Weighted Shannon entropy can address these concerns by ranking compounds according to profile-specific statistics derived from estimates of the probability mass distribution of response at the tested concentration levels. This strategy can be used to rank all tested chemicals in the absence of a pre-specified model structure or the approach can complement existing activity call algorithms by ranking the returned candidate hits. The weighted entropy approach was evaluated here using data simulated from the Hill equation model. The procedure was then applied to a chemical genomics profiling data set interrogating compounds for androgen receptor agonist activity. PMID:24056003
Statistical mechanics of monatomic liquids
NASA Astrophysics Data System (ADS)
Wallace, Duane C.
1997-10-01
Two key experimental properties of elemental liquids, together with an analysis of the condensed-system potential-energy surface, lead us logically to the dynamical theory of monatomic liquids. Experimentally, the ion motional specific heat is approximately 3Nk for N ions, implying the normal modes of motion are approximately 3N independent harmonic oscillators. This implies the potential surface contains nearly harmonic valleys. The equilibrium configuration at the bottom of each valley is a ``structure.'' Structures are crystalline or amorphous, and amorphous structures can have a remnant of local crystal symmetry, or can be random. The random structures are by far the most numerous, and hence dominate the statistical mechanics of the liquid state, and their macroscopic properties are uniform over the structure class, for large-N systems. The Hamiltonian for any structural valley is the static structure potential, a sum of harmonic normal modes, and an anharmonic correction. Again from experiment, the constant-density entropy of melting contains a universal disordering contribution of NkΔ, suggesting the random structural valleys are of universal number wN, where lnw=Δ. Our experimental estimate for Δ is 0.80. In quasiharmonic approximation, the liquid theory for entropy agrees with experiment, for all currently analyzable experimental data at elevated temperatures, to within 1-2% of the total entropy. Further testable predictions of the theory are mentioned.
Guan, Yue; Li, Weifeng; Jiang, Zhuoran; Chen, Ying; Liu, Song; He, Jian; Zhou, Zhengyang; Ge, Yun
2016-12-01
This study aimed to develop whole-lesion apparent diffusion coefficient (ADC)-based entropy-related parameters of cervical cancer to preliminarily assess intratumoral heterogeneity of this lesion in comparison to adjacent normal cervical tissues. A total of 51 women (mean age, 49 years) with cervical cancers confirmed by biopsy underwent 3-T pelvic diffusion-weighted magnetic resonance imaging with b values of 0 and 800 s/mm 2 prospectively. ADC-based entropy-related parameters including first-order entropy and second-order entropies were derived from the whole tumor volume as well as adjacent normal cervical tissues. Intraclass correlation coefficient, Wilcoxon test with Bonferroni correction, Kruskal-Wallis test, and receiver operating characteristic curve were used for statistical analysis. All the parameters showed excellent interobserver agreement (all intraclass correlation coefficients > 0.900). Entropy, entropy(H) 0 , entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean were significantly higher, whereas entropy(H) range and entropy(H) std were significantly lower in cervical cancers compared to adjacent normal cervical tissues (all P <.0001). Kruskal-Wallis test showed that there were no significant differences among the values of various second-order entropies including entropy(H) 0, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean. All second-order entropies had larger area under the receiver operating characteristic curve than first-order entropy in differentiating cervical cancers from adjacent normal cervical tissues. Further, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean had the same largest area under the receiver operating characteristic curve of 0.867. Whole-lesion ADC-based entropy-related parameters of cervical cancers were developed successfully, which showed initial potential in characterizing intratumoral heterogeneity in comparison to adjacent normal cervical tissues. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
On quantum Rényi entropies: A new generalization and some properties
NASA Astrophysics Data System (ADS)
Müller-Lennert, Martin; Dupuis, Frédéric; Szehr, Oleg; Fehr, Serge; Tomamichel, Marco
2013-12-01
The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.
Toward a Classical Thermodynamic Model for Retro-cognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, Edwin C.
2011-11-29
Retro-cognition--a human response before a randomly determined future stimulus--has always been part of our experience. Experiments over the last 80 years show a small but statistically significant effect. If this turns out to be true, then it suggests a form of macroscopic retro-causation. The 2nd Law of Thermodynamics provides an explanation for the apparent single direction of time at the macroscopic level although time is reversible at the microscopic level. In a preliminary study, I examined seven anomalous cognition (a.k.a., ESP) studies in which the entropic gradients and the entropy of their associated target systems were calculated, and the qualitymore » of the response was estimated by a rating system called the figure of merit. The combined Spearman's correlation coefficient for these variables for the seven studies was 0.211 (p = 6.4x10{sup -4}) with a 95% confidence interval for the correlation of [0.084, 0.332]; whereas, the same data for a correlation with the entropy itself was 0.028 (p = 0.36; 95% confidence interval of [-0.120-0.175]). This suggests that anomalous cognition is mediated via some kind of a sensory system in that all the normal sensory systems are more sensitive to changes than they are to inputs that are not changing. A standard relationship for the change of entropy of a binary sequence appears to provide an upper limit to anomalous cognition functioning for free response and for forced-choice Zener card guessing. This entropic relation and an apparent limit set by the entropy may provide a clue for understanding macroscopic retro-causation.« less
Serotonergic psychedelics temporarily modify information transfer in humans.
Alonso, Joan Francesc; Romero, Sergio; Mañanas, Miquel Àngel; Riba, Jordi
2015-03-28
Psychedelics induce intense modifications in the sensorium, the sense of "self," and the experience of reality. Despite advances in our understanding of the molecular and cellular level mechanisms of these drugs, knowledge of their actions on global brain dynamics is still incomplete. Recent imaging studies have found changes in functional coupling between frontal and parietal brain structures, suggesting a modification in information flow between brain regions during acute effects. Here we assessed the psychedelic-induced changes in directionality of information flow during the acute effects of a psychedelic in humans. We measured modifications in connectivity of brain oscillations using transfer entropy, a nonlinear measure of directed functional connectivity based on information theory. Ten healthy male volunteers with prior experience with psychedelics participated in 2 experimental sessions. They received a placebo or a dose of ayahuasca, a psychedelic preparation containing the serotonergic 5-HT2A agonist N,N-dimethyltryptamine. The analysis showed significant changes in the coupling of brain oscillations between anterior and posterior recording sites. Transfer entropy analysis showed that frontal sources decreased their influence over central, parietal, and occipital sites. Conversely, sources in posterior locations increased their influence over signals measured at anterior locations. Exploratory correlations found that anterior-to-posterior transfer entropy decreases were correlated with the intensity of subjective effects, while the imbalance between anterior-to-posterior and posterior-to-anterior transfer entropy correlated with the degree of incapacitation experienced. These results suggest that psychedelics induce a temporary disruption of neural hierarchies by reducing top-down control and increasing bottom-up information transfer in the human brain. © The Author 2015. Published by Oxford University Press on behalf of CINP.
High-entropy fireballs and jets in gamma-ray burst sources
NASA Technical Reports Server (NTRS)
Meszaros, P.; Rees, M. J.
1992-01-01
Two mechanisms whereby compact coalescing binaries can produce relatively 'clean' fireballs via neutrino-antineutrino annihilation are proposed. Preejected mass due to tidal heating will collimate the fireball into jets. The resulting anisotropic gamma-ray emission can be efficient and intense enough to provide an acceptable model for gamma-ray bursts, if these originate at cosmological distances.
Upper entropy axioms and lower entropy axioms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi
2015-04-15
The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less
[Assessment of laparoscopic training based on eye tracker and electroencephalograph].
Liu, Yun; Wang, Shuyi; Zhang, Yangun; Xu, Mingzhe; Ye, Shasha; Wang, Peng
2017-02-01
The aim of this study is to evaluate the effect of laparoscopic simulation training with different attention. Attention was appraised using the sample entropy and θ/β value, which were calculated according to electroencephalograph(EEG) signal collected with Brain Link. The effect of laparoscopic simulation training was evaluated using the completion time, error number and fixation number, which were calculated according to eye movement signal collected with Tobii eye tracker. Twenty volunteers were recruited in this study. Those with the sample entropy lower than0.77 were classified into group A and those higher than 0.77 into group B. The results showed that the sample entropy of group A was lower than that of group B, and fluctuations of A were more steady. However, the sample entropy of group B showed steady fluctuations in the first five trainings, and then demonstrated relatively dramatic fluctuates in the later five trainings. Compared with that of group B, the θ/β value of group A was smaller and shows steady fluctuations. Group A has a shorter completion time, less errors and faster decrease of fixation number. Therefore, this study reached the following conclusion that the attention of the trainees would affect the training effect. Members in group A, who had a higher attention were more efficient and faster training. For those in group B, although their training skills have been improved, they needed a longer time to reach a plateau.
Efficiency in the European agricultural sector: environment and resources.
Moutinho, Victor; Madaleno, Mara; Macedo, Pedro; Robaina, Margarita; Marques, Carlos
2018-04-22
This article intends to compute agriculture technical efficiency scores of 27 European countries during the period 2005-2012, using both data envelopment analysis (DEA) and stochastic frontier analysis (SFA) with a generalized cross-entropy (GCE) approach, for comparison purposes. Afterwards, by using the scores as dependent variable, we apply quantile regressions using a set of possible influencing variables within the agricultural sector able to explain technical efficiency scores. Results allow us to conclude that although DEA and SFA are quite distinguishable methodologies, and despite attained results are different in terms of technical efficiency scores, both are able to identify analogously the worst and better countries. They also suggest that it is important to include resources productivity and subsidies in determining technical efficiency due to its positive and significant exerted influence.
Entropy Generation Analysis through Helical Coil Heat Exchanger in an Agitated Vessel
NASA Astrophysics Data System (ADS)
Ashok Reddy, K.
2018-03-01
Entropy Generation have been obtained while conducting the experiments for different sodium carboxymethyl cellulose concentrations 0.05%,0.1%,0.15% and 0.2% of Newtonian and non Newtonian fluids and the data made available by passing the test fluid at different flow rates through a helical coil in a mixing coil using paddle impeller. Heating of fluids depend on operational parameters, geometry of the mixing vessel and the type of impeller used. A new design of heating element was design and fabricated by providing kanthal wire inserted into a glove knitted with fiber glass yarn as glass fabric is flexible, heat resistant and can accommodate to adopt small difference in size of the vessel, perfectly. The knitted fabric is made to the shape of vessel used in the experiment and the heating elements are inserted so that it gets embedded and forms part of the glove knitted with yarn of fiber glass.
Intrasubject multimodal groupwise registration with the conditional template entropy.
Polfliet, Mathias; Klein, Stefan; Huizinga, Wyke; Paulides, Margarethus M; Niessen, Wiro J; Vandemeulebroucke, Jef
2018-05-01
Image registration is an important task in medical image analysis. Whereas most methods are designed for the registration of two images (pairwise registration), there is an increasing interest in simultaneously aligning more than two images using groupwise registration. Multimodal registration in a groupwise setting remains difficult, due to the lack of generally applicable similarity metrics. In this work, a novel similarity metric for such groupwise registration problems is proposed. The metric calculates the sum of the conditional entropy between each image in the group and a representative template image constructed iteratively using principal component analysis. The proposed metric is validated in extensive experiments on synthetic and intrasubject clinical image data. These experiments showed equivalent or improved registration accuracy compared to other state-of-the-art (dis)similarity metrics and improved transformation consistency compared to pairwise mutual information. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Research and implementation of group animation based on normal cloud model
NASA Astrophysics Data System (ADS)
Li, Min; Wei, Bin; Peng, Bao
2011-12-01
Group Animation is a difficult technology problem which always has not been solved in computer Animation technology, All current methods have their limitations. This paper put forward a method: the Motion Coordinate and Motion Speed of true fish group was collected as sample data, reverse cloud generator was designed and run, expectation, entropy and super entropy are gotten. Which are quantitative value of qualitative concept. These parameters are used as basis, forward cloud generator was designed and run, Motion Coordinate and Motion Speed of two-dimensional fish group animation are produced, And two spirit state variable about fish group : the feeling of hunger, the feeling of fear are designed. Experiment is used to simulated the motion state of fish Group Animation which is affected by internal cause and external cause above, The experiment shows that the Group Animation which is designed by this method has strong Realistic.
Diffusion imaging quality control via entropy of principal direction distribution.
Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A
2013-11-15
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. Copyright © 2013 Elsevier Inc. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution
Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.
2013-01-01
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. PMID:23684874
NASA Astrophysics Data System (ADS)
Zucker, M. H.
This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own temperature and thus, by itself; reverse entropy. The vast encompassing gravitational forces that the universe has at its disposal, forces that dominate the phase of contraction, provide the compacting, compressive mechanism that regenerates heat in an expanded, cooled universe and decreases entropy. And this phenomenon takes place without diminishing or depleting the finite amount of mass/energy with which the universe began. The fact that the universe can reverse the entropic process leads to possibilities previously ignored when assessing which of the three models (open, closed, of flat) most probably represents the future of the universe. After analyzing the models, the conclusion reached here is that the open model is only an expanded version of the closed model and therefore is not open, and the closed model will never collapse to a big crunch and, therefore, is not closed. Which leaves a modified model, oscillating forever between limited phases of expansion and contraction (a universe in "dynamic equilibrium") as the only feasible choice.
General results for higher spin Wilson lines and entanglement in Vasiliev theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegde, Ashwin; Kraus, Per; Perlmutter, Eric
Here, we develop tools for the efficient evaluation of Wilson lines in 3D higher spin gravity, and use these to compute entanglement entropy in the hs[λ ] Vasiliev theory that governs the bulk side of the duality proposal of Gaberdiel and Gopakumar. Our main technical advance is the determination of SL(N) Wilson lines for arbitrary N, which, in suitable cases, enables us to analytically continue to hs[λ ] via N→ -λ. We then apply this result to compute various quantities of interest, including entanglement entropy expanded perturbatively in the background higher spin charge, chemical potential, and interval size. This includesmore » a computation of entanglement entropy in the higher spin black hole of the Vasiliev theory. Our results are consistent with conformal field theory calculations. We also provide an alternative derivation of the Wilson line, by showing how it arises naturally from earlier work on scalar correlators in higher spin theory. The general picture that emerges is consistent with the statement that the SL(N) Wilson line computes the semiclassical W N vacuum block, and our results provide an explicit result for this object.« less
General results for higher spin Wilson lines and entanglement in Vasiliev theory
Hegde, Ashwin; Kraus, Per; Perlmutter, Eric
2016-01-28
Here, we develop tools for the efficient evaluation of Wilson lines in 3D higher spin gravity, and use these to compute entanglement entropy in the hs[λ ] Vasiliev theory that governs the bulk side of the duality proposal of Gaberdiel and Gopakumar. Our main technical advance is the determination of SL(N) Wilson lines for arbitrary N, which, in suitable cases, enables us to analytically continue to hs[λ ] via N→ -λ. We then apply this result to compute various quantities of interest, including entanglement entropy expanded perturbatively in the background higher spin charge, chemical potential, and interval size. This includesmore » a computation of entanglement entropy in the higher spin black hole of the Vasiliev theory. Our results are consistent with conformal field theory calculations. We also provide an alternative derivation of the Wilson line, by showing how it arises naturally from earlier work on scalar correlators in higher spin theory. The general picture that emerges is consistent with the statement that the SL(N) Wilson line computes the semiclassical W N vacuum block, and our results provide an explicit result for this object.« less
Algorithm based on the short-term Rényi entropy and IF estimation for noisy EEG signals analysis.
Lerga, Jonatan; Saulig, Nicoletta; Mozetič, Vladimir
2017-01-01
Stochastic electroencephalogram (EEG) signals are known to be nonstationary and often multicomponential. Detecting and extracting their components may help clinicians to localize brain neurological dysfunctionalities for patients with motor control disorders due to the fact that movement-related cortical activities are reflected in spectral EEG changes. A new algorithm for EEG signal components detection from its time-frequency distribution (TFD) has been proposed in this paper. The algorithm utilizes the modification of the Rényi entropy-based technique for number of components estimation, called short-term Rényi entropy (STRE), and upgraded by an iterative algorithm which was shown to enhance existing approaches. Combined with instantaneous frequency (IF) estimation, the proposed method was applied to EEG signal analysis both in noise-free and noisy environments for limb movements EEG signals, and was shown to be an efficient technique providing spectral description of brain activities at each electrode location up to moderate additive noise levels. Furthermore, the obtained information concerning the number of EEG signal components and their IFs show potentials to enhance diagnostics and treatment of neurological disorders for patients with motor control illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bahreinizad, Hossein; Salimi Bani, Milad; Hasani, Mojtaba; Karimi, Mohammad Taghi; Sharifmoradi, Keyvan; Karimi, Alireza
2017-08-09
The influence of various musculoskeletal disorders has been evaluated using different kinetic and kinematic parameters. But the efficiency of walking can be evaluated by measuring the effort of the subject, or by other words the energy that is required to walk. The aim of this study was to identify mechanical energy differences between the normal and pathological groups. Four groups of 15 healthy subjects, 13 Parkinson subjects, 4 osteoarthritis subjects, and 4 ACL reconstructed subjects have participated in this study. The motions of foot, shank and thigh were recorded using a three dimensional motion analysis system. The kinetic, potential and total mechanical energy of each segment was calculated using 3D markers positions and anthropometric measurements. Maximum value and sample entropy of energies was compared between the normal and abnormal subjects. Maximum value of potential energy of OA subjects was lower than the normal subjects. Furthermore, sample entropy of mechanical energy for Parkinson subjects was low in comparison to the normal subjects while sample entropy of mechanical energy for the ACL subjects was higher than that of the normal subjects. Findings of this study suggested that the subjects with different abilities show different mechanical energy during walking.
Multi-GPU maximum entropy image synthesis for radio astronomy
NASA Astrophysics Data System (ADS)
Cárcamo, M.; Román, P. E.; Casassus, S.; Moral, V.; Rannou, F. R.
2018-01-01
The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has a statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridding MEM, which is tested using real and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This allows to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 min, instead of 2.5 days that takes a sequential version on CPU.
Inhomogeneity of epidemic spreading with entropy-based infected clusters.
Wen-Jie, Zhou; Xing-Yuan, Wang
2013-12-01
Considering the difference in the sizes of the infected clusters in the dynamic complex networks, the normalized entropy based on infected clusters (δ*) is proposed to characterize the inhomogeneity of epidemic spreading. δ* gives information on the variability of the infected clusters in the system. We investigate the variation in the inhomogeneity of the distribution of the epidemic with the absolute velocity v of moving agent, the infection density ρ, and the interaction radius r. By comparing δ* in the dynamic networks with δH* in homogeneous mode, the simulation experiments show that the inhomogeneity of epidemic spreading becomes smaller with the increase of v, ρ, r.
Thermodynamics of nuclear track chemical etching
NASA Astrophysics Data System (ADS)
Rana, Mukhtar Ahmed
2018-05-01
This is a brief paper with new and useful scientific information on nuclear track chemical etching. Nuclear track etching is described here by using basic concepts of thermodynamics. Enthalpy, entropy and free energy parameters are considered for the nuclear track etching. The free energy of etching is determined using etching experiments of fission fragment tracks in CR-39. Relationship between the free energy and the etching temperature is explored and is found to be approximately linear. The above relationship is discussed. A simple enthalpy-entropy model of chemical etching is presented. Experimental and computational results presented here are of fundamental interest in nuclear track detection methodology.
Refined two-index entropy and multiscale analysis for complex system
NASA Astrophysics Data System (ADS)
Bian, Songhan; Shang, Pengjian
2016-10-01
As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.
New Standard State Entropy for Sphene (Titanite)
NASA Astrophysics Data System (ADS)
Manon, M. R.; Dachs, E.; Essene, E. J.
2004-12-01
Several recent papers have questioned the accepted standard state (STP) entropy of sphene (CaTiSiO5), which had been considered to be in the range 129-132 J/mol.K (Berman, 1988: 129.3 Robie and Hemingway, 1995: 129.2 J/mol.K; Holland and Powell, 1995: 131.2 J/mol.K.). However, Xirouchakis and Lindsley (1998) recommended a much lower value of 106 J/mol.K for the STP entropy of sphene. Tangeman and Xirouchakis (2001) inferred a value less than 124 or 120 J/mol.K, based on based on enthalpy constraints combined with the tightly reversed reaction sphene+kyanite=rutile+anorthite by Bohlen and Manning (1991). Their recommendations are in conflict with the accepted values for STP entropy for sphene, including values calculated by direct measurement of Cp from 50 to 300 K by King (1954). In order to resolve this discrepancy, we have collected new data on the Cp of sphene between 5 and 300 K. Our measurements were made in the PPMS at Salzburg on a 21.4 g sample of sphene generously furnished by Tangeman and Xirouchakis (2001), the same sample as used in their experiments. The Cp data are slightly lower than those of King (1954) but merge smoothly with data of Tangeman and Xirouchakis (2001) from 330 to 483 K (or whatever) where a transition is recorded in the Cp data as a lambda anomaly. Tangeman and Xirouchakis also obtained data above the transition up to 950K. Integration of the new Cp data yields a STP entropy of 127.3 J/mol.K, lower than the generally accepted value by ca. 2 J/mol.K. A change in the STP entropy of sphene will have an effect on many Ti-bearing reactions which occur within the earth, although the magnitude of this change is not nearly as large as that suggested by Xirouchakis and Lindsley (1998). Above 700 K, the entropy calculated using the new STP entropy with the heat capacity equation of Tangeman and Xirouchakis (2001) is within 1 J/mol.K of the value tabulated in Robie and Hemingway (1995) and of that calculated from Berman (1988). The effect on most phase equilibrium calculations will not be large except for reactions with small Δ S. The use of 127.2 J/mol.K as the standard entropy of sphene is recommended especially in calculations of geobarometers involving that phase.
On the pH Dependence of the Potential of Maximum Entropy of Ir(111) Electrodes.
Ganassin, Alberto; Sebastián, Paula; Climent, Víctor; Schuhmann, Wolfgang; Bandarenka, Aliaksandr S; Feliu, Juan
2017-04-28
Studies over the entropy of components forming the electrode/electrolyte interface can give fundamental insights into the properties of electrified interphases. In particular, the potential where the entropy of formation of the double layer is maximal (potential of maximum entropy, PME) is an important parameter for the characterization of electrochemical systems. Indeed, this parameter determines the majority of electrode processes. In this work, we determine PMEs for Ir(111) electrodes. The latter currently play an important role to understand electrocatalysis for energy provision; and at the same time, iridium is one of the most stable metals against corrosion. For the experiments, we used a combination of the laser induced potential transient to determine the PME, and CO charge-displacement to determine the potentials of zero total charge, (E PZTC ). Both PME and E PZTC were assessed for perchlorate solutions in the pH range from 1 to 4. Surprisingly, we found that those are located in the potential region where the adsorption of hydrogen and hydroxyl species takes place, respectively. The PMEs demonstrated a shift by ~30 mV per a pH unit (in the RHE scale). Connections between the PME and electrocatalytic properties of the electrode surface are discussed.
Chatter detection in milling process based on VMD and energy entropy
NASA Astrophysics Data System (ADS)
Liu, Changfu; Zhu, Lida; Ni, Chenbing
2018-05-01
This paper presents a novel approach to detect the milling chatter based on Variational Mode Decomposition (VMD) and energy entropy. VMD has already been employed in feature extraction from non-stationary signals. The parameters like number of modes (K) and the quadratic penalty (α) need to be selected empirically when raw signal is decomposed by VMD. Aimed at solving the problem how to select K and α, the automatic selection method of VMD's based on kurtosis is proposed in this paper. When chatter occurs in the milling process, energy will be absorbed to chatter frequency bands. To detect the chatter frequency bands automatically, the chatter detection method based on energy entropy is presented. The vibration signal containing chatter frequency is simulated and three groups of experiments which represent three cutting conditions are conducted. To verify the effectiveness of method presented by this paper, chatter feather extraction has been successfully employed on simulation signals and experimental signals. The simulation and experimental results show that the proposed method can effectively detect the chatter.
Anomalous phase behavior of first-order fluid-liquid phase transition in phosphorus
NASA Astrophysics Data System (ADS)
Zhao, G.; Wang, H.; Hu, D. M.; Ding, M. C.; Zhao, X. G.; Yan, J. L.
2017-11-01
Although the existence of liquid-liquid phase transition has become more and more convincing, whether it will terminate at a critical point and what is the order parameter are still open. To explore these questions, we revisit the fluid-liquid phase transition (FLPT) in phosphorus (P) and study its phase behavior by performing extensive first-principles molecular dynamics simulations. The FLPT observed in experiments is well reproduced, and a fluid-liquid critical point (FLCP) at T = 3000 ˜ 3500 K, P = 1.5-2.0 Kbar is found. With decreasing temperature from the FLCP along the transition line, the density difference (Δρ) between two coexisting phases first increases from zero and then anomalously decreases; however, the entropy difference (ΔS) continuously increases from zero. These features suggest that an order parameter containing contributions from both the density and the entropy is needed to describe the FLPT in P, and at least at low temperatures, the entropy, instead of the density, governs the FLPT.
Tang, Jian; Jiang, Xiaoliang
2017-01-01
Image segmentation has always been a considerable challenge in image analysis and understanding due to the intensity inhomogeneity, which is also commonly known as bias field. In this paper, we present a novel region-based approach based on local entropy for segmenting images and estimating the bias field simultaneously. Firstly, a local Gaussian distribution fitting (LGDF) energy function is defined as a weighted energy integral, where the weight is local entropy derived from a grey level distribution of local image. The means of this objective function have a multiplicative factor that estimates the bias field in the transformed domain. Then, the bias field prior is fully used. Therefore, our model can estimate the bias field more accurately. Finally, minimization of this energy function with a level set regularization term, image segmentation, and bias field estimation can be achieved. Experiments on images of various modalities demonstrated the superior performance of the proposed method when compared with other state-of-the-art approaches.
Divalent cation shrinks DNA but inhibits its compaction with trivalent cation.
Tongu, Chika; Kenmotsu, Takahiro; Yoshikawa, Yuko; Zinchenko, Anatoly; Chen, Ning; Yoshikawa, Kenichi
2016-05-28
Our observation reveals the effects of divalent and trivalent cations on the higher-order structure of giant DNA (T4 DNA 166 kbp) by fluorescence microscopy. It was found that divalent cations, Mg(2+) and Ca(2+), inhibit DNA compaction induced by a trivalent cation, spermidine (SPD(3+)). On the other hand, in the absence of SPD(3+), divalent cations cause the shrinkage of DNA. As the control experiment, we have confirmed the minimum effect of monovalent cation, Na(+) on the DNA higher-order structure. We interpret the competition between 2+ and 3+ cations in terms of the change in the translational entropy of the counterions. For the compaction with SPD(3+), we consider the increase in translational entropy due to the ion-exchange of the intrinsic monovalent cations condensing on a highly charged polyelectrolyte, double-stranded DNA, by the 3+ cations. In contrast, the presence of 2+ cation decreases the gain of entropy contribution by the ion-exchange between monovalent and 3+ ions.
NASA Astrophysics Data System (ADS)
Jarabo-Amores, María-Pilar; la Mata-Moya, David de; Gil-Pita, Roberto; Rosa-Zurera, Manuel
2013-12-01
The application of supervised learning machines trained to minimize the Cross-Entropy error to radar detection is explored in this article. The detector is implemented with a learning machine that implements a discriminant function, which output is compared to a threshold selected to fix a desired probability of false alarm. The study is based on the calculation of the function the learning machine approximates to during training, and the application of a sufficient condition for a discriminant function to be used to approximate the optimum Neyman-Pearson (NP) detector. In this article, the function a supervised learning machine approximates to after being trained to minimize the Cross-Entropy error is obtained. This discriminant function can be used to implement the NP detector, which maximizes the probability of detection, maintaining the probability of false alarm below or equal to a predefined value. Some experiments about signal detection using neural networks are also presented to test the validity of the study.
NASA Astrophysics Data System (ADS)
Tera, Akemi; Shirai, Kiyoaki; Yuizono, Takaya; Sugiyama, Kozo
In order to investigate reading processes of Japanese language learners, we have conducted an experiment to record eye movements during Japanese text reading using an eye-tracking system. We showed that Japanese native speakers use “forward and backward jumping eye movements” frequently[13],[14]. In this paper, we analyzed further the same eye tracking data. Our goal is to examine whether Japanese learners fix their eye movements at boundaries of linguistic units such as words, phrases or clauses when they start or end “backward jumping”. We consider conventional linguistic boundaries as well as boundaries empirically defined based on the entropy of the N-gram model. Another goal is to examine the relation between the entropy of the N-gram model and the depth of syntactic structures of sentences. Our analysis shows that (1) Japanese learners often fix their eyes at linguistic boundaries, (2) the average of the entropy is the greatest at the fifth depth of syntactic structures.
Microcanonical entropy for classical systems
NASA Astrophysics Data System (ADS)
Franzosi, Roberto
2018-03-01
The entropy definition in the microcanonical ensemble is revisited. We propose a novel definition for the microcanonical entropy that resolve the debate on the correct definition of the microcanonical entropy. In particular we show that this entropy definition fixes the problem inherent the exact extensivity of the caloric equation. Furthermore, this entropy reproduces results which are in agreement with the ones predicted with standard Boltzmann entropy when applied to macroscopic systems. On the contrary, the predictions obtained with the standard Boltzmann entropy and with the entropy we propose, are different for small system sizes. Thus, we conclude that the Boltzmann entropy provides a correct description for macroscopic systems whereas extremely small systems should be better described with the entropy that we propose here.
The DOSY experiment provides insights into the protegrin-lipid interaction
NASA Astrophysics Data System (ADS)
Malliavin, T. E.; Louis, V.; Delsuc, M. A.
1998-02-01
The measure of translational diffusion using PFG NMR has known a renewal of interest with the development of the DOSY experiments. The extraction of diffusion coefficients from these experiments requires an inverse Laplace transform. We present here the use of the Maximum Entropy technique to perform this transform, and an application of this method to investigate the interaction protegrin-lipid. We show that the analysis by DOSY experiments permits to determine some of the interaction features. La mesure de diffusion translationnelle par gradients de champs pulsés en RMN a connu un regain d'intérêt avec le développement des expériences de DOSY. L'extraction de coefficients de diffusion à partir de ces expériences nécessite l'application d'une transformée de Laplace inverse. Nous présentons ici l'utilisation de la méthode d'Entropie Maximum pour effectuer cette transformée, ainsi qu'une application de l'expérience de DOSY pour étudier une interaction protégrine-lipide. Nous montrons que l'analyse par l'expérience de DOSY permet de déterminer certaines des caractéristiques de cette interaction.
NASA Astrophysics Data System (ADS)
Hu, Anzi; Freericks, J. K.; Maśka, M. M.; Williams, C. J.
2011-04-01
We discuss the application of a strong-coupling expansion (perturbation theory in the hopping) for studying light-Fermi-heavy-Bose (like K40-Rb87) mixtures in optical lattices. We use the strong-coupling method to evaluate the efficiency for preforming molecules, the entropy per particle, and the thermal fluctuations. We show that within the strong interaction regime (and at high temperature), the strong-coupling expansion is an economical way to study this problem. In some cases, it remains valid even down to low temperatures. Because the computational effort is minimal, the strong-coupling approach allows us to work with much larger system sizes, where boundary effects can be eliminated, which is particularly important at higher temperatures. Since the strong-coupling approach is so efficient and accurate, it allows one to rapidly scan through parameter space in order to optimize the preforming of molecules on a lattice (by choosing the lattice depth and interspecies attraction). Based on the strong-coupling calculations, we test the thermometry scheme based on the fluctuation-dissipation theorem and find the scheme gives accurate temperature estimation even at very low temperature. We believe this approach and the calculation results will be useful in the design of the next generation of experiments and will hopefully lead to the ability to form dipolar matter in the quantum degenerate regime.
On S-mixing entropy of quantum channels
NASA Astrophysics Data System (ADS)
Mukhamedov, Farrukh; Watanabe, Noboru
2018-06-01
In this paper, an S-mixing entropy of quantum channels is introduced as a generalization of Ohya's S-mixing entropy. We investigate several properties of the introduced entropy. Moreover, certain relations between the S-mixing entropy and the existing map and output entropies of quantum channels are investigated as well. These relations allowed us to find certain connections between separable states and the introduced entropy. Hence, there is a sufficient condition to detect entangled states. Moreover, several properties of the introduced entropy are investigated. Besides, entropies of qubit and phase-damping channels are calculated.
NASA Astrophysics Data System (ADS)
Sanyal, Tanmoy; Shell, M. Scott
2016-07-01
Bottom-up multiscale techniques are frequently used to develop coarse-grained (CG) models for simulations at extended length and time scales but are often limited by a compromise between computational efficiency and accuracy. The conventional approach to CG nonbonded interactions uses pair potentials which, while computationally efficient, can neglect the inherently multibody contributions of the local environment of a site to its energy, due to degrees of freedom that were coarse-grained out. This effect often causes the CG potential to depend strongly on the overall system density, composition, or other properties, which limits its transferability to states other than the one at which it was parameterized. Here, we propose to incorporate multibody effects into CG potentials through additional nonbonded terms, beyond pair interactions, that depend in a mean-field manner on local densities of different atomic species. This approach is analogous to embedded atom and bond-order models that seek to capture multibody electronic effects in metallic systems. We show that the relative entropy coarse-graining framework offers a systematic route to parameterizing such local density potentials. We then characterize this approach in the development of implicit solvation strategies for interactions between model hydrophobes in an aqueous environment.
NASA Astrophysics Data System (ADS)
Zhang, Huiyan; Li, Ran; Zhang, Leilei; Zhang, Tao
2014-04-01
The influence of interchangeable substitution of similar heavy rare-earth-elements (HRE), i.e., Gd-Ho, Gd-Er, and Ho-Er, on the magnetic and magnetocaloric properties of HRE55Al27.5Co17.5 metallic glasses was evaluated. The magnetic transition temperature (TC) can be tuned in a wide temperature range from 8 K to 93 K by adjusting the substitutional concentration in the resulting metallic glasses. A roughly linear correlation between peak value of magnetic entropy change (|ΔSMpk|) and TC-2/3 was obtained in the three systems. This kind of substitutional adjustment provides a useful method for designing desirable candidates in metallic glasses with high magnetic entropy change, large magnetic cooling efficiency, and tunable TC for magnetic refrigerant in nitrogen and hydrogen liquefaction temperature ranges.
On the optimality of code options for a universal noiseless coder
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu; Rice, Robert F.; Miller, Warner
1991-01-01
A universal noiseless coding structure was developed that provides efficient performance over an extremely broad range of source entropy. This is accomplished by adaptively selecting the best of several easily implemented variable length coding algorithms. Custom VLSI coder and decoder modules capable of processing over 20 million samples per second are currently under development. The first of the code options used in this module development is shown to be equivalent to a class of Huffman code under the Humblet condition, other options are shown to be equivalent to the Huffman codes of a modified Laplacian symbol set, at specified symbol entropy values. Simulation results are obtained on actual aerial imagery, and they confirm the optimality of the scheme. On sources having Gaussian or Poisson distributions, coder performance is also projected through analysis and simulation.
Spin-Orbit Torque from a Magnetic Heterostructure of High-Entropy Alloy
NASA Astrophysics Data System (ADS)
Chen, Tian-Yue; Chuang, Tsao-Chi; Huang, Ssu-Yen; Yen, Hung-Wei; Pai, Chi-Feng
2017-10-01
High-entropy alloy (HEA) is a family of metallic materials with nearly equal partitions of five or more metals, which might possess mechanical and transport properties that are different from conventional binary or tertiary alloys. In this work, we demonstrate current-induced spin-orbit torque (SOT) magnetization switching in a Ta-Nb-Hf-Zr-Ti HEA-based magnetic heterostructure with perpendicular magnetic anisotropy. The maximum dampinglike SOT efficiency from this particular HEA-based magnetic heterostructure is further determined to be |ζDLHEA | ≈0.033 by hysteresis-loop-shift measurements, while that for the Ta control sample is |ζDLTa | ≈0.04 . Our results indicate that HEA-based magnetic heterostructures can serve as an alternative group of potential candidates for SOT device applications due to the possibility of tuning buffer-layer properties with more than two constituent elements.
Shock wave induced vaporization of porous solids
NASA Astrophysics Data System (ADS)
Shen, Andy H.; Ahrens, Thomas J.; O'Keefe, John D.
2003-05-01
Strong shock waves generated by hypervelocity impact can induce vaporization in solid materials. To pursue knowledge of the chemical species in the shock-induced vapors, one needs to design experiments that will drive the system to such thermodynamic states that sufficient vapor can be generated for investigation. It is common to use porous media to reach high entropy, vaporized states in impact experiments. We extended calculations by Ahrens [J. Appl. Phys. 43, 2443 (1972)] and Ahrens and O'Keefe [The Moon 4, 214 (1972)] to higher distentions (up to five) and improved their method with a different impedance match calculation scheme and augmented their model with recent thermodynamic and Hugoniot data of metals, minerals, and polymers. Although we reconfirmed the competing effects reported in the previous studies: (1) increase of entropy production and (2) decrease of impedance match, when impacting materials with increasing distentions, our calculations did not exhibit optimal entropy-generating distention. For different materials, very different impact velocities are needed to initiate vaporization. For aluminum at distention (m)<2.2, a minimum impact velocity of 2.7 km/s is required using tungsten projectile. For ionic solids such as NaCl at distention <2.2, 2.5 km/s is needed. For carbonate and sulfate minerals, the minimum impact velocities are much lower, ranging from less than 1 to 1.5 km/s.
Entropy and equilibrium via games of complexity
NASA Astrophysics Data System (ADS)
Topsøe, Flemming
2004-09-01
It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.
Universal bound on the efficiency of molecular motors
NASA Astrophysics Data System (ADS)
Pietzonka, Patrick; Barato, Andre C.; Seifert, Udo
2016-12-01
The thermodynamic uncertainty relation provides an inequality relating any mean current, the associated dispersion and the entropy production rate for arbitrary non-equilibrium steady states. Applying it here to a general model of a molecular motor running against an external force or torque, we show that the thermodynamic efficiency of such motors is universally bounded by an expression involving only experimentally accessible quantities. For motors pulling cargo through a viscous fluid, a universal bound for the corresponding Stokes efficiency follows as a variant. A similar result holds if mechanical force is used to synthesize molecules of high chemical potential. Crucially, no knowledge of the detailed underlying mechano-chemical mechanism is required for applying these bounds.
Engines with ideal efficiency and nonzero power for sublinear transport laws
NASA Astrophysics Data System (ADS)
Koning, Jesper; Indekeu, Joseph O.
2016-11-01
It is known that an engine with ideal efficiency (η = 1 for a chemical engine and e = eCarnot for a thermal one) has zero power because a reversible cycle takes an infinite time. However, at least from a theoretical point of view, it is possible to conceive (irreversible) engines with nonzero power that can reach ideal efficiency. Here this is achieved by replacing the usual linear transport law by a sublinear one and taking the step-function limit for the particle current (chemical engine) or heat current (thermal engine) versus the applied force. It is shown that in taking this limit exact thermodynamic inequalities relating the currents to the entropy production are not violated.
Possible Origin of Efficient Navigation in Small Worlds
NASA Astrophysics Data System (ADS)
Hu, Yanqing; Wang, Yougui; Li, Daqing; Havlin, Shlomo; di, Zengru
2011-03-01
The small-world phenomenon is one of the most important properties found in social networks. It includes both short path lengths and efficient navigation between two individuals. It is found by Kleinberg that navigation is efficient only if the probability density distribution of an individual to have a friend at distance r scales as P(r)˜r-1. Although this spatial scaling is found in many empirical studies, the origin of how this scaling emerges is still missing. In this Letter, we propose the origin of this scaling law using the concept of entropy from statistical physics and show that this scaling is the result of optimization of collecting information in social networks.
A novel parallel pipeline structure of VP9 decoder
NASA Astrophysics Data System (ADS)
Qin, Huabiao; Chen, Wu; Yi, Sijun; Tan, Yunfei; Yi, Huan
2018-04-01
To improve the efficiency of VP9 decoder, a novel parallel pipeline structure of VP9 decoder is presented in this paper. According to the decoding workflow, VP9 decoder can be divided into sub-modules which include entropy decoding, inverse quantization, inverse transform, intra prediction, inter prediction, deblocking and pixel adaptive compensation. By analyzing the computing time of each module, hotspot modules are located and the causes of low efficiency of VP9 decoder can be found. Then, a novel pipeline decoder structure is designed by using mixed parallel decoding methods of data division and function division. The experimental results show that this structure can greatly improve the decoding efficiency of VP9.
Quantile based Tsallis entropy in residual lifetime
NASA Astrophysics Data System (ADS)
Khammar, A. H.; Jahanshahi, S. M. A.
2018-02-01
Tsallis entropy is a generalization of type α of the Shannon entropy, that is a nonadditive entropy unlike the Shannon entropy. Shannon entropy may be negative for some distributions, but Tsallis entropy can always be made nonnegative by choosing appropriate value of α. In this paper, we derive the quantile form of this nonadditive's entropy function in the residual lifetime, namely the residual quantile Tsallis entropy (RQTE) and get the bounds for it, depending on the Renyi's residual quantile entropy. Also, we obtain relationship between RQTE and concept of proportional hazards model in the quantile setup. Based on the new measure, we propose a stochastic order and aging classes, and study its properties. Finally, we prove characterizations theorems for some well known lifetime distributions. It is shown that RQTE uniquely determines the parent distribution unlike the residual Tsallis entropy.
Spatial econometric analysis of factors influencing regional energy efficiency in China.
Song, Malin; Chen, Yu; An, Qingxian
2018-05-01
Increased environmental pollution and energy consumption caused by the country's rapid development has raised considerable public concern, and has become the focus of the government and public. This study employs the super-efficiency slack-based model-data envelopment analysis (SBM-DEA) to measure the total factor energy efficiency of 30 provinces in China. The estimation model for the spatial interaction intensity of regional total factor energy efficiency is based on Wilson's maximum entropy model. The model is used to analyze the factors that affect the potential value of total factor energy efficiency using spatial dynamic panel data for 30 provinces during 2000-2014. The study found that there are differences and spatial correlations of energy efficiency among provinces and regions in China. The energy efficiency in the eastern, central, and western regions fluctuated significantly, and was mainly because of significant energy efficiency impacts on influences of industrial structure, energy intensity, and technological progress. This research is of great significance to China's energy efficiency and regional coordinated development.
Gold, currencies and market efficiency
NASA Astrophysics Data System (ADS)
Kristoufek, Ladislav; Vosvrda, Miloslav
2016-05-01
Gold and currency markets form a unique pair with specific interactions and dynamics. We focus on the efficiency ranking of gold markets with respect to the currency of purchase. By utilizing the Efficiency Index (EI) based on fractal dimension, approximate entropy and long-term memory on a wide portfolio of 142 gold price series for different currencies, we construct the efficiency ranking based on the extended EI methodology we provide. Rather unexpected results are uncovered as the gold prices in major currencies lay among the least efficient ones whereas very minor currencies are among the most efficient ones. We argue that such counterintuitive results can be partly attributed to a unique period of examination (2011-2014) characteristic by quantitative easing and rather unorthodox monetary policies together with the investigated illegal collusion of major foreign exchange market participants, as well as some other factors discussed in some detail.
Time-dependent entropy evolution in microscopic and macroscopic electromagnetic relaxation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker-Jarvis, James
This paper is a study of entropy and its evolution in the time and frequency domains upon application of electromagnetic fields to materials. An understanding of entropy and its evolution in electromagnetic interactions bridges the boundaries between electromagnetism and thermodynamics. The approach used here is a Liouville-based statistical-mechanical theory. I show that the microscopic entropy is reversible and the macroscopic entropy satisfies an H theorem. The spectral entropy development can be very useful for studying the frequency response of materials. Using a projection-operator based nonequilibrium entropy, different equations are derived for the entropy and entropy production and are applied tomore » the polarization, magnetization, and macroscopic fields. I begin by proving an exact H theorem for the entropy, progress to application of time-dependent entropy in electromagnetics, and then apply the theory to relevant applications in electromagnetics. The paper concludes with a discussion of the relationship of the frequency-domain form of the entropy to the permittivity, permeability, and impedance.« less
Entropy flow and entropy production in the human body in basal conditions.
Aoki, I
1989-11-08
Entropy inflow and outflow for the naked human body in basal conditions in the respiration calorimeter due to infrared radiation, convection, evaporation of water and mass-flow are calculated by use of the energetic data obtained by Hardy & Du Bois. Also, the change of entropy content in the body is estimated. The entropy production in the human body is obtained as the change of entropy content minus the net entropy flow into the body. The entropy production thus calculated becomes positive. The magnitude of entropy production per effective radiating surface area does not show any significant variation with subjects. The entropy production is nearly constant at the calorimeter temperatures of 26-32 degrees C; the average in this temperature range is 0.172 J m-2 sec-1 K-1. The forced air currents around the human body and also clothing have almost no effect in changing the entropy production. Thus, the entropy production of the naked human body in basal conditions does not depend on its environmental factors.
NASA Astrophysics Data System (ADS)
Boutet de Monvel, Jacques; Le Calvez, Sophie; Ulfendahl, Mats
2000-05-01
Image restoration algorithms provide efficient tools for recovering part of the information lost in the imaging process of a microscope. We describe recent progress in the application of deconvolution to confocal microscopy. The point spread function of a Biorad-MRC1024 confocal microscope was measured under various imaging conditions, and used to process 3D-confocal images acquired in an intact preparation of the inner ear developed at Karolinska Institutet. Using these experiments we investigate the application of denoising methods based on wavelet analysis as a natural regularization of the deconvolution process. Within the Bayesian approach to image restoration, we compare wavelet denoising with the use of a maximum entropy constraint as another natural regularization method. Numerical experiments performed with test images show a clear advantage of the wavelet denoising approach, allowing to `cool down' the image with respect to the signal, while suppressing much of the fine-scale artifacts appearing during deconvolution due to the presence of noise, incomplete knowledge of the point spread function, or undersampling problems. We further describe a natural development of this approach, which consists of performing the Bayesian inference directly in the wavelet domain.
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
Enthalpy versus entropy: What drives hard-particle ordering in condensed phases?
Anthamatten, Mitchell; Ou, Jane J.; Weinfeld, Jeffrey A.; ...
2016-07-27
In support of mesoscopic-scale materials processing, spontaneous hard-particle ordering has been actively pursued for over a half-century. The generally accepted view that entropy alone can drive hard particle ordering is evaluated. Furthermore, a thermodynamic analysis of hard particle ordering was conducted and shown to agree with existing computations and experiments. Conclusions are that (i) hard particle ordering transitions between states in equilibrium are forbidden at constant volume but are allowed at constant pressure; (ii) spontaneous ordering transitions at constant pressure are driven by enthalpy, and (iii) ordering under constant volume necessarily involves a non-equilibrium initial state which has yet tomore » be rigorously defined.« less
A Numerical Investigation of the Burnett Equations Based on the Second Law
NASA Technical Reports Server (NTRS)
Comeaux, Keith A.; Chapman, Dean R.; MacCormack, Robert W.; Edwards, Thomas A. (Technical Monitor)
1995-01-01
The Burnett equations have been shown to potentially violate the second law of thermodynamics. The objective of this investigation is to correlate the numerical problems experienced by the Burnett equations to the negative production of entropy. The equations have had a long history of numerical instability to small wavelength disturbances. Recently, Zhong corrected the instability problem and made solutions attainable for one dimensional shock waves and hypersonic blunt bodies. Difficulties still exist when attempting to solve hypersonic flat plate boundary layers and blunt body wake flows, however. Numerical experiments will include one-dimensional shock waves, quasi-one dimensional nozzles, and expanding Prandlt-Meyer flows and specifically examine the entropy production for these cases.
Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159
Risk evaluation of bogie system based on extension theory and entropy weight method.
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.
Detection of Dendritic Spines Using Wavelet Packet Entropy and Fuzzy Support Vector Machine.
Wang, Shuihua; Li, Yang; Shao, Ying; Cattani, Carlo; Zhang, Yudong; Du, Sidan
2017-01-01
The morphology of dendritic spines is highly correlated with the neuron function. Therefore, it is of positive influence for the research of the dendritic spines. However, it is tried to manually label the spine types for statistical analysis. In this work, we proposed an approach based on the combination of wavelet contour analysis for the backbone detection, wavelet packet entropy, and fuzzy support vector machine for the spine classification. The experiments show that this approach is promising. The average detection accuracy of "MushRoom" achieves 97.3%, "Stubby" achieves 94.6%, and "Thin" achieves 97.2%. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Liu, Zhigang; Han, Zhiwei; Zhang, Yang; Zhang, Qiaoge
2014-11-01
Multiwavelets possess better properties than traditional wavelets. Multiwavelet packet transformation has more high-frequency information. Spectral entropy can be applied as an analysis index to the complexity or uncertainty of a signal. This paper tries to define four multiwavelet packet entropies to extract the features of different transmission line faults, and uses a radial basis function (RBF) neural network to recognize and classify 10 fault types of power transmission lines. First, the preprocessing and postprocessing problems of multiwavelets are presented. Shannon entropy and Tsallis entropy are introduced, and their difference is discussed. Second, multiwavelet packet energy entropy, time entropy, Shannon singular entropy, and Tsallis singular entropy are defined as the feature extraction methods of transmission line fault signals. Third, the plan of transmission line fault recognition using multiwavelet packet entropies and an RBF neural network is proposed. Finally, the experimental results show that the plan with the four multiwavelet packet energy entropies defined in this paper achieves better performance in fault recognition. The performance with SA4 (symmetric antisymmetric) multiwavelet packet Tsallis singular entropy is the best among the combinations of different multiwavelet packets and the four multiwavelet packet entropies.
Minimal entropy probability paths between genome families.
Ahlbrandt, Calvin; Benson, Gary; Casey, William
2004-05-01
We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non-rich vectors, does not involve variational theory and does not involve differential equations, but is a better approximation of the minimal entropy path distance than the distance //b-a//(2). We compute minimal entropy distance matrices for examples of DNA myostatin genes and amino-acid sequences across several species. Output tree dendograms for our minimal entropy metric are compared with dendograms based on BLAST and BLAST identity scores.
Recent Developments and Applications of the MMPBSA Method
Wang, Changhao; Greene, D'Artagnan; Xiao, Li; Qi, Ruxi; Luo, Ray
2018-01-01
The Molecular Mechanics Poisson-Boltzmann Surface Area (MMPBSA) approach has been widely applied as an efficient and reliable free energy simulation method to model molecular recognition, such as for protein-ligand binding interactions. In this review, we focus on recent developments and applications of the MMPBSA method. The methodology review covers solvation terms, the entropy term, extensions to membrane proteins and high-speed screening, and new automation toolkits. Recent applications in various important biomedical and chemical fields are also reviewed. We conclude with a few future directions aimed at making MMPBSA a more robust and efficient method. PMID:29367919
Uniqueness and characterization theorems for generalized entropies
NASA Astrophysics Data System (ADS)
Enciso, Alberto; Tempesta, Piergiulio
2017-12-01
The requirement that an entropy function be composable is key: it means that the entropy of a compound system can be calculated in terms of the entropy of its independent components. We prove that, under mild regularity assumptions, the only composable generalized entropy in trace form is the Tsallis one-parameter family (which contains Boltzmann-Gibbs as a particular case). This result leads to the use of generalized entropies that are not of trace form, such as Rényi’s entropy, in the study of complex systems. In this direction, we also present a characterization theorem for a large class of composable non-trace-form entropy functions with features akin to those of Rényi’s entropy.
Wang, Junmei; Hou, Tingjun
2012-01-01
It is of great interest in modern drug design to accurately calculate the free energies of protein-ligand or nucleic acid-ligand binding. MM-PBSA (Molecular Mechanics-Poisson Boltzmann Surface Area) and MM-GBSA (Molecular Mechanics-Generalized Born Surface Area) have gained popularity in this field. For both methods, the conformational entropy, which is usually calculated through normal mode analysis (NMA), is needed to calculate the absolute binding free energies. Unfortunately, NMA is computationally demanding and becomes a bottleneck of the MM-PB/GBSA-NMA methods. In this work, we have developed a fast approach to estimate the conformational entropy based upon solvent accessible surface area calculations. In our approach, the conformational entropy of a molecule, S, can be obtained by summing up the contributions of all atoms, no matter they are buried or exposed. Each atom has two types of surface areas, solvent accessible surface area (SAS) and buried SAS (BSAS). The two types of surface areas are weighted to estimate the contribution of an atom to S. Atoms having the same atom type share the same weight and a general parameter k is applied to balance the contributions of the two types of surface areas. This entropy model was parameterized using a large set of small molecules for which their conformational entropies were calculated at the B3LYP/6-31G* level taking the solvent effect into account. The weighted solvent accessible surface area (WSAS) model was extensively evaluated in three tests. For the convenience, TS, the product of temperature T and conformational entropy S, were calculated in those tests. T was always set to 298.15 K through the text. First of all, good correlations were achieved between WSAS TS and NMA TS for 44 protein or nucleic acid systems sampled with molecular dynamics simulations (10 snapshots were collected for post-entropy calculations): the mean correlation coefficient squares (R2) was 0.56. As to the 20 complexes, the TS changes upon binding, TΔS, were also calculated and the mean R2 was 0.67 between NMA and WSAS. In the second test, TS were calculated for 12 proteins decoy sets (each set has 31 conformations) generated by the Rosetta software package. Again, good correlations were achieved for all decoy sets: the mean, maximum, minimum of R2 were 0.73, 0.89 and 0.55, respectively. Finally, binding free energies were calculated for 6 protein systems (the numbers of inhibitors range from 4 to 18) using four scoring functions. Compared to the measured binding free energies, the mean R2 of the six protein systems were 0.51, 0.47, 0.40 and 0.43 for MM-GBSA-WSAS, MM-GBSA-NMA, MM-PBSA-WSAS and MM-PBSA-NMA, respectively. The mean RMS errors of prediction were 1.19, 1.24, 1.41, 1.29 kcal/mol for the four scoring functions, correspondingly. Therefore, the two scoring functions employing WSAS achieved a comparable prediction performance to that of the scoring functions using NMA. It should be emphasized that no minimization was performed prior to the WSAS calculation in the last test. Although WSAS is not as rigorous as physical models such as quasi-harmonic analysis and thermodynamic integration (TI), it is computationally very efficient as only surface area calculation is involved and no structural minimization is required. Moreover, WSAS has achieved a comparable performance to normal mode analysis. We expect that this model could find its applications in the fields like high throughput screening (HTS), molecular docking and rational protein design. In those fields, efficiency is crucial since there are a large number of compounds, docking poses or protein models to be evaluated. A list of acronyms and abbreviations used in this work is provided for quick reference. PMID:22497310
NASA Astrophysics Data System (ADS)
Schliesser, Jacob M.; Huang, Baiyu; Sahu, Sulata K.; Asplund, Megan; Navrotsky, Alexandra; Woodfield, Brian F.
2018-03-01
We have measured the heat capacities of several well-characterized bulk and nanophase Fe3O4-Co3O4 and Fe3O4-Mn3O4 spinel solid solution samples from which magnetic properties of transitions and third-law entropies have been determined. The magnetic transitions show several features common to effects of particle and magnetic domain sizes. From the standard molar entropies, excess entropies of mixing have been generated for these solid solutions and compared with configurational entropies determined previously by assuming appropriate cation and valence distributions. The vibrational and magnetic excess entropies for bulk materials are comparable in magnitude to the respective configurational entropies indicating that excess entropies of mixing must be included when analyzing entropies of mixing. The excess entropies for nanophase materials are even larger than the configurational entropies. Changes in valence, cation distribution, bonding and microstructure between the mixing ions are the likely sources of the positive excess entropies of mixing.
Abe, Sumiyoshi
2002-10-01
The q-exponential distributions, which are generalizations of the Zipf-Mandelbrot power-law distribution, are frequently encountered in complex systems at their stationary states. From the viewpoint of the principle of maximum entropy, they can apparently be derived from three different generalized entropies: the Rényi entropy, the Tsallis entropy, and the normalized Tsallis entropy. Accordingly, mere fittings of observed data by the q-exponential distributions do not lead to identification of the correct physical entropy. Here, stabilities of these entropies, i.e., their behaviors under arbitrary small deformation of a distribution, are examined. It is shown that, among the three, the Tsallis entropy is stable and can provide an entropic basis for the q-exponential distributions, whereas the others are unstable and cannot represent any experimentally observable quantities.
Atomic Bose-Hubbard Systems with Single-Particle Control
NASA Astrophysics Data System (ADS)
Preiss, Philipp Moritz
Experiments with ultracold atoms in optical lattices provide outstanding opportunities to realize exotic quantum states due to a high degree of tunability and control. In this thesis, I present experiments that extend this control from global parameters to the level of individual particles. Using a quantum gas microscope for 87Rb, we have developed a single-site addressing scheme based on digital amplitude holograms. The system self-corrects for aberrations in the imaging setup and creates arbitrary beam profiles. We are thus able to shape optical potentials on the scale of single lattice sites and control the dynamics of individual atoms. We study the role of quantum statistics and interactions in the Bose-Hubbard model on the fundamental level of two particles. Bosonic quantum statistics are apparent in the Hong-Ou-Mandel interference of massive particles, which we observe in tailored double-well potentials. These underlying statistics, in combination with tunable repulsive interactions, dominate the dynamics in single- and two-particle quantum walks. We observe highly coherent position-space Bloch oscillations, bosonic bunching in Hanbury Brown-Twiss interference and the fermionization of strongly interacting bosons. Many-body states of indistinguishable quantum particles are characterized by large-scale spatial entanglement, which is difficult to detect in itinerant systems. Here, we extend the concept of Hong-Ou-Mandel interference from individual particles to many-body states to directly quantify entanglement entropy. We perform collective measurements on two copies of a quantum state and detect entanglement entropy through many-body interference. We measure the second order Renyi entropy in small Bose-Hubbard systems and detect the buildup of spatial entanglement across the superfluid-insulator transition. Our experiments open new opportunities for the single-particle-resolved preparation and characterization of many-body quantum states.
On the entropy variation in the scenario of entropic gravity
NASA Astrophysics Data System (ADS)
Xiao, Yong; Bai, Shi-Yang
2018-05-01
In the scenario of entropic gravity, entropy varies as a function of the location of the matter, while the tendency to increase entropy appears as gravity. We concentrate on studying the entropy variation of a typical gravitational system with different relative positions between the mass and the gravitational source. The result is that the entropy of the system doesn't increase when the mass is displaced closer to the gravitational source. In this way it disproves the proposal of entropic gravity from thermodynamic entropy. It doesn't exclude the possibility that gravity originates from non-thermodynamic entropy like entanglement entropy.
NASA Astrophysics Data System (ADS)
Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen
2018-06-01
As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.
Multiscale permutation entropy analysis of laser beam wandering in isotropic turbulence.
Olivares, Felipe; Zunino, Luciano; Gulich, Damián; Pérez, Darío G; Rosso, Osvaldo A
2017-10-01
We have experimentally quantified the temporal structural diversity from the coordinate fluctuations of a laser beam propagating through isotropic optical turbulence. The main focus here is on the characterization of the long-range correlations in the wandering of a thin Gaussian laser beam over a screen after propagating through a turbulent medium. To fulfill this goal, a laboratory-controlled experiment was conducted in which coordinate fluctuations of the laser beam were recorded at a sufficiently high sampling rate for a wide range of turbulent conditions. Horizontal and vertical displacements of the laser beam centroid were subsequently analyzed by implementing the symbolic technique based on ordinal patterns to estimate the well-known permutation entropy. We show that the permutation entropy estimations at multiple time scales evidence an interplay between different dynamical behaviors. More specifically, a crossover between two different scaling regimes is observed. We confirm a transition from an integrated stochastic process contaminated with electronic noise to a fractional Brownian motion with a Hurst exponent H=5/6 as the sampling time increases. Besides, we are able to quantify, from the estimated entropy, the amount of electronic noise as a function of the turbulence strength. We have also demonstrated that these experimental observations are in very good agreement with numerical simulations of noisy fractional Brownian motions with a well-defined crossover between two different scaling regimes.
Mak, Chi H
2015-11-25
While single-stranded (ss) segments of DNAs and RNAs are ubiquitous in biology, details about their structures have only recently begun to emerge. To study ssDNA and RNAs, we have developed a new Monte Carlo (MC) simulation using a free energy model for nucleic acids that has the atomisitic accuracy to capture fine molecular details of the sugar-phosphate backbone. Formulated on the basis of a first-principle calculation of the conformational entropy of the nucleic acid chain, this free energy model correctly reproduced both the long and short length-scale structural properties of ssDNA and RNAs in a rigorous comparison against recent data from fluorescence resonance energy transfer, small-angle X-ray scattering, force spectroscopy and fluorescence correlation transport measurements on sequences up to ∼100 nucleotides long. With this new MC algorithm, we conducted a comprehensive investigation of the entropy landscape of small RNA stem-loop structures. From a simulated ensemble of ∼10(6) equilibrium conformations, the entropy for the initiation of different size RNA hairpin loops was computed and compared against thermodynamic measurements. Starting from seeded hairpin loops, constrained MC simulations were then used to estimate the entropic costs associated with propagation of the stem. The numerical results provide new direct molecular insights into thermodynaimc measurement from macroscopic calorimetry and melting experiments.
Theory and Normal Mode Analysis of Change in Protein Vibrational Dynamics on Ligand Binding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mortisugu, Kei; Njunda, Brigitte; Smith, Jeremy C
2009-12-01
The change of protein vibrations on ligand binding is of functional and thermodynamic importance. Here, this process is characterized using a simple analytical 'ball-and-spring' model and all-atom normal-mode analysis (NMA) of the binding of the cancer drug, methotrexate (MTX) to its target, dihydrofolate reductase (DHFR). The analytical model predicts that the coupling between protein vibrations and ligand external motion generates entropy-rich, low-frequency vibrations in the complex. This is consistent with the atomistic NMA which reveals vibrational softening in forming the DHFR-MTX complex, a result also in qualitative agreement with neutron-scattering experiments. Energy minimization of the atomistic bound-state (B) structure whilemore » gradually decreasing the ligand interaction to zero allows the generation of a hypothetical 'intermediate' (I) state, without the ligand force field but with a structure similar to that of B. In going from I to B, it is found that the vibrational entropies of both the protein and MTX decrease while the complex structure becomes enthalpically stabilized. However, the relatively weak DHFR:MTX interaction energy results in the net entropy gain arising from coupling between the protein and MTX external motion being larger than the loss of vibrational entropy on complex formation. This, together with the I structure being more flexible than the unbound structure, results in the observed vibrational softening on ligand binding.« less
Logarithmic black hole entropy corrections and holographic Rényi entropy
NASA Astrophysics Data System (ADS)
Mahapatra, Subhash
2018-01-01
The entanglement and Rényi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Rényi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order GD^0. The entropic c-function and the inequalities of the Rényi entropy are also satisfied even with the correction terms.
Maximum Relative Entropy of Coherence: An Operational Coherence Measure.
Bu, Kaifeng; Singh, Uttam; Fei, Shao-Ming; Pati, Arun Kumar; Wu, Junde
2017-10-13
The operational characterization of quantum coherence is the cornerstone in the development of the resource theory of coherence. We introduce a new coherence quantifier based on maximum relative entropy. We prove that the maximum relative entropy of coherence is directly related to the maximum overlap with maximally coherent states under a particular class of operations, which provides an operational interpretation of the maximum relative entropy of coherence. Moreover, we show that, for any coherent state, there are examples of subchannel discrimination problems such that this coherent state allows for a higher probability of successfully discriminating subchannels than that of all incoherent states. This advantage of coherent states in subchannel discrimination can be exactly characterized by the maximum relative entropy of coherence. By introducing a suitable smooth maximum relative entropy of coherence, we prove that the smooth maximum relative entropy of coherence provides a lower bound of one-shot coherence cost, and the maximum relative entropy of coherence is equivalent to the relative entropy of coherence in the asymptotic limit. Similar to the maximum relative entropy of coherence, the minimum relative entropy of coherence has also been investigated. We show that the minimum relative entropy of coherence provides an upper bound of one-shot coherence distillation, and in the asymptotic limit the minimum relative entropy of coherence is equivalent to the relative entropy of coherence.
NASA Astrophysics Data System (ADS)
Jamshed, Wasim; Aziz, Asim
2018-06-01
The efficiency of any nanofluid based thermal solar system depend on the thermophysical properties of the operating fluids, type and shape of nanoparticles, nanoparticles volumetric concentration in the base fluid and the geometry/length of the system in which fluid is flowing. The recent research in the field of thermal solar energy has been focused to increase the efficiency of solar thermal collector systems. In the present research a simplified mathematical model is studied for inclusion in the thermal solar systems with the aim to improve the overall efficiency of the system. The flow of Powell-Eyring nanofluid is induced by non-uniform stretching of porous horizontal surface with fluid occupying a space over the surface. The thermal conductivity of the nanofluid is to vary as a linear function of temperature and the thermal radiation is to travel a short distance in the optically thick nanofluid. Numerical scheme of Keller box is implemented on the system of nonlinear ordinary differential equations, which are resultant after application of similarity transformation to governing nonlinear partial differential equations. The impact of non dimensional physical parameters appearing in the system have been observed on velocity and temperature profiles along with the entropy of the system. The velocity gradient (skin friction coefficient) and the strength of convective heat exchange (Nusselt number) are also investigated.
Characterising RNA secondary structure space using information entropy
2013-01-01
Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905
The evolutionary synchronization of the exchange rate system in ASEAN+6
NASA Astrophysics Data System (ADS)
Feng, Xiaobing; Hu, Haibo; Wang, Xiaofan
2010-12-01
Although there are extensive researches on the behavior of the world currency network, the complexity of the Asian regional currency system is not well understood regardless of its importance. Using daily exchange rates this paper examines exchange rate co-movements in the region before and after the China exchange rate reform. It was found that the correlation between Asian currencies and the US Dollar, the previous regional key currency has become weaker and intra-Asia interactions have increased. Cross sample entropy and cross entropy approaches are also applied to examine the synchrony behavior among the Asian currencies. The study also shows that the Asian exchange rate markets featured are neither stochastic nor efficient. These findings may shed some light on the in-depth understanding of collective behaviors in a regional currency network; they will also lay a theoretical foundation for further policy formulation in Asian currency integration.
NASA Astrophysics Data System (ADS)
Pei, Zongrui; Eisenbach, Markus
2017-06-01
Dislocations are among the most important defects in determining the mechanical properties of both conventional alloys and high-entropy alloys. The Peierls-Nabarro model supplies an efficient pathway to their geometries and mobility. The difficulty in solving the integro-differential Peierls-Nabarro equation is how to effectively avoid the local minima in the energy landscape of a dislocation core. Among the other methods to optimize the dislocation core structures, we choose the algorithm of Particle Swarm Optimization, an algorithm that simulates the social behaviors of organisms. By employing more particles (bigger swarm) and more iterative steps (allowing them to explore for longer time), the local minima can be effectively avoided. But this would require more computational cost. The advantage of this algorithm is that it is readily parallelized in modern high computing architecture. We demonstrate the performance of our parallelized algorithm scales linearly with the number of employed cores.
Neural net classification of REM sleep based on spectral measures as compared to nonlinear measures.
Grözinger, M; Fell, J; Röschke, J
2001-11-01
In various studies the implementation of nonlinear and nonconventional measures has significantly improved EEG (electroencephalogram) analyses as compared to using conventional parameters alone. A neural network algorithm well approved in our laboratory for the automatic recognition of rapid eye movement (REM) sleep was investigated in this regard. Originally based on a broad range of spectral power inputs, we additionally supplied the nonlinear measures of the largest Lyapunov exponent and correlation dimension as well as the nonconventional stochastic measures of spectral entropy and entropy of amplitudes. No improvement in the detection of REM sleep could be achieved by the inclusion of the new measures. The accuracy of the classification was significantly worse, however, when supplied with these variables alone. In view of results demonstrating the efficiency of nonconventional measures in EEG analysis, the benefit appears to depend on the nature of the problem.
NASA Astrophysics Data System (ADS)
Li, Yongbo; Yang, Yuantao; Li, Guoyan; Xu, Minqiang; Huang, Wenhu
2017-07-01
Health condition identification of planetary gearboxes is crucial to reduce the downtime and maximize productivity. This paper aims to develop a novel fault diagnosis method based on modified multi-scale symbolic dynamic entropy (MMSDE) and minimum redundancy maximum relevance (mRMR) to identify the different health conditions of planetary gearbox. MMSDE is proposed to quantify the regularity of time series, which can assess the dynamical characteristics over a range of scales. MMSDE has obvious advantages in the detection of dynamical changes and computation efficiency. Then, the mRMR approach is introduced to refine the fault features. Lastly, the obtained new features are fed into the least square support vector machine (LSSVM) to complete the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault types of planetary gearboxes.
Martiniani, Stefano; Schrenk, K Julian; Stevenson, Jacob D; Wales, David J; Frenkel, Daan
2016-01-01
We present a numerical calculation of the total number of disordered jammed configurations Ω of N repulsive, three-dimensional spheres in a fixed volume V. To make these calculations tractable, we increase the computational efficiency of the approach of Xu et al. [Phys. Rev. Lett. 106, 245502 (2011)10.1103/PhysRevLett.106.245502] and Asenjo et al. [Phys. Rev. Lett. 112, 098002 (2014)10.1103/PhysRevLett.112.098002] and we extend the method to allow computation of the configurational entropy as a function of pressure. The approach that we use computes the configurational entropy by sampling the absolute volume of basins of attraction of the stable packings in the potential energy landscape. We find a surprisingly strong correlation between the pressure of a configuration and the volume of its basin of attraction in the potential energy landscape. This relation is well described by a power law. Our methodology to compute the number of minima in the potential energy landscape should be applicable to a wide range of other enumeration problems in statistical physics, string theory, cosmology, and machine learning that aim to find the distribution of the extrema of a scalar cost function that depends on many degrees of freedom.
Query construction, entropy, and generalization in neural-network models
NASA Astrophysics Data System (ADS)
Sollich, Peter
1994-05-01
We study query construction algorithms, which aim at improving the generalization ability of systems that learn from examples by choosing optimal, nonredundant training sets. We set up a general probabilistic framework for deriving such algorithms from the requirement of optimizing a suitable objective function; specifically, we consider the objective functions entropy (or information gain) and generalization error. For two learning scenarios, the high-low game and the linear perceptron, we evaluate the generalization performance obtained by applying the corresponding query construction algorithms and compare it to training on random examples. We find qualitative differences between the two scenarios due to the different structure of the underlying rules (nonlinear and ``noninvertible'' versus linear); in particular, for the linear perceptron, random examples lead to the same generalization ability as a sequence of queries in the limit of an infinite number of examples. We also investigate learning algorithms which are ill matched to the learning environment and find that, in this case, minimum entropy queries can in fact yield a lower generalization ability than random examples. Finally, we study the efficiency of single queries and its dependence on the learning history, i.e., on whether the previous training examples were generated randomly or by querying, and the difference between globally and locally optimal query construction.
NASA Astrophysics Data System (ADS)
Novak, Nikola; Weyland, Florian; Patel, Satyanarayan; Guo, Hanzheng; Tan, Xiaoli; Rödel, Jürgen; Koruza, Jurij
2018-03-01
The electrocaloric effect in ferroics is considered a powerful solid-state cooling technology. Its potential is enhanced by correlation to the inverse electrocaloric effect and leads into mechanisms of decreasing or increasing dipolar entropy under applied electric field. Nevertheless, the mechanism underlying the increase of the dipolar entropy with applied electric field remains unclear and controversial. This study investigates the electrocaloric response of the antiferroelectric P b0.99N b0.02[(Zr0.58Sn0.43) 0.92T i0.08] 0.98O3 in which the critical electric field is low enough to induce the ferroelectric phase over a broad temperature range. Utilizing temperature- and electric-field-dependent dielectric measurements, direct electrocaloric measurements, and in situ transmission electron microscopy, a crossover from conventional to inverse electrocaloric response is demonstrated. The origin of the inverse electrocaloric effect is rationalized by investigating the field-induced phase transition between antiferroelectric and ferroelectric phases. The disappearance of the latent heat at field-induced transition coincides with the crossover of the electrocaloric effect and demonstrates that the overall electrocaloric response is an interplay of different entropy contributions. This opens new opportunities for highly efficient, environmentally friendly cooling devices based on ferroic materials.
An Improved Evidential-IOWA Sensor Data Fusion Approach in Fault Diagnosis
Zhou, Deyun; Zhuang, Miaoyan; Fang, Xueyi; Xie, Chunhe
2017-01-01
As an important tool of information fusion, Dempster–Shafer evidence theory is widely applied in handling the uncertain information in fault diagnosis. However, an incorrect result may be obtained if the combined evidence is highly conflicting, which may leads to failure in locating the fault. To deal with the problem, an improved evidential-Induced Ordered Weighted Averaging (IOWA) sensor data fusion approach is proposed in the frame of Dempster–Shafer evidence theory. In the new method, the IOWA operator is used to determine the weight of different sensor data source, while determining the parameter of the IOWA, both the distance of evidence and the belief entropy are taken into consideration. First, based on the global distance of evidence and the global belief entropy, the α value of IOWA is obtained. Simultaneously, a weight vector is given based on the maximum entropy method model. Then, according to IOWA operator, the evidence are modified before applying the Dempster’s combination rule. The proposed method has a better performance in conflict management and fault diagnosis due to the fact that the information volume of each evidence is taken into consideration. A numerical example and a case study in fault diagnosis are presented to show the rationality and efficiency of the proposed method. PMID:28927017
Double symbolic joint entropy in nonlinear dynamic complexity analysis
NASA Astrophysics Data System (ADS)
Yao, Wenpo; Wang, Jun
2017-07-01
Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.
Effect of entropy on anomalous transport in ITG-modes of magneto-plasma
NASA Astrophysics Data System (ADS)
Yaqub Khan, M.; Qaiser Manzoor, M.; Haq, A. ul; Iqbal, J.
2017-04-01
The ideal gas equation and S={{c}v}log ≤ft(P/ρ \\right) (where S is entropy, P is pressure and ρ is the mass density) define the interconnection of entropy with the temperature and density of plasma. Therefore, different phenomena relating to plasma and entropy need to be investigated. By employing the Braginskii transport equations for a nonuniform electron-ion magnetoplasma, two new parameters—the entropy distribution function and the entropy gradient drift—are defined, a new dispersion relation is obtained, and the dependence of anomalous transport on entropy is also proved. Some results, like monotonicity, the entropy principle and the second law of thermodynamics, are proved with a new definition of entropy. This work will open new horizons in fusion processes, not only by controlling entropy in tokamak plasmas—particularly in the pedestal regions of the H-mode and space plasmas—but also in engineering sciences.
Approximate entropy: a new evaluation approach of mental workload under multitask conditions
NASA Astrophysics Data System (ADS)
Yao, Lei; Li, Xiaoling; Wang, Wei; Dong, Yuanzhe; Jiang, Ying
2014-04-01
There are numerous instruments and an abundance of complex information in the traditional cockpit display-control system, and pilots require a long time to familiarize themselves with the cockpit interface. This can cause accidents when they cope with emergency events, suggesting that it is necessary to evaluate pilot cognitive workload. In order to establish a simplified method to evaluate cognitive workload under a multitask condition. We designed a series of experiments involving different instrument panels and collected electroencephalograms (EEG) from 10 healthy volunteers. The data were classified and analyzed with an approximate entropy (ApEn) signal processing. ApEn increased with increasing experiment difficulty, suggesting that it can be used to evaluate cognitive workload. Our results demonstrate that ApEn can be used as an evaluation criteria of cognitive workload and has good specificity and sensitivity. Moreover, we determined an empirical formula to assess the cognitive workload interval, which can simplify cognitive workload evaluation under multitask conditions.
Chiavazzo, Eliodoro; Isaia, Marco; Mammola, Stefano; Lepore, Emiliano; Ventola, Luigi; Asinari, Pietro; Pugno, Nicola Maria
2015-01-01
The choice of a suitable area to spiders where to lay eggs is promoted in terms of Darwinian fitness. Despite its importance, the underlying factors behind this key decision are generally poorly understood. Here, we designed a multidisciplinary study based both on in-field data and laboratory experiments focusing on the European cave spider Meta menardi (Araneae, Tetragnathidae) and aiming at understanding the selective forces driving the female in the choice of the depositional area. Our in-field data analysis demonstrated a major role of air velocity and distance from the cave entrance within a particular cave in driving the female choice. This has been interpreted using a model based on the Entropy Generation Minimization - EGM - method, without invoking best fit parameters and thanks to independent lab experiments, thus demonstrating that the female chooses the depositional area according to minimal level of thermo-fluid-dynamic irreversibility. This methodology may pave the way to a novel approach in understanding evolutionary strategies for other living organisms. PMID:25556697
Mechanical transduction via a single soft polymer
NASA Astrophysics Data System (ADS)
Hou, Ruizheng; Wang, Nan; Bao, Weizhu; Wang, Zhisong
2018-04-01
Molecular machines from biology and nanotechnology often depend on soft structures to perform mechanical functions, but the underlying mechanisms and advantages or disadvantages over rigid structures are not fully understood. We report here a rigorous study of mechanical transduction along a single soft polymer based on exact solutions to the realistic three-dimensional wormlike-chain model and augmented with analytical relations derived from simpler polymer models. The results reveal surprisingly that a soft polymer with vanishingly small persistence length below a single chemical bond still transduces biased displacement and mechanical work up to practically significant amounts. This "soft" approach possesses unique advantages over the conventional wisdom of rigidity-based transduction, and potentially leads to a unified mechanism for effective allosterylike transduction and relay of mechanical actions, information, control, and molecules from one position to another in molecular devices and motors. This study also identifies an entropy limit unique to the soft transduction, and thereby suggests a possibility of detecting higher efficiency for kinesin motor and mutants in future experiments.
NodePM: A Remote Monitoring Alert System for Energy Consumption Using Probabilistic Techniques
Filho, Geraldo P. R.; Ueyama, Jó; Villas, Leandro A.; Pinto, Alex R.; Gonçalves, Vinícius P.; Pessin, Gustavo; Pazzi, Richard W.; Braun, Torsten
2014-01-01
In this paper, we propose an intelligent method, named the Novelty Detection Power Meter (NodePM), to detect novelties in electronic equipment monitored by a smart grid. Considering the entropy of each device monitored, which is calculated based on a Markov chain model, the proposed method identifies novelties through a machine learning algorithm. To this end, the NodePM is integrated into a platform for the remote monitoring of energy consumption, which consists of a wireless sensors network (WSN). It thus should be stressed that the experiments were conducted in real environments different from many related works, which are evaluated in simulated environments. In this sense, the results show that the NodePM reduces by 13.7% the power consumption of the equipment we monitored. In addition, the NodePM provides better efficiency to detect novelties when compared to an approach from the literature, surpassing it in different scenarios in all evaluations that were carried out. PMID:24399157
An improvement of the measurement of time series irreversibility with visibility graph approach
NASA Astrophysics Data System (ADS)
Wu, Zhenyu; Shang, Pengjian; Xiong, Hui
2018-07-01
We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.
Herding, minority game, market clearing and efficient markets in a simple spin model framework
NASA Astrophysics Data System (ADS)
Kristoufek, Ladislav; Vosvrda, Miloslav
2018-01-01
We present a novel approach towards the financial Ising model. Most studies utilize the model to find settings which generate returns closely mimicking the financial stylized facts such as fat tails, volatility clustering and persistence, and others. We tackle the model utility from the other side and look for the combination of parameters which yields return dynamics of the efficient market in the view of the efficient market hypothesis. Working with the Ising model, we are able to present nicely interpretable results as the model is based on only two parameters. Apart from showing the results of our simulation study, we offer a new interpretation of the Ising model parameters via inverse temperature and entropy. We show that in fact market frictions (to a certain level) and herding behavior of the market participants do not go against market efficiency but what is more, they are needed for the markets to be efficient.
Finite-size effect on optimal efficiency of heat engines.
Tajima, Hiroyasu; Hayashi, Masahito
2017-07-01
The optimal efficiency of quantum (or classical) heat engines whose heat baths are n-particle systems is given by the strong large deviation. We give the optimal work extraction process as a concrete energy-preserving unitary time evolution among the heat baths and the work storage. We show that our optimal work extraction turns the disordered energy of the heat baths to the ordered energy of the work storage, by evaluating the ratio of the entropy difference to the energy difference in the heat baths and the work storage, respectively. By comparing the statistical mechanical optimal efficiency with the macroscopic thermodynamic bound, we evaluate the accuracy of the macroscopic thermodynamics with finite-size heat baths from the statistical mechanical viewpoint. We also evaluate the quantum coherence effect on the optimal efficiency of the cycle processes without restricting their cycle time by comparing the classical and quantum optimal efficiencies.
Quantifying Extrinsic Noise in Gene Expression Using the Maximum Entropy Framework
Dixit, Purushottam D.
2013-01-01
We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. PMID:23790383
NASA Astrophysics Data System (ADS)
Huang, Lida; Chen, Tao; Wang, Yan; Yuan, Hongyong
2015-12-01
Gatherings of large human crowds often result in crowd disasters such as the Love Parade Disaster in Duisburg, Germany on July 24, 2010. To avoid these tragedies, video surveillance and early warning are becoming more and more significant. In this paper, the velocity entropy is first defined as the criterion for congestion detection, which represents the motion magnitude distribution and the motion direction distribution simultaneously. Then the detection method is verified by the simulation data based on AnyLogic software. To test the generalization performance of this method, video recordings of a real-world case, the Love Parade disaster, are also used in the experiments. The velocity histograms of the foreground object in the videos are extracted by the Gaussian Mixture Model (GMM) and optical flow computation. With a sequential change-point detection algorithm, the velocity entropy can be applied to detect congestions of the Love Parade festival. It turned out that without recognizing and tracking individual pedestrian, our method can detect abnormal crowd behaviors in real-time.
Quantifying extrinsic noise in gene expression using the maximum entropy framework.
Dixit, Purushottam D
2013-06-18
We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Yeu, In Won; Park, Jaehong; Han, Gyuseung; Hwang, Cheol Seong; Choi, Jung-Hae
2017-09-06
A detailed understanding of the atomic configuration of the compound semiconductor surface, especially after reconstruction, is very important for the device fabrication and performance. While there have been numerous experimental studies using the scanning probe techniques, further theoretical studies on surface reconstruction are necessary to promote the clear understanding of the origins and development of such subtle surface structures. In this work, therefore, a pressure-temperature surface reconstruction diagram was constructed for the model case of the InAs (001) surface considering both the vibrational entropy and configurational entropy based on the density functional theory. Notably, the equilibrium fraction of various reconstructions was determined as a function of the pressure and temperature, not as a function of the chemical potential, which largely facilitated the direct comparison with the experiments. By taking into account the entropy effects, the coexistence of the multiple reconstructions and the fractional change of each reconstruction by the thermodynamic condition were predicted and were in agreement with the previous experimental observations. This work provides the community with a useful framework for such type of theoretical studies.
Thermodynamic and Differential Entropy under a Change of Variables
Hnizdo, Vladimir; Gilson, Michael K.
2013-01-01
The differential Shannon entropy of information theory can change under a change of variables (coordinates), but the thermodynamic entropy of a physical system must be invariant under such a change. This difference is puzzling, because the Shannon and Gibbs entropies have the same functional form. We show that a canonical change of variables can, indeed, alter the spatial component of the thermodynamic entropy just as it alters the differential Shannon entropy. However, there is also a momentum part of the entropy, which turns out to undergo an equal and opposite change when the coordinates are transformed, so that the total thermodynamic entropy remains invariant. We furthermore show how one may correctly write the change in total entropy for an isothermal physical process in any set of spatial coordinates. PMID:24436633
Entropy for Mechanically Vibrating Systems
NASA Astrophysics Data System (ADS)
Tufano, Dante
The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.
Entropy is more resistant to artifacts than bispectral index in brain-dead organ donors.
Wennervirta, Johanna; Salmi, Tapani; Hynynen, Markku; Yli-Hankala, Arvi; Koivusalo, Anna-Maria; Van Gils, Mark; Pöyhiä, Reino; Vakkuri, Anne
2007-01-01
To evaluate the usefulness of entropy and the bispectral index (BIS) in brain-dead subjects. A prospective, open, nonselective, observational study in the university hospital. 16 brain-dead organ donors. Time-domain electroencephalography (EEG), spectral entropy of the EEG, and BIS were recorded during solid organ harvest. State entropy differed significantly from 0 (isoelectric EEG) 28%, response entropy 29%, and BIS 68% of the total recorded time. The median values during the operation were state entropy 0.0, response entropy 0.0, and BIS 3.0. In four of 16 organ donors studied the EEG was not isoelectric, and nonreactive rhythmic activity was noted in time-domain EEG. After excluding the results from subjects with persistent residual EEG activity state entropy, response entropy, and BIS values differed from zero 17%, 18%, and 62% of the recorded time, respectively. Median values were 0.0, 0.0, and 2.0 for state entropy, response entropy, and BIS, respectively. The highest index values in entropy and BIS monitoring were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, 50-Hz artifact, handling of the donor, ballistocardiography, electromyography, and electrocardiography. Both entropy and BIS showed nonzero values due to artifacts after brain death diagnosis. BIS was more liable to artifacts than entropy. Neither of these indices are diagnostic tools, and care should be taken when interpreting EEG and EEG-derived indices in the evaluation of brain death.
Marronnier, Arthur; Roma, Guido; Boyer-Richard, Soline; Pedesseau, Laurent; Jancu, Jean-Marc; Bonnassieux, Yvan; Katan, Claudine; Stoumpos, Constantinos C; Kanatzidis, Mercouri G; Even, Jacky
2018-04-24
Hybrid organic-inorganic perovskites emerged as a new generation of absorber materials for high-efficiency low-cost solar cells in 2009. Very recently, fully inorganic perovskite quantum dots also led to promising efficiencies, making them a potentially stable and efficient alternative to their hybrid cousins. Currently, the record efficiency is obtained with CsPbI 3 , whose crystallographical characterization is still limited. Here, we show through high-resolution in situ synchrotron XRD measurements that CsPbI 3 can be undercooled below its transition temperature and temporarily maintained in its perovskite structure down to room temperature, stabilizing a metastable perovskite polytype (black γ-phase) crucial for photovoltaic applications. Our analysis of the structural phase transitions reveals a highly anisotropic evolution of the individual lattice parameters versus temperature. Structural, vibrational, and electronic properties of all the experimentally observed black phases are further inspected based on several theoretical approaches. Whereas the black γ-phase is shown to behave harmonically around equilibrium, for the tetragonal phase, density functional theory reveals the same anharmonic behavior, with a Brillouin zone-centered double-well instability, as for the cubic phase. Using total energy and vibrational entropy calculations, we highlight the competition between all the low-temperature phases of CsPbI 3 (γ, δ, β) and show that avoiding the order-disorder entropy term arising from double-well instabilities is key to preventing the formation of the yellow perovskitoid phase. A symmetry-based tight-binding model, validated by self-consistent GW calculations including spin-orbit coupling, affords further insight into their electronic properties, with evidence of Rashba effect for both cubic and tetragonal phases when using the symmetry-breaking structures obtained through frozen phonon calculations.
Thermal isomerization of azobenzenes: on the performance of Eyring transition state theory.
Rietze, Clemens; Titov, Evgenii; Lindner, Steven; Saalfrank, Peter
2017-08-09
The thermal [Formula: see text] (back-)isomerization of azobenzenes is a prototypical reaction occurring in molecular switches. It has been studied for decades, yet its kinetics is not fully understood. In this paper, quantum chemical calculations are performed to model the kinetics of an experimental benchmark system, where a modified azobenzene (AzoBiPyB) is embedded in a metal-organic framework (MOF). The molecule can be switched thermally from cis to trans, under solvent-free conditions. We critically test the validity of Eyring transition state theory for this reaction. As previously found for other azobenzenes (albeit in solution), good agreement between theory and experiment emerges for activation energies and activation free energies, already at a comparatively simple level of theory, B3LYP/6-31G * including dispersion corrections. However, theoretical Arrhenius prefactors and activation entropies are in qualitiative disagreement with experiment. Several factors are discussed that may have an influence on activation entropies, among them dynamical and geometric constraints (imposed by the MOF). For a simpler model-[Formula: see text] isomerization in azobenzene-a systematic test of quantum chemical methods from both density functional theory and wavefunction theory is carried out in the context of Eyring theory. Also, the effect of anharmonicities on activation entropies is discussed for this model system. Our work highlights capabilities and shortcomings of Eyring transition state theory and quantum chemical methods, when applied for the [Formula: see text] (back-)isomerization of azobenzenes under solvent-free conditions.
Thermal isomerization of azobenzenes: on the performance of Eyring transition state theory
NASA Astrophysics Data System (ADS)
Rietze, Clemens; Titov, Evgenii; Lindner, Steven; Saalfrank, Peter
2017-08-01
The thermal Z\\to E (back-)isomerization of azobenzenes is a prototypical reaction occurring in molecular switches. It has been studied for decades, yet its kinetics is not fully understood. In this paper, quantum chemical calculations are performed to model the kinetics of an experimental benchmark system, where a modified azobenzene (AzoBiPyB) is embedded in a metal-organic framework (MOF). The molecule can be switched thermally from cis to trans, under solvent-free conditions. We critically test the validity of Eyring transition state theory for this reaction. As previously found for other azobenzenes (albeit in solution), good agreement between theory and experiment emerges for activation energies and activation free energies, already at a comparatively simple level of theory, B3LYP/6-31G* including dispersion corrections. However, theoretical Arrhenius prefactors and activation entropies are in qualitiative disagreement with experiment. Several factors are discussed that may have an influence on activation entropies, among them dynamical and geometric constraints (imposed by the MOF). For a simpler model—Z\\to E isomerization in azobenzene—a systematic test of quantum chemical methods from both density functional theory and wavefunction theory is carried out in the context of Eyring theory. Also, the effect of anharmonicities on activation entropies is discussed for this model system. Our work highlights capabilities and shortcomings of Eyring transition state theory and quantum chemical methods, when applied for the Z\\to E (back-)isomerization of azobenzenes under solvent-free conditions.
NASA Astrophysics Data System (ADS)
Jeon, Wonju; Lee, Sang-Hee
2012-12-01
In our previous study, we defined the branch length similarity (BLS) entropy for a simple network consisting of a single node and numerous branches. As the first application of this entropy to characterize shapes, the BLS entropy profiles of 20 battle tank shapes were calculated from simple networks created by connecting pixels in the boundary of the shape. The profiles successfully characterized the tank shapes through a comparison of their BLS entropy profiles. Following the application, this entropy was used to characterize human's emotional faces, such as happiness and sad, and to measure the degree of complexity for termite tunnel networks. These applications indirectly indicate that the BLS entropy profile can be a useful tool to characterize networks and shapes. However, the ability of the BLS entropy in the characterization depends on the image resolution because the entropy is determined by the number of nodes for the boundary of a shape. Higher resolution means more nodes. If the entropy is to be widely used in the scientific community, the effect of the resolution on the entropy profile should be understood. In the present study, we mathematically investigated the BLS entropy profile of a shape with infinite resolution and numerically investigated the variation in the pattern of the entropy profile caused by changes in the resolution change in the case of finite resolution.
NASA Astrophysics Data System (ADS)
Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin
2014-07-01
Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.
Entropy generation of nanofluid flow in a microchannel heat sink
NASA Astrophysics Data System (ADS)
Manay, Eyuphan; Akyürek, Eda Feyza; Sahin, Bayram
2018-06-01
Present study aims to investigate the effects of the presence of nano sized TiO2 particles in the base fluid on entropy generation rate in a microchannel heat sink. Pure water was chosen as base fluid, and TiO2 particles were suspended into the pure water in five different particle volume fractions of 0.25%, 0.5%, 1.0%, 1.5% and 2.0%. Under laminar, steady state flow and constant heat flux boundary conditions, thermal, frictional, total entropy generation rates and entropy generation number ratios of nanofluids were experimentally analyzed in microchannel flow for different channel heights of 200 μm, 300 μm, 400 μm and 500 μm. It was observed that frictional and total entropy generation rates increased as thermal entropy generation rate were decreasing with an increase in particle volume fraction. In microchannel flows, thermal entropy generation could be neglected due to its too low rate smaller than 1.10e-07 in total entropy generation. Higher channel heights caused higher thermal entropy generation rates, and increasing channel height yielded an increase from 30% to 52% in thermal entropy generation. When channel height decreased, an increase of 66%-98% in frictional entropy generation was obtained. Adding TiO2 nanoparticles into the base fluid caused thermal entropy generation to decrease about 1.8%-32.4%, frictional entropy generation to increase about 3.3%-21.6%.
NASA Astrophysics Data System (ADS)
Guo, Ran
2018-04-01
In this paper, we investigate the definition of the entropy in the Fokker–Planck equation under the generalized fluctuation–dissipation relation (FDR), which describes a Brownian particle moving in a complex medium with friction and multiplicative noise. The friction and the noise are related by the generalized FDR. The entropy for such a system is defined first. According to the definition of the entropy, we calculate the entropy production and the entropy flux. Lastly, we make a numerical calculation to display the results in figures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prasad, Saurav, E-mail: saurav7188@gmail.com, E-mail: cyz118212@chemistry.iitd.ac.in; Chakravarty, Charusita
Experiments and simulations demonstrate some intriguing equivalences in the effect of pressure and electrolytes on the hydrogen-bonded network of water. Here, we examine the extent and nature of equivalence effects between pressure and salt concentration using relationships between structure, entropy, and transport properties based on two key ideas: first, the approximation of the excess entropy of the fluid by the contribution due to the atom-atom pair correlation functions and second, Rosenfeld-type excess entropy scaling relations for transport properties. We perform molecular dynamics simulations of LiCl–H{sub 2}O and bulk SPC/E water spanning the concentration range 0.025–0.300 molefraction of LiCl at 1more » atm and pressure range from 0 to 7 GPa, respectively. The temperature range considered was from 225 to 350 K for both the systems. To establish that the time-temperature-transformation behaviour of electrolyte solutions and water is equivalent, we use the additional observation based on our simulations that the pair entropy behaves as a near-linear function of pressure in bulk water and of composition in LiCl–H{sub 2}O. This allows for the alignment of pair entropy isotherms and allows for a simple mapping of pressure onto composition. Rosenfeld-scaling implies that pair entropy is semiquantitatively related to the transport properties. At a given temperature, equivalent state points in bulk H{sub 2}O and LiCl–H{sub 2}O (at 1 atm) are defined as those for which the pair entropy, diffusivity, and viscosity are nearly identical. The microscopic basis for this equivalence lies in the ability of both pressure and ions to convert the liquid phase into a pair-dominated fluid, as demonstrated by the O–O–O angular distribution within the first coordination shell of a water molecule. There are, however, sharp differences in local order and mechanisms for the breakdown of tetrahedral order by pressure and electrolytes. Increasing pressure increases orientational disorder within the first neighbour shell while addition of ions shifts local orientational order from tetrahedral to close-packed as water molecules get incorporated in ionic hydration shells. The variations in local order within the first hydration shell may underlie ion-specific effects, such as the Hofmeister series.« less
NASA Astrophysics Data System (ADS)
Prasad, Saurav; Chakravarty, Charusita
2016-06-01
Experiments and simulations demonstrate some intriguing equivalences in the effect of pressure and electrolytes on the hydrogen-bonded network of water. Here, we examine the extent and nature of equivalence effects between pressure and salt concentration using relationships between structure, entropy, and transport properties based on two key ideas: first, the approximation of the excess entropy of the fluid by the contribution due to the atom-atom pair correlation functions and second, Rosenfeld-type excess entropy scaling relations for transport properties. We perform molecular dynamics simulations of LiCl-H2O and bulk SPC/E water spanning the concentration range 0.025-0.300 molefraction of LiCl at 1 atm and pressure range from 0 to 7 GPa, respectively. The temperature range considered was from 225 to 350 K for both the systems. To establish that the time-temperature-transformation behaviour of electrolyte solutions and water is equivalent, we use the additional observation based on our simulations that the pair entropy behaves as a near-linear function of pressure in bulk water and of composition in LiCl-H2O. This allows for the alignment of pair entropy isotherms and allows for a simple mapping of pressure onto composition. Rosenfeld-scaling implies that pair entropy is semiquantitatively related to the transport properties. At a given temperature, equivalent state points in bulk H2O and LiCl-H2O (at 1 atm) are defined as those for which the pair entropy, diffusivity, and viscosity are nearly identical. The microscopic basis for this equivalence lies in the ability of both pressure and ions to convert the liquid phase into a pair-dominated fluid, as demonstrated by the O-O-O angular distribution within the first coordination shell of a water molecule. There are, however, sharp differences in local order and mechanisms for the breakdown of tetrahedral order by pressure and electrolytes. Increasing pressure increases orientational disorder within the first neighbour shell while addition of ions shifts local orientational order from tetrahedral to close-packed as water molecules get incorporated in ionic hydration shells. The variations in local order within the first hydration shell may underlie ion-specific effects, such as the Hofmeister series.
Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A; Lu, Zhong-Lin; Myung, Jay I
2016-01-01
Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias.
A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings
Liu, Jie; Hu, Youmin; Wu, Bo; Wang, Yan; Xie, Fengyun
2017-01-01
The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD). Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features’ information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components. PMID:28524088
Low-complex energy-aware image communication in visual sensor networks
NASA Astrophysics Data System (ADS)
Phamila, Yesudhas Asnath Victy; Amutha, Ramachandran
2013-10-01
A low-complex, low bit rate, energy-efficient image compression algorithm explicitly designed for resource-constrained visual sensor networks applied for surveillance, battle field, habitat monitoring, etc. is presented, where voluminous amount of image data has to be communicated over a bandwidth-limited wireless medium. The proposed method overcomes the energy limitation of individual nodes and is investigated in terms of image quality, entropy, processing time, overall energy consumption, and system lifetime. This algorithm is highly energy efficient and extremely fast since it applies energy-aware zonal binary discrete cosine transform (DCT) that computes only the few required significant coefficients and codes them using enhanced complementary Golomb Rice code without using any floating point operations. Experiments are performed using the Atmel Atmega128 and MSP430 processors to measure the resultant energy savings. Simulation results show that the proposed energy-aware fast zonal transform consumes only 0.3% of energy needed by conventional DCT. This algorithm consumes only 6% of energy needed by Independent JPEG Group (fast) version, and it suits for embedded systems requiring low power consumption. The proposed scheme is unique since it significantly enhances the lifetime of the camera sensor node and the network without any need for distributed processing as was traditionally required in existing algorithms.
Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A.; Lu, Zhong-Lin; Myung, Jay I.
2016-01-01
Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias. PMID:27105061
Single water entropy: hydrophobic crossover and application to drug binding.
Sasikala, Wilbee D; Mukherjee, Arnab
2014-09-11
Entropy of water plays an important role in both chemical and biological processes e.g. hydrophobic effect, molecular recognition etc. Here we use a new approach to calculate translational and rotational entropy of the individual water molecules around different hydrophobic and charged solutes. We show that for small hydrophobic solutes, the translational and rotational entropies of each water molecule increase as a function of its distance from the solute reaching finally to a constant bulk value. As the size of the solute increases (0.746 nm), the behavior of the translational entropy is opposite; water molecules closest to the solute have higher entropy that reduces with distance from the solute. This indicates that there is a crossover in translational entropy of water molecules around hydrophobic solutes from negative to positive values as the size of the solute is increased. Rotational entropy of water molecules around hydrophobic solutes for all sizes increases with distance from the solute, indicating the absence of crossover in rotational entropy. This makes the crossover in total entropy (translation + rotation) of water molecule happen at much larger size (>1.5 nm) for hydrophobic solutes. Translational entropy of single water molecule scales logarithmically (Str(QH) = C + kB ln V), with the volume V obtained from the ellipsoid of inertia. We further discuss the origin of higher entropy of water around water and show the possibility of recovering the entropy loss of some hypothetical solutes. The results obtained are helpful to understand water entropy behavior around various hydrophobic and charged environments within biomolecules. Finally, we show how our approach can be used to calculate the entropy of the individual water molecules in a protein cavity that may be replaced during ligand binding.
Relating different quantum generalizations of the conditional Rényi entropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomamichel, Marco; School of Physics, The University of Sydney, Sydney 2006; Berta, Mario
2014-08-15
Recently a new quantum generalization of the Rényi divergence and the corresponding conditional Rényi entropies was proposed. Here, we report on a surprising relation between conditional Rényi entropies based on this new generalization and conditional Rényi entropies based on the quantum relative Rényi entropy that was used in previous literature. Our result generalizes the well-known duality relation H(A|B) + H(A|C) = 0 of the conditional von Neumann entropy for tripartite pure states to Rényi entropies of two different kinds. As a direct application, we prove a collection of inequalities that relate different conditional Rényi entropies and derive a new entropicmore » uncertainty relation.« less
Exact analytical thermodynamic expressions for a Brownian heat engine
NASA Astrophysics Data System (ADS)
Taye, Mesfin Asfaw
2015-09-01
The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t . Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.
Exact analytical thermodynamic expressions for a Brownian heat engine.
Taye, Mesfin Asfaw
2015-09-01
The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t. Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.
ERIC Educational Resources Information Center
Harris, Ronald M.
1978-01-01
Presents material dealing with an application of statistical thermodynamics to the diatomic solid I-2(s). The objective is to enhance the student's appreciation of the power of the statistical formulation of thermodynamics. The Simple Einstein Model is used. (Author/MA)
Exergy and the economic process
NASA Astrophysics Data System (ADS)
Karakatsanis, Georgios
2016-04-01
The Second Law of Thermodynamics (2nd Law) dictates that the introduction of physical work in a system requires the existence of a heat gradient, according to the universal notion of Carnot Heat Engine. This is the corner stone for the notion of exergy as well, as exergy is actually the potential of physical work generation across the process of equilibration of a number of unified systems with different thermodynamic states. However, although energy concerns the abstract ability of work generation, exergy concerns the specific ability of work generation, due to the requirement for specifying an environment of reference, in relation to which the thermodynamic equilibration takes place; also determining heat engine efficiencies. Consequently, while energy is always conserved, exergy -deriving from heat gradient equilibration- is always consumed. According to this perspective, the availability of heat gradients is what fundamentally drives the evolution of econosystems, via enhancing -or even substituting- human labor (Boulding 1978; Chen 2005; Ayres and Warr 2009). In addition, exergy consumption is irreversible, via the gradual transformation of useful physical work to entropy; hence reducing its future economic availability. By extending Roegen's relative approach (1971), it could be postulated that this irreversible exhaustion of exergy comprises the fundamental cause of economic scarcity, which is the corner stone for the development of economic science. Conclusively, scarcity consists in: (a) the difficulty of allocating -in the Earth System- very high heat gradients that would make humanity's heat engines very efficient and (b) the irreversible depletion of existent heat gradients due to entropy production. In addition, the concept of exergy could be used to study natural resource degradation and pollution at the biogeochemical level and understand why heat gradient scarcity in the Earth System was eventually inevitable. All of these issues are analyzed both theoretically and quantitatively. Keywords: 2nd Law, physical work, heat gradient, Carnot Heat Engine, exergy, energy, reference environment, econosystems, irreversibility, entropy, scarcity, resource degradation, pollution References 1. Ayres, Robert U. and Benjamin Warr (2009), The Economic Growth Engine: How Energy and Work Drive Material Prosperity, Edward Elgar and IIASA 2. Boulding, Kenneth E. (1978), Ecodynamics: A New Theory of Societal Evolution, Sage Publication 3. Chen, Jing (2005), The Physical Foundations of Economics: An Analytic Thermodynamic Theory, World Scientific 4. Roegen, Nicolas Georgescu (1971), The Entropy Law and the Economic Process, Harvard University Press
Modeling the Overalternating Bias with an Asymmetric Entropy Measure
Gronchi, Giorgio; Raglianti, Marco; Noventa, Stefano; Lazzeri, Alessandro; Guazzini, Andrea
2016-01-01
Psychological research has found that human perception of randomness is biased. In particular, people consistently show the overalternating bias: they rate binary sequences of symbols (such as Heads and Tails in coin flipping) with an excess of alternation as more random than prescribed by the normative criteria of Shannon's entropy. Within data mining for medical applications, Marcellin proposed an asymmetric measure of entropy that can be ideal to account for such bias and to quantify subjective randomness. We fitted Marcellin's entropy and Renyi's entropy (a generalized form of uncertainty measure comprising many different kinds of entropies) to experimental data found in the literature with the Differential Evolution algorithm. We observed a better fit for Marcellin's entropy compared to Renyi's entropy. The fitted asymmetric entropy measure also showed good predictive properties when applied to different datasets of randomness-related tasks. We concluded that Marcellin's entropy can be a parsimonious and effective measure of subjective randomness that can be useful in psychological research about randomness perception. PMID:27458418
Entropy for the Complexity of Physiological Signal Dynamics.
Zhang, Xiaohua Douglas
2017-01-01
Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.
Information Entropy Analysis of the H1N1 Genetic Code
NASA Astrophysics Data System (ADS)
Martwick, Andy
2010-03-01
During the current H1N1 pandemic, viral samples are being obtained from large numbers of infected people world-wide and are being sequenced on the NCBI Influenza Virus Resource Database. The information entropy of the sequences was computed from the probability of occurrence of each nucleotide base at every position of each set of sequences using Shannon's definition of information entropy, [ H=∑bpb,2( 1pb ) ] where H is the observed information entropy at each nucleotide position and pb is the probability of the base pair of the nucleotides A, C, G, U. Information entropy of the current H1N1 pandemic is compared to reference human and swine H1N1 entropy. As expected, the current H1N1 entropy is in a low entropy state and has a very large mutation potential. Using the entropy method in mature genes we can identify low entropy regions of nucleotides that generally correlate to critical protein function.
Generalized Entanglement Entropy and Holography
NASA Astrophysics Data System (ADS)
Obregón, O.
2018-04-01
A nonextensive statistical mechanics entropy that depends only on the probability distribution is proposed in the framework of superstatistics. It is based on a Γ(χ 2) distribution that depends on β and also on pl . The corresponding modified von Neumann entropy is constructed; it is shown that it can also be obtained from a generalized Replica trick. We address the question whether the generalized entanglement entropy can play a role in the gauge/gravity duality. We pay attention to 2dCFT and their gravity duals. The correction terms to the von Neumann entropy result more relevant than the usual UV (for c = 1) ones and also than those due to the area dependent AdS 3 entropy which result comparable to the UV ones. Then the correction terms due to the new entropy would modify the Ryu-Takayanagi identification between the CFT entanglement entropy and the AdS entropy in a different manner than the UV ones or than the corrections to the AdS 3 area dependent entropy.
Tensor-product preconditioners for a space-time discontinuous Galerkin method
NASA Astrophysics Data System (ADS)
Diosady, Laslo T.; Murman, Scott M.
2014-10-01
A space-time discontinuous Galerkin spectral element discretization is presented for direct numerical simulation of the compressible Navier-Stokes equations. An efficient solution technique based on a matrix-free Newton-Krylov method is presented. A diagonalized alternating direction implicit preconditioner is extended to a space-time formulation using entropy variables. The effectiveness of this technique is demonstrated for the direct numerical simulation of turbulent flow in a channel.
Molinari, Filippo; Acharya, U Rajendra; Martis, Roshan Joy; De Luca, Riccardo; Petraroli, Giuliana; Liboni, William
2013-12-01
Diabetes mellitus (DM) is a metabolic disorder that is widely rampant throughout the world population these days. The uncontrolled DM may lead to complications of eye, heart, kidney and nerves. The most common type of diabetes is the type 2 diabetes or insulin-resistant DM. Near-infrared spectroscopy (NIRS) technology is widely used in non-invasive monitoring of physiological signals. Three types of NIRS signals are used in this work: (i) variation in the oxygenated haemoglobin (O2Hb) concentration, (ii) deoxygenated haemoglobin (HHb), and (iii) ratio of oxygenated over the sum of the oxygenated and deoxygenated haemoglobin which is defined as: tissue oxygenation index (TOI) to analyze the effect of exercise on diabetes subjects. The NIRS signal has the characteristics of non-linearity and non-stationarity. Hence, the very small changes in this time series can be efficiently extracted using higher order statistics (HOS) method. Hence, in this work, we have used sample and HOS entropies to analyze these NIRS signals. These computer aided techniques will assist the clinicians to diagnose and monitor the health accurately and easily without any inter or intra observer variability. Results showed that after a one-year of physical exercise programme, all diabetic subjects increased the sample entropy of the NIRS signals, thus revealing a better muscle performance and an improved recruitment by the central nervous system. Moreover, after one year of physical therapy, diabetic subjects showed a NIRS muscular metabolic pattern that was not distinguished from that of controls. We believe that sample and bispectral entropy analysis is need when the aim is to compare the inner structure of the NIRS signals during muscle contraction, particularly when dealing with neuromuscular impairments. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.
Bergeron, Dominic; Tremblay, A-M S
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation
NASA Astrophysics Data System (ADS)
Bergeron, Dominic; Tremblay, A.-M. S.
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Ciliates learn to diagnose and correct classical error syndromes in mating strategies
Clark, Kevin B.
2013-01-01
Preconjugal ciliates learn classical repetition error-correction codes to safeguard mating messages and replies from corruption by “rivals” and local ambient noise. Because individual cells behave as memory channels with Szilárd engine attributes, these coding schemes also might be used to limit, diagnose, and correct mating-signal errors due to noisy intracellular information processing. The present study, therefore, assessed whether heterotrich ciliates effect fault-tolerant signal planning and execution by modifying engine performance, and consequently entropy content of codes, during mock cell–cell communication. Socially meaningful serial vibrations emitted from an ambiguous artificial source initiated ciliate behavioral signaling performances known to advertise mating fitness with varying courtship strategies. Microbes, employing calcium-dependent Hebbian-like decision making, learned to diagnose then correct error syndromes by recursively matching Boltzmann entropies between signal planning and execution stages via “power” or “refrigeration” cycles. All eight serial contraction and reversal strategies incurred errors in entropy magnitude by the execution stage of processing. Absolute errors, however, subtended expected threshold values for single bit-flip errors in three-bit replies, indicating coding schemes protected information content throughout signal production. Ciliate preparedness for vibrations selectively and significantly affected the magnitude and valence of Szilárd engine performance during modal and non-modal strategy corrective cycles. But entropy fidelity for all replies mainly improved across learning trials as refinements in engine efficiency. Fidelity neared maximum levels for only modal signals coded in resilient three-bit repetition error-correction sequences. Together, these findings demonstrate microbes can elevate survival/reproductive success by learning to implement classical fault-tolerant information processing in social contexts. PMID:23966987
Thermodynamic framework for compact q-Gaussian distributions
NASA Astrophysics Data System (ADS)
Souza, Andre M. C.; Andrade, Roberto F. S.; Nobre, Fernando D.; Curado, Evaldo M. F.
2018-02-01
Recent works have associated systems of particles, characterized by short-range repulsive interactions and evolving under overdamped motion, to a nonlinear Fokker-Planck equation within the class of nonextensive statistical mechanics, with a nonlinear diffusion contribution whose exponent is given by ν = 2 - q. The particular case ν = 2 applies to interacting vortices in type-II superconductors, whereas ν > 2 covers systems of particles characterized by short-range power-law interactions, where correlations among particles are taken into account. In the former case, several studies presented a consistent thermodynamic framework based on the definition of an effective temperature θ (presenting experimental values much higher than typical room temperatures T, so that thermal noise could be neglected), conjugated to a generalized entropy sν (with ν = 2). Herein, the whole thermodynamic scheme is revisited and extended to systems of particles interacting repulsively, through short-ranged potentials, described by an entropy sν, with ν > 1, covering the ν = 2 (vortices in type-II superconductors) and ν > 2 (short-range power-law interactions) physical examples. One basic requirement concerns a cutoff in the equilibrium distribution Peq(x) , approached due to a confining external harmonic potential, ϕ(x) = αx2 / 2 (α > 0). The main results achieved are: (a) The definition of an effective temperature θ conjugated to the entropy sν; (b) The construction of a Carnot cycle, whose efficiency is shown to be η = 1 -(θ2 /θ1) , where θ1 and θ2 are the effective temperatures associated with two isothermal transformations, with θ1 >θ2; (c) Thermodynamic potentials, Maxwell relations, and response functions. The present thermodynamic framework, for a system of interacting particles under the above-mentioned conditions, and associated to an entropy sν, with ν > 1, certainly enlarges the possibility of experimental verifications.
Tertiary structure-based analysis of microRNA–target interactions
Gan, Hin Hark; Gunsalus, Kristin C.
2013-01-01
Current computational analysis of microRNA interactions is based largely on primary and secondary structure analysis. Computationally efficient tertiary structure-based methods are needed to enable more realistic modeling of the molecular interactions underlying miRNA-mediated translational repression. We incorporate algorithms for predicting duplex RNA structures, ionic strength effects, duplex entropy and free energy, and docking of duplex–Argonaute protein complexes into a pipeline to model and predict miRNA–target duplex binding energies. To ensure modeling accuracy and computational efficiency, we use an all-atom description of RNA and a continuum description of ionic interactions using the Poisson–Boltzmann equation. Our method predicts the conformations of two constructs of Caenorhabditis elegans let-7 miRNA–target duplexes to an accuracy of ∼3.8 Å root mean square distance of their NMR structures. We also show that the computed duplex formation enthalpies, entropies, and free energies for eight miRNA–target duplexes agree with titration calorimetry data. Analysis of duplex–Argonaute docking shows that structural distortions arising from single-base-pair mismatches in the seed region influence the activity of the complex by destabilizing both duplex hybridization and its association with Argonaute. Collectively, these results demonstrate that tertiary structure-based modeling of miRNA interactions can reveal structural mechanisms not accessible with current secondary structure-based methods. PMID:23417009