Sample records for information entropy analysis

  1. Information Entropy Analysis of the H1N1 Genetic Code

    NASA Astrophysics Data System (ADS)

    Martwick, Andy

    2010-03-01

    During the current H1N1 pandemic, viral samples are being obtained from large numbers of infected people world-wide and are being sequenced on the NCBI Influenza Virus Resource Database. The information entropy of the sequences was computed from the probability of occurrence of each nucleotide base at every position of each set of sequences using Shannon's definition of information entropy, [ H=∑bpb,2( 1pb ) ] where H is the observed information entropy at each nucleotide position and pb is the probability of the base pair of the nucleotides A, C, G, U. Information entropy of the current H1N1 pandemic is compared to reference human and swine H1N1 entropy. As expected, the current H1N1 entropy is in a low entropy state and has a very large mutation potential. Using the entropy method in mature genes we can identify low entropy regions of nucleotides that generally correlate to critical protein function.

  2. Entropy and generalized least square methods in assessment of the regional value of streamgages

    USGS Publications Warehouse

    Markus, M.; Vernon, Knapp H.; Tasker, Gary D.

    2003-01-01

    The Illinois State Water Survey performed a study to assess the streamgaging network in the State of Illinois. One of the important aspects of the study was to assess the regional value of each station through an assessment of the information transfer among gaging records for low, average, and high flow conditions. This analysis was performed for the main hydrologic regions in the State, and the stations were initially evaluated using a new approach based on entropy analysis. To determine the regional value of each station within a region, several information parameters, including total net information, were defined based on entropy. Stations were ranked based on the total net information. For comparison, the regional value of the same stations was assessed using the generalized least square regression (GLS) method, developed by the US Geological Survey. Finally, a hybrid combination of GLS and entropy was created by including a function of the negative net information as a penalty function in the GLS. The weights of the combined model were determined to maximize the average correlation with the results of GLS and entropy. The entropy and GLS methods were evaluated using the high-flow data from southern Illinois stations. The combined method was compared with the entropy and GLS approaches using the high-flow data from eastern Illinois stations. ?? 2003 Elsevier B.V. All rights reserved.

  3. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    NASA Astrophysics Data System (ADS)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  4. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  5. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  6. Transfer entropy as a log-likelihood ratio.

    PubMed

    Barnett, Lionel; Bossomaier, Terry

    2012-09-28

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  7. Measurement of entanglement entropy in the two-dimensional Potts model using wavelet analysis.

    PubMed

    Tomita, Yusuke

    2018-05-01

    A method is introduced to measure the entanglement entropy using a wavelet analysis. Using this method, the two-dimensional Haar wavelet transform of a configuration of Fortuin-Kasteleyn (FK) clusters is performed. The configuration represents a direct snapshot of spin-spin correlations since spin degrees of freedom are traced out in FK representation. A snapshot of FK clusters loses image information at each coarse-graining process by the wavelet transform. It is shown that the loss of image information measures the entanglement entropy in the Potts model.

  8. Use of information entropy measures of sitting postural sway to quantify developmental delay in infants

    PubMed Central

    Deffeyes, Joan E; Harbourne, Regina T; DeJong, Stacey L; Kyvelidou, Anastasia; Stuberg, Wayne A; Stergiou, Nicholas

    2009-01-01

    Background By quantifying the information entropy of postural sway data, the complexity of the postural movement of different populations can be assessed, giving insight into pathologic motor control functioning. Methods In this study, developmental delay of motor control function in infants was assessed by analysis of sitting postural sway data acquired from force plate center of pressure measurements. Two types of entropy measures were used: symbolic entropy, including a new asymmetric symbolic entropy measure, and approximate entropy, a more widely used entropy measure. For each method of analysis, parameters were adjusted to optimize the separation of the results from the infants with delayed development from infants with typical development. Results The method that gave the widest separation between the populations was the asymmetric symbolic entropy method, which we developed by modification of the symbolic entropy algorithm. The approximate entropy algorithm also performed well, using parameters optimized for the infant sitting data. The infants with delayed development were found to have less complex patterns of postural sway in the medial-lateral direction, and were found to have different left-right symmetry in their postural sway, as compared to typically developing infants. Conclusion The results of this study indicate that optimization of the entropy algorithm for infant sitting postural sway data can greatly improve the ability to separate the infants with developmental delay from typically developing infants. PMID:19671183

  9. Modeling Information Content Via Dirichlet-Multinomial Regression Analysis.

    PubMed

    Ferrari, Alberto

    2017-01-01

    Shannon entropy is being increasingly used in biomedical research as an index of complexity and information content in sequences of symbols, e.g. languages, amino acid sequences, DNA methylation patterns and animal vocalizations. Yet, distributional properties of information entropy as a random variable have seldom been the object of study, leading to researchers mainly using linear models or simulation-based analytical approach to assess differences in information content, when entropy is measured repeatedly in different experimental conditions. Here a method to perform inference on entropy in such conditions is proposed. Building on results coming from studies in the field of Bayesian entropy estimation, a symmetric Dirichlet-multinomial regression model, able to deal efficiently with the issue of mean entropy estimation, is formulated. Through a simulation study the model is shown to outperform linear modeling in a vast range of scenarios and to have promising statistical properties. As a practical example, the method is applied to a data set coming from a real experiment on animal communication.

  10. Financial time series analysis based on effective phase transfer entropy

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  11. Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy

    NASA Astrophysics Data System (ADS)

    Dun, Xiaohong

    2018-05-01

    With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.

  12. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  13. Computing algebraic transfer entropy and coupling directions via transcripts

    NASA Astrophysics Data System (ADS)

    Amigó, José M.; Monetti, Roberto; Graff, Beata; Graff, Grzegorz

    2016-11-01

    Most random processes studied in nonlinear time series analysis take values on sets endowed with a group structure, e.g., the real and rational numbers, and the integers. This fact allows to associate with each pair of group elements a third element, called their transcript, which is defined as the product of the second element in the pair times the first one. The transfer entropy of two such processes is called algebraic transfer entropy. It measures the information transferred between two coupled processes whose values belong to a group. In this paper, we show that, subject to one constraint, the algebraic transfer entropy matches the (in general, conditional) mutual information of certain transcripts with one variable less. This property has interesting practical applications, especially to the analysis of short time series. We also derive weak conditions for the 3-dimensional algebraic transfer entropy to yield the same coupling direction as the corresponding mutual information of transcripts. A related issue concerns the use of mutual information of transcripts to determine coupling directions in cases where the conditions just mentioned are not fulfilled. We checked the latter possibility in the lowest dimensional case with numerical simulations and cardiovascular data, and obtained positive results.

  14. An Information Transmission Measure for the Analysis of Effective Connectivity among Cortical Neurons

    PubMed Central

    Law, Andrew J.; Sharma, Gaurav; Schieber, Marc H.

    2014-01-01

    We present a methodology for detecting effective connections between simultaneously recorded neurons using an information transmission measure to identify the presence and direction of information flow from one neuron to another. Using simulated and experimentally-measured data, we evaluate the performance of our proposed method and compare it to the traditional transfer entropy approach. In simulations, our measure of information transmission outperforms transfer entropy in identifying the effective connectivity structure of a neuron ensemble. For experimentally recorded data, where ground truth is unavailable, the proposed method also yields a more plausible connectivity structure than transfer entropy. PMID:21096617

  15. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    NASA Astrophysics Data System (ADS)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency of WWTPs with respect to the state-of-the-art of technology, waste water treatment could become more resources preserving.

  16. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  17. Multi-scale symbolic transfer entropy analysis of EEG

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-10-01

    From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.

  18. Rényi entropy measure of noise-aided information transmission in a binary channel.

    PubMed

    Chapeau-Blondeau, François; Rousseau, David; Delahaies, Agnès

    2010-05-01

    This paper analyzes a binary channel by means of information measures based on the Rényi entropy. The analysis extends, and contains as a special case, the classic reference model of binary information transmission based on the Shannon entropy measure. The extended model is used to investigate further possibilities and properties of stochastic resonance or noise-aided information transmission. The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order. Furthermore, in definite conditions, when seeking the Rényi information measures that best exploit stochastic resonance, then nontrivial orders differing from the Shannon case usually emerge. In this way, through binary information transmission, stochastic resonance identifies optimal Rényi measures of information differing from the classic Shannon measure. A confrontation of the quantitative information measures with visual perception is also proposed in an experiment of noise-aided binary image transmission.

  19. EEG entropy measures indicate decrease of cortical information processing in Disorders of Consciousness.

    PubMed

    Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel

    2016-02-01

    Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    PubMed Central

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489

  1. Cardiorespiratory Information Dynamics during Mental Arithmetic and Sustained Attention

    PubMed Central

    Widjaja, Devy; Montalto, Alessandro; Vlemincx, Elke; Marinazzo, Daniele; Van Huffel, Sabine; Faes, Luca

    2015-01-01

    An analysis of cardiorespiratory dynamics during mental arithmetic, which induces stress, and sustained attention was conducted using information theory. The information storage and internal information of heart rate variability (HRV) were determined respectively as the self-entropy of the tachogram, and the self-entropy of the tachogram conditioned to the knowledge of respiration. The information transfer and cross information from respiration to HRV were assessed as the transfer and cross-entropy, both measures of cardiorespiratory coupling. These information-theoretic measures identified significant nonlinearities in the cardiorespiratory time series. Additionally, it was shown that, although mental stress is related to a reduction in vagal activity, no difference in cardiorespiratory coupling was found when several mental states (rest, mental stress, sustained attention) are compared. However, the self-entropy of HRV conditioned to respiration was very informative to study the predictability of RR interval series during mental tasks, and showed higher predictability during mental arithmetic compared to sustained attention or rest. PMID:26042824

  2. Effect of extreme data loss on heart rate signals quantified by entropy analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao

    2015-02-01

    The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.

  3. Competition between Homophily and Information Entropy Maximization in Social Networks

    PubMed Central

    Zhao, Jichang; Liang, Xiao; Xu, Ke

    2015-01-01

    In social networks, it is conventionally thought that two individuals with more overlapped friends tend to establish a new friendship, which could be stated as homophily breeding new connections. While the recent hypothesis of maximum information entropy is presented as the possible origin of effective navigation in small-world networks. We find there exists a competition between information entropy maximization and homophily in local structure through both theoretical and experimental analysis. This competition suggests that a newly built relationship between two individuals with more common friends would lead to less information entropy gain for them. We demonstrate that in the evolution of the social network, both of the two assumptions coexist. The rule of maximum information entropy produces weak ties in the network, while the law of homophily makes the network highly clustered locally and the individuals would obtain strong and trust ties. A toy model is also presented to demonstrate the competition and evaluate the roles of different rules in the evolution of real networks. Our findings could shed light on the social network modeling from a new perspective. PMID:26334994

  4. Towards an information geometric characterization/classification of complex systems. I. Use of generalized entropies

    NASA Astrophysics Data System (ADS)

    Ghikas, Demetris P. K.; Oikonomou, Fotios D.

    2018-04-01

    Using the generalized entropies which depend on two parameters we propose a set of quantitative characteristics derived from the Information Geometry based on these entropies. Our aim, at this stage, is to construct first some fundamental geometric objects which will be used in the development of our geometrical framework. We first establish the existence of a two-parameter family of probability distributions. Then using this family we derive the associated metric and we state a generalized Cramer-Rao Inequality. This gives a first two-parameter classification of complex systems. Finally computing the scalar curvature of the information manifold we obtain a further discrimination of the corresponding classes. Our analysis is based on the two-parameter family of generalized entropies of Hanel and Thurner (2011).

  5. Analysis of the phase transition in the two-dimensional Ising ferromagnet using a Lempel-Ziv string-parsing scheme and black-box data-compression utilities

    NASA Astrophysics Data System (ADS)

    Melchert, O.; Hartmann, A. K.

    2015-02-01

    In this work we consider information-theoretic observables to analyze short symbolic sequences, comprising time series that represent the orientation of a single spin in a two-dimensional (2D) Ising ferromagnet on a square lattice of size L2=1282 for different system temperatures T . The latter were chosen from an interval enclosing the critical point Tc of the model. At small temperatures the sequences are thus very regular; at high temperatures they are maximally random. In the vicinity of the critical point, nontrivial, long-range correlations appear. Here we implement estimators for the entropy rate, excess entropy (i.e., "complexity"), and multi-information. First, we implement a Lempel-Ziv string-parsing scheme, providing seemingly elaborate entropy rate and multi-information estimates and an approximate estimator for the excess entropy. Furthermore, we apply easy-to-use black-box data-compression utilities, providing approximate estimators only. For comparison and to yield results for benchmarking purposes, we implement the information-theoretic observables also based on the well-established M -block Shannon entropy, which is more tedious to apply compared to the first two "algorithmic" entropy estimation procedures. To test how well one can exploit the potential of such data-compression techniques, we aim at detecting the critical point of the 2D Ising ferromagnet. Among the above observables, the multi-information, which is known to exhibit an isolated peak at the critical point, is very easy to replicate by means of both efficient algorithmic entropy estimation procedures. Finally, we assess how good the various algorithmic entropy estimates compare to the more conventional block entropy estimates and illustrate a simple modification that yields enhanced results.

  6. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  7. Shannon information entropy in heavy-ion collisions

    NASA Astrophysics Data System (ADS)

    Ma, Chun-Wang; Ma, Yu-Gang

    2018-03-01

    The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.

  8. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    USGS Publications Warehouse

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  9. Information Transfer Analysis of Spontaneous Low-frequency Fluctuations in Cerebral Hemodynamics and Cardiovascular Dynamics

    NASA Astrophysics Data System (ADS)

    Katura, Takusige; Tanaka, Naoki; Obata, Akiko; Sato, Hiroki; Maki, Atsushi

    2005-08-01

    In this study, from the information-theoretic viewpoint, we analyzed the interrelation between the spontaneous low-frequency fluctuations around 0.1Hz in the hemoglobin concentration in the cerebral cortex, mean arterial blood pressure and the heart rate. For this analysis, as measures of information transfer, we used transfer entropy (TE) proposed for two-factor systems by Schreiber and intrinsic transfer entropy (ITE) introduced for further analysis of three-factor systems by extending the original TE. In our analysis, information transfer analysis based on both TE and ITE suggests the systemic cardiovascular fluctuations alone cannot account for the cerebrovascular fluctuations, that is, the regulation of the regional cerebral energetic metabolism is important as a candidate of its generation mechanism Such an information transfer analysis seems useful to reveal the interrelation between the elements regulated each other in a complex manner.

  10. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    NASA Astrophysics Data System (ADS)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  11. Refined generalized multiscale entropy analysis for physiological signals

    NASA Astrophysics Data System (ADS)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  12. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  13. Information theory analysis of Australian humpback whale song.

    PubMed

    Miksis-Olds, Jennifer L; Buck, John R; Noad, Michael J; Cato, Douglas H; Stokes, M Dale

    2008-10-01

    Songs produced by migrating whales were recorded off the coast of Queensland, Australia, over six consecutive weeks in 2003. Forty-eight independent song sessions were analyzed using information theory techniques. The average length of the songs estimated by correlation analysis was approximately 100 units, with song sessions lasting from 300 to over 3100 units. Song entropy, a measure of structural constraints, was estimated using three different methodologies: (1) the independently identically distributed model, (2) a first-order Markov model, and (3) the nonparametric sliding window match length (SWML) method, as described by Suzuki et al. [(2006). "Information entropy of humpback whale song," J. Acoust. Soc. Am. 119, 1849-1866]. The analysis finds that the song sequences of migrating Australian whales are consistent with the hierarchical structure proposed by Payne and McVay [(1971). "Songs of humpback whales," Science 173, 587-597], and recently supported mathematically by Suzuki et al. (2006) for singers on the Hawaiian breeding grounds. Both the SWML entropy estimates and the song lengths for the Australian singers in 2003 were lower than that reported by Suzuki et al. (2006) for Hawaiian whales in 1976-1978; however, song redundancy did not differ between these two populations separated spatially and temporally. The average total information in the sequence of units in Australian song was approximately 35 bits/song. Aberrant songs (8%) yielded entropies similar to the typical songs.

  14. Identification of genome regions determining semen quality in Holstein-Friesian bulls using information theory.

    PubMed

    Borowska, Alicja; Szwaczkowski, Tomasz; Kamiński, Stanisław; Hering, Dorota M; Kordan, Władysław; Lecewicz, Marek

    2018-05-01

    Use of information theory can be an alternative statistical approach to detect genome regions and candidate genes that are associated with livestock traits. The aim of this study was to verify the validity of the SNPs effects on some semen quality variables of bulls using entropy analysis. Records from 288 Holstein-Friesian bulls from one AI station were included. The following semen quality variables were analyzed: CASA kinematic variables of sperm (total motility, average path velocity, straight line velocity, curvilinear velocity, amplitude of lateral head displacement, beat cross frequency, straightness, linearity), sperm membrane integrity (plazmolema, mitochondrial function), sperm ATP content. Molecular data included 48,192 SNPs. After filtering (call rate = 0.95 and MAF = 0.05), 34,794 SNPs were included in the entropy analysis. The entropy and conditional entropy were estimated for each SNP. Conditional entropy quantifies the remaining uncertainty about values of the variable with the knowledge of SNP. The most informative SNPs for each variable were determined. The computations were performed using the R statistical package. A majority of the loci had relatively small contributions. The most informative SNPs for all variables were mainly located on chromosomes: 3, 4, 5 and 16. The results from the study indicate that important genome regions and candidate genes that determine semen quality variables in bulls are located on a number of chromosomes. Some detected clusters of SNPs were located in RNA (U6 and 5S_rRNA) for all the variables for which analysis occurred. Associations between PARK2 as well GALNT13 genes and some semen characteristics were also detected. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. On quantum Rényi entropies: A new generalization and some properties

    NASA Astrophysics Data System (ADS)

    Müller-Lennert, Martin; Dupuis, Frédéric; Szehr, Oleg; Fehr, Serge; Tomamichel, Marco

    2013-12-01

    The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.

  16. Multiwavelet packet entropy and its application in transmission line fault recognition and classification.

    PubMed

    Liu, Zhigang; Han, Zhiwei; Zhang, Yang; Zhang, Qiaoge

    2014-11-01

    Multiwavelets possess better properties than traditional wavelets. Multiwavelet packet transformation has more high-frequency information. Spectral entropy can be applied as an analysis index to the complexity or uncertainty of a signal. This paper tries to define four multiwavelet packet entropies to extract the features of different transmission line faults, and uses a radial basis function (RBF) neural network to recognize and classify 10 fault types of power transmission lines. First, the preprocessing and postprocessing problems of multiwavelets are presented. Shannon entropy and Tsallis entropy are introduced, and their difference is discussed. Second, multiwavelet packet energy entropy, time entropy, Shannon singular entropy, and Tsallis singular entropy are defined as the feature extraction methods of transmission line fault signals. Third, the plan of transmission line fault recognition using multiwavelet packet entropies and an RBF neural network is proposed. Finally, the experimental results show that the plan with the four multiwavelet packet energy entropies defined in this paper achieves better performance in fault recognition. The performance with SA4 (symmetric antisymmetric) multiwavelet packet Tsallis singular entropy is the best among the combinations of different multiwavelet packets and the four multiwavelet packet entropies.

  17. Rényi’s information transfer between financial time series

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad

    2012-05-01

    In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter q. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990-31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008-11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia-Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.

  18. Temporal information entropy of the Blood-Oxygenation Level-Dependent signals increases in the activated human primary visual cortex

    NASA Astrophysics Data System (ADS)

    DiNuzzo, Mauro; Mascali, Daniele; Moraschi, Marta; Bussu, Giorgia; Maraviglia, Bruno; Mangia, Silvia; Giove, Federico

    2017-02-01

    Time-domain analysis of blood-oxygenation level-dependent (BOLD) signals allows the identification of clusters of voxels responding to photic stimulation in primary visual cortex (V1). However, the characterization of information encoding into temporal properties of the BOLD signals of an activated cluster is poorly investigated. Here, we used Shannon entropy to determine spatial and temporal information encoding in the BOLD signal within the most strongly activated area of the human visual cortex during a hemifield photic stimulation. We determined the distribution profile of BOLD signals during epochs at rest and under stimulation within small (19-121 voxels) clusters designed to include only voxels driven by the stimulus as highly and uniformly as possible. We found consistent and significant increases (2-4% on average) in temporal information entropy during activation in contralateral but not ipsilateral V1, which was mirrored by an expected loss of spatial information entropy. These opposite changes coexisted with increases in both spatial and temporal mutual information (i.e. dependence) in contralateral V1. Thus, we showed that the first cortical stage of visual processing is characterized by a specific spatiotemporal rearrangement of intracluster BOLD responses. Our results indicate that while in the space domain BOLD maps may be incapable of capturing the functional specialization of small neuronal populations due to relatively low spatial resolution, some information encoding may still be revealed in the temporal domain by an increase of temporal information entropy.

  19. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  20. Backward transfer entropy: Informational measure for detecting hidden Markov models and its interpretations in thermodynamics, gambling and causality

    PubMed Central

    Ito, Sosuke

    2016-01-01

    The transfer entropy is a well-established measure of information flow, which quantifies directed influence between two stochastic time series and has been shown to be useful in a variety fields of science. Here we introduce the transfer entropy of the backward time series called the backward transfer entropy, and show that the backward transfer entropy quantifies how far it is from dynamics to a hidden Markov model. Furthermore, we discuss physical interpretations of the backward transfer entropy in completely different settings of thermodynamics for information processing and the gambling with side information. In both settings of thermodynamics and the gambling, the backward transfer entropy characterizes a possible loss of some benefit, where the conventional transfer entropy characterizes a possible benefit. Our result implies the deep connection between thermodynamics and the gambling in the presence of information flow, and that the backward transfer entropy would be useful as a novel measure of information flow in nonequilibrium thermodynamics, biochemical sciences, economics and statistics. PMID:27833120

  1. Backward transfer entropy: Informational measure for detecting hidden Markov models and its interpretations in thermodynamics, gambling and causality

    NASA Astrophysics Data System (ADS)

    Ito, Sosuke

    2016-11-01

    The transfer entropy is a well-established measure of information flow, which quantifies directed influence between two stochastic time series and has been shown to be useful in a variety fields of science. Here we introduce the transfer entropy of the backward time series called the backward transfer entropy, and show that the backward transfer entropy quantifies how far it is from dynamics to a hidden Markov model. Furthermore, we discuss physical interpretations of the backward transfer entropy in completely different settings of thermodynamics for information processing and the gambling with side information. In both settings of thermodynamics and the gambling, the backward transfer entropy characterizes a possible loss of some benefit, where the conventional transfer entropy characterizes a possible benefit. Our result implies the deep connection between thermodynamics and the gambling in the presence of information flow, and that the backward transfer entropy would be useful as a novel measure of information flow in nonequilibrium thermodynamics, biochemical sciences, economics and statistics.

  2. Optimization of rainfall networks using information entropy and temporal variability analysis

    NASA Astrophysics Data System (ADS)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  3. Prediction of microsleeps using pairwise joint entropy and mutual information between EEG channels.

    PubMed

    Baseer, Abdul; Weddell, Stephen J; Jones, Richard D

    2017-07-01

    Microsleeps are involuntary and brief instances of complete loss of responsiveness, typically of 0.5-15 s duration. They adversely affect performance in extended attention-driven jobs and can be fatal. Our aim was to predict microsleeps from 16 channel EEG signals. Two information theoretic concepts - pairwise joint entropy and mutual information - were independently used to continuously extract features from EEG signals. k-nearest neighbor (kNN) with k = 3 was used to calculate both joint entropy and mutual information. Highly correlated features were discarded and the rest were ranked using Fisher score followed by an average of 3-fold cross-validation area under the curve of the receiver operating characteristic (AUC ROC ). Leave-one-out method (LOOM) was performed to test the performance of microsleep prediction system on independent data. The best prediction for 0.25 s ahead was AUCROC, sensitivity, precision, geometric mean (GM), and φ of 0.93, 0.68, 0.33, 0.75, and 0.38 respectively with joint entropy using single linear discriminant analysis (LDA) classifier.

  4. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  5. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  6. A general methodology for population analysis

    NASA Astrophysics Data System (ADS)

    Lazov, Petar; Lazov, Igor

    2014-12-01

    For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ρ, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ν, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

  7. Regional Sustainable Development Analysis Based on Information Entropy-Sichuan Province as an Example.

    PubMed

    Liang, Xuedong; Si, Dongyang; Zhang, Xinli

    2017-10-13

    According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province's sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research.

  8. Information entropy to measure the spatial and temporal complexity of solute transport in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Li, Weiyao; Huang, Guanhua; Xiong, Yunwu

    2016-04-01

    The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and solute transport complexity weakened, and the corresponding information entropy also decreased. Longitudinal macro dispersivity declined slightly at early time then rose. Solute spatial and temporal distribution had significant impacts on the information entropy. Information entropy could reflect the change of solute distribution. Information entropy appears a tool to characterize the spatial and temporal complexity of solute migration and provides a reference for future research.

  9. Informational basis of sensory adaptation: entropy and single-spike efficiency in rat barrel cortex.

    PubMed

    Adibi, Mehdi; Clifford, Colin W G; Arabzadeh, Ehsan

    2013-09-11

    We showed recently that exposure to whisker vibrations enhances coding efficiency in rat barrel cortex despite increasing correlations in variability (Adibi et al., 2013). Here, to understand how adaptation achieves this improvement in sensory representation, we decomposed the stimulus information carried in neuronal population activity into its fundamental components in the framework of information theory. In the context of sensory coding, these components are the entropy of the responses across the entire stimulus set (response entropy) and the entropy of the responses conditional on the stimulus (conditional response entropy). We found that adaptation decreased response entropy and conditional response entropy at both the level of single neurons and the pooled activity of neuronal populations. However, the net effect of adaptation was to increase the mutual information because the drop in the conditional entropy outweighed the drop in the response entropy. The information transmitted by a single spike also increased under adaptation. As population size increased, the information content of individual spikes declined but the relative improvement attributable to adaptation was maintained.

  10. Polarimetric Decomposition Analysis of the Deepwater Horizon Oil Slick Using L-Band UAVSAR Data

    NASA Technical Reports Server (NTRS)

    Jones, Cathleen; Minchew, Brent; Holt, Benjamin

    2011-01-01

    We report here an analysis of the polarization dependence of L-band radar backscatter from the main slick of the Deepwater Horizon oil spill, with specific attention to the utility of polarimetric decomposition analysis for discrimination of oil from clean water and identification of variations in the oil characteristics. For this study we used data collected with the UAVSAR instrument from opposing look directions directly over the main oil slick. We find that both the Cloude-Pottier and Shannon entropy polarimetric decomposition methods offer promise for oil discrimination, with the Shannon entropy method yielding the same information as contained in the Cloude-Pottier entropy and averaged in tensity parameters, but with significantly less computational complexity

  11. Multifractal characteristics of multiparticle production in heavy-ion collisions at SPS energies

    NASA Astrophysics Data System (ADS)

    Khan, Shaista; Ahmad, Shakeel

    Entropy, dimensions and other multifractal characteristics of multiplicity distributions of relativistic charged hadrons produced in ion-ion collisions at SPS energies are investigated. The analysis of the experimental data is carried out in terms of phase space bin-size dependence of multiplicity distributions following the Takagi’s approach. Yet another method is also followed to study the multifractality which, is not related to the bin-width and (or) the detector resolution, rather involves multiplicity distribution of charged particles in full phase space in terms of information entropy and its generalization, Rényi’s order-q information entropy. The findings reveal the presence of multifractal structure — a remarkable property of the fluctuations. Nearly constant values of multifractal specific heat “c” estimated by the two different methods of analysis followed indicate that the parameter “c” may be used as a universal characteristic of the particle production in high energy collisions. The results obtained from the analysis of the experimental data agree well with the predictions of Monte Carlo model AMPT.

  12. Spatial-dependence recurrence sample entropy

    NASA Astrophysics Data System (ADS)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  13. Entropy for Mechanically Vibrating Systems

    NASA Astrophysics Data System (ADS)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  14. The Dynameomics Entropy Dictionary: A Large-Scale Assessment of Conformational Entropy across Protein Fold Space.

    PubMed

    Towse, Clare-Louise; Akke, Mikael; Daggett, Valerie

    2017-04-27

    Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing data set of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identifies deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, α-helices have lower entropy on average than do β-sheets, and both are lower than coil regions.

  15. How hidden are hidden processes? A primer on crypticity and entropy convergence

    NASA Astrophysics Data System (ADS)

    Mahoney, John R.; Ellison, Christopher J.; James, Ryan G.; Crutchfield, James P.

    2011-09-01

    We investigate a stationary process's crypticity—a measure of the difference between its hidden state information and its observed information—using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. This is also a first step in developing applications in spatially extended and network dynamical systems.

  16. An automatic classifier of emotions built from entropy of noise.

    PubMed

    Ferreira, Jacqueline; Brás, Susana; Silva, Carlos F; Soares, Sandra C

    2017-04-01

    The electrocardiogram (ECG) signal has been widely used to study the physiological substrates of emotion. However, searching for better filtering techniques in order to obtain a signal with better quality and with the maximum relevant information remains an important issue for researchers in this field. Signal processing is largely performed for ECG analysis and interpretation, but this process can be susceptible to error in the delineation phase. In addition, it can lead to the loss of important information that is usually considered as noise and, consequently, discarded from the analysis. The goal of this study was to evaluate if the ECG noise allows for the classification of emotions, while using its entropy as an input in a decision tree classifier. We collected the ECG signal from 25 healthy participants while they were presented with videos eliciting negative (fear and disgust) and neutral emotions. The results indicated that the neutral condition showed a perfect identification (100%), whereas the classification of negative emotions indicated good identification performances (60% of sensitivity and 80% of specificity). These results suggest that the entropy of noise contains relevant information that can be useful to improve the analysis of the physiological correlates of emotion. © 2016 Society for Psychophysiological Research.

  17. Information theory and robotics meet to study predator-prey interactions

    NASA Astrophysics Data System (ADS)

    Neri, Daniele; Ruberto, Tommaso; Cord-Cruz, Gabrielle; Porfiri, Maurizio

    2017-07-01

    Transfer entropy holds promise to advance our understanding of animal behavior, by affording the identification of causal relationships that underlie animal interactions. A critical step toward the reliable implementation of this powerful information-theoretic concept entails the design of experiments in which causal relationships could be systematically controlled. Here, we put forward a robotics-based experimental approach to test the validity of transfer entropy in the study of predator-prey interactions. We investigate the behavioral response of zebrafish to a fear-evoking robotic stimulus, designed after the morpho-physiology of the red tiger oscar and actuated along preprogrammed trajectories. From the time series of the positions of the zebrafish and the robotic stimulus, we demonstrate that transfer entropy correctly identifies the influence of the stimulus on the focal subject. Building on this evidence, we apply transfer entropy to study the interactions between zebrafish and a live red tiger oscar. The analysis of transfer entropy reveals a change in the direction of the information flow, suggesting a mutual influence between the predator and the prey, where the predator adapts its strategy as a function of the movement of the prey, which, in turn, adjusts its escape as a function of the predator motion. Through the integration of information theory and robotics, this study posits a new approach to study predator-prey interactions in freshwater fish.

  18. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  19. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  20. Quantum Non-thermal Effect from Black Holes Surrounded by Quintessence

    NASA Astrophysics Data System (ADS)

    Gong, Tian-Xi; Wang, Yong-Jiu

    2009-11-01

    We present a short and direct derivation of Hawking radiation as a tunneling process across the horizon and compute the tunneling probability. Considering the self-gravitation and energy conservation, we use the Keskiy Vakkuri, Kraus, and Wilczek (KKW) analysis to compute the temperature and entropy of the black holes surrounded by quintessence and obtain the temperature and entropy are different from the Hawking temperature and the Bekenstein-Hawking entropy. The result we get can offer a possible mechanism to deal with the information loss paradox because the spectrum is not purely thermal.

  1. Information entropy of humpback whale songs.

    PubMed

    Suzuki, Ryuji; Buck, John R; Tyack, Peter L

    2006-03-01

    The structure of humpback whale (Megaptera novaeangliae) songs was examined using information theory techniques. The song is an ordered sequence of individual sound elements separated by gaps of silence. Song samples were converted into sequences of discrete symbols by both human and automated classifiers. This paper analyzes the song structure in these symbol sequences using information entropy estimators and autocorrelation estimators. Both parametric and nonparametric entropy estimators are applied to the symbol sequences representing the songs. The results provide quantitative evidence consistent with the hierarchical structure proposed for these songs by Payne and McVay [Science 173, 587-597 (1971)]. Specifically, this analysis demonstrates that: (1) There is a strong structural constraint, or syntax, in the generation of the songs, and (2) the structural constraints exhibit periodicities with periods of 6-8 and 180-400 units. This implies that no empirical Markov model is capable of representing the songs' structure. The results are robust to the choice of either human or automated song-to-symbol classifiers. In addition, the entropy estimates indicate that the maximum amount of information that could be communicated by the sequence of sounds made is less than 1 bit per second.

  2. Fuzzy geometry, entropy, and image information

    NASA Technical Reports Server (NTRS)

    Pal, Sankar K.

    1991-01-01

    Presented here are various uncertainty measures arising from grayness ambiguity and spatial ambiguity in an image, and their possible applications as image information measures. Definitions are given of an image in the light of fuzzy set theory, and of information measures and tools relevant for processing/analysis e.g., fuzzy geometrical properties, correlation, bound functions and entropy measures. Also given is a formulation of algorithms along with management of uncertainties for segmentation and object extraction, and edge detection. The output obtained here is both fuzzy and nonfuzzy. Ambiguity in evaluation and assessment of membership function are also described.

  3. Naturalistic stimulation changes the dynamic response of action potential encoding in a mechanoreceptor

    PubMed Central

    Pfeiffer, Keram; French, Andrew S.

    2015-01-01

    Naturalistic signals were created from vibrations made by locusts walking on a Sansevieria plant. Both naturalistic and Gaussian noise signals were used to mechanically stimulate VS-3 slit-sense mechanoreceptor neurons of the spider, Cupiennius salei, with stimulus amplitudes adjusted to give similar firing rates for either stimulus. Intracellular microelectrodes recorded action potentials, receptor potential, and receptor current, using current clamp and voltage clamp. Frequency response analysis showed that naturalistic stimulation contained relatively more power at low frequencies, and caused increased neuronal sensitivity to higher frequencies. In contrast, varying the amplitude of Gaussian stimulation did not change neuronal dynamics. Naturalistic stimulation contained less entropy than Gaussian, but signal entropy was higher than stimulus in the resultant receptor current, indicating addition of uncorrelated noise during transduction. The presence of added noise was supported by measuring linear information capacity in the receptor current. Total entropy and information capacity in action potentials produced by either stimulus were much lower than in earlier stages, and limited to the maximum entropy of binary signals. We conclude that the dynamics of action potential encoding in VS-3 neurons are sensitive to the form of stimulation, but entropy and information capacity of action potentials are limited by firing rate. PMID:26578975

  4. Pressure transfer function of a JT15D nozzle due to acoustic and convected entropy fluctuations

    NASA Astrophysics Data System (ADS)

    Miles, J. H.

    An acoustic transmission matrix analysis of sound propagation in a variable area duct with and without flow is extended to include convected entropy fluctuations. The boundary conditions used in the analysis are a transfer function relating entropy and pressure at the nozzle inlet and the nozzle exit impedance. The nozzle pressure transfer function calculated is compared with JT15D turbofan engine nozzle data. The one dimensional theory for sound propagation in a variable area nozzle with flow but without convected entropy is good at the low engine speeds where the nozzle exit Mach number is low (M=0.2) and the duct exit impedance model is good. The effect of convected entropy appears to be so negligible that it is obscured by the inaccuracy of the nozzle exit impedance model, the lack of information on the magnitude of the convected entropy and its phase relationship with the pressure, and the scatter in the data. An improved duct exit impedance model is required at the higher engine speeds where the nozzle exit Mach number is high (M=0.56) and at low frequencies (below 120 Hz).

  5. Information theory applications for biological sequence analysis.

    PubMed

    Vinga, Susana

    2014-05-01

    Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.

  6. Classifying the Quantum Phases of Matter

    DTIC Science & Technology

    2015-01-01

    Kim related entanglement entropy to topological storage of quantum information [8]. Michalakis et al. showed that a particle-like excitation spectrum...Perturbative analysis of topological entanglement entropy from conditional independence, Phys. Rev. B 86, 254116 (2012), arXiv:1210.2360. [3] I. Kim...symmetries or long-range entanglement ), (2) elucidating the properties of three-dimensional quantum codes (in particular those which admit no string-like

  7. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  8. Serotonergic Psychedelics Temporarily Modify Information Transfer in Humans

    PubMed Central

    Alonso, Joan Francesc; Romero, Sergio; Mañanas, Miquel Àngel

    2015-01-01

    Background: Psychedelics induce intense modifications in the sensorium, the sense of “self,” and the experience of reality. Despite advances in our understanding of the molecular and cellular level mechanisms of these drugs, knowledge of their actions on global brain dynamics is still incomplete. Recent imaging studies have found changes in functional coupling between frontal and parietal brain structures, suggesting a modification in information flow between brain regions during acute effects. Methods: Here we assessed the psychedelic-induced changes in directionality of information flow during the acute effects of a psychedelic in humans. We measured modifications in connectivity of brain oscillations using transfer entropy, a nonlinear measure of directed functional connectivity based on information theory. Ten healthy male volunteers with prior experience with psychedelics participated in 2 experimental sessions. They received a placebo or a dose of ayahuasca, a psychedelic preparation containing the serotonergic 5-HT2A agonist N,N-dimethyltryptamine. Results: The analysis showed significant changes in the coupling of brain oscillations between anterior and posterior recording sites. Transfer entropy analysis showed that frontal sources decreased their influence over central, parietal, and occipital sites. Conversely, sources in posterior locations increased their influence over signals measured at anterior locations. Exploratory correlations found that anterior-to-posterior transfer entropy decreases were correlated with the intensity of subjective effects, while the imbalance between anterior-to-posterior and posterior-to-anterior transfer entropy correlated with the degree of incapacitation experienced. Conclusions: These results suggest that psychedelics induce a temporary disruption of neural hierarchies by reducing top-down control and increasing bottom-up information transfer in the human brain. PMID:25820842

  9. Signatures of Solvation Thermodynamics in Spectra of Intermolecular Vibrations

    PubMed Central

    2017-01-01

    This study explores the thermodynamic and vibrational properties of water in the three-dimensional environment of solvated ions and small molecules using molecular simulations. The spectrum of intermolecular vibrations in liquid solvents provides detailed information on the shape of the local potential energy surface, which in turn determines local thermodynamic properties such as the entropy. Here, we extract this information using a spatially resolved extension of the two-phase thermodynamics method to estimate hydration water entropies based on the local vibrational density of states (3D-2PT). Combined with an analysis of solute–water and water–water interaction energies, this allows us to resolve local contributions to the solvation enthalpy, entropy, and free energy. We use this approach to study effects of ions on their surrounding water hydrogen bond network, its spectrum of intermolecular vibrations, and resulting thermodynamic properties. In the three-dimensional environment of polar and nonpolar functional groups of molecular solutes, we identify distinct hydration water species and classify them by their characteristic vibrational density of states and molecular entropies. In each case, we are able to assign variations in local hydration water entropies to specific changes in the spectrum of intermolecular vibrations. This provides an important link for the thermodynamic interpretation of vibrational spectra that are accessible to far-infrared absorption and Raman spectroscopy experiments. Our analysis provides unique microscopic details regarding the hydration of hydrophobic and hydrophilic functional groups, which enable us to identify interactions and molecular degrees of freedom that determine relevant contributions to the solvation entropy and consequently the free energy. PMID:28783431

  10. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;

  11. Analysis of a SCADA System Anomaly Detection Model Based on Information Entropy

    DTIC Science & Technology

    2014-03-27

    20 Intrusion Detection...alarms ( Rem ). ............................................................................................................. 86 Figure 25. TP% for...literature concerning the focus areas of this research. The focus areas include SCADA vulnerabilities, information theory, and intrusion detection

  12. New Fault Recognition Method for Rotary Machinery Based on Information Entropy and a Probabilistic Neural Network.

    PubMed

    Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu

    2018-01-24

    Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.

  13. EEG based topography analysis in string recognition task

    NASA Astrophysics Data System (ADS)

    Ma, Xiaofei; Huang, Xiaolin; Shen, Yuxiaotong; Qin, Zike; Ge, Yun; Chen, Ying; Ning, Xinbao

    2017-03-01

    Vision perception and recognition is a complex process, during which different parts of brain are involved depending on the specific modality of the vision target, e.g. face, character, or word. In this study, brain activities in string recognition task compared with idle control state are analyzed through topographies based on multiple measurements, i.e. sample entropy, symbolic sample entropy and normalized rhythm power, extracted from simultaneously collected scalp EEG. Our analyses show that, for most subjects, both symbolic sample entropy and normalized gamma power in string recognition task are significantly higher than those in idle state, especially at locations of P4, O2, T6 and C4. It implies that these regions are highly involved in string recognition task. Since symbolic sample entropy measures complexity, from the perspective of new information generation, and normalized rhythm power reveals the power distributions in frequency domain, complementary information about the underlying dynamics can be provided through the two types of indices.

  14. Time series analysis of the Antarctic Circumpolar Wave via symbolic transfer entropy

    NASA Astrophysics Data System (ADS)

    Oh, Mingi; Kim, Sehyun; Lim, Kyuseong; Kim, Soo Yong

    2018-06-01

    An attempt to interpret a large-scale climate phenomenon in the Southern Ocean (SO), the Antarctic Circumpolar Wave (ACW), has been made using an information entropy method, symbolic transfer entropy (STE). Over the areas of 50-60∘S latitude belt, information flow for four climate variables, sea surface temperature (SST), sea-ice edge (SIE), sea level pressure (SLP) and meridional wind speed (MWS) is examined. We found a tendency that eastward flow of information is preferred only for oceanic variables, which is a main characteristic of the ACW, an eastward wave making a circuit around the Antarctica. Since the ACW is the coherent pattern in both ocean and atmosphere it is reasonable to infer that the tendency reflects the Antarctic Circumpolar Current (ACC) encircling the Antarctica, rather than an evidence of the ACW. We observed one common feature for all four variables, a strong information flow over the area of the eastern Pacific Ocean, which suggest a signature of El Nino Southern Oscillation (ENSO).

  15. Low-dimensional approximation searching strategy for transfer entropy from non-uniform embedding

    PubMed Central

    2018-01-01

    Transfer entropy from non-uniform embedding is a popular tool for the inference of causal relationships among dynamical subsystems. In this study we present an approach that makes use of low-dimensional conditional mutual information quantities to decompose the original high-dimensional conditional mutual information in the searching procedure of non-uniform embedding for significant variables at different lags. We perform a series of simulation experiments to assess the sensitivity and specificity of our proposed method to demonstrate its advantage compared to previous algorithms. The results provide concrete evidence that low-dimensional approximations can help to improve the statistical accuracy of transfer entropy in multivariate causality analysis and yield a better performance over other methods. The proposed method is especially efficient as the data length grows. PMID:29547669

  16. The dynamics of information-driven coordination phenomena: A transfer entropy analysis

    PubMed Central

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-01-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875

  17. The dynamics of information-driven coordination phenomena: A transfer entropy analysis.

    PubMed

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-04-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.

  18. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  19. Automated EEG entropy measurements in coma, vegetative state/unresponsive wakefulness syndrome and minimally conscious state

    PubMed Central

    Gosseries, Olivia; Schnakers, Caroline; Ledoux, Didier; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurélie; Demertzi, Athéna; Noirhomme, Quentin; Lehembre, Rémy; Damas, Pierre; Goldman, Serge; Peeters, Erika; Moonen, Gustave; Laureys, Steven

    Summary Monitoring the level of consciousness in brain-injured patients with disorders of consciousness is crucial as it provides diagnostic and prognostic information. Behavioral assessment remains the gold standard for assessing consciousness but previous studies have shown a high rate of misdiagnosis. This study aimed to investigate the usefulness of electroencephalography (EEG) entropy measurements in differentiating unconscious (coma or vegetative) from minimally conscious patients. Left fronto-temporal EEG recordings (10-minute resting state epochs) were prospectively obtained in 56 patients and 16 age-matched healthy volunteers. Patients were assessed in the acute (≤1 month post-injury; n=29) or chronic (>1 month post-injury; n=27) stage. The etiology was traumatic in 23 patients. Automated online EEG entropy calculations (providing an arbitrary value ranging from 0 to 91) were compared with behavioral assessments (Coma Recovery Scale-Revised) and outcome. EEG entropy correlated with Coma Recovery Scale total scores (r=0.49). Mean EEG entropy values were higher in minimally conscious (73±19; mean and standard deviation) than in vegetative/unresponsive wakefulness syndrome patients (45±28). Receiver operating characteristic analysis revealed an entropy cut-off value of 52 differentiating acute unconscious from minimally conscious patients (sensitivity 89% and specificity 90%). In chronic patients, entropy measurements offered no reliable diagnostic information. EEG entropy measurements did not allow prediction of outcome. User-independent time-frequency balanced spectral EEG entropy measurements seem to constitute an interesting diagnostic – albeit not prognostic – tool for assessing neural network complexity in disorders of consciousness in the acute setting. Future studies are needed before using this tool in routine clinical practice, and these should seek to improve automated EEG quantification paradigms in order to reduce the remaining false negative and false positive findings. PMID:21693085

  20. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  1. Entropy: Order or Information

    ERIC Educational Resources Information Center

    Ben-Naim, Arieh

    2011-01-01

    Changes in entropy can "sometimes" be interpreted in terms of changes in disorder. On the other hand, changes in entropy can "always" be interpreted in terms of changes in Shannon's measure of information. Mixing and demixing processes are used to highlight the pitfalls in the association of entropy with disorder. (Contains 3 figures.)

  2. Entropy of measurement and erasure: Szilard's membrane model revisited

    NASA Astrophysics Data System (ADS)

    Leff, Harvey S.; Rex, Andrew F.

    1994-11-01

    It is widely believed that measurement is accompanied by irreversible entropy increase. This conventional wisdom is based in part on Szilard's 1929 study of entropy decrease in a thermodynamic system by intelligent intervention (i.e., a Maxwell's demon) and Brillouin's association of entropy with information. Bennett subsequently argued that information acquisition is not necessarily irreversible, but information erasure must be dissipative (Landauer's principle). Inspired by the ensuing debate, we revisit the membrane model introduced by Szilard and find that it can illustrate and clarify (1) reversible measurement, (2) information storage, (3) decoupling of the memory from the system being measured, and (4) entropy increase associated with memory erasure and resetting.

  3. Mapping entropy: Analysis of population-environment dynamics using integrated remote sensing and transition theory based on a general systems perspective

    NASA Astrophysics Data System (ADS)

    de La Sierra, Ruben Ulises

    The present study introduces entropy mapping as a comprehensive method to analyze and describe complex interactive systems; and to assess the effect that entropy has in paradigm changes as described by transition theory. Dynamics of interactions among environmental, economic and demographic conditions affect a number of fast growing locations throughout the world. One of the regions especially affected by accelerated growth in terms of demographic and economic development is the border region between Mexico and the US. As the contrast between these countries provides a significant economic and cultural differential, the dynamics of capital, goods, services and people and the rates at which they interact are rather unique. To illustrate the most fundamental economic and political changes affecting the region, a background addressing the causes for these changes leading to the North America Free Trade Agreement (NAFTA) is presented. Although the concept of thermodynamic entropy was first observed in physical sciences, a relevant homology exists in biological, social and economic sciences as the universal tendency towards disorder, dissipation and equilibrium is present in these disciplines when energy or resources become deficient. Furthermore, information theory is expressed as uncertainty and randomness in terms of efficiency in transmission of information. Although entropy in closed systems is unavoidable, its increase in open systems, can be arrested by a flux of energy, resources and/or information. A critical component of all systems is the boundary. If a boundary is impermeable, it will prevent energy flow from the environment into the system; likewise, if the boundary is too porous, it will not be able to prevent the dissipation of energy and resources into the environment, and will not prevent entropy from entering. Therefore, two expressions of entropy--thermodynamic and information--are identified and related to systems in transition and to spatial distribution. These expressions are used to identify causes and trends leading to growth or disorder.

  4. Conformational Entropy of Intrinsically Disordered Proteins from Amino Acid Triads

    PubMed Central

    Baruah, Anupaul; Rani, Pooja; Biswas, Parbati

    2015-01-01

    This work quantitatively characterizes intrinsic disorder in proteins in terms of sequence composition and backbone conformational entropy. Analysis of the normalized relative composition of the amino acid triads highlights a distinct boundary between globular and disordered proteins. The conformational entropy is calculated from the dihedral angles of the middle amino acid in the amino acid triad for the conformational ensemble of the globular, partially and completely disordered proteins relative to the non-redundant database. Both Monte Carlo (MC) and Molecular Dynamics (MD) simulations are used to characterize the conformational ensemble of the representative proteins of each group. The results show that the globular proteins span approximately half of the allowed conformational states in the Ramachandran space, while the amino acid triads in disordered proteins sample the entire range of the allowed dihedral angle space following Flory’s isolated-pair hypothesis. Therefore, only the sequence information in terms of the relative amino acid triad composition may be sufficient to predict protein disorder and the backbone conformational entropy, even in the absence of well-defined structure. The predicted entropies are found to agree with those calculated using mutual information expansion and the histogram method. PMID:26138206

  5. Analysis of crude oil markets with improved multiscale weighted permutation entropy

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun; Liu, Cheng

    2018-03-01

    Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.

  6. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  7. Information Measures for Multisensor Systems

    DTIC Science & Technology

    2013-12-11

    permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor

  8. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  9. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S.

    1989-05-15

    Several theorems on the mechanics of gathering information are proved, and the possibility of violating the second law of thermodynamics by obtaining information is discussed in light of these theorems. Maxwell's demon can lower the entropy of his surroundings by an amount equal to the difference between the maximum entropy of his recording device and its initial entropy, without generating a compensating entropy increase. A demon with human-scale recording devices can reduce the entropy of a gas by a negligible amount only, but the proof of the demon's impracticability leaves open the possibility that systems highly correlated with their environmentmore » can reduce the environment's entropy by a substantial amount without increasing entropy elsewhere. In the event that a boundary condition for the universe requires it to be in a state of low entropy when small, the correlations induced between different particle modes during the expansion phase allow the modes to behave like Maxwell's demons during the contracting phase, reducing the entropy of the universe to a low value.« less

  10. Correlation as a Determinant of Configurational Entropy in Supramolecular and Protein Systems

    PubMed Central

    2015-01-01

    For biomolecules in solution, changes in configurational entropy are thought to contribute substantially to the free energies of processes like binding and conformational change. In principle, the configurational entropy can be strongly affected by pairwise and higher-order correlations among conformational degrees of freedom. However, the literature offers mixed perspectives regarding the contributions that changes in correlations make to changes in configurational entropy for such processes. Here we take advantage of powerful techniques for simulation and entropy analysis to carry out rigorous in silico studies of correlation in binding and conformational changes. In particular, we apply information-theoretic expansions of the configurational entropy to well-sampled molecular dynamics simulations of a model host–guest system and the protein bovine pancreatic trypsin inhibitor. The results bear on the interpretation of NMR data, as they indicate that changes in correlation are important determinants of entropy changes for biologically relevant processes and that changes in correlation may either balance or reinforce changes in first-order entropy. The results also highlight the importance of main-chain torsions as contributors to changes in protein configurational entropy. As simulation techniques grow in power, the mathematical techniques used here will offer new opportunities to answer challenging questions about complex molecular systems. PMID:24702693

  11. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy

    PubMed Central

    2011-01-01

    Background Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. Results In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. Conclusions TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox. PMID:22098775

  12. Serotonergic psychedelics temporarily modify information transfer in humans.

    PubMed

    Alonso, Joan Francesc; Romero, Sergio; Mañanas, Miquel Àngel; Riba, Jordi

    2015-03-28

    Psychedelics induce intense modifications in the sensorium, the sense of "self," and the experience of reality. Despite advances in our understanding of the molecular and cellular level mechanisms of these drugs, knowledge of their actions on global brain dynamics is still incomplete. Recent imaging studies have found changes in functional coupling between frontal and parietal brain structures, suggesting a modification in information flow between brain regions during acute effects. Here we assessed the psychedelic-induced changes in directionality of information flow during the acute effects of a psychedelic in humans. We measured modifications in connectivity of brain oscillations using transfer entropy, a nonlinear measure of directed functional connectivity based on information theory. Ten healthy male volunteers with prior experience with psychedelics participated in 2 experimental sessions. They received a placebo or a dose of ayahuasca, a psychedelic preparation containing the serotonergic 5-HT2A agonist N,N-dimethyltryptamine. The analysis showed significant changes in the coupling of brain oscillations between anterior and posterior recording sites. Transfer entropy analysis showed that frontal sources decreased their influence over central, parietal, and occipital sites. Conversely, sources in posterior locations increased their influence over signals measured at anterior locations. Exploratory correlations found that anterior-to-posterior transfer entropy decreases were correlated with the intensity of subjective effects, while the imbalance between anterior-to-posterior and posterior-to-anterior transfer entropy correlated with the degree of incapacitation experienced. These results suggest that psychedelics induce a temporary disruption of neural hierarchies by reducing top-down control and increasing bottom-up information transfer in the human brain. © The Author 2015. Published by Oxford University Press on behalf of CINP.

  13. TRENTOOL: a Matlab open source toolbox to analyse information flow in time series data with transfer entropy.

    PubMed

    Lindner, Michael; Vicente, Raul; Priesemann, Viola; Wibral, Michael

    2011-11-18

    Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox.

  14. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  15. Modified cross sample entropy and surrogate data analysis method for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-09-01

    For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.

  16. Refined composite multivariate generalized multiscale fuzzy entropy: A tool for complexity analysis of multichannel signals

    NASA Astrophysics Data System (ADS)

    Azami, Hamed; Escudero, Javier

    2017-01-01

    Multiscale entropy (MSE) is an appealing tool to characterize the complexity of time series over multiple temporal scales. Recent developments in the field have tried to extend the MSE technique in different ways. Building on these trends, we propose the so-called refined composite multivariate multiscale fuzzy entropy (RCmvMFE) whose coarse-graining step uses variance (RCmvMFEσ2) or mean (RCmvMFEμ). We investigate the behavior of these multivariate methods on multichannel white Gaussian and 1/ f noise signals, and two publicly available biomedical recordings. Our simulations demonstrate that RCmvMFEσ2 and RCmvMFEμ lead to more stable results and are less sensitive to the signals' length in comparison with the other existing multivariate multiscale entropy-based methods. The classification results also show that using both the variance and mean in the coarse-graining step offers complexity profiles with complementary information for biomedical signal analysis. We also made freely available all the Matlab codes used in this paper.

  17. The Shannon entropy information for mixed Manning Rosen potential in D-dimensional Schrodinger equation

    NASA Astrophysics Data System (ADS)

    Suparmi, A.; Cari, C.; Nur Pratiwi, Beta; Arya Nugraha, Dewanta

    2017-01-01

    D dimensional Schrodinger equation for the mixed Manning Rosen potential was investigated using supersymmetric quantum mechanics. We obtained the energy eigenvalues from radial part solution and wavefunctions in radial and angular parts solution. From the lowest radial wavefunctions, we evaluated the Shannon entropy information using Matlab software. Based on the entropy densities demonstrated graphically, we obtained that the wave of position information entropy density moves right when the value of potential parameter q increases, while its wave moves left with the increase of parameter α. The wave of momentum information entropy densities were expressed in graphs. We observe that its amplitude increase with increasing parameter q and α

  18. Entropy and Information: A Multidisciplinary Overview.

    ERIC Educational Resources Information Center

    Shaw, Debora; Davis, Charles H.

    1983-01-01

    Cites representative extensions of concept of entropy (measure of the amount of energy unavailable for useful work; from the second law of thermodynamics) noting basic relationships between entropy, order, information, and meaning in such disciplines as biology, economics, information science, the arts, and religion. Seventy-eight references are…

  19. Measuring Gaussian quantum information and correlations using the Rényi entropy of order 2.

    PubMed

    Adesso, Gerardo; Girolami, Davide; Serafini, Alessio

    2012-11-09

    We demonstrate that the Rényi-2 entropy provides a natural measure of information for any multimode Gaussian state of quantum harmonic systems, operationally linked to the phase-space Shannon sampling entropy of the Wigner distribution of the state. We prove that, in the Gaussian scenario, such an entropy satisfies the strong subadditivity inequality, a key requirement for quantum information theory. This allows us to define and analyze measures of Gaussian entanglement and more general quantum correlations based on such an entropy, which are shown to satisfy relevant properties such as monogamy.

  20. Randomness in denoised stock returns: The case of Moroccan family business companies

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2018-02-01

    In this paper, we scrutinize entropy in family business stocks listed on Casablanca stock exchange and market index to assess randomness in their returns. For this purpose, we adopt a novel approach based on combination of stationary wavelet transform and Tsallis entropy for empirical analysis of the return series. The obtained empirical results show strong evidence that their respective entropy functions are characterized by opposite dynamics. Indeed, the information contents of their respective dynamics are statistically and significantly different. Obviously, information on regular events carried by family business returns is more certain, whilst that carried by market returns is uncertain. Such results are definitively useful to understand the nonlinear dynamics on returns on family business companies and those of the market. Without a doubt, they could be helpful for quantitative portfolio managers and investors.

  1. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal

    PubMed Central

    Mohapatra, Biswajit

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis. PMID:29854361

  2. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal.

    PubMed

    Nayak, Suraj K; Bit, Arindam; Dey, Anilesh; Mohapatra, Biswajit; Pal, Kunal

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis.

  3. Entropy Information of Cardiorespiratory Dynamics in Neonates during Sleep.

    PubMed

    Lucchini, Maristella; Pini, Nicolò; Fifer, William P; Burtchen, Nina; Signorini, Maria G

    2017-05-01

    Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not yet fully understood. Signal processing approaches have focused on cardiorespiratory analysis to elucidate this co-regulation. This manuscript proposes to analyze heart rate (HR), respiratory variability and their interrelationship in newborn infants to characterize cardiorespiratory interactions in different sleep states (active vs. quiet). We are searching for indices that could detect regulation alteration or malfunction, potentially leading to infant distress. We have analyzed inter-beat (RR) interval series and respiration in a population of 151 newborns, and followed up with 33 at 1 month of age. RR interval series were obtained by recognizing peaks of the QRS complex in the electrocardiogram (ECG), corresponding to the ventricles depolarization. Univariate time domain, frequency domain and entropy measures were applied. In addition, Transfer Entropy was considered as a bivariate approach able to quantify the bidirectional information flow from one signal (respiration) to another (RR series). Results confirm the validity of the proposed approach. Overall, HRV is higher in active sleep, while high frequency (HF) power characterizes more quiet sleep. Entropy analysis provides higher indices for SampEn and Quadratic Sample entropy (QSE) in quiet sleep. Transfer Entropy values were higher in quiet sleep and point to a major influence of respiration on the RR series. At 1 month of age, time domain parameters show an increase in HR and a decrease in variability. No entropy differences were found across ages. The parameters employed in this study help to quantify the potential for infants to adapt their cardiorespiratory responses as they mature. Thus, they could be useful as early markers of risk for infant cardiorespiratory vulnerabilities.

  4. Investigating dynamical complexity in the magnetosphere using various entropy measures

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos

    2009-09-01

    The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather applications.

  5. Entanglement entropy and mutual information production rates in acoustic black holes.

    PubMed

    Giovanazzi, Stefano

    2011-01-07

    A method to investigate acoustic Hawking radiation is proposed, where entanglement entropy and mutual information are measured from the fluctuations of the number of particles. The rate of entropy radiated per one-dimensional (1D) channel is given by S=κ/12, where κ is the sound acceleration on the sonic horizon. This entropy production is accompanied by a corresponding formation of mutual information to ensure the overall conservation of information. The predictions are confirmed using an ab initio analytical approach in transonic flows of 1D degenerate ideal Fermi fluids.

  6. Entropy in sound and vibration: towards a new paradigm.

    PubMed

    Le Bot, A

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.

  7. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  8. Complex Networks/Foundations of Information Systems

    DTIC Science & Technology

    2013-03-06

    the benefit of feedback or dynamic correlations in coding and protocol. Using Renyi correlation analysis and entropy to model this wider class of...dynamic heterogeneous conditions. Lizhong Zheng, MIT Renyi Channel Correlation Analysis (connected to geometric curvature) Network Channel

  9. Fusion information entropy method of rolling bearing fault diagnosis based on n-dimensional characteristic parameter distance

    NASA Astrophysics Data System (ADS)

    Ai, Yan-Ting; Guan, Jiao-Yue; Fei, Cheng-Wei; Tian, Jing; Zhang, Feng-Ling

    2017-05-01

    To monitor rolling bearing operating status with casings in real time efficiently and accurately, a fusion method based on n-dimensional characteristic parameters distance (n-DCPD) was proposed for rolling bearing fault diagnosis with two types of signals including vibration signal and acoustic emission signals. The n-DCPD was investigated based on four information entropies (singular spectrum entropy in time domain, power spectrum entropy in frequency domain, wavelet space characteristic spectrum entropy and wavelet energy spectrum entropy in time-frequency domain) and the basic thought of fusion information entropy fault diagnosis method with n-DCPD was given. Through rotor simulation test rig, the vibration and acoustic emission signals of six rolling bearing faults (ball fault, inner race fault, outer race fault, inner-ball faults, inner-outer faults and normal) are collected under different operation conditions with the emphasis on the rotation speed from 800 rpm to 2000 rpm. In the light of the proposed fusion information entropy method with n-DCPD, the diagnosis of rolling bearing faults was completed. The fault diagnosis results show that the fusion entropy method holds high precision in the recognition of rolling bearing faults. The efforts of this study provide a novel and useful methodology for the fault diagnosis of an aeroengine rolling bearing.

  10. Large deviation analysis of a simple information engine

    NASA Astrophysics Data System (ADS)

    Maitland, Michael; Grosskinsky, Stefan; Harris, Rosemary J.

    2015-11-01

    Information thermodynamics provides a framework for studying the effect of feedback loops on entropy production. It has enabled the understanding of novel thermodynamic systems such as the information engine, which can be seen as a modern version of "Maxwell's Dæmon," whereby a feedback controller processes information gained by measurements in order to extract work. Here, we analyze a simple model of such an engine that uses feedback control based on measurements to obtain negative entropy production. We focus on the distribution and fluctuations of the information obtained by the feedback controller. Significantly, our model allows an analytic treatment for a two-state system with exact calculation of the large deviation rate function. These results suggest an approximate technique for larger systems, which is corroborated by simulation data.

  11. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    PubMed Central

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118

  12. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    NASA Astrophysics Data System (ADS)

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.

  13. Entropy information of heart rate variability and its power spectrum during day and night

    NASA Astrophysics Data System (ADS)

    Jin, Li; Jun, Wang

    2013-07-01

    Physiologic systems generate complex fluctuations in their output signals that reflect the underlying dynamics. We employed the base-scale entropy method and the power spectral analysis to study the 24 hours heart rate variability (HRV) signals. The results show that such profound circadian-, age- and pathologic-dependent changes are accompanied by changes in base-scale entropy and power spectral distribution. Moreover, the base-scale entropy changes reflect the corresponding changes in the autonomic nerve outflow. With the suppression of the vagal tone and dominance of the sympathetic tone in congestive heart failure (CHF) subjects, there is more variability in the date fluctuation mode. So the higher base-scale entropy belongs to CHF subjects. With the decrease of the sympathetic tone and the respiratory frequency (RSA) becoming more pronounced with slower breathing during sleeping, the base-scale entropy drops in CHF subjects. The HRV series of the two healthy groups have the same diurnal/nocturnal trend as the CHF series. The fluctuation dynamics trend of data in the three groups can be described as “HF effect”.

  14. Regional Sustainable Development Analysis Based on Information Entropy—Sichuan Province as an Example

    PubMed Central

    Liang, Xuedong; Si, Dongyang; Zhang, Xinli

    2017-01-01

    According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province’s sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research. PMID:29027982

  15. Inverting ion images without Abel inversion: maximum entropy reconstruction of velocity maps.

    PubMed

    Dick, Bernhard

    2014-01-14

    A new method for the reconstruction of velocity maps from ion images is presented, which is based on the maximum entropy concept. In contrast to other methods used for Abel inversion the new method never applies an inversion or smoothing to the data. Instead, it iteratively finds the map which is the most likely cause for the observed data, using the correct likelihood criterion for data sampled from a Poissonian distribution. The entropy criterion minimizes the information content in this map, which hence contains no information for which there is no evidence in the data. Two implementations are proposed, and their performance is demonstrated with simulated and experimental data: Maximum Entropy Velocity Image Reconstruction (MEVIR) obtains a two-dimensional slice through the velocity distribution and can be compared directly to Abel inversion. Maximum Entropy Velocity Legendre Reconstruction (MEVELER) finds one-dimensional distribution functions Q(l)(v) in an expansion of the velocity distribution in Legendre polynomials P((cos θ) for the angular dependence. Both MEVIR and MEVELER can be used for the analysis of ion images with intensities as low as 0.01 counts per pixel, with MEVELER performing significantly better than MEVIR for images with low intensity. Both methods perform better than pBASEX, in particular for images with less than one average count per pixel.

  16. Empirical and Theoretical Aspects of Generation and Transfer of Information in a Neuromagnetic Source Network

    PubMed Central

    Vakorin, Vasily A.; Mišić, Bratislav; Krakovska, Olga; McIntosh, Anthony Randal

    2011-01-01

    Variability in source dynamics across the sources in an activated network may be indicative of how the information is processed within a network. Information-theoretic tools allow one not only to characterize local brain dynamics but also to describe interactions between distributed brain activity. This study follows such a framework and explores the relations between signal variability and asymmetry in mutual interdependencies in a data-driven pipeline of non-linear analysis of neuromagnetic sources reconstructed from human magnetoencephalographic (MEG) data collected as a reaction to a face recognition task. Asymmetry in non-linear interdependencies in the network was analyzed using transfer entropy, which quantifies predictive information transfer between the sources. Variability of the source activity was estimated using multi-scale entropy, quantifying the rate of which information is generated. The empirical results are supported by an analysis of synthetic data based on the dynamics of coupled systems with time delay in coupling. We found that the amount of information transferred from one source to another was correlated with the difference in variability between the dynamics of these two sources, with the directionality of net information transfer depending on the time scale at which the sample entropy was computed. The results based on synthetic data suggest that both time delay and strength of coupling can contribute to the relations between variability of brain signals and information transfer between them. Our findings support the previous attempts to characterize functional organization of the activated brain, based on a combination of non-linear dynamics and temporal features of brain connectivity, such as time delay. PMID:22131968

  17. Discussion of enthalpy, entropy and free energy of formation of GaN

    NASA Astrophysics Data System (ADS)

    Jacob, K. T.; Rajitha, G.

    2009-07-01

    Presented in this letter is a critical discussion of a recent paper on experimental investigation of the enthalpy, entropy and free energy of formation of gallium nitride (GaN) published in this journal [T.J. Peshek, J.C. Angus, K. Kash, J. Cryst. Growth 311 (2008) 185-189]. It is shown that the experimental technique employed detects neither the equilibrium partial pressure of N 2 corresponding to the equilibrium between Ga and GaN at fixed temperatures nor the equilibrium temperature at constant pressure of N 2. The results of Peshek et al. are discussed in the light of other information on the Gibbs energy of formation available in the literature. Entropy of GaN is derived from heat-capacity measurements. Based on a critical analysis of all thermodynamic information now available, a set of optimized parameters is identified and a table of thermodynamic data for GaN developed from 298.15 to 1400 K.

  18. Advanced image fusion algorithms for Gamma Knife treatment planning. Evaluation and proposal for clinical use.

    PubMed

    Apostolou, N; Papazoglou, Th; Koutsouris, D

    2006-01-01

    Image fusion is a process of combining information from multiple sensors. It is a useful tool implemented in the treatment planning programme of Gamma Knife Radiosurgery. In this paper we evaluate advanced image fusion algorithms for Matlab platform and head images. We develop nine level grayscale image fusion methods: average, principal component analysis (PCA), discrete wavelet transform (DWT) and Laplacian, filter - subtract - decimate (FSD), contrast, gradient, morphological pyramid and a shift invariant discrete wavelet transform (SIDWT) method in Matlab platform. We test these methods qualitatively and quantitatively. The quantitative criteria we use are the Root Mean Square Error (RMSE), the Mutual Information (MI), the Standard Deviation (STD), the Entropy (H), the Difference Entropy (DH) and the Cross Entropy (CEN). The qualitative are: natural appearance, brilliance contrast, presence of complementary features and enhancement of common features. Finally we make clinically useful suggestions.

  19. Optimized Kernel Entropy Components.

    PubMed

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  20. Decoherence estimation in quantum theory and beyond

    NASA Astrophysics Data System (ADS)

    Pfister, Corsin

    The quantum physics literature provides many different characterizations of decoherence. Most of them have in common that they describe decoherence as a kind of influence on a quantum system upon interacting with an another system. In the spirit of quantum information theory, we adapt a particular viewpoint on decoherence which describes it as the loss of information into a system that is possibly controlled by an adversary. We use a quantitative framework for decoherence that builds on operational characterizations of the min-entropy that have been developed in the quantum information literature. It characterizes decoherence as an influence on quantum channels that reduces their suitability for a variety of quantifiable tasks such as the distribution of secret cryptographic keys of a certain length or the distribution of a certain number of maximally entangled qubit pairs. This allows for a quantitative and operational characterization of decoherence via operational characterizations of the min-entropy. In this thesis, we present a series of results about the estimation of the minentropy, subdivided into three parts. The first part concerns the estimation of a quantum adversary's uncertainty about classical information--expressed by the smooth min-entropy--as it is done in protocols for quantum key distribution (QKD). We analyze this form of min-entropy estimation in detail and find that some of the more recently suggested QKD protocols have previously unnoticed security loopholes. We show that the specifics of the sifting subroutine of a QKD protocol are crucial for security by pointing out mistakes in the security analysis in the literature and by presenting eavesdropping attacks on those problematic protocols. We provide solutions to the identified problems and present a formalized analysis of the min-entropy estimate that incorporates the sifting stage of QKD protocols. In the second part, we extend ideas from QKD to a protocol that allows to estimate an adversary's uncertainty about quantum information, expressed by the fully quantum smooth min-entropy. Roughly speaking, we show that a protocol that resembles the parallel execution of two QKD protocols can be used to lower bound the min-entropy of some unmeasured qubits. We explain how this result may influence the ongoing search for protocols for entanglement distribution. The third part is dedicated to the development of a framework that allows the estimation of decoherence even in experiments that cannot be correctly described by quantum theory. Inspired by an equivalent formulation of the min-entropy that relates it to the fidelity with a maximally entangled state, we define a decoherence quantity for a very general class of probabilistic theories that reduces to the min-entropy in the special case of quantum theory. This entails a definition of maximal entanglement for generalized probabilistic theories. Using techniques from semidefinite and linear programming, we show how bounds on this quantity can be estimated through Bell-type experiments. This allows to test models for decoherence that cannot be described by quantum theory. As an example application, we devise an experimental test of a model for gravitational decoherence that has been suggested in the literature.

  1. Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models

    NASA Astrophysics Data System (ADS)

    Wellmann, J. Florian; Regenauer-Lieb, Klaus

    2012-03-01

    Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.

  2. Information-theoretic CAD system in mammography: Entropy-based indexing for computational efficiency and robust performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee

    2007-08-15

    We have previously presented a knowledge-based computer-assisted detection (KB-CADe) system for the detection of mammographic masses. The system is designed to compare a query mammographic region with mammographic templates of known ground truth. The templates are stored in an adaptive knowledge database. Image similarity is assessed with information theoretic measures (e.g., mutual information) derived directly from the image histograms. A previous study suggested that the diagnostic performance of the system steadily improves as the knowledge database is initially enriched with more templates. However, as the database increases in size, an exhaustive comparison of the query case with each stored templatemore » becomes computationally burdensome. Furthermore, blind storing of new templates may result in redundancies that do not necessarily improve diagnostic performance. To address these concerns we investigated an entropy-based indexing scheme for improving the speed of analysis and for satisfying database storage restrictions without compromising the overall diagnostic performance of our KB-CADe system. The indexing scheme was evaluated on two different datasets as (i) a search mechanism to sort through the knowledge database, and (ii) a selection mechanism to build a smaller, concise knowledge database that is easier to maintain but still effective. There were two important findings in the study. First, entropy-based indexing is an effective strategy to identify fast a subset of templates that are most relevant to a given query. Only this subset could be analyzed in more detail using mutual information for optimized decision making regarding the query. Second, a selective entropy-based deposit strategy may be preferable where only high entropy cases are maintained in the knowledge database. Overall, the proposed entropy-based indexing scheme was shown to reduce the computational cost of our KB-CADe system by 55% to 80% while maintaining the system's diagnostic performance.« less

  3. Entropy-based goodness-of-fit test: Application to the Pareto distribution

    NASA Astrophysics Data System (ADS)

    Lequesne, Justine

    2013-08-01

    Goodness-of-fit tests based on entropy have been introduced in [13] for testing normality. The maximum entropy distribution in a class of probability distributions defined by linear constraints induces a Pythagorean equality between the Kullback-Leibler information and an entropy difference. This allows one to propose a goodness-of-fit test for maximum entropy parametric distributions which is based on the Kullback-Leibler information. We will focus on the application of the method to the Pareto distribution. The power of the proposed test is computed through Monte Carlo simulation.

  4. Entropy/information flux in Hawking radiation

    NASA Astrophysics Data System (ADS)

    Alonso-Serrano, Ana; Visser, Matt

    2018-01-01

    Blackbody radiation contains (on average) an entropy of 3.9 ± 2.5 bits per photon. If the emission process is unitary, then this entropy is exactly compensated by "hidden information" in the correlations. We extend this argument to the Hawking radiation from GR black holes, demonstrating that the assumption of unitarity leads to a perfectly reasonable entropy/information budget. The key technical aspect of our calculation is a variant of the "average subsystem" approach developed by Page, which we extend beyond bipartite pure systems, to a tripartite pure system that considers the influence of the environment.

  5. Simple proof of the concavity of the entropy power with respect to Gaussian noise

    NASA Technical Reports Server (NTRS)

    Dembo, Amir

    1989-01-01

    A very simple proof of M. H. Costa's result that the entropy power of Xt = X + N (O, tI) is concave in t, is derived as an immediate consequence of an inequality concerning Fisher information. This relationship between Fisher information and entropy is found to be useful for proving the central limit theorem. Thus, one who seeks new entropy inequalities should try first to find new inequalities about Fisher information, or at least to exploit the existing ones in new ways.

  6. Entropy in sound and vibration: towards a new paradigm

    PubMed Central

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190

  7. Noise and complexity in human postural control: interpreting the different estimations of entropy.

    PubMed

    Rhea, Christopher K; Silver, Tobin A; Hong, S Lee; Ryu, Joong Hyun; Studenka, Breanna E; Hughes, Charmayne M L; Haddad, Jeffrey M

    2011-03-17

    Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses.

  8. Estimating the mutual information of an EEG-based Brain-Computer Interface.

    PubMed

    Schlögl, A; Neuper, C; Pfurtscheller, G

    2002-01-01

    An EEG-based Brain-Computer Interface (BCI) could be used as an additional communication channel between human thoughts and the environment. The efficacy of such a BCI depends mainly on the transmitted information rate. Shannon's communication theory was used to quantify the information rate of BCI data. For this purpose, experimental EEG data from four BCI experiments was analyzed off-line. Subjects imaginated left and right hand movements during EEG recording from the sensorimotor area. Adaptive autoregressive (AAR) parameters were used as features of single trial EEG and classified with linear discriminant analysis. The intra-trial variation as well as the inter-trial variability, the signal-to-noise ratio, the entropy of information, and the information rate were estimated. The entropy difference was used as a measure of the separability of two classes of EEG patterns.

  9. Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.

    PubMed

    Mammone, Nadia; Morabito, Francesco Carlo

    2008-09-01

    Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.

  10. Entropy changes in brain function.

    PubMed

    Rosso, Osvaldo A

    2007-04-01

    The traditional way of analyzing brain electrical activity, on the basis of electroencephalography (EEG) records, relies mainly on visual inspection and years of training. Although it is quite useful, of course, one has to acknowledge its subjective nature that hardly allows for a systematic protocol. In the present work quantifiers based on information theory and wavelet transform are reviewed. The "relative wavelet energy" provides information about the relative energy associated with different frequency bands present in the EEG and their corresponding degree of importance. The "normalized total wavelet entropy" carries information about the degree of order-disorder associated with a multi-frequency signal response. Their application in the analysis and quantification of short duration EEG signals (event-related potentials) and epileptic EEG records are summarized.

  11. Information content exploitation of imaging spectrometer's images for lossless compression

    NASA Astrophysics Data System (ADS)

    Wang, Jianyu; Zhu, Zhenyu; Lin, Kan

    1996-11-01

    Imaging spectrometer, such as MAIS produces a tremendous volume of image data with up to 5.12 Mbps raw data rate, which needs urgently a real-time, efficient and reversible compression implementation. Between the lossy scheme with high compression ratio and the lossless scheme with high fidelity, we must make our choice based on the particular information content analysis of each imaging spectrometer's image data. In this paper, we present a careful analysis of information-preserving compression of imaging spectrometer MAIS with an entropy and autocorrelation study on the hyperspectral images. First, the statistical information in an actual MAIS image, captured in Marble Bar Australia, is measured with its entropy, conditional entropy, mutual information and autocorrelation coefficients on both spatial dimensions and spectral dimension. With these careful analyses, it is shown that there is high redundancy existing in the spatial dimensions, but the correlation in spectral dimension of the raw images is smaller than expected. The main reason of the nonstationarity on spectral dimension is attributed to the instruments's discrepancy on detector's response and channel's amplification in different spectral bands. To restore its natural correlation, we preprocess the signal in advance. There are two methods to accomplish this requirement: onboard radiation calibration and normalization. A better result can be achieved by the former one. After preprocessing, the spectral correlation increases so high that it contributes much redundancy in addition to spatial correlation. At last, an on-board hardware implementation for the lossless compression is presented with an ideal result.

  12. Entropy coders for image compression based on binary forward classification

    NASA Astrophysics Data System (ADS)

    Yoo, Hoon; Jeong, Jechang

    2000-12-01

    Entropy coders as a noiseless compression method are widely used as final step compression for images, and there have been many contributions to increase of entropy coder performance and to reduction of entropy coder complexity. In this paper, we propose some entropy coders based on the binary forward classification (BFC). The BFC requires overhead of classification but there is no change between the amount of input information and the total amount of classified output information, which we prove this property in this paper. And using the proved property, we propose entropy coders that are the BFC followed by Golomb-Rice coders (BFC+GR) and the BFC followed by arithmetic coders (BFC+A). The proposed entropy coders introduce negligible additional complexity due to the BFC. Simulation results also show better performance than other entropy coders that have similar complexity to the proposed coders.

  13. Memory and betweenness preference in temporal networks induced from time series

    NASA Astrophysics Data System (ADS)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Zheng, Rui; Hui, Pan

    2017-02-01

    We construct temporal networks from time series via unfolding the temporal information into an additional topological dimension of the networks. Thus, we are able to introduce memory entropy analysis to unravel the memory effect within the considered signal. We find distinct patterns in the entropy growth rate of the aggregate network at different memory scales for time series with different dynamics ranging from white noise, 1/f noise, autoregressive process, periodic to chaotic dynamics. Interestingly, for a chaotic time series, an exponential scaling emerges in the memory entropy analysis. We demonstrate that the memory exponent can successfully characterize bifurcation phenomenon, and differentiate the human cardiac system in healthy and pathological states. Moreover, we show that the betweenness preference analysis of these temporal networks can further characterize dynamical systems and separate distinct electrocardiogram recordings. Our work explores the memory effect and betweenness preference in temporal networks constructed from time series data, providing a new perspective to understand the underlying dynamical systems.

  14. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  15. Quantum thermodynamics and quantum entanglement entropies in an expanding universe

    NASA Astrophysics Data System (ADS)

    Farahmand, Mehrnoosh; Mohammadzadeh, Hosein; Mehri-Dehnavi, Hossein

    2017-05-01

    We investigate an asymptotically spatially flat Robertson-Walker space-time from two different perspectives. First, using von Neumann entropy, we evaluate the entanglement generation due to the encoded information in space-time. Then, we work out the entropy of particle creation based on the quantum thermodynamics of the scalar field on the underlying space-time. We show that the general behavior of both entropies are the same. Therefore, the entanglement can be applied to the customary quantum thermodynamics of the universe. Also, using these entropies, we can recover some information about the parameters of space-time.

  16. A duality principle for the multi-block entanglement entropy of free fermion systems.

    PubMed

    Carrasco, J A; Finkel, F; González-López, A; Tempesta, P

    2017-09-11

    The analysis of the entanglement entropy of a subsystem of a one-dimensional quantum system is a powerful tool for unravelling its critical nature. For instance, the scaling behaviour of the entanglement entropy determines the central charge of the associated Virasoro algebra. For a free fermion system, the entanglement entropy depends essentially on two sets, namely the set A of sites of the subsystem considered and the set K of excited momentum modes. In this work we make use of a general duality principle establishing the invariance of the entanglement entropy under exchange of the sets A and K to tackle complex problems by studying their dual counterparts. The duality principle is also a key ingredient in the formulation of a novel conjecture for the asymptotic behavior of the entanglement entropy of a free fermion system in the general case in which both sets A and K consist of an arbitrary number of blocks. We have verified that this conjecture reproduces the numerical results with excellent precision for all the configurations analyzed. We have also applied the conjecture to deduce several asymptotic formulas for the mutual and r-partite information generalizing the known ones for the single block case.

  17. Power-law scaling for macroscopic entropy and microscopic complexity: Evidence from human movement and posture

    NASA Astrophysics Data System (ADS)

    Hong, S. Lee; Bodfish, James W.; Newell, Karl M.

    2006-03-01

    We investigated the relationship between macroscopic entropy and microscopic complexity of the dynamics of body rocking and sitting still across adults with stereotyped movement disorder and mental retardation (profound and severe) against controls matched for age, height, and weight. This analysis was performed through the examination of center of pressure (COP) motion on the mediolateral (side-to-side) and anteroposterior (fore-aft) dimensions and the entropy of the relative phase between the two dimensions of motion. Intentional body rocking and stereotypical body rocking possessed similar slopes for their respective frequency spectra, but differences were revealed during maintenance of sitting postures. The dynamics of sitting in the control group produced lower spectral slopes and higher complexity (approximate entropy). In the controls, the higher complexity found on each dimension of motion was related to a weaker coupling between dimensions. Information entropy of the relative phase between the two dimensions of COP motion and irregularity (complexity) of their respective motions fitted a power-law function, revealing a relationship between macroscopic entropy and microscopic complexity across both groups and behaviors. This power-law relation affords the postulation that the organization of movement and posture dynamics occurs as a fractal process.

  18. Influence of conversion on the location of points and lines: The change of location entropy and the probability of a vector point inside the converted grid point

    NASA Astrophysics Data System (ADS)

    Chen, Nan

    2018-03-01

    Conversion of points or lines from vector to grid format, or vice versa, is the first operation required for most spatial analysis. Conversion, however, usually causes the location of points or lines to change, which influences the reliability of the results of spatial analysis or even results in analysis errors. The purpose of this paper is to evaluate the change of the location of points and lines during conversion using the concepts of probability and entropy. This paper shows that when a vector point is converted to a grid point, the vector point may be outside or inside the grid point. This paper deduces a formula for computing the probability that the vector point is inside the grid point. It was found that the probability increased with the side length of the grid and with the variances of the coordinates of the vector point. In addition, the location entropy of points and lines are defined in this paper. Formulae for computing the change of the location entropy during conversion are deduced. The probability mentioned above and the change of location entropy may be used to evaluate the location reliability of points and lines in Geographic Information Systems and may be used to choose an appropriate range of the side length of grids before conversion. The results of this study may help scientists and users to avoid mistakes caused by the change of location during conversion as well as in spatial decision and analysis.

  19. Optimization of pressure gauge locations for water distribution systems using entropy theory.

    PubMed

    Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon

    2012-12-01

    It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.

  20. Role of information theoretic uncertainty relations in quantum theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz; ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin; Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again,more » improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.« less

  1. [Study on infrared spectrum change of Ganoderma lucidum and its extracts].

    PubMed

    Chen, Zao-Xin; Xu, Yong-Qun; Chen, Xiao-Kang; Huang, Dong-Lan; Lu, Wen-Guan

    2013-05-01

    From the determination of the infrared spectra of four substances (original ganoderma lucidum and ganoderma lucidum water extract, 95% ethanol extract and petroleum ether extract), it was found that the infrared spectrum can carry systematic chemical information and basically reflects the distribution of each component of the analyte. Ganoderma lucidum and its extracts can be distinguished according to the absorption peak area ratio of 3 416-3 279, 1 541 and 723 cm(-1) to 2 935-2 852 cm(-1). A method of calculating the information entropy of the sample set with Euclidean distance was proposed, the relationship between the information entropy and the amount of chemical information carried by the sample set was discussed, and the authors come to a conclusion that sample set of original ganoderma lucidum carry the most abundant chemical information. The infrared spectrum set of original ganoderma lucidum has better clustering effect on ganoderma atrum, Cyan ganoderma, ganoderma multiplicatum and ganoderma lucidum when making hierarchical cluster analysis of 4 sample set. The results show that infrared spectrum carries the chemical information of the material structure and closely relates to the chemical composition of the system. The higher the value of information entropy, the much richer the chemical information and the more the benefit for pattern recognition. This study has a guidance function to the construction of the sample set in pattern recognition.

  2. The Conditional Entropy Power Inequality for Bosonic Quantum Systems

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo; Trevisan, Dario

    2018-06-01

    We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under the heat semigroup evolution. The beam-splitter and the squeezing are the central elements of quantum optics, and can model the attenuation, the amplification and the noise of electromagnetic signals. This conditional Entropy Power Inequality will have a strong impact in quantum information and quantum cryptography. Among its many possible applications there is the proof of a new uncertainty relation for the conditional Wehrl entropy.

  3. The Conditional Entropy Power Inequality for Bosonic Quantum Systems

    NASA Astrophysics Data System (ADS)

    De Palma, Giacomo; Trevisan, Dario

    2018-01-01

    We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under the heat semigroup evolution. The beam-splitter and the squeezing are the central elements of quantum optics, and can model the attenuation, the amplification and the noise of electromagnetic signals. This conditional Entropy Power Inequality will have a strong impact in quantum information and quantum cryptography. Among its many possible applications there is the proof of a new uncertainty relation for the conditional Wehrl entropy.

  4. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  5. Intrinsic Information Processing and Energy Dissipation in Stochastic Input-Output Dynamical Systems

    DTIC Science & Technology

    2015-07-09

    Crutchfield. Information Anatomy of Stochastic Equilibria, Entropy , (08 2014): 0. doi: 10.3390/e16094713 Virgil Griffith, Edwin Chong, Ryan James...Christopher Ellison, James Crutchfield. Intersection Information Based on Common Randomness, Entropy , (04 2014): 0. doi: 10.3390/e16041985 TOTAL: 5 Number...Learning Group Seminar, Complexity Sciences Center, UC Davis. Korana Burke and Greg Wimsatt (UCD), reviewed PRL “Measurement of Stochastic Entropy

  6. Impact of Information Entropy on Teaching Effectiveness

    ERIC Educational Resources Information Center

    Wang, Zhi-guo

    2007-01-01

    Information entropy refers to the process in which information is sent out from the information source, transmitted through information channel and acquired by information sink, while the teaching process is the one of transmitting teaching information from teachers and teaching material to students. How to improve teaching effectiveness is…

  7. Applications of quantum entropy to statistics

    NASA Astrophysics Data System (ADS)

    Silver, R. N.; Martz, H. F.

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.

  8. Two faces of entropy and information in biological systems.

    PubMed

    Mitrokhin, Yuriy

    2014-10-21

    The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Impaired information processing triggers altered states of consciousness.

    PubMed

    Fritzsche, M

    2002-04-01

    Schizophrenia, intoxication with tetrahydrocannabinol (Delta-THC), and cannabis psychosis induce characteristic time and space distortions suggesting a common psychotic dysfunction. Since genetic research into schizophrenia has led into disappointing dead ends, the present study is focusing on this phenotype. It is shown that information theory can account for the dynamical basis of higher sensorimotor information processing and consciousness under physiologic as well as pathologic conditions. If Kolmogorov entropy (inherent in the processing of action and time) breaks down in acute psychosis, it is predicted that Shannon entropy (inherent in the processing of higher dimensional perception) will increase, provoking positive symptoms and altered states of consciousness. In the search for candidate genes and the protection of vulnerable individuals from cannabis abuse, non-linear EEG analysis of Kolmogorov information could thus present us with a novel diagnostic tool to directly assess the breakdown of information processing in schizophrenia. Copyright 2002 Elsevier Science Ltd. All rights reserved.

  10. Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy

    NASA Astrophysics Data System (ADS)

    Shi, Yan; Lu, Zhenzhou; Zhou, Yicheng

    2018-06-01

    To analyse the component of fuzzy output entropy, a decomposition method of fuzzy output entropy is first presented. After the decomposition of fuzzy output entropy, the total fuzzy output entropy can be expressed as the sum of the component fuzzy entropy contributed by fuzzy inputs. Based on the decomposition of fuzzy output entropy, a new global sensitivity analysis model is established for measuring the effects of uncertainties of fuzzy inputs on the output. The global sensitivity analysis model can not only tell the importance of fuzzy inputs but also simultaneously reflect the structural composition of the response function to a certain degree. Several examples illustrate the validity of the proposed global sensitivity analysis, which is a significant reference in engineering design and optimization of structural systems.

  11. Entropy production in a Glauber–Ising irreversible model with dynamical competition

    NASA Astrophysics Data System (ADS)

    Barbosa, Oscar A.; Tomé, Tânia

    2018-06-01

    An out of equilibrium Glauber–Ising model, evolving in accordance with an irreversible and stochastic Markovian dynamics, is analyzed in order to improve our comprehension concerning critical behavior and phase transitions in nonequilibrium systems. Therefore, a lattice model ruled by the competition between two Glauber dynamics acting on interlaced square lattices is proposed. Previous results have shown how the entropy production provides information about irreversibility and criticality. Mean-field approximations and Monte Carlo simulations were used in the analysis. The results obtained here show a continuous phase transition, reflected in the entropy production as a logarithmic divergence of its derivative, which suggests a shared universality class with the irreversible models invariant under the symmetry operations of the Ising model.

  12. Upper entropy axioms and lower entropy axioms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi

    2015-04-15

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less

  13. Fault detection and diagnosis for gas turbines based on a kernelized information entropy model.

    PubMed

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.

  14. Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model

    PubMed Central

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms. PMID:25258726

  15. Bioinspired Resource Management for Multiple-Sensor Target Tracking Systems

    DTIC Science & Technology

    2011-06-20

    Section 2, we also present the Renyi o-entropy and a-divergence [13] that are extensively utilized in our information-theoretic approach (cf. [9] and...gain in information. The Renyi a-entropy provides a general scalar measure of uncertainty [10]: Ua (Slrft) = YZT^ 1(>g / ^ (XA’ I Zl:*^ (/XA:- (7...it follows that as a approaches unity, the Renyi a-entropy (7) reduces to the Shannon entropy: TMzi*) = Urni/Ha(zi;fc) = - / p(xk\\zhk)\\ogp{xk\\zi:k

  16. Brain entropy and human intelligence: A resting-state fMRI study

    PubMed Central

    Calderone, Daniel; Morales, Leah J.

    2018-01-01

    Human intelligence comprises comprehension of and reasoning about an infinitely variable external environment. A brain capable of large variability in neural configurations, or states, will more easily understand and predict variable external events. Entropy measures the variety of configurations possible within a system, and recently the concept of brain entropy has been defined as the number of neural states a given brain can access. This study investigates the relationship between human intelligence and brain entropy, to determine whether neural variability as reflected in neuroimaging signals carries information about intellectual ability. We hypothesize that intelligence will be positively associated with entropy in a sample of 892 healthy adults, using resting-state fMRI. Intelligence is measured with the Shipley Vocabulary and WASI Matrix Reasoning tests. Brain entropy was positively associated with intelligence. This relation was most strongly observed in the prefrontal cortex, inferior temporal lobes, and cerebellum. This relationship between high brain entropy and high intelligence indicates an essential role for entropy in brain functioning. It demonstrates that access to variable neural states predicts complex behavioral performance, and specifically shows that entropy derived from neuroimaging signals at rest carries information about intellectual capacity. Future work in this area may elucidate the links between brain entropy in both resting and active states and various forms of intelligence. This insight has the potential to provide predictive information about adaptive behavior and to delineate the subdivisions and nature of intelligence based on entropic patterns. PMID:29432427

  17. Brain entropy and human intelligence: A resting-state fMRI study.

    PubMed

    Saxe, Glenn N; Calderone, Daniel; Morales, Leah J

    2018-01-01

    Human intelligence comprises comprehension of and reasoning about an infinitely variable external environment. A brain capable of large variability in neural configurations, or states, will more easily understand and predict variable external events. Entropy measures the variety of configurations possible within a system, and recently the concept of brain entropy has been defined as the number of neural states a given brain can access. This study investigates the relationship between human intelligence and brain entropy, to determine whether neural variability as reflected in neuroimaging signals carries information about intellectual ability. We hypothesize that intelligence will be positively associated with entropy in a sample of 892 healthy adults, using resting-state fMRI. Intelligence is measured with the Shipley Vocabulary and WASI Matrix Reasoning tests. Brain entropy was positively associated with intelligence. This relation was most strongly observed in the prefrontal cortex, inferior temporal lobes, and cerebellum. This relationship between high brain entropy and high intelligence indicates an essential role for entropy in brain functioning. It demonstrates that access to variable neural states predicts complex behavioral performance, and specifically shows that entropy derived from neuroimaging signals at rest carries information about intellectual capacity. Future work in this area may elucidate the links between brain entropy in both resting and active states and various forms of intelligence. This insight has the potential to provide predictive information about adaptive behavior and to delineate the subdivisions and nature of intelligence based on entropic patterns.

  18. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

  19. Permutation auto-mutual information of electroencephalogram in anesthesia

    NASA Astrophysics Data System (ADS)

    Liang, Zhenhu; Wang, Yinghua; Ouyang, Gaoxiang; Voss, Logan J.; Sleigh, Jamie W.; Li, Xiaoli

    2013-04-01

    Objective. The dynamic change of brain activity in anesthesia is an interesting topic for clinical doctors and drug designers. To explore the dynamical features of brain activity in anesthesia, a permutation auto-mutual information (PAMI) method is proposed to measure the information coupling of electroencephalogram (EEG) time series obtained in anesthesia. Approach. The PAMI is developed and applied on EEG data collected from 19 patients under sevoflurane anesthesia. The results are compared with the traditional auto-mutual information (AMI), SynchFastSlow (SFS, derived from the BIS index), permutation entropy (PE), composite PE (CPE), response entropy (RE) and state entropy (SE). Performance of all indices is assessed by pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability. Main results. The PK/PD modeling and prediction probability analysis show that the PAMI index correlates closely with the anesthetic effect. The coefficient of determination R2 between PAMI values and the sevoflurane effect site concentrations, and the prediction probability Pk are higher in comparison with other indices. The information coupling in EEG series can be applied to indicate the effect of the anesthetic drug sevoflurane on the brain activity as well as other indices. The PAMI of the EEG signals is suggested as a new index to track drug concentration change. Significance. The PAMI is a useful index for analyzing the EEG dynamics during general anesthesia.

  20. Logistic Map for Cancellable Biometrics

    NASA Astrophysics Data System (ADS)

    Supriya, V. G., Dr; Manjunatha, Ramachandra, Dr

    2017-08-01

    This paper presents design and implementation of secured biometric template protection system by transforming the biometric template using binary chaotic signals and 3 different key streams to obtain another form of template and demonstrating its efficiency by the results and investigating on its security through analysis including, key space analysis, information entropy and key sensitivity analysis.

  1. Informational temperature concept and the nature of self-organization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Shu-Kun

    1996-12-31

    Self-organization phenomena are spontaneous processes. Their behavior should be governed by the second law of thermodynamics. The dissipative structure theory of the Prigogine school of thermodynamics claims that {open_quotes}order out of chaos{close_quotes} through {open_quotes}self-organization{close_quotes} and challenges the validity of the second law of thermodynamics. Unfortunately this theory is questionable. Therefore we have to reconsider the related fundamental theoretical problems. Informational entropy (S) and information (I) are related by S = S{sub max} - I, where S{sub max} is the maximum informational entropy. This conforms with the broadly accepted definition that entropy is the information loss. As informational entropy concept hasmore » been proved to be useful, it will be convenient to define an informational temperature, T{sub I}. This can be related to energy E and the informational entropy S. Information registration is a process of {Delta}I > 0, or {Delta}S < 0, and involves the energetically excited states ({Delta}E > 0). Therefore, T{sub I} is negative, and has the opposite sign of the conventional thermodynamic temperature, T. This concept is useful for clarifying the concepts of {open_quotes}order{close_quotes} and {open_quotes}disorder{close_quotes} of static structures and characterizing many typical information loss processes of self-organization.« less

  2. Entropy-based link prediction in weighted networks

    NASA Astrophysics Data System (ADS)

    Xu, Zhongqi; Pu, Cunlai; Ramiz Sharafat, Rajput; Li, Lunbo; Yang, Jian

    2017-01-01

    Information entropy has been proved to be an effective tool to quantify the structural importance of complex networks. In the previous work (Xu et al, 2016 \\cite{xu2016}), we measure the contribution of a path in link prediction with information entropy. In this paper, we further quantify the contribution of a path with both path entropy and path weight, and propose a weighted prediction index based on the contributions of paths, namely Weighted Path Entropy (WPE), to improve the prediction accuracy in weighted networks. Empirical experiments on six weighted real-world networks show that WPE achieves higher prediction accuracy than three typical weighted indices.

  3. On q-non-extensive statistics with non-Tsallisian entropy

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2016-02-01

    We combine an axiomatics of Rényi with the q-deformed version of Khinchin axioms to obtain a measure of information (i.e., entropy) which accounts both for systems with embedded self-similarity and non-extensivity. We show that the entropy thus obtained is uniquely solved in terms of a one-parameter family of information measures. The ensuing maximal-entropy distribution is phrased in terms of a special function known as the Lambert W-function. We analyze the corresponding "high" and "low-temperature" asymptotics and reveal a non-trivial structure of the parameter space. Salient issues such as concavity and Schur concavity of the new entropy are also discussed.

  4. Information entropy and dark energy evolution

    NASA Astrophysics Data System (ADS)

    Capozziello, Salvatore; Luongo, Orlando

    Here, the information entropy is investigated in the context of early and late cosmology under the hypothesis that distinct phases of universe evolution are entangled between them. The approach is based on the entangled state ansatz, representing a coarse-grained definition of primordial dark temperature associated to an effective entangled energy density. The dark temperature definition comes from assuming either Von Neumann or linear entropy as sources of cosmological thermodynamics. We interpret the involved information entropies by means of probabilities of forming structures during cosmic evolution. Following this recipe, we propose that quantum entropy is simply associated to the thermodynamical entropy and we investigate the consequences of our approach using the adiabatic sound speed. As byproducts, we analyze two phases of universe evolution: the late and early stages. To do so, we first recover that dark energy reduces to a pure cosmological constant, as zero-order entanglement contribution, and second that inflation is well-described by means of an effective potential. In both cases, we infer numerical limits which are compatible with current observations.

  5. Characterizing Brain Structures and Remodeling after TBI Based on Information Content, Diffusion Entropy

    PubMed Central

    Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P.; Zhang, Zheng Gang; Lehman, Norman L.; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan

    2013-01-01

    Background To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Methods Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Results Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Conclusions Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease. PMID:24143186

  6. Characterizing brain structures and remodeling after TBI based on information content, diffusion entropy.

    PubMed

    Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P; Zhang, Zheng Gang; Lehman, Norman L; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan

    2013-01-01

    To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease.

  7. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  8. Wavelet entropy: a new tool for analysis of short duration brain electrical signals.

    PubMed

    Rosso, O A; Blanco, S; Yordanova, J; Kolev, V; Figliola, A; Schürmann, M; Başar, E

    2001-01-30

    Since traditional electrical brain signal analysis is mostly qualitative, the development of new quantitative methods is crucial for restricting the subjectivity in the study of brain signals. These methods are particularly fruitful when they are strongly correlated with intuitive physical concepts that allow a better understanding of brain dynamics. Here, new method based on orthogonal discrete wavelet transform (ODWT) is applied. It takes as a basic element the ODWT of the EEG signal, and defines the relative wavelet energy, the wavelet entropy (WE) and the relative wavelet entropy (RWE). The relative wavelet energy provides information about the relative energy associated with different frequency bands present in the EEG and their corresponding degree of importance. The WE carries information about the degree of order/disorder associated with a multi-frequency signal response, and the RWE measures the degree of similarity between different segments of the signal. In addition, the time evolution of the WE is calculated to give information about the dynamics in the EEG records. Within this framework, the major objective of the present work was to characterize in a quantitative way functional dynamics of order/disorder microstates in short duration EEG signals. For that aim, spontaneous EEG signals under different physiological conditions were analyzed. Further, specific quantifiers were derived to characterize how stimulus affects electrical events in terms of frequency synchronization (tuning) in the event related potentials.

  9. An Information Theoretic Characterisation of Auditory Encoding

    PubMed Central

    Overath, Tobias; Cusack, Rhodri; Kumar, Sukhbinder; von Kriegstein, Katharina; Warren, Jason D; Grube, Manon; Carlyon, Robert P; Griffiths, Timothy D

    2007-01-01

    The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. By systematically varying the entropy of pitch sequences, we sought brain areas where neural activity and energetic demands increase as a function of entropy. Such a relationship is predicted to occur in an efficient encoding mechanism that uses less computational resource when less information is present in the signal: we specifically tested the hypothesis that such a relationship is present in the planum temporale (PT). In two convergent functional MRI studies, we demonstrated this relationship in PT for encoding, while furthermore showing that a distributed fronto-parietal network for retrieval of acoustic information is independent of entropy. The results establish PT as an efficient neural engine that demands less computational resource to encode redundant signals than those with high information content. PMID:17958472

  10. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    PubMed

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  11. Multiscale Symbolic Phase Transfer Entropy in Financial Time Series Classification

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    We address the challenge of classifying financial time series via a newly proposed multiscale symbolic phase transfer entropy (MSPTE). Using MSPTE method, we succeed to quantify the strength and direction of information flow between financial systems and classify financial time series, which are the stock indices from Europe, America and China during the period from 2006 to 2016 and the stocks of banking, aviation industry and pharmacy during the period from 2007 to 2016, simultaneously. The MSPTE analysis shows that the value of symbolic phase transfer entropy (SPTE) among stocks decreases with the increasing scale factor. It is demonstrated that MSPTE method can well divide stocks into groups by areas and industries. In addition, it can be concluded that the MSPTE analysis quantify the similarity among the stock markets. The symbolic phase transfer entropy (SPTE) between the two stocks from the same area is far less than the SPTE between stocks from different areas. The results also indicate that four stocks from America and Europe have relatively high degree of similarity and the stocks of banking and pharmaceutical industry have higher similarity for CA. It is worth mentioning that the pharmaceutical industry has weaker particular market mechanism than banking and aviation industry.

  12. Information thermodynamics of near-equilibrium computation

    NASA Astrophysics Data System (ADS)

    Prokopenko, Mikhail; Einav, Itai

    2015-06-01

    In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.

  13. Modeling Loop Entropy

    PubMed Central

    Chirikjian, Gregory S.

    2011-01-01

    Proteins fold from a highly disordered state into a highly ordered one. Traditionally, the folding problem has been stated as one of predicting ‘the’ tertiary structure from sequential information. However, new evidence suggests that the ensemble of unfolded forms may not be as disordered as once believed, and that the native form of many proteins may not be described by a single conformation, but rather an ensemble of its own. Quantifying the relative disorder in the folded and unfolded ensembles as an entropy difference may therefore shed light on the folding process. One issue that clouds discussions of ‘entropy’ is that many different kinds of entropy can be defined: entropy associated with overall translational and rotational Brownian motion, configurational entropy, vibrational entropy, conformational entropy computed in internal or Cartesian coordinates (which can even be different from each other), conformational entropy computed on a lattice; each of the above with different solvation and solvent models; thermodynamic entropy measured experimentally, etc. The focus of this work is the conformational entropy of coil/loop regions in proteins. New mathematical modeling tools for the approximation of changes in conformational entropy during transition from unfolded to folded ensembles are introduced. In particular, models for computing lower and upper bounds on entropy for polymer models of polypeptide coils both with and without end constraints are presented. The methods reviewed here include kinematics (the mathematics of rigid-body motions), classical statistical mechanics and information theory. PMID:21187223

  14. Noise and Complexity in Human Postural Control: Interpreting the Different Estimations of Entropy

    PubMed Central

    Rhea, Christopher K.; Silver, Tobin A.; Hong, S. Lee; Ryu, Joong Hyun; Studenka, Breanna E.; Hughes, Charmayne M. L.; Haddad, Jeffrey M.

    2011-01-01

    Background Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. Methods and Findings The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. Conclusions The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses. PMID:21437281

  15. A modified belief entropy in Dempster-Shafer framework.

    PubMed

    Zhou, Deyun; Tang, Yongchuan; Jiang, Wen

    2017-01-01

    How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What's more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.

  16. A modified belief entropy in Dempster-Shafer framework

    PubMed Central

    Zhou, Deyun; Jiang, Wen

    2017-01-01

    How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What’s more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method. PMID:28481914

  17. Hidden messenger revealed in Hawking radiation: A resolution to the paradox of black hole information loss

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Cai, Qing-yu; You, Li; Zhan, Ming-sheng

    2009-05-01

    Using standard statistical method, we discover the existence of correlations among Hawking radiations (of tunneled particles) from a black hole. The information carried by such correlations is quantified by mutual information between sequential emissions. Through a careful counting of the entropy taken out by the emitted particles, we show that the black hole radiation as tunneling is an entropy conservation process. While information is leaked out through the radiation, the total entropy is conserved. Thus, we conclude the black hole evaporation process is unitary.

  18. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    PubMed

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  19. GABAergic excitation of spider mechanoreceptors increases information capacity by increasing entropy rather than decreasing jitter.

    PubMed

    Pfeiffer, Keram; French, Andrew S

    2009-09-02

    Neurotransmitter chemicals excite or inhibit a range of sensory afferents and sensory pathways. These changes in firing rate or static sensitivity can also be associated with changes in dynamic sensitivity or membrane noise and thus action potential timing. We measured action potential firing produced by random mechanical stimulation of spider mechanoreceptor neurons during long-duration excitation by the GABAA agonist muscimol. Information capacity was estimated from signal-to-noise ratio by averaging responses to repeated identical stimulation sequences. Information capacity was also estimated from the coherence function between input and output signals. Entropy rate was estimated by a data compression algorithm and maximum entropy rate from the firing rate. Action potential timing variability, or jitter, was measured as normalized interspike interval distance. Muscimol increased firing rate, information capacity, and entropy rate, but jitter was unchanged. We compared these data with the effects of increasing firing rate by current injection. Our results indicate that the major increase in information capacity by neurotransmitter action arose from the increased entropy rate produced by increased firing rate, not from reduction in membrane noise and action potential jitter.

  20. Monitoring the informational efficiency of European corporate bond markets with dynamical permutation min-entropy

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2016-08-01

    In this paper the permutation min-entropy has been implemented to unveil the presence of temporal structures in the daily values of European corporate bond indices from April 2001 to August 2015. More precisely, the informational efficiency evolution of the prices of fifteen sectorial indices has been carefully studied by estimating this information-theory-derived symbolic tool over a sliding time window. Such a dynamical analysis makes possible to obtain relevant conclusions about the effect that the 2008 credit crisis has had on the different European corporate bond sectors. It is found that the informational efficiency of some sectors, namely banks, financial services, insurance, and basic resources, has been strongly reduced due to the financial crisis whereas another set of sectors, integrated by chemicals, automobiles, media, energy, construction, industrial goods & services, technology, and telecommunications has only suffered a transitory loss of efficiency. Last but not least, the food & beverage, healthcare, and utilities sectors show a behavior close to a random walk practically along all the period of analysis, confirming a remarkable immunity against the 2008 financial crisis.

  1. An information theory analysis of spatial decisions in cognitive development

    PubMed Central

    Scott, Nicole M.; Sera, Maria D.; Georgopoulos, Apostolos P.

    2015-01-01

    Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain) as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages 5 to 10 years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of “cognitive entropy” were defined based on two independent aspects of chunking, namely (1) the number of clusters formed at each age group, and (2) the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured “chunking” of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework. PMID:25698915

  2. Efficient optimization of the quantum relative entropy

    NASA Astrophysics Data System (ADS)

    Fawzi, Hamza; Fawzi, Omar

    2018-04-01

    Many quantum information measures can be written as an optimization of the quantum relative entropy between sets of states. For example, the relative entropy of entanglement of a state is the minimum relative entropy to the set of separable states. The various capacities of quantum channels can also be written in this way. We propose a unified framework to numerically compute these quantities using off-the-shelf semidefinite programming solvers, exploiting the approximation method proposed in Fawzi, Saunderson and Parrilo (2017 arXiv: 1705.00812). As a notable application, this method allows us to provide numerical counterexamples for a proposed lower bound on the quantum conditional mutual information in terms of the relative entropy of recovery.

  3. Regional Value Analysis at Threat Evaluation

    DTIC Science & Technology

    2014-06-01

    targets based on information entropy and fuzzy optimization theory. in Industrial Engineering and Engineering Management (IEEM), 2011 IEEE...Assignment by Virtual Permutation and Tabu Search Heuristics. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 2010

  4. Estimating Bayesian Phylogenetic Information Content

    PubMed Central

    Lewis, Paul O.; Chen, Ming-Hui; Kuo, Lynn; Lewis, Louise A.; Fučíková, Karolina; Neupane, Suman; Wang, Yu-Bo; Shi, Daoyuan

    2016-01-01

    Measuring the phylogenetic information content of data has a long history in systematics. Here we explore a Bayesian approach to information content estimation. The entropy of the posterior distribution compared with the entropy of the prior distribution provides a natural way to measure information content. If the data have no information relevant to ranking tree topologies beyond the information supplied by the prior, the posterior and prior will be identical. Information in data discourages consideration of some hypotheses allowed by the prior, resulting in a posterior distribution that is more concentrated (has lower entropy) than the prior. We focus on measuring information about tree topology using marginal posterior distributions of tree topologies. We show that both the accuracy and the computational efficiency of topological information content estimation improve with use of the conditional clade distribution, which also allows topological information content to be partitioned by clade. We explore two important applications of our method: providing a compelling definition of saturation and detecting conflict among data partitions that can negatively affect analyses of concatenated data. [Bayesian; concatenation; conditional clade distribution; entropy; information; phylogenetics; saturation.] PMID:27155008

  5. Information transfer across intra/inter-structure of CDS and stock markets

    NASA Astrophysics Data System (ADS)

    Lim, Kyuseong; Kim, Sehyun; Kim, Soo Yong

    2017-11-01

    We investigate the information flow between industrial sectors in credit default swap and stock markets in the United States based on transfer entropy. Both markets have been studied with respect to dynamics and relations. Our approach considers the intra-structure of each financial market as well as the inter-structure between two markets through a moving window in order to scan a period from 2005 to 2012. We examine the information transfer with different k, especially k = 3, k = 5 and k = 7. Analysis indicates that the cases with k = 3 and k = 7 show the opposite trends but similar characteristics. Change in transfer entropy for intra-structure of CDS market precedes that of stock market in view of the entire time windows. Abrupt rise and fall in inter-structural information transfer between two markets are detected at the periods related to the financial crises, which can be considered as early warnings.

  6. Rényi information flow in the Ising model with single-spin dynamics.

    PubMed

    Deng, Zehui; Wu, Jinshan; Guo, Wenan

    2014-12-01

    The n-index Rényi mutual information and transfer entropies for the two-dimensional kinetic Ising model with arbitrary single-spin dynamics in the thermodynamic limit are derived as functions of ensemble averages of observables and spin-flip probabilities. Cluster Monte Carlo algorithms with different dynamics from the single-spin dynamics are thus applicable to estimate the transfer entropies. By means of Monte Carlo simulations with the Wolff algorithm, we calculate the information flows in the Ising model with the Metropolis dynamics and the Glauber dynamics, respectively. We find that not only the global Rényi transfer entropy, but also the pairwise Rényi transfer entropy, peaks in the disorder phase.

  7. Characterization of time dynamical evolution of electroencephalographic epileptic records

    NASA Astrophysics Data System (ADS)

    Rosso, Osvaldo A.; Mairal, María. Liliana

    2002-09-01

    Since traditional electrical brain signal analysis is mostly qualitative, the development of new quantitative methods is crucial for restricting the subjectivity in the study of brain signals. These methods are particularly fruitful when they are strongly correlated with intuitive physical concepts that allow a better understanding of the brain dynamics. The processing of information by the brain is reflected in dynamical changes of the electrical activity in time, frequency, and space. Therefore, the concomitant studies require methods capable of describing the qualitative variation of the signal in both time and frequency. The entropy defined from the wavelet functions is a measure of the order/disorder degree present in a time series. In consequence, this entropy evaluates over EEG time series gives information about the underlying dynamical process in the brain, more specifically of the synchrony of the group cells involved in the different neural responses. The total wavelet entropy results independent of the signal energy and becomes a good tool for detecting dynamical changes in the system behavior. In addition the total wavelet entropy has advantages over the Lyapunov exponents, because it is parameter free and independent of the stationarity of the time series. In this work we compared the results of the time evolution of the chaoticity (Lyapunov exponent as a function of time) with the corresponding time evolution of the total wavelet entropy in two different EEG records, one provide by depth electrodes and other by scalp ones.

  8. Breakdown of local information processing may underlie isoflurane anesthesia effects.

    PubMed

    Wollstadt, Patricia; Sellers, Kristin K; Rudelt, Lucas; Priesemann, Viola; Hutt, Axel; Fröhlich, Flavio; Wibral, Michael

    2017-06-01

    The disruption of coupling between brain areas has been suggested as the mechanism underlying loss of consciousness in anesthesia. This hypothesis has been tested previously by measuring the information transfer between brain areas, and by taking reduced information transfer as a proxy for decoupling. Yet, information transfer is a function of the amount of information available in the information source-such that transfer decreases even for unchanged coupling when less source information is available. Therefore, we reconsidered past interpretations of reduced information transfer as a sign of decoupling, and asked whether impaired local information processing leads to a loss of information transfer. An important prediction of this alternative hypothesis is that changes in locally available information (signal entropy) should be at least as pronounced as changes in information transfer. We tested this prediction by recording local field potentials in two ferrets after administration of isoflurane in concentrations of 0.0%, 0.5%, and 1.0%. We found strong decreases in the source entropy under isoflurane in area V1 and the prefrontal cortex (PFC)-as predicted by our alternative hypothesis. The decrease in source entropy was stronger in PFC compared to V1. Information transfer between V1 and PFC was reduced bidirectionally, but with a stronger decrease from PFC to V1. This links the stronger decrease in information transfer to the stronger decrease in source entropy-suggesting reduced source entropy reduces information transfer. This conclusion fits the observation that the synaptic targets of isoflurane are located in local cortical circuits rather than on the synapses formed by interareal axonal projections. Thus, changes in information transfer under isoflurane seem to be a consequence of changes in local processing more than of decoupling between brain areas. We suggest that source entropy changes must be considered whenever interpreting changes in information transfer as decoupling.

  9. A two-phase copula entropy-based multiobjective optimization approach to hydrometeorological gauge network design

    NASA Astrophysics Data System (ADS)

    Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin

    2017-12-01

    Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.

  10. Renyi Entropies in Multiparticle Production

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyz, W.

    2000-12-01

    Renyi entropies are calculated for some multiparticle systems. Arguments are presented that measurements of Renyi entropies as functions of the average number of particles produced in high energy collisions carry important information on the produced system.

  11. Comparison of transfer entropy methods for financial time series

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian

    2017-09-01

    There is a certain relationship between the global financial markets, which creates an interactive network of global finance. Transfer entropy, a measurement for information transfer, offered a good way to analyse the relationship. In this paper, we analysed the relationship between 9 stock indices from the U.S., Europe and China (from 1995 to 2015) by using transfer entropy (TE), effective transfer entropy (ETE), Rényi transfer entropy (RTE) and effective Rényi transfer entropy (ERTE). We compared the four methods in the sense of the effectiveness for identification of the relationship between stock markets. In this paper, two kinds of information flows are given. One reveals that the U.S. took the leading position when in terms of lagged-current cases, but when it comes to the same date, China is the most influential. And ERTE could provide superior results.

  12. Vergence variability: a key to understanding oculomotor adaptability?

    PubMed

    Petrock, Annie Marie; Reisman, S; Alvarez, T

    2006-01-01

    Vergence eye movements were recorded from three different populations: healthy young (ages 18-35 years), adaptive presbyopic and non-adaptive presbyopic(the presbyopic groups aged above 45 years) to determine how the variability of the eye movements made by the populations differs. The variability was determined using Shannon Entropy calculations of Wavelet transform coefficients, to yield a non-linear analysis of the vergence movement variability. The data were then fed through a k-means clustering algorithm to classify each subject, with no a priori knowledge of true subject classification. The results indicate a highly significant difference in the total entropy values between the three groups, indicating a difference in the level of information content, and thus hypothetically the oculomotor adaptability, between the three groups.Further, the frequency distribution of the entropy varied across groups.

  13. Study of the cross-market effects of Brexit based on the improved symbolic transfer entropy GARCH model—An empirical analysis of stock–bond correlations

    PubMed Central

    Chen, Xiurong; Zhao, Rubo

    2017-01-01

    In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value. PMID:28817712

  14. Entropy-Based Analysis and Bioinformatics-Inspired Integration of Global Economic Information Transfer

    PubMed Central

    An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis. PMID:23300959

  15. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    PubMed

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  16. [The motive force of evolution based on the principle of organismal adjustment evolution.].

    PubMed

    Cao, Jia-Shu

    2010-08-01

    From the analysis of the existing problems of the prevalent theories of evolution, this paper discussed the motive force of evolution based on the knowledge of the principle of organismal adjustment evolution to get a new understanding of the evolution mechanism. In the guide of Schrodinger's theory - "life feeds on negative entropy", the author proposed that "negative entropy flow" actually includes material flow, energy flow and information flow, and the "negative entropy flow" is the motive force for living and development. By modifying my own theory of principle of organismal adjustment evolution (not adaptation evolution), a new theory of "regulation system of organismal adjustment evolution involved in DNA, RNA and protein interacting with environment" is proposed. According to the view that phylogenetic development is the "integral" of individual development, the difference of negative entropy flow between organisms and environment is considered to be a motive force for evolution, which is a new understanding of the mechanism of evolution. Based on such understanding, evolution is regarded as "a changing process that one subsystem passes all or part of its genetic information to the next generation in a larger system, and during the adaptation process produces some new elements, stops some old ones, and thereby lasts in the larger system". Some other controversial questions related to evolution are also discussed.

  17. Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety

    ERIC Educational Resources Information Center

    Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B.

    2012-01-01

    Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of…

  18. Discovery and Entropy in the Revision of Technical Reports.

    ERIC Educational Resources Information Center

    Marder, Daniel

    A useful device in revising technical reports is the metaphor of entropy, which refers to the amount of disorder that is present in a system. Applied to communication theory, high entropy would correspond to increased amounts of unfamiliar or useless information in a text. Since entropy in rhetorical systems increases with the unfamiliarity of…

  19. Determining Dynamical Path Distributions usingMaximum Relative Entropy

    DTIC Science & Technology

    2015-05-31

    entropy to a one-dimensional continuum labeled by a parameter η. The resulting η-entropies are equivalent to those proposed by Renyi [12] or by Tsallis [13...1995). [12] A. Renyi , “On measures of entropy and information,”Proc. 4th Berkeley Simposium on Mathematical Statistics and Probability, Vol 1, p. 547-461

  20. Transfer Entropy and Transient Limits of Computation

    PubMed Central

    Prokopenko, Mikhail; Lizier, Joseph T.

    2014-01-01

    Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation. PMID:24953547

  1. A diameter-sensitive flow entropy method for reliability consideration in water distribution system design

    NASA Astrophysics Data System (ADS)

    Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin

    2014-07-01

    Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.

  2. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  3. Entropy of orthogonal polynomials with Freud weights and information entropies of the harmonic oscillator potential

    NASA Astrophysics Data System (ADS)

    Van Assche, W.; Yáñez, R. J.; Dehesa, J. S.

    1995-08-01

    The information entropy of the harmonic oscillator potential V(x)=1/2λx2 in both position and momentum spaces can be expressed in terms of the so-called ``entropy of Hermite polynomials,'' i.e., the quantity Sn(H):= -∫-∞+∞H2n(x)log H2n(x) e-x2dx. These polynomials are instances of the polynomials orthogonal with respect to the Freud weights w(x)=exp(-||x||m), m≳0. Here, a very precise and general result of the entropy of Freud polynomials recently established by Aptekarev et al. [J. Math. Phys. 35, 4423-4428 (1994)], specialized to the Hermite kernel (case m=2), leads to an important refined asymptotic expression for the information entropies of very excited states (i.e., for large n) in both position and momentum spaces, to be denoted by Sρ and Sγ, respectively. Briefly, it is shown that, for large values of n, Sρ+1/2logλ≂log(π√2n/e)+o(1) and Sγ-1/2log λ≂log(π√2n/e)+o(1), so that Sρ+Sγ≂log(2π2n/e2)+o(1) in agreement with the generalized indetermination relation of Byalinicki-Birula and Mycielski [Commun. Math. Phys. 44, 129-132 (1975)]. Finally, the rate of convergence of these two information entropies is numerically analyzed. In addition, using a Rakhmanov result, we describe a totally new proof of the leading term of the entropy of Freud polynomials which, naturally, is just a weak version of the aforementioned general result.

  4. Information Theory to Probe Intrapartum Fetal Heart Rate Dynamics

    NASA Astrophysics Data System (ADS)

    Granero-Belinchon, Carlos; Roux, Stéphane; Abry, Patrice; Doret, Muriel; Garnier, Nicolas

    2017-11-01

    Intrapartum fetal heart rate (FHR) monitoring constitutes a reference tool in clinical practice to assess the baby health status and to detect fetal acidosis. It is usually analyzed by visual inspection grounded on FIGO criteria. Characterization of Intrapartum fetal heart rate temporal dynamics remains a challenging task and continuously receives academic research efforts. Complexity measures, often implemented with tools referred to as \\emph{Approximate Entropy} (ApEn) or \\emph{Sample Entropy} (SampEn), have regularly been reported as significant features for intrapartum FHR analysis. We explore how Information Theory, and especially {\\em auto mutual information} (AMI), is connected to ApEn and SampEn and can be used to probe FHR dynamics. Applied to a large (1404 subjects) and documented database of FHR data, collected in a French academic hospital, it is shown that i) auto mutual information outperforms ApEn and SampEn for acidosis detection in the first stage of labor and continues to yield the best performance in the second stage; ii) Shannon entropy increases as labor progresses, and is always much larger in the second stage;iii) babies suffering from fetal acidosis additionally show more structured temporal dynamics than healthy ones and that this progressive structuration can be used for early acidosis detection.

  5. Flood control project selection using an interval type-2 entropy weight with interval type-2 fuzzy TOPSIS

    NASA Astrophysics Data System (ADS)

    Zamri, Nurnadiah; Abdullah, Lazim

    2014-06-01

    Flood control project is a complex issue which takes economic, social, environment and technical attributes into account. Selection of the best flood control project requires the consideration of conflicting quantitative and qualitative evaluation criteria. When decision-makers' judgment are under uncertainty, it is relatively difficult for them to provide exact numerical values. The interval type-2 fuzzy set (IT2FS) is a strong tool which can deal with the uncertainty case of subjective, incomplete, and vague information. Besides, it helps to solve for some situations where the information about criteria weights for alternatives is completely unknown. Therefore, this paper is adopted the information interval type-2 entropy concept into the weighting process of interval type-2 fuzzy TOPSIS. This entropy weight is believed can effectively balance the influence of uncertainty factors in evaluating attribute. Then, a modified ranking value is proposed in line with the interval type-2 entropy weight. Quantitative and qualitative factors that normally linked with flood control project are considered for ranking. Data in form of interval type-2 linguistic variables were collected from three authorised personnel of three Malaysian Government agencies. Study is considered for the whole of Malaysia. From the analysis, it shows that diversion scheme yielded the highest closeness coefficient at 0.4807. A ranking can be drawn using the magnitude of closeness coefficient. It was indicated that the diversion scheme recorded the first rank among five causes.

  6. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis

    PubMed Central

    Awan, Imtiaz; Aziz, Wajid; Habib, Nazneen; Alowibdi, Jalal S.; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features. PMID:29771977

  7. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis.

    PubMed

    Awan, Imtiaz; Aziz, Wajid; Shah, Imran Hussain; Habib, Nazneen; Alowibdi, Jalal S; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features.

  8. Land quality, sustainable development and environmental degradation in agricultural districts: A computational approach based on entropy indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zambon, Ilaria, E-mail: ilaria.zambon@unitus.it; Colantoni, Andrea; Carlucci, Margherita

    Land Degradation (LD) in socio-environmental systems negatively impacts sustainable development paths. This study proposes a framework to LD evaluation based on indicators of diversification in the spatial distribution of sensitive land. We hypothesize that conditions for spatial heterogeneity in a composite index of land sensitivity are more frequently associated to areas prone to LD than spatial homogeneity. Spatial heterogeneity is supposed to be associated with degraded areas that act as hotspots for future degradation processes. A diachronic analysis (1960–2010) was performed at the Italian agricultural district scale to identify environmental factors associated with spatial heterogeneity in the degree of landmore » sensitivity to degradation based on the Environmentally Sensitive Area Index (ESAI). In 1960, diversification in the level of land sensitivity measured using two common indexes of entropy (Shannon's diversity and Pielou's evenness) increased significantly with the ESAI, indicating a high level of land sensitivity to degradation. In 2010, surface area classified as “critical” to LD was the highest in districts with diversification in the spatial distribution of ESAI values, confirming the hypothesis formulated above. Entropy indexes, based on observed alignment with the concept of LD, constitute a valuable base to inform mitigation strategies against desertification. - Highlights: • Spatial heterogeneity is supposed to be associated with degraded areas. • Entropy indexes can inform mitigation strategies against desertification. • Assessing spatial diversification in the degree of land sensitivity to degradation. • Mediterranean rural areas have an evident diversity in agricultural systems. • A diachronic analysis carried out at the Italian agricultural district scale.« less

  9. The smooth entropy formalism for von Neumann algebras

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Furrer, Fabian; Scholz, Volkher B.

    2016-01-01

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.

  10. Parametric scaling from species relative abundances to absolute abundances in the computation of biological diversity: a first proposal using Shannon's entropy.

    PubMed

    Ricotta, Carlo

    2003-01-01

    Traditional diversity measures such as the Shannon entropy are generally computed from the species' relative abundance vector of a given community to the exclusion of species' absolute abundances. In this paper, I first mention some examples where the total information content associated with a given community may be more adequate than Shannon's average information content for a better understanding of ecosystem functioning. Next, I propose a parametric measure of statistical information that contains both Shannon's entropy and total information content as special cases of this more general function.

  11. The smooth entropy formalism for von Neumann algebras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berta, Mario, E-mail: berta@caltech.edu; Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp; Scholz, Volkher B., E-mail: scholz@phys.ethz.ch

    2016-01-15

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.

  12. Entropy perspective on the thermal crossover in a fermionic Hubbard chain

    NASA Astrophysics Data System (ADS)

    Bonnes, Lars; Pichler, Hannes; Läuchli, Andreas M.

    2013-10-01

    We study the Renyi entropy in the finite-temperature crossover regime of a Hubbard chain using quantum Monte Carlo. The ground-state entropy has characteristic features such as a logarithmic divergence with block size and 2kF oscillations that are a hallmark of its Luttinger liquid nature. The interplay between the (extensive) thermal entropy and the ground-state features is studied and we analyze the temperature-induced decay of the amplitude of the oscillations as well as the scaling of the purity. Furthermore, we show how the spin and charge velocities can be extracted from the temperature dependence of the Renyi entropy, bridging our findings to recent experimental proposals on how to implement the measurement of Renyi entropies in the cold atom system. Studying the Renyi mutual information, we also demonstrate how constraints such as particle number conservation can induce persistent correlations visible in the mutual information even at high temperature.

  13. Studies on entanglement entropy for Hubbard model with hole-doping and external magnetic field [rapid communication

    NASA Astrophysics Data System (ADS)

    Yao, K. L.; Li, Y. C.; Sun, X. Z.; Liu, Q. M.; Qin, Y.; Fu, H. H.; Gao, G. Y.

    2005-10-01

    By using the density matrix renormalization group (DMRG) method for the one-dimensional (1D) Hubbard model, we have studied the von Neumann entropy of a quantum system, which describes the entanglement of the system block and the rest of the chain. It is found that there is a close relation between the entanglement entropy and properties of the system. The hole-doping can alter the charge charge and spin spin interactions, resulting in charge polarization along the chain. By comparing the results before and after the doping, we find that doping favors increase of the von Neumann entropy and thus also favors the exchange of information along the chain. Furthermore, we calculated the spin and entropy distribution in external magnetic filed. It is confirmed that both the charge charge and the spin spin interactions affect the exchange of information along the chain, making the entanglement entropy redistribute.

  14. Mathematical and information-geometrical entropy for phenomenological Fourier and non-Fourier heat conduction

    NASA Astrophysics Data System (ADS)

    Li, Shu-Nan; Cao, Bing-Yang

    2017-09-01

    The second law of thermodynamics governs the direction of heat transport, which provides the foundational definition of thermodynamic Clausius entropy. The definitions of entropy are further generalized for the phenomenological heat transport models in the frameworks of classical irreversible thermodynamics and extended irreversible thermodynamics (EIT). In this work, entropic functions from mathematics are combined with phenomenological heat conduction models and connected to several information-geometrical conceptions. The long-time behaviors of these mathematical entropies exhibit a wide diversity and physical pictures in phenomenological heat conductions, including the tendency to thermal equilibrium, and exponential decay of nonequilibrium and asymptotics, which build a bridge between the macroscopic and microscopic modelings. In contrast with the EIT entropies, the mathematical entropies expressed in terms of the internal energy function can avoid singularity paired with nonpositive local absolute temperature caused by non-Fourier heat conduction models.

  15. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  16. The coupling analysis between stock market indices based on permutation measures

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung

    2016-04-01

    Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  17. Fisher Information, Entropy, and the Second and Third Laws of Thermodynamics

    EPA Science Inventory

    We propose Fisher Information as a new calculable thermodynamic property that can be shown to follow the Second and the Third Laws of Thermodynamics. Fisher Information is, however, qualitatively different from entropy and potentially possessing a great deal more structure. Hence...

  18. Anorexia Nervosa: Analysis of Trabecular Texture with CT

    PubMed Central

    Tabari, Azadeh; Torriani, Martin; Miller, Karen K.; Klibanski, Anne; Kalra, Mannudeep K.

    2017-01-01

    Purpose To determine indexes of skeletal integrity by using computed tomographic (CT) trabecular texture analysis of the lumbar spine in patients with anorexia nervosa and normal-weight control subjects and to determine body composition predictors of trabecular texture. Materials and Methods This cross-sectional study was approved by the institutional review board and compliant with HIPAA. Written informed consent was obtained. The study included 30 women with anorexia nervosa (mean age ± standard deviation, 26 years ± 6) and 30 normal-weight age-matched women (control group). All participants underwent low-dose single-section quantitative CT of the L4 vertebral body with use of a calibration phantom. Trabecular texture analysis was performed by using software. Skewness (asymmetry of gray-level pixel distribution), kurtosis (pointiness of pixel distribution), entropy (inhomogeneity of pixel distribution), and mean value of positive pixels (MPP) were assessed. Bone mineral density and abdominal fat and paraspinal muscle areas were quantified with quantitative CT. Women with anorexia nervosa and normal-weight control subjects were compared by using the Student t test. Linear regression analyses were performed to determine associations between trabecular texture and body composition. Results Women with anorexia nervosa had higher skewness and kurtosis, lower MPP (P < .001), and a trend toward lower entropy (P = .07) compared with control subjects. Bone mineral density, abdominal fat area, and paraspinal muscle area were inversely associated with skewness and kurtosis and positively associated with MPP and entropy. Texture parameters, but not bone mineral density, were associated with lowest lifetime weight and duration of amenorrhea in anorexia nervosa. Conclusion Patients with anorexia nervosa had increased skewness and kurtosis and decreased entropy and MPP compared with normal-weight control subjects. These parameters were associated with lowest lifetime weight and duration of amenorrhea, but there were no such associations with bone mineral density. These findings suggest that trabecular texture analysis might contribute information about bone health in anorexia nervosa that is independent of that provided with bone mineral density. © RSNA, 2016 PMID:27797678

  19. Anorexia Nervosa: Analysis of Trabecular Texture with CT.

    PubMed

    Tabari, Azadeh; Torriani, Martin; Miller, Karen K; Klibanski, Anne; Kalra, Mannudeep K; Bredella, Miriam A

    2017-04-01

    Purpose To determine indexes of skeletal integrity by using computed tomographic (CT) trabecular texture analysis of the lumbar spine in patients with anorexia nervosa and normal-weight control subjects and to determine body composition predictors of trabecular texture. Materials and Methods This cross-sectional study was approved by the institutional review board and compliant with HIPAA. Written informed consent was obtained. The study included 30 women with anorexia nervosa (mean age ± standard deviation, 26 years ± 6) and 30 normal-weight age-matched women (control group). All participants underwent low-dose single-section quantitative CT of the L4 vertebral body with use of a calibration phantom. Trabecular texture analysis was performed by using software. Skewness (asymmetry of gray-level pixel distribution), kurtosis (pointiness of pixel distribution), entropy (inhomogeneity of pixel distribution), and mean value of positive pixels (MPP) were assessed. Bone mineral density and abdominal fat and paraspinal muscle areas were quantified with quantitative CT. Women with anorexia nervosa and normal-weight control subjects were compared by using the Student t test. Linear regression analyses were performed to determine associations between trabecular texture and body composition. Results Women with anorexia nervosa had higher skewness and kurtosis, lower MPP (P < .001), and a trend toward lower entropy (P = .07) compared with control subjects. Bone mineral density, abdominal fat area, and paraspinal muscle area were inversely associated with skewness and kurtosis and positively associated with MPP and entropy. Texture parameters, but not bone mineral density, were associated with lowest lifetime weight and duration of amenorrhea in anorexia nervosa. Conclusion Patients with anorexia nervosa had increased skewness and kurtosis and decreased entropy and MPP compared with normal-weight control subjects. These parameters were associated with lowest lifetime weight and duration of amenorrhea, but there were no such associations with bone mineral density. These findings suggest that trabecular texture analysis might contribute information about bone health in anorexia nervosa that is independent of that provided with bone mineral density. © RSNA, 2016.

  20. A Maximal Entropy Distribution Derivation of the Sharma-Taneja-Mittal Entropic Form

    NASA Astrophysics Data System (ADS)

    Scarfone, Antonio M.

    In this letter we derive the distribution maximizing the Sharma-Taneja-Mittal entropy under certain constrains by using an information inequality satisfied by the Br`egman divergence associated to this entropic form. The resulting maximal entropy distribution coincides with the one derived from the calculus according to the maximal entropy principle à la Jaynes.

  1. On variational definition of quantum entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belavkin, Roman V.

    Entropy of distribution P can be defined in at least three different ways: 1) as the expectation of the Kullback-Leibler (KL) divergence of P from elementary δ-measures (in this case, it is interpreted as expected surprise); 2) as a negative KL-divergence of some reference measure ν from the probability measure P; 3) as the supremum of Shannon’s mutual information taken over all channels such that P is the output probability, in which case it is dual of some transportation problem. In classical (i.e. commutative) probability, all three definitions lead to the same quantity, providing only different interpretations of entropy. Inmore » non-commutative (i.e. quantum) probability, however, these definitions are not equivalent. In particular, the third definition, where the supremum is taken over all entanglements of two quantum systems with P being the output state, leads to the quantity that can be twice the von Neumann entropy. It was proposed originally by V. Belavkin and Ohya [1] and called the proper quantum entropy, because it allows one to define quantum conditional entropy that is always non-negative. Here we extend these ideas to define also quantum counterpart of proper cross-entropy and cross-information. We also show inequality for the values of classical and quantum information.« less

  2. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  3. A trade-off between local and distributed information processing associated with remote episodic versus semantic memory.

    PubMed

    Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R

    2014-01-01

    Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

  4. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  5. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series

    PubMed Central

    Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764

  6. Interictal cardiorespiratory variability in temporal lobe and absence epilepsy in childhood.

    PubMed

    Varon, Carolina; Montalto, Alessandro; Jansen, Katrien; Lagae, Lieven; Marinazzo, Daniele; Faes, Luca; Van Huffel, Sabine

    2015-04-01

    It is well known that epilepsy has a profound effect on the autonomic nervous system, especially on the autonomic control of heart rate and respiration. This effect has been widely studied during seizure activity, but less attention has been given to interictal (i.e. seizure-free) activity. The studies that have been done on this topic, showed that heart rate and respiration can be affected individually, even without the occurrence of seizures. In this work, the interactions between these two individual physiological variables are analysed during interictal activity in temporal lobe and absence epilepsy in childhood. These interactions are assessed by decomposing the predictive information about heart rate variability, into different components like the transfer entropy, cross-entropy, self- entropy and the conditional self entropy. Each one of these components quantifies different types of shared information. However, when using the cross-entropy and the conditional self entropy, it is possible to split the information carried by the heart rate, into two main components, one related to respiration and one related to different mechanisms, like sympathetic activation. This can be done after assuming a directional link going from respiration to heart rate. After analysing all the entropy components, it is shown that in subjects with absence epilepsy the information shared by respiration and heart rate is significantly lower than for normal subjects. And a more remarkable finding indicates that this type of epilepsy seems to have a long term effect on the cardiac and respiratory control mechanisms of the autonomic nervous system.

  7. Thermodynamic and Differential Entropy under a Change of Variables

    PubMed Central

    Hnizdo, Vladimir; Gilson, Michael K.

    2013-01-01

    The differential Shannon entropy of information theory can change under a change of variables (coordinates), but the thermodynamic entropy of a physical system must be invariant under such a change. This difference is puzzling, because the Shannon and Gibbs entropies have the same functional form. We show that a canonical change of variables can, indeed, alter the spatial component of the thermodynamic entropy just as it alters the differential Shannon entropy. However, there is also a momentum part of the entropy, which turns out to undergo an equal and opposite change when the coordinates are transformed, so that the total thermodynamic entropy remains invariant. We furthermore show how one may correctly write the change in total entropy for an isothermal physical process in any set of spatial coordinates. PMID:24436633

  8. Parabolic replicator dynamics and the principle of minimum Tsallis information gain

    PubMed Central

    2013-01-01

    Background Non-linear, parabolic (sub-exponential) and hyperbolic (super-exponential) models of prebiological evolution of molecular replicators have been proposed and extensively studied. The parabolic models appear to be the most realistic approximations of real-life replicator systems due primarily to product inhibition. Unlike the more traditional exponential models, the distribution of individual frequencies in an evolving parabolic population is not described by the Maximum Entropy (MaxEnt) Principle in its traditional form, whereby the distribution with the maximum Shannon entropy is chosen among all the distributions that are possible under the given constraints. We sought to identify a more general form of the MaxEnt principle that would be applicable to parabolic growth. Results We consider a model of a population that reproduces according to the parabolic growth law and show that the frequencies of individuals in the population minimize the Tsallis relative entropy (non-additive information gain) at each time moment. Next, we consider a model of a parabolically growing population that maintains a constant total size and provide an “implicit” solution for this system. We show that in this case, the frequencies of the individuals in the population also minimize the Tsallis information gain at each moment of the ‘internal time” of the population. Conclusions The results of this analysis show that the general MaxEnt principle is the underlying law for the evolution of a broad class of replicator systems including not only exponential but also parabolic and hyperbolic systems. The choice of the appropriate entropy (information) function depends on the growth dynamics of a particular class of systems. The Tsallis entropy is non-additive for independent subsystems, i.e. the information on the subsystems is insufficient to describe the system as a whole. In the context of prebiotic evolution, this “non-reductionist” nature of parabolic replicator systems might reflect the importance of group selection and competition between ensembles of cooperating replicators. Reviewers This article was reviewed by Viswanadham Sridhara (nominated by Claus Wilke), Puushottam Dixit (nominated by Sergei Maslov), and Nick Grishin. For the complete reviews, see the Reviewers’ Reports section. PMID:23937956

  9. The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Wolf, David R.

    2004-01-01

    The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.

  10. Entropy for the Complexity of Physiological Signal Dynamics.

    PubMed

    Zhang, Xiaohua Douglas

    2017-01-01

    Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.

  11. Shannon entropies and Fisher information of K-shell electrons of neutral atoms

    NASA Astrophysics Data System (ADS)

    Sekh, Golam Ali; Saha, Aparna; Talukdar, Benoy

    2018-02-01

    We represent the two K-shell electrons of neutral atoms by Hylleraas-type wave function which fulfils the exact behavior at the electron-electron and electron-nucleus coalescence points and, derive a simple method to construct expressions for single-particle position- and momentum-space charge densities, ρ (r) and γ (p) respectively. We make use of the results for ρ (r) and γ (p) to critically examine the effect of correlation on bare (uncorrelated) values of Shannon information entropies (S) and of Fisher information (F) for the K-shell electrons of atoms from helium to neon. Due to inter-electronic repulsion the values of the uncorrelated Shannon position-space entropies are augmented while those of the momentum-space entropies are reduced. The corresponding Fisher information are found to exhibit opposite behavior in respect of this. Attempts are made to provide some plausible explanation for the observed response of S and F to electronic correlation.

  12. Self-organization and entropy reduction in a living cell.

    PubMed

    Davies, Paul C W; Rieper, Elisabeth; Tuszynski, Jack A

    2013-01-01

    In this paper we discuss the entropy and information aspects of a living cell. Particular attention is paid to the information gain on assembling and maintaining a living state. Numerical estimates of the information and entropy reduction are given and discussed in the context of the cell's metabolic activity. We discuss a solution to an apparent paradox that there is less information content in DNA than in the proteins that are assembled based on the genetic code encrypted in DNA. When energy input required for protein synthesis is accounted for, the paradox is clearly resolved. Finally, differences between biological information and instruction are discussed. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. A Joint Multitarget Estimator for the Joint Target Detection and Tracking Filter

    DTIC Science & Technology

    2015-06-27

    function is the information theoretic part of the problem and aims for entropy maximization, while the second one arises from the constraint in the...objective functions in conflict. The first objective function is the information theo- retic part of the problem and aims for entropy maximization...theory. For the sake of completeness and clarity, we also summarize how each concept is utilized later. Entropy : A random variable is statistically

  14. Assessment of risk of femoral neck fracture with radiographic texture parameters: a retrospective study.

    PubMed

    Thevenot, Jérôme; Hirvasniemi, Jukka; Pulkkinen, Pasi; Määttä, Mikko; Korpelainen, Raija; Saarakkala, Simo; Jämsä, Timo

    2014-07-01

    To investigate whether femoral neck fracture can be predicted retrospectively on the basis of clinical radiographs by using the combined analysis of bone geometry, textural analysis of trabecular bone, and bone mineral density (BMD). Formal ethics committee approval was obtained for the study, and all participants gave informed written consent. Pelvic radiographs and proximal femur BMD measurements were obtained in 53 women aged 79-82 years in 2006. By 2012, 10 of these patients had experienced a low-impact femoral neck fracture. A Laplacian-based semiautomatic custom algorithm was applied to the radiographs to calculate the texture parameters along the trabecular fibers in the lower neck area for all subjects. Intra- and interobserver reproducibility was calculated by using the root mean square average coefficient of variation to evaluate the robustness of the method. The best predictors of hip fracture were entropy (P = .007; reproducibility coefficient of variation < 1%), the neck-shaft angle (NSA) (P = .017), and the BMD (P = .13). For prediction of fracture, the area under the receiver operating characteristic curve was 0.753 for entropy, 0.608 for femoral neck BMD, and 0.698 for NSA. The area increased to 0.816 when entropy and NSA were combined and to 0.902 when entropy, NSA, and BMD were combined. Textural analysis of pelvic radiographs enables discrimination of patients at risk for femoral neck fracture, and our results show the potential of this conventional imaging method to yield better prediction than that achieved with dual-energy x-ray absorptiometry-based BMD. The combination of the entropy parameter with NSA and BMD can further enhance predictive accuracy. © RSNA, 2014.

  15. Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.

    PubMed

    Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L

    2008-06-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.

  16. Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures

    PubMed Central

    Peng, Chung-Kang; Goldberger, Ary L.

    2016-01-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763

  17. The epidemic spreading model and the direction of information flow in brain networks.

    PubMed

    Meier, J; Zhou, X; Hillebrand, A; Tewarie, P; Stam, C J; Van Mieghem, P

    2017-05-15

    The interplay between structural connections and emerging information flow in the human brain remains an open research problem. A recent study observed global patterns of directional information flow in empirical data using the measure of transfer entropy. For higher frequency bands, the overall direction of information flow was from posterior to anterior regions whereas an anterior-to-posterior pattern was observed in lower frequency bands. In this study, we applied a simple Susceptible-Infected-Susceptible (SIS) epidemic spreading model on the human connectome with the aim to reveal the topological properties of the structural network that give rise to these global patterns. We found that direct structural connections induced higher transfer entropy between two brain regions and that transfer entropy decreased with increasing distance between nodes (in terms of hops in the structural network). Applying the SIS model, we were able to confirm the empirically observed opposite information flow patterns and posterior hubs in the structural network seem to play a dominant role in the network dynamics. For small time scales, when these hubs acted as strong receivers of information, the global pattern of information flow was in the posterior-to-anterior direction and in the opposite direction when they were strong senders. Our analysis suggests that these global patterns of directional information flow are the result of an unequal spatial distribution of the structural degree between posterior and anterior regions and their directions seem to be linked to different time scales of the spreading process. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Applying Factor Analysis Combined with Kriging and Information Entropy Theory for Mapping and Evaluating the Stability of Groundwater Quality Variation in Taiwan

    PubMed Central

    Shyu, Guey-Shin; Cheng, Bai-You; Chiang, Chi-Ting; Yao, Pei-Hsuan; Chang, Tsun-Kuo

    2011-01-01

    In Taiwan many factors, whether geological parent materials, human activities, and climate change, can affect the groundwater quality and its stability. This work combines factor analysis and kriging with information entropy theory to interpret the stability of groundwater quality variation in Taiwan between 2005 and 2007. Groundwater quality demonstrated apparent differences between the northern and southern areas of Taiwan when divided by the Wu River. Approximately 52% of the monitoring wells in southern Taiwan suffered from progressing seawater intrusion, causing unstable groundwater quality. Industrial and livestock wastewaters also polluted 59.6% of the monitoring wells, resulting in elevated EC and TOC concentrations in the groundwater. In northern Taiwan, domestic wastewaters polluted city groundwater, resulting in higher NH3-N concentration and groundwater quality instability was apparent among 10.3% of the monitoring wells. The method proposed in this study for analyzing groundwater quality inspects common stability factors, identifies potential areas influenced by common factors, and assists in elevating and reinforcing information in support of an overall groundwater management strategy. PMID:21695030

  19. Information measures for a local quantum phase transition: Lattice fermions in a one-dimensional harmonic trap

    NASA Astrophysics Data System (ADS)

    Zhang, Yicheng; Vidmar, Lev; Rigol, Marcos

    2018-02-01

    We use quantum information measures to study the local quantum phase transition that occurs for trapped spinless fermions in one-dimensional lattices. We focus on the case of a harmonic confinement. The transition occurs upon increasing the characteristic density and results in the formation of a band-insulating domain in the center of the trap. We show that the ground-state bipartite entanglement entropy can be used as an order parameter to characterize this local quantum phase transition. We also study excited eigenstates by calculating the average von Neumann and second Renyi eigenstate entanglement entropies, and compare the results with the thermodynamic entropy and the mutual information of thermal states at the same energy density. While at low temperatures we observe a linear increase of the thermodynamic entropy with temperature at all characteristic densities, the average eigenstate entanglement entropies exhibit a strikingly different behavior as functions of temperature below and above the transition. They are linear in temperature below the transition but exhibit activated behavior above it. Hence, at nonvanishing energy densities above the ground state, the average eigenstate entanglement entropies carry fingerprints of the local quantum phase transition.

  20. Information and Entropy

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel

    2007-11-01

    What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.

  1. Information Flows? A Critique of Transfer Entropies

    NASA Astrophysics Data System (ADS)

    James, Ryan G.; Barnett, Nix; Crutchfield, James P.

    2016-06-01

    A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.

  2. Time-series analysis of sleep wake stage of rat EEG using time-dependent pattern entropy

    NASA Astrophysics Data System (ADS)

    Ishizaki, Ryuji; Shinba, Toshikazu; Mugishima, Go; Haraguchi, Hikaru; Inoue, Masayoshi

    2008-05-01

    We performed electroencephalography (EEG) for six male Wistar rats to clarify temporal behaviors at different levels of consciousness. Levels were identified both by conventional sleep analysis methods and by our novel entropy method. In our method, time-dependent pattern entropy is introduced, by which EEG is reduced to binary symbolic dynamics and the pattern of symbols in a sliding temporal window is considered. A high correlation was obtained between level of consciousness as measured by the conventional method and mean entropy in our entropy method. Mean entropy was maximal while awake (stage W) and decreased as sleep deepened. These results suggest that time-dependent pattern entropy may offer a promising method for future sleep research.

  3. Regional entropy of functional imaging signals varies differently in sensory and cognitive systems during propofol-modulated loss and return of behavioral responsiveness.

    PubMed

    Liu, Xiaolin; Lauer, Kathryn K; Ward, B Douglas; Roberts, Christopher J; Liu, Suyan; Gollapudy, Suneeta; Rohloff, Robert; Gross, William; Xu, Zhan; Chen, Shanshan; Wang, Lubin; Yang, Zheng; Li, Shi-Jiang; Binder, Jeffrey R; Hudetz, Anthony G

    2018-05-08

    The level and richness of consciousness depend on information integration in the brain. Altered interregional functional interactions may indicate disrupted information integration during anesthetic-induced unconsciousness. How anesthetics modulate the amount of information in various brain regions has received less attention. Here, we propose a novel approach to quantify regional information content in the brain by the entropy of the principal components of regional blood oxygen-dependent imaging signals during graded propofol sedation. Fifteen healthy individuals underwent resting-state scans in wakeful baseline, light sedation (conscious), deep sedation (unconscious), and recovery (conscious). Light sedation characterized by lethargic behavioral responses was associated with global reduction of entropy in the brain. Deep sedation with completely suppressed overt responsiveness was associated with further reductions of entropy in sensory (primary and higher sensory plus orbital prefrontal cortices) but not high-order cognitive (dorsal and medial prefrontal, cingulate, parietotemporal cortices and hippocampal areas) systems. Upon recovery of responsiveness, entropy was restored in the sensory but not in high-order cognitive systems. These findings provide novel evidence for a reduction of information content of the brain as a potential systems-level mechanism of reduced consciousness during propofol anesthesia. The differential changes of entropy in the sensory and high-order cognitive systems associated with losing and regaining overt responsiveness are consistent with the notion of "disconnected consciousness", in which a complete sensory-motor disconnection from the environment occurs with preserved internal mentation.

  4. Complexity and Entropy Analysis of DNMT1 Gene

    USDA-ARS?s Scientific Manuscript database

    Background: The application of complexity information on DNA sequence and protein in biological processes are well established in this study. Available sequences for DNMT1 gene, which is a maintenance methyltransferase is responsible for copying DNA methylation patterns to the daughter strands durin...

  5. Differential Entropy Preserves Variational Information of Near-Infrared Spectroscopy Time Series Associated With Working Memory.

    PubMed

    Keshmiri, Soheil; Sumioka, Hidenubo; Yamazaki, Ryuji; Ishiguro, Hiroshi

    2018-01-01

    Neuroscience research shows a growing interest in the application of Near-Infrared Spectroscopy (NIRS) in analysis and decoding of the brain activity of human subjects. Given the correlation that is observed between the Blood Oxygen Dependent Level (BOLD) responses that are exhibited by the time series data of functional Magnetic Resonance Imaging (fMRI) and the hemoglobin oxy/deoxy-genation that is captured by NIRS, linear models play a central role in these applications. This, in turn, results in adaptation of the feature extraction strategies that are well-suited for discretization of data that exhibit a high degree of linearity, namely, slope and the mean as well as their combination, to summarize the informational contents of the NIRS time series. In this article, we demonstrate that these features are inefficient in capturing the variational information of NIRS data, limiting the reliability and the adequacy of the conclusion on their results. Alternatively, we propose the linear estimate of differential entropy of these time series as a natural representation of such information. We provide evidence for our claim through comparative analysis of the application of these features on NIRS data pertinent to several working memory tasks as well as naturalistic conversational stimuli.

  6. EEG entropy measures in anesthesia

    PubMed Central

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277

  7. On determining absolute entropy without quantum theory or the third law of thermodynamics

    NASA Astrophysics Data System (ADS)

    Steane, Andrew M.

    2016-04-01

    We employ classical thermodynamics to gain information about absolute entropy, without recourse to statistical methods, quantum mechanics or the third law of thermodynamics. The Gibbs-Duhem equation yields various simple methods to determine the absolute entropy of a fluid. We also study the entropy of an ideal gas and the ionization of a plasma in thermal equilibrium. A single measurement of the degree of ionization can be used to determine an unknown constant in the entropy equation, and thus determine the absolute entropy of a gas. It follows from all these examples that the value of entropy at absolute zero temperature does not need to be assigned by postulate, but can be deduced empirically.

  8. Predicting protein β-sheet contacts using a maximum entropy-based correlated mutation measure.

    PubMed

    Burkoff, Nikolas S; Várnai, Csilla; Wild, David L

    2013-03-01

    The problem of ab initio protein folding is one of the most difficult in modern computational biology. The prediction of residue contacts within a protein provides a more tractable immediate step. Recently introduced maximum entropy-based correlated mutation measures (CMMs), such as direct information, have been successful in predicting residue contacts. However, most correlated mutation studies focus on proteins that have large good-quality multiple sequence alignments (MSA) because the power of correlated mutation analysis falls as the size of the MSA decreases. However, even with small autogenerated MSAs, maximum entropy-based CMMs contain information. To make use of this information, in this article, we focus not on general residue contacts but contacts between residues in β-sheets. The strong constraints and prior knowledge associated with β-contacts are ideally suited for prediction using a method that incorporates an often noisy CMM. Using contrastive divergence, a statistical machine learning technique, we have calculated a maximum entropy-based CMM. We have integrated this measure with a new probabilistic model for β-contact prediction, which is used to predict both residue- and strand-level contacts. Using our model on a standard non-redundant dataset, we significantly outperform a 2D recurrent neural network architecture, achieving a 5% improvement in true positives at the 5% false-positive rate at the residue level. At the strand level, our approach is competitive with the state-of-the-art single methods achieving precision of 61.0% and recall of 55.4%, while not requiring residue solvent accessibility as an input. http://www2.warwick.ac.uk/fac/sci/systemsbiology/research/software/

  9. Autonomous entropy-based intelligent experimental design

    NASA Astrophysics Data System (ADS)

    Malakar, Nabin Kumar

    2011-07-01

    The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same goal in an automated fashion.

  10. The Effects of Aging and Dual Tasking on Human Gait Complexity During Treadmill Walking: A Comparative Study Using Quantized Dynamical Entropy and Sample Entropy.

    PubMed

    Ahmadi, Samira; Wu, Christine; Sepehri, Nariman; Kantikar, Anuprita; Nankar, Mayur; Szturm, Tony

    2018-01-01

    Quantized dynamical entropy (QDE) has recently been proposed as a new measure to quantify the complexity of dynamical systems with the purpose of offering a better computational efficiency. This paper further investigates the viability of this method using five different human gait signals. These signals are recorded while normal walking and while performing secondary tasks among two age groups (young and older age groups). The results are compared with the outcomes of previously established sample entropy (SampEn) measure for the same signals. We also study how analyzing segmented and spatially and temporally normalized signal differs from analyzing whole data. Our findings show that human gait signals become more complex as people age and while they are cognitively loaded. Center of pressure (COP) displacement in mediolateral direction is the best signal for showing the gait changes. Moreover, the results suggest that by segmenting data, more information about intrastride dynamical features are obtained. Most importantly, QDE is shown to be a reliable measure for human gait complexity analysis.

  11. An Extension to Deng's Entropy in the Open World Assumption with an Application in Sensor Data Fusion.

    PubMed

    Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S

    2018-06-11

    Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  12. Approximate reversibility in the context of entropy gain, information gain, and complete positivity

    NASA Astrophysics Data System (ADS)

    Buscemi, Francesco; Das, Siddhartha; Wilde, Mark M.

    2016-06-01

    There are several inequalities in physics which limit how well we can process physical systems to achieve some intended goal, including the second law of thermodynamics, entropy bounds in quantum information theory, and the uncertainty principle of quantum mechanics. Recent results provide physically meaningful enhancements of these limiting statements, determining how well one can attempt to reverse an irreversible process. In this paper, we apply and extend these results to give strong enhancements to several entropy inequalities, having to do with entropy gain, information gain, entropic disturbance, and complete positivity of open quantum systems dynamics. Our first result is a remainder term for the entropy gain of a quantum channel. This result implies that a small increase in entropy under the action of a subunital channel is a witness to the fact that the channel's adjoint can be used as a recovery map to undo the action of the original channel. We apply this result to pure-loss, quantum-limited amplifier, and phase-insensitive quantum Gaussian channels, showing how a quantum-limited amplifier can serve as a recovery from a pure-loss channel and vice versa. Our second result regards the information gain of a quantum measurement, both without and with quantum side information. We find here that a small information gain implies that it is possible to undo the action of the original measurement if it is efficient. The result also has operational ramifications for the information-theoretic tasks known as measurement compression without and with quantum side information. Our third result shows that the loss of Holevo information caused by the action of a noisy channel on an input ensemble of quantum states is small if and only if the noise can be approximately corrected on average. We finally establish that the reduced dynamics of a system-environment interaction are approximately completely positive and trace preserving if and only if the data processing inequality holds approximately.

  13. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  14. Theoretical information measurement in nonrelativistic time-dependent approach

    NASA Astrophysics Data System (ADS)

    Najafizade, S. A.; Hassanabadi, H.; Zarrinkamar, S.

    2018-02-01

    The information-theoretic measures of time-dependent Schrödinger equation are investigated via the Shannon information entropy, variance and local Fisher quantities. In our calculations, we consider the two first states n = 0,1 and obtain the position Sx (t) and momentum Sp (t) Shannon entropies as well as Fisher information Ix (t) in position and momentum Ip (t) spaces. Using the Fourier transformed wave function, we obtain the results in momentum space. Some interesting features of the information entropy densities ρs (x,t) and γs (p,t), as well as the probability densities ρ (x,t) and γ (p,t) for time-dependent states are demonstrated. We establish a general relation between variance and Fisher's information. The Bialynicki-Birula-Mycielski inequality is tested and verified for the states n = 0,1.

  15. Effect of electrode contact area on the information content of the recorded electrogastrograms: An analysis based on Rényi entropy and Teager-Kaiser Energy

    NASA Astrophysics Data System (ADS)

    Alagumariappan, Paramasivam; Krishnamurthy, Kamalanand; Kandiah, Sundravadivelu; Ponnuswamy, Mannar Jawahar

    2017-06-01

    Electrogastrograms (EGG) are electrical signals originating from the digestive system, which are closely correlated with its mechanical activity. Electrogastrography is an efficient non-invasive method for examining the physiological and pathological states of the human digestive system. There are several factors such as fat conductivity, abdominal thickness, change in electrode surface area etc, which affects the quality of the recorded EGG signals. In this work, the effect of variations in the contact area of surface electrodes on the information content of the measured electrogastrograms is analyzed using Rényi entropy and Teager-Kaiser Energy (TKE). Two different circular cutaneous electrodes with approximate contact areas of 201.14 mm2 and 283.64 mm2, have been adopted and EGG signals were acquired using the standard three electrode protocol. Further, the information content of the measured EGG signals were analyzed using the computed values of entropy and energy. Results demonstrate that the information content of the measured EGG signals increases by 6.72% for an increase in the contact area of the surface electrode by 29.09%. Further, it was observed that the average energy increases with increase in the contact surface area. This work appears to be of high clinical significance since the accurate measurement of EGG signals without loss in its information content, is highly useful for the design of diagnostic assistance tools for automated diagnosis and mass screening of digestive disorders.

  16. Interaction entropy for protein-protein binding

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoxi; Yan, Yu N.; Yang, Maoyou; Zhang, John Z. H.

    2017-03-01

    Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interaction entropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interaction entropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.

  17. A novel rail defect detection method based on undecimated lifting wavelet packet transform and Shannon entropy-improved adaptive line enhancer

    NASA Astrophysics Data System (ADS)

    Hao, Qiushi; Zhang, Xin; Wang, Yan; Shen, Yi; Makis, Viliam

    2018-07-01

    Acoustic emission (AE) technology is sensitive to subliminal rail defects, however strong wheel-rail contact rolling noise under high-speed condition has gravely impeded detecting of rail defects using traditional denoising methods. In this context, the paper develops an adaptive detection method for rail cracks, which combines multiresolution analysis with an improved adaptive line enhancer (ALE). To obtain elaborate multiresolution information of transient crack signals with low computational cost, lifting scheme-based undecimated wavelet packet transform is adopted. In order to feature the impulsive property of crack signals, a Shannon entropy-improved ALE is proposed as a signal enhancing approach, where Shannon entropy is introduced to improve the cost function. Then a rail defect detection plan based on the proposed method for high-speed condition is put forward. From theoretical analysis and experimental verification, it is demonstrated that the proposed method has superior performance in enhancing the rail defect AE signal and reducing the strong background noise, offering an effective multiresolution approach for rail defect detection under high-speed and strong-noise condition.

  18. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  19. Maximally Informative Hierarchical Representations of High-Dimensional Data

    DTIC Science & Technology

    2015-05-11

    will be considered dis- crete but the domain of the X i ’s is not restricted. Entropy is defined in the usual way as H(X) ⌘ E X [log 1/p(x)]. We use...natural logarithms so that the unit of information is nats. Higher-order entropies can be constructed in various ways from this standard definition. For...sense, not truly high-dimensional and can be charac- terized separately. On the other hand, the entropy of X, H(X), can naively be considered the

  20. Analysis of the division of the urban-rural ecotone in the city of Zhuhai

    NASA Astrophysics Data System (ADS)

    Cui, Nan; Zhou, Sulong; Guo, Luo

    2018-02-01

    In this study, a high-resolution remote sensing image of downtown Zhuhai (2010) was used to analyze the division of the urban-rural ecotone. Based on the information entropy theory, the study analyzed the characteristics of the ecotone’s land use and entropy value distribution, the break entropy values of the inner and outer boundary, as determined by mutation detection, were 0.51 and 0.46, respectively, providing a range for the rough classification of the rural-urban ecotone. The results showed that the boundaries of the ecotone were dynamic and the landscape turbulence of the urban fringe in the section between rural and urban areas was greater than that of the core area and imagery area of Zhuhai city. We concluded that this study provided technical support for urban planning and administration in the city of Zhuhai.

  1. MuTE: a MATLAB toolbox to compare established and novel estimators of the multivariate transfer entropy.

    PubMed

    Montalto, Alessandro; Faes, Luca; Marinazzo, Daniele

    2014-01-01

    A challenge for physiologists and neuroscientists is to map information transfer between components of the systems that they study at different scales, in order to derive important knowledge on structure and function from the analysis of the recorded dynamics. The components of physiological networks often interact in a nonlinear way and through mechanisms which are in general not completely known. It is then safer that the method of choice for analyzing these interactions does not rely on any model or assumption on the nature of the data and their interactions. Transfer entropy has emerged as a powerful tool to quantify directed dynamical interactions. In this paper we compare different approaches to evaluate transfer entropy, some of them already proposed, some novel, and present their implementation in a freeware MATLAB toolbox. Applications to simulated and real data are presented.

  2. MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy

    PubMed Central

    Montalto, Alessandro; Faes, Luca; Marinazzo, Daniele

    2014-01-01

    A challenge for physiologists and neuroscientists is to map information transfer between components of the systems that they study at different scales, in order to derive important knowledge on structure and function from the analysis of the recorded dynamics. The components of physiological networks often interact in a nonlinear way and through mechanisms which are in general not completely known. It is then safer that the method of choice for analyzing these interactions does not rely on any model or assumption on the nature of the data and their interactions. Transfer entropy has emerged as a powerful tool to quantify directed dynamical interactions. In this paper we compare different approaches to evaluate transfer entropy, some of them already proposed, some novel, and present their implementation in a freeware MATLAB toolbox. Applications to simulated and real data are presented. PMID:25314003

  3. Mixture models with entropy regularization for community detection in networks

    NASA Astrophysics Data System (ADS)

    Chang, Zhenhai; Yin, Xianjun; Jia, Caiyan; Wang, Xiaoyang

    2018-04-01

    Community detection is a key exploratory tool in network analysis and has received much attention in recent years. NMM (Newman's mixture model) is one of the best models for exploring a range of network structures including community structure, bipartite and core-periphery structures, etc. However, NMM needs to know the number of communities in advance. Therefore, in this study, we have proposed an entropy regularized mixture model (called EMM), which is capable of inferring the number of communities and identifying network structure contained in a network, simultaneously. In the model, by minimizing the entropy of mixing coefficients of NMM using EM (expectation-maximization) solution, the small clusters contained little information can be discarded step by step. The empirical study on both synthetic networks and real networks has shown that the proposed model EMM is superior to the state-of-the-art methods.

  4. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  5. Information-Theoretical Complexity Analysis of Selected Elementary Chemical Reactions

    NASA Astrophysics Data System (ADS)

    Molina-Espíritu, M.; Esquivel, R. O.; Dehesa, J. S.

    We investigate the complexity of selected elementary chemical reactions (namely, the hydrogenic-abstraction reaction and the identity SN2 exchange reaction) by means of the following single and composite information-theoretic measures: disequilibrium (D), exponential entropy(L), Fisher information (I), power entropy (J), I-D, D-L and I-J planes and Fisher-Shannon (FS) and Lopez-Mancini-Calbet (LMC) shape complexities. These quantities, which are functionals of the one-particle density, are computed in both position (r) and momentum (p) spaces. The analysis revealed that the chemically significant regions of these reactions can be identified through most of the single information-theoretic measures and the two-component planes, not only the ones which are commonly revealed by the energy, such as the reactant/product (R/P) and the transition state (TS), but also those that are not present in the energy profile such as the bond cleavage energy region (BCER), the bond breaking/forming regions (B-B/F) and the charge transfer process (CT). The analysis of the complexities shows that the energy profile of the abstraction reaction bears the same information-theoretical features of the LMC and FS measures, however for the identity SN2 exchange reaction does not hold a simple behavior with respect to the LMC and FS measures. Most of the chemical features of interest (BCER, B-B/F and CT) are only revealed when particular information-theoretic aspects of localizability (L or J), uniformity (D) and disorder (I) are considered.

  6. Refined two-index entropy and multiscale analysis for complex system

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2016-10-01

    As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.

  7. Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations

    DTIC Science & Technology

    2017-08-21

    distributions, and we discuss some applications for engineered and biological information transmission systems. Keywords: information theory; minimum...of its interpretation as a measure of the amount of information communicable by a neural system to groups of downstream neurons. Previous authors...of the maximum entropy approach. Our results also have relevance for engineered information transmission systems. We show that empirically measured

  8. An improved method for predicting the evolution of the characteristic parameters of an information system

    NASA Astrophysics Data System (ADS)

    Dushkin, A. V.; Kasatkina, T. I.; Novoseltsev, V. I.; Ivanov, S. V.

    2018-03-01

    The article proposes a forecasting method that allows, based on the given values of entropy and error level of the first and second kind, to determine the allowable time for forecasting the development of the characteristic parameters of a complex information system. The main feature of the method under consideration is the determination of changes in the characteristic parameters of the development of the information system in the form of the magnitude of the increment in the ratios of its entropy. When a predetermined value of the prediction error ratio is reached, that is, the entropy of the system, the characteristic parameters of the system and the depth of the prediction in time are estimated. The resulting values of the characteristics and will be optimal, since at that moment the system possessed the best ratio of entropy as a measure of the degree of organization and orderliness of the structure of the system. To construct a method for estimating the depth of prediction, it is expedient to use the maximum principle of the value of entropy.

  9. Entropy of dynamical social networks

    NASA Astrophysics Data System (ADS)

    Zhao, Kun; Karsai, Marton; Bianconi, Ginestra

    2012-02-01

    Dynamical social networks are evolving rapidly and are highly adaptive. Characterizing the information encoded in social networks is essential to gain insight into the structure, evolution, adaptability and dynamics. Recently entropy measures have been used to quantify the information in email correspondence, static networks and mobility patterns. Nevertheless, we still lack methods to quantify the information encoded in time-varying dynamical social networks. In this talk we present a model to quantify the entropy of dynamical social networks and use this model to analyze the data of phone-call communication. We show evidence that the entropy of the phone-call interaction network changes according to circadian rhythms. Moreover we show that social networks are extremely adaptive and are modified by the use of technologies such as mobile phone communication. Indeed the statistics of duration of phone-call is described by a Weibull distribution and is significantly different from the distribution of duration of face-to-face interactions in a conference. Finally we investigate how much the entropy of dynamical social networks changes in realistic models of phone-call or face-to face interactions characterizing in this way different type human social behavior.

  10. Entropy and chemical change. 1: Characterization of product (and reactant) energy distributions in reactive molecular collisions: Information and enthropy deficiency

    NASA Technical Reports Server (NTRS)

    Bernstein, R. B.; Levine, R. D.

    1972-01-01

    Optimal means of characterizing the distribution of product energy states resulting from reactive collisions of molecules with restricted distributions of initial states are considered, along with those for characterizing the particular reactant state distribution which yields a given set of product states at a specified total energy. It is suggested to represent the energy-dependence of global-type results in the form of square-faced bar plots, and of data for specific-type experiments as triangular-faced prismatic plots. The essential parameters defining the internal state distribution are isolated, and the information content of such a distribution is put on a quantitative basis. The relationship between the information content, the surprisal, and the entropy of the continuous distribution is established. The concept of an entropy deficiency, which characterizes the specificity of product state formation, is suggested as a useful measure of the deviance from statistical behavior. The degradation of information by experimental averaging is considered, leading to bounds on the entropy deficiency.

  11. Reconstructing Historical Changes in Watersheds from Environmental Records: An Information Theory Approach

    NASA Astrophysics Data System (ADS)

    Guerrero, F. J.; Hatten, J. A.; Ruddell, B.; Penaranda, V.; Murillo, P.

    2015-12-01

    A 20% of the world's population is living in watersheds that suffer from water shortage. This situation has complex causes associated with historical changes in watersheds. However, disentangling the role of key drivers of water availability like climate change or land use practices is challenging. Part of the difficulty resides in that historical analysis is basically a process of empirical reconstruction from available environmental records (e.g. sediment cores or long-term hydrologic time series). We developed a mathematical approach, based on information theory, for historical reconstructions in watersheds. We analyze spectral entropies calculated directly or indirectly for sediment cores or long-term hydrologic time series respectively. Spectral entropy measures changes in Shannon's information of natural patterns (e.g. particle size distributions in lake bottoms or streamflow regimes) as they respond to different drivers. We illustrate the application of our approach with two case studies: a reconstruction of a time series of historical changes from a sediment core, and the detection of hydrologic alterations in watersheds associated to climate and forestry activities. In the first case we calculated spectral entropies from 700 sediment layers encompassing 1500 years of history in Loon Lake (Southern Oregon). In the second case, we calculated annual spectral entropies from daily discharge for the last 45 years in two experimental watersheds in the H. J. Andrews LTER site (Oregon Cascades). In Loon Lake our approach separated, without supervision, earthquakes from landslides and floods. It can also help to improve age models for sedimentary layers. At H. J. Andrews's sites our approach was able to identify hydrological alterations following a complete clear cut in 1975. It is also helpful to identify potential long-term impacts of these forestry activities, enhanced by climate change. Our results suggest that spectral entropy is central for translating between historical structural changes in natural patterns, and their timing and relevance in watershed history. Therefore, a more robust reconstruction of watershed history and a better identification of drivers for pressing environmental issues, seems possible under the framework of Shannon's information theory.

  12. Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers

    NASA Astrophysics Data System (ADS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-04-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  13. Influence of measurement error on Maxwell's demon

    NASA Astrophysics Data System (ADS)

    Sørdal, Vegard; Bergli, Joakim; Galperin, Y. M.

    2017-06-01

    In any general cycle of measurement, feedback, and erasure, the measurement will reduce the entropy of the system when information about the state is obtained, while erasure, according to Landauer's principle, is accompanied by a corresponding increase in entropy due to the compression of logical and physical phase space. The total process can in principle be fully reversible. A measurement error reduces the information obtained and the entropy decrease in the system. The erasure still gives the same increase in entropy, and the total process is irreversible. Another consequence of measurement error is that a bad feedback is applied, which further increases the entropy production if the proper protocol adapted to the expected error rate is not applied. We consider the effect of measurement error on a realistic single-electron box Szilard engine, and we find the optimal protocol for the cycle as a function of the desired power P and error ɛ .

  14. Entropy generation in Gaussian quantum transformations: applying the replica method to continuous-variable quantum information theory

    NASA Astrophysics Data System (ADS)

    Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.

    2016-02-01

    In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.

  15. Teaching Entropy Analysis in the First-Year High School Course and Beyond

    ERIC Educational Resources Information Center

    Bindel, Thomas H.

    2004-01-01

    A new method is presented, which educates and empowers the teachers and assists them in incorporating entropy analysis in their curricula and also provides an entropy-analysis unit that can be used in classrooms. The topics that the teachers can cover depending on the ability of the students and the comfort level of the teacher are included.

  16. From Ecology to Finance (and Back?): A Review on Entropy-Based Null Models for the Analysis of Bipartite Networks

    NASA Astrophysics Data System (ADS)

    Straka, Mika J.; Caldarelli, Guido; Squartini, Tiziano; Saracco, Fabio

    2018-04-01

    Bipartite networks provide an insightful representation of many systems, ranging from mutualistic networks of species interactions to investment networks in finance. The analyses of their topological structures have revealed the ubiquitous presence of properties which seem to characterize many—apparently different—systems. Nestedness, for example, has been observed in biological plant-pollinator as well as in country-product exportation networks. Due to the interdisciplinary character of complex networks, tools developed in one field, for example ecology, can greatly enrich other areas of research, such as economy and finance, and vice versa. With this in mind, we briefly review several entropy-based bipartite null models that have been recently proposed and discuss their application to real-world systems. The focus on these models is motivated by the fact that they show three very desirable features: analytical character, general applicability, and versatility. In this respect, entropy-based methods have been proven to perform satisfactorily both in providing benchmarks for testing evidence-based null hypotheses and in reconstructing unknown network configurations from partial information. Furthermore, entropy-based models have been successfully employed to analyze ecological as well as economic systems. As an example, the application of entropy-based null models has detected early-warning signals, both in economic and financial systems, of the 2007-2008 world crisis. Moreover, they have revealed a statistically-significant export specialization phenomenon of country export baskets in international trade, a result that seems to reconcile Ricardo's hypothesis in classical economics with recent findings on the (empirical) diversification industrial production at the national level. Finally, these null models have shown that the information contained in the nestedness is already accounted for by the degree sequence of the corresponding graphs.

  17. Breakdown of local information processing may underlie isoflurane anesthesia effects

    PubMed Central

    Sellers, Kristin K.; Priesemann, Viola; Hutt, Axel

    2017-01-01

    The disruption of coupling between brain areas has been suggested as the mechanism underlying loss of consciousness in anesthesia. This hypothesis has been tested previously by measuring the information transfer between brain areas, and by taking reduced information transfer as a proxy for decoupling. Yet, information transfer is a function of the amount of information available in the information source—such that transfer decreases even for unchanged coupling when less source information is available. Therefore, we reconsidered past interpretations of reduced information transfer as a sign of decoupling, and asked whether impaired local information processing leads to a loss of information transfer. An important prediction of this alternative hypothesis is that changes in locally available information (signal entropy) should be at least as pronounced as changes in information transfer. We tested this prediction by recording local field potentials in two ferrets after administration of isoflurane in concentrations of 0.0%, 0.5%, and 1.0%. We found strong decreases in the source entropy under isoflurane in area V1 and the prefrontal cortex (PFC)—as predicted by our alternative hypothesis. The decrease in source entropy was stronger in PFC compared to V1. Information transfer between V1 and PFC was reduced bidirectionally, but with a stronger decrease from PFC to V1. This links the stronger decrease in information transfer to the stronger decrease in source entropy—suggesting reduced source entropy reduces information transfer. This conclusion fits the observation that the synaptic targets of isoflurane are located in local cortical circuits rather than on the synapses formed by interareal axonal projections. Thus, changes in information transfer under isoflurane seem to be a consequence of changes in local processing more than of decoupling between brain areas. We suggest that source entropy changes must be considered whenever interpreting changes in information transfer as decoupling. PMID:28570661

  18. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  19. Shannon entropy and particle decays

    NASA Astrophysics Data System (ADS)

    Carrasco Millán, Pedro; García-Ferrero, M. Ángeles; Llanes-Estrada, Felipe J.; Porras Riojano, Ana; Sánchez García, Esteban M.

    2018-05-01

    We deploy Shannon's information entropy to the distribution of branching fractions in a particle decay. This serves to quantify how important a given new reported decay channel is, from the point of view of the information that it adds to the already known ones. Because the entropy is additive, one can subdivide the set of channels and discuss, for example, how much information the discovery of a new decay branching would add; or subdivide the decay distribution down to the level of individual quantum states (which can be quickly counted by the phase space). We illustrate the concept with some examples of experimentally known particle decay distributions.

  20. Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Barrett, Adam B.; Seth, Anil K.

    2009-12-01

    Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.

  1. Combined Population Dynamics and Entropy Modelling Supports Patient Stratification in Chronic Myeloid Leukemia

    NASA Astrophysics Data System (ADS)

    Brehme, Marc; Koschmieder, Steffen; Montazeri, Maryam; Copland, Mhairi; Oehler, Vivian G.; Radich, Jerald P.; Brümmendorf, Tim H.; Schuppert, Andreas

    2016-04-01

    Modelling the parameters of multistep carcinogenesis is key for a better understanding of cancer progression, biomarker identification and the design of individualized therapies. Using chronic myeloid leukemia (CML) as a paradigm for hierarchical disease evolution we show that combined population dynamic modelling and CML patient biopsy genomic analysis enables patient stratification at unprecedented resolution. Linking CD34+ similarity as a disease progression marker to patient-derived gene expression entropy separated established CML progression stages and uncovered additional heterogeneity within disease stages. Importantly, our patient data informed model enables quantitative approximation of individual patients’ disease history within chronic phase (CP) and significantly separates “early” from “late” CP. Our findings provide a novel rationale for personalized and genome-informed disease progression risk assessment that is independent and complementary to conventional measures of CML disease burden and prognosis.

  2. Entropy and complexity analysis of hydrogenic Rydberg atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez-Rosa, S.; Departamento de Fisica Aplicada II, Universidad de Sevilla, 41012-Sevilla; Toranzo, I. V.

    The internal disorder of hydrogenic Rydberg atoms as contained in their position and momentum probability densities is examined by means of the following information-theoretic spreading quantities: the radial and logarithmic expectation values, the Shannon entropy, and the Fisher information. As well, the complexity measures of Cramer-Rao, Fisher-Shannon, and Lopez Ruiz-Mancini-Calvet types are investigated in both reciprocal spaces. The leading term of these quantities is rigorously calculated by use of the asymptotic properties of the concomitant entropic functionals of the Laguerre and Gegenbauer orthogonal polynomials which control the wavefunctions of the Rydberg states in both position and momentum spaces. The associatedmore » generalized Heisenberg-like, logarithmic and entropic uncertainty relations are also given. Finally, application to linear (l= 0), circular (l=n- 1), and quasicircular (l=n- 2) states is explicitly done.« less

  3. Affine Isoperimetry and Information Theoretic Inequalities

    ERIC Educational Resources Information Center

    Lv, Songjun

    2012-01-01

    There are essential connections between the isoperimetric theory and information theoretic inequalities. In general, the Brunn-Minkowski inequality and the entropy power inequality, as well as the classical isoperimetric inequality and the classical entropy-moment inequality, turn out to be equivalent in some certain sense, respectively. Based on…

  4. Application of online measures to monitor and evaluate multiplatform fusion performance

    NASA Astrophysics Data System (ADS)

    Stubberud, Stephen C.; Kowalski, Charlene; Klamer, Dale M.

    1999-07-01

    A primary concern of multiplatform data fusion is assessing the quality and utility of data shared among platforms. Constraints such as platform and sensor capability and task load necessitate development of an on-line system that computes a metric to determine which other platform can provide the best data for processing. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. Entropy measures quality of processed information such as localization, classification, and ambiguity in measurement-to-track association. Lower entropy scores imply less uncertainty about a particular target. When new information is provided, we compuete the level of improvement a particular track obtains from one measurement to another. The measure permits us to evaluate the utility of the new information. We couple entropy with intelligent agents that provide two main data gathering functions: estimation of another platform's performance and evaluation of the new measurement data's quality. Both functions result from the entropy metric. The intelligent agent on a platform makes an estimate of another platform's measurement and provides it to its own fusion system, which can then incorporate it, for a particular target. A resulting entropy measure is then calculated and returned to its own agent. From this metric, the agent determines a perceived value of the offboard platform's measurement. If the value is satisfactory, the agent requests the measurement from the other platform, usually by interacting with the other platform's agent. Once the actual measurement is received, again entropy is computed and the agent assesses its estimation process and refines it accordingly.

  5. How to Calculate Renyi Entropy from Heart Rate Variability, and Why it Matters for Detecting Cardiac Autonomic Neuropathy.

    PubMed

    Cornforth, David J; Tarvainen, Mika P; Jelinek, Herbert F

    2014-01-01

    Cardiac autonomic neuropathy (CAN) is a disease that involves nerve damage leading to an abnormal control of heart rate. An open question is to what extent this condition is detectable from heart rate variability (HRV), which provides information only on successive intervals between heart beats, yet is non-invasive and easy to obtain from a three-lead ECG recording. A variety of measures may be extracted from HRV, including time domain, frequency domain, and more complex non-linear measures. Among the latter, Renyi entropy has been proposed as a suitable measure that can be used to discriminate CAN from controls. However, all entropy methods require estimation of probabilities, and there are a number of ways in which this estimation can be made. In this work, we calculate Renyi entropy using several variations of the histogram method and a density method based on sequences of RR intervals. In all, we calculate Renyi entropy using nine methods and compare their effectiveness in separating the different classes of participants. We found that the histogram method using single RR intervals yields an entropy measure that is either incapable of discriminating CAN from controls, or that it provides little information that could not be gained from the SD of the RR intervals. In contrast, probabilities calculated using a density method based on sequences of RR intervals yield an entropy measure that provides good separation between groups of participants and provides information not available from the SD. The main contribution of this work is that different approaches to calculating probability may affect the success of detecting disease. Our results bring new clarity to the methods used to calculate the Renyi entropy in general, and in particular, to the successful detection of CAN.

  6. How to Calculate Renyi Entropy from Heart Rate Variability, and Why it Matters for Detecting Cardiac Autonomic Neuropathy

    PubMed Central

    Cornforth, David J.;  Tarvainen, Mika P.; Jelinek, Herbert F.

    2014-01-01

    Cardiac autonomic neuropathy (CAN) is a disease that involves nerve damage leading to an abnormal control of heart rate. An open question is to what extent this condition is detectable from heart rate variability (HRV), which provides information only on successive intervals between heart beats, yet is non-invasive and easy to obtain from a three-lead ECG recording. A variety of measures may be extracted from HRV, including time domain, frequency domain, and more complex non-linear measures. Among the latter, Renyi entropy has been proposed as a suitable measure that can be used to discriminate CAN from controls. However, all entropy methods require estimation of probabilities, and there are a number of ways in which this estimation can be made. In this work, we calculate Renyi entropy using several variations of the histogram method and a density method based on sequences of RR intervals. In all, we calculate Renyi entropy using nine methods and compare their effectiveness in separating the different classes of participants. We found that the histogram method using single RR intervals yields an entropy measure that is either incapable of discriminating CAN from controls, or that it provides little information that could not be gained from the SD of the RR intervals. In contrast, probabilities calculated using a density method based on sequences of RR intervals yield an entropy measure that provides good separation between groups of participants and provides information not available from the SD. The main contribution of this work is that different approaches to calculating probability may affect the success of detecting disease. Our results bring new clarity to the methods used to calculate the Renyi entropy in general, and in particular, to the successful detection of CAN. PMID:25250311

  7. Generating intrinsically disordered protein conformational ensembles from a Markov chain

    NASA Astrophysics Data System (ADS)

    Cukier, Robert I.

    2018-03-01

    Intrinsically disordered proteins (IDPs) sample a diverse conformational space. They are important to signaling and regulatory pathways in cells. An entropy penalty must be payed when an IDP becomes ordered upon interaction with another protein or a ligand. Thus, the degree of conformational disorder of an IDP is of interest. We create a dichotomic Markov model that can explore entropic features of an IDP. The Markov condition introduces local (neighbor residues in a protein sequence) rotamer dependences that arise from van der Waals and other chemical constraints. A protein sequence of length N is characterized by its (information) entropy and mutual information, MIMC, the latter providing a measure of the dependence among the random variables describing the rotamer probabilities of the residues that comprise the sequence. For a Markov chain, the MIMC is proportional to the pair mutual information MI which depends on the singlet and pair probabilities of neighbor residue rotamer sampling. All 2N sequence states are generated, along with their probabilities, and contrasted with the probabilities under the assumption of independent residues. An efficient method to generate realizations of the chain is also provided. The chain entropy, MIMC, and state probabilities provide the ingredients to distinguish different scenarios using the terminologies: MoRF (molecular recognition feature), not-MoRF, and not-IDP. A MoRF corresponds to large entropy and large MIMC (strong dependence among the residues' rotamer sampling), a not-MoRF corresponds to large entropy but small MIMC, and not-IDP corresponds to low entropy irrespective of the MIMC. We show that MorFs are most appropriate as descriptors of IDPs. They provide a reasonable number of high-population states that reflect the dependences between neighbor residues, thus classifying them as IDPs, yet without very large entropy that might lead to a too high entropy penalty.

  8. Expected Shannon Entropy and Shannon Differentiation between Subpopulations for Neutral Genes under the Finite Island Model.

    PubMed

    Chao, Anne; Jost, Lou; Hsieh, T C; Ma, K H; Sherwin, William B; Rollins, Lee Ann

    2015-01-01

    Shannon entropy H and related measures are increasingly used in molecular ecology and population genetics because (1) unlike measures based on heterozygosity or allele number, these measures weigh alleles in proportion to their population fraction, thus capturing a previously-ignored aspect of allele frequency distributions that may be important in many applications; (2) these measures connect directly to the rich predictive mathematics of information theory; (3) Shannon entropy is completely additive and has an explicitly hierarchical nature; and (4) Shannon entropy-based differentiation measures obey strong monotonicity properties that heterozygosity-based measures lack. We derive simple new expressions for the expected values of the Shannon entropy of the equilibrium allele distribution at a neutral locus in a single isolated population under two models of mutation: the infinite allele model and the stepwise mutation model. Surprisingly, this complex stochastic system for each model has an entropy expressable as a simple combination of well-known mathematical functions. Moreover, entropy- and heterozygosity-based measures for each model are linked by simple relationships that are shown by simulations to be approximately valid even far from equilibrium. We also identify a bridge between the two models of mutation. We apply our approach to subdivided populations which follow the finite island model, obtaining the Shannon entropy of the equilibrium allele distributions of the subpopulations and of the total population. We also derive the expected mutual information and normalized mutual information ("Shannon differentiation") between subpopulations at equilibrium, and identify the model parameters that determine them. We apply our measures to data from the common starling (Sturnus vulgaris) in Australia. Our measures provide a test for neutrality that is robust to violations of equilibrium assumptions, as verified on real world data from starlings.

  9. Discrete-time entropy formulation of optimal and adaptive control problems

    NASA Technical Reports Server (NTRS)

    Tsai, Yweting A.; Casiello, Francisco A.; Loparo, Kenneth A.

    1992-01-01

    The discrete-time version of the entropy formulation of optimal control of problems developed by G. N. Saridis (1988) is discussed. Given a dynamical system, the uncertainty in the selection of the control is characterized by the probability distribution (density) function which maximizes the total entropy. The equivalence between the optimal control problem and the optimal entropy problem is established, and the total entropy is decomposed into a term associated with the certainty equivalent control law, the entropy of estimation, and the so-called equivocation of the active transmission of information from the controller to the estimator. This provides a useful framework for studying the certainty equivalent and adaptive control laws.

  10. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    PubMed Central

    Weiss, Brandi A.; Dardick, William

    2015-01-01

    This article introduces an entropy-based measure of data–model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data–model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data–model fit to assess how well logistic regression models classify cases into observed categories. PMID:29795897

  11. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression.

    PubMed

    Weiss, Brandi A; Dardick, William

    2016-12-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data-model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data-model fit to assess how well logistic regression models classify cases into observed categories.

  12. Fisher information and Rényi entropies in dynamical systems.

    PubMed

    Godó, B; Nagy, Á

    2017-07-01

    The link between the Fisher information and Rényi entropies is explored. The relationship is based on a thermodynamical formalism based on Fisher information with a parameter, β, which is interpreted as the inverse temperature. The Fisher heat capacity is defined and found to be sensitive to changes of higher order than the analogous quantity in the conventional formulation.

  13. Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms - Examples of Entropy Increase? Nonsense!

    NASA Astrophysics Data System (ADS)

    Lambert, Frank L.

    1999-10-01

    The order of presentation in this article is unusual; its conclusion is first. This is done because the title entails text and lecture examples so familiar to all teachers that most may find a preliminary discussion redundant. Conclusion The dealer shuffling cards in Monte Carlo or Las Vegas, the professor who mixes the papers and books on a desk, the student who tosses clothing about his or her room, the fuel for the huge cranes and trucks that would be necessary to move the nonbonded stones of the Great Pyramid of Cheops all across Egypteach undergoes physical, thermodynamic entropy increase in these specific processes. The thermodynamic entropy change from human-defined order to disorder in the giant Egyptian stones themselves, in the clothing and books in a room or papers on a desk, and in the millions of cards in the world's casinos is precisely the same: Zero. K. G. Denbigh succinctly summarizes the case against identifying changes in position in one macro object or in a group with physical entropy change (1): If one wishes to substantiate a claim or a guess that some particular process involves a change of thermodynamic or statistical entropy, one should ask oneself whether there exists a reversible heat effect, or a change in the number of accessible energy eigenstates, pertaining to the process in question. If not, there has been no change of physical entropy (even though there may have been some change in our "information"). Thus, simply changing the location of everyday macro objects from an arrangement that we commonly judge as orderly (relatively singular) to one that appears disorderly (relatively probable) is a "zero change" in the thermodynamic entropy of the objects because the number of accessible energetic microstates in any of them has not been changed. Finally, although it may appear obvious, a collection of ordinary macro things does not constitute a thermodynamic system as does a group of microparticles. The crucial difference is that such things are not ceaselessly colliding and exchanging energy under the thermal dominance of their environment as are microparticles. A postulate can be derived from this fundamental criterion: The movement of macro objects from one location to another by an external agent involves no change in the objects' physical (thermodynamic) entropy. The agent of movement undergoes a thermodynamic entropy increase in the process. A needed corollary, considering the number of erroneous statements in print, is: There is no spontaneous tendency in groups of macro objects to become disorderly or randomly scattered. The tendency in nature toward increased entropy does not reside in the arrangement of any chemically unchanging objects but rather in the external agent moving them. It is the sole cause of their transport toward more probable locations. The Error There is no more widespread error in chemistry and physics texts than the identification of a thermodynamic entropy increase with a change in the pattern of a group of macro objects. The classic example is that of playing cards. Shuffling a new deck is widely said to result in an increase in entropy in the cards. This erroneous impression is often extended to all kinds of things when they are changed from humanly designated order to what is commonly considered disorder: a group of marbles to scattered marbles, racked billiard balls to a broken rack, neat groups of papers on a desk to the more usual disarray. In fact, there is no thermodynamic entropy change in the objects in the "after" state compared to the "before". Further, such alterations in arrangement have been used in at least one text to support a "law" that is stated, "things move spontaneously in the direction of maximum chaos or disorder".1 The foregoing examples and "law" seriously mislead the student by focusing on macro objects that are only a passive part of a system. They are deceptive in omitting the agent that actually is changed in entropy as it follows the second lawthat is, whatever energy source is involved in the process of moving the static macro objects to more probable random locations. Entropy is increased in the shuffler's and in the billiard cue holder's muscles, in the tornado's wind and the earthquake's stressnot in the objects shifted. Chemically unchanged macro things do not spontaneously, by some innate tendency, leap or even slowly lurch toward visible disorder. Energy concentrated in the ATP of a person's muscles or in wind or in earth-stress is ultimately responsible for moving objects and is partly degraded to diffuse thermal energy as a result. Discussion To discover the origin of this text and lecture error, a brief review of some aspects of physical entropy is useful. Of course, the original definition of Clausius, dS = Dq(rev)/T, applies to a system plus its surroundings, and the Gibbsian relation of pertains to a system at constant pressure and constant temperature. Only in the present discussion (where an unfortunate term, information "entropy", must be dealt with) would it be necessary to emphasize that temperature is integral to any physical thermodynamic entropy change described via Clausius or Gibbs. In our era we are surer even than they could be that temperature is indispensable in understanding thermodynamic entropy because it indicates the thermal environment of microparticles in a system. That environment sustains the intermolecular motions whereby molecules continuously interchange energy and are able to access the wide range of energetic microstates available to them. It is this ever-present thermal motion that makes spontaneous change possible, even at constant temperature and in the absence of chemical reaction, because it is the mechanism whereby molecules can occupy new energetic microstates if the boundaries of a system are altered. Prime examples of such spontaneous change are diffusion in fluids and the expansion of gases into vacua, both fundamentally due to the additional translational energetic microstates in the enlarged systems. (Of course, spontaneous endothermic processes ranging from phase changes to chemical reactions are also due to mobile energy-transferring microparticles that can access new rotational and vibrational as well as translational energetic microstatesin the thermal surroundings as well as in the chemical system.) Misinterpretation of the Boltzmann equation for entropy change, ln(number of energetic microstates after change/number of energetic microstates before change), is the source of much of the confusion regarding the behavior of macro objects. R, the gas constant, embeds temperature in Boltzmann's entropy as integrally as in the Clausius or Gibbs relation and, to repeat, the environment's temperature indicates the degree of energy dispersion that makes access to available energy microstates possible. The Boltzmann equation is revelatory in uniting the macrothermodynamics of classic Clausian entropy with what has been described above as the behavior of a system of microparticles occupying energetic microstates. In discussing how probability enters the Boltzmann equation (i.e., the number of possible energetic microstates and their occupancy by microparticles), texts and teachers often enumerate the many ways a few symbolic molecules can be distributed on lines representing energy levels, or in similar cells or boxes, or with combinations of playing cards. Of course these are good analogs for depicting an energetic microsystem. However, even if there are warnings by the instructor, the use of playing cards as a model is probably intellectually hazardous; these objects are so familiar that the student can too easily warp this macro analog of a microsystem into an example of actual entropic change in the cards. Another major source of confusion about entropy change as the result of simply rearranging macro objects comes from information theory "entropy".2 Claude E. Shannon's 1948 paper began the era of quantification of information and in it he adopted the word "entropy" to name the quantity that his equation defined (2). This occurred because a friend, the brilliant mathematician John von Neumann, told him "call it entropy no one knows what entropy really is, so in a debate you will always have the advantage" (3). Wryly funny for that moment, Shannon's unwise acquiescence has produced enormous scientific confusion due to the increasingly widespread usefulness of his equation and its fertile mathematical variations in many fields other than communications (4, 5). Certainly most non-experts hearing of the widely touted information "entropy" would assume its overlap with thermodynamic entropy. However, the great success of information "entropy" has been in areas totally divorced from experimental chemistry, whose objective macro results are dependent on the behavior of energetic microparticles. Nevertheless, many instructors in chemistry have the impression that information "entropy" is not only relevant to the calculations and conclusions of thermodynamic entropy but may change them. This is not true. There is no invariant function corresponding to energy embedded in each of the hundreds of equations of information "entropy" and thus no analog of temperature universally present in them. In contrast, inherent in all thermodynamic entropy, temperature is the objective indicator of a system's energetic state. Probability distributions in information "entropy" represent human selections; therefore information "entropy" is strongly subjective. Probability distributions in thermodynamic entropy are dependent on the microparticulate and physicochemical nature of the system; limited thereby, thermodynamic entropy is strongly objective. This is not to say that the extremely general mathematics of information theory cannot be modified ad hoc and further specifically constrained to yield results that are identical to Gibbs' or Boltzmann's relations (6). This may be important theoretically but it is totally immaterial here; such a modification simply supports conventional thermodynamic results without changing themno lesser nor any greater thermodynamic entropy. The point is that information "entropy" in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change in the movement of macro objects because such information "entropy" does not deal with microparticles whose perturbations are related to temperature.3 Even those who are very competent chemists and physicists have become confused when they have melded or mixed information "entropy" in their consideration of physical thermodynamic entropy. This is shown by the results in textbooks and by the lectures of professors found on the Internet.1 Overall then, how did such an error (concerning entropy changes in macro objects that are simply moved) become part of mainstream instruction, being repeated in print even by distinguished physicists and chemists? The modern term for distorting a photograph, morphing, is probably the best answer. Correct statements of statistical thermodynamics have been progressively altered so that their dependence on the energetics of atoms and molecules is obliterated for the nonprofessional reader and omitted by some author-scientists. The morphing process can be illustrated by the sequence of statements 1 to 4 below.

    1. Isolated systems of atoms and molecules spontaneously tend to occupy all available energetic microstates thermally accessible to them and tend to change to any arrangement or macro state that provides more such microstates. Thus, spontaneous change is toward a condition of greater probability of energy dispersion. After a spontaneous change, the logarithm of the ratio of the number of available microstates to those in the prior state is related to the system's increase in entropy by a constant, R/N per mole. It is the presence of temperature in R that distinguishes physical entropy from all information "entropy".
    2. Systems of atoms and molecules spontaneously tend to go from a less probable state in which they are relatively "orderly" (few microstates, low entropy) to one that is more probable in which they are "disorderly" (many microstates, high entropy).
    3. Spontaneous (natural) processes go from order to disorder and entropy increases. Order is what we see as neat, patterned. Disorder is what we see as messy, random.
    4. Things naturally become disorderly.
    Most chemists would read statements 3 and 4 with the implications from statement 1 or 2 automatically present in their thoughts. Undoubtedly, a majority are aware that 3 really applies only to atomic and molecular order and disorder. However, most students and nonscientists lack such a background. As is evident from their writing, some physicists err because they ignore or forget the dependence of physical thermodynamic entropy upon atomic and molecular energetic states. The following recent quote from a distinguished physicist is in the middle of a discussion of the arrangement of books in a young person's room: "The subjective terms 'order' and 'disorder' are quantified by association with probability, and identified, respectively, with low and high entropy." He then informs his readers that "in the natural course of events the room has a tendency to become more disordered."1 (Italics added.) The phrase "in the natural course of events" implies to a chemist that energy from some sourcethe internal energy of a substance in a chemical process, the external energy involved as an agent transports a solid objectcan powerfully affect macro things in a room, but is this true for most readers? "Naturally" to many students and nonscientists even has the inappropriate connotation "of course" or "as would be expected". Certainly, it does not properly imply a truly complex set of conditions, such as "in nature, where objects can be pushed around by people or windstorms or hail or quakes and where the substances from which they are made can change if their activation energies are exceeded"! Thus, errors in texts and lectures have arisen because of two types of category slippage: (i) misapplying thermodynamic entropy evaluationsproper in the domain of energetic atoms and moleculesto visibly static macro objects that are unaltered packages of such microparticles, and (ii) misinterpretation of words such as natural (whose common meaning lacks a sense of the external energy needed for any agent to move large visible things.) Why is there no permanent thermodynamic entropy change in a macro object after it has been transported from one location to another or when a group of them is scattered randomly? Thermodynamic entropy changes are dependent on changes in the dispersal of energy in the microstates of atoms and molecules. A playing card or a billiard ball or a blue sock is a package, a sealed closed system, of energetic microstates whose numbers and types are not changed when the package is transported to a new site from a starting place. All macro objects are like this. Their relocation to different sites does not create any permanent additional energetic microstates within them. (Any temporary heating effects due to the initiation and cessation of the movement are lost to the environment.) Thus, there is a zero change in their physical entropy as a result of being moved. Acknowledgments I thank Norman C. Craig and the reviewers for invaluable criticism of the original manuscript. Notes
    1. Singling out individual authors from many could appear invidious. Thus, references to quotations or errors are not listed.
    2. It is important that information "entropy" always be in quotes whenever thermodynamic entropy is mentioned in the same article or book. Otherwise, the unfortunate confusion of the past half-century is amplified rather than attenuated.
    3. It has been said that an information "entropy" equationcompared to those for thermodynamic entropymay look like a duck but, without any empowering thermal energy, it can't quack like a duck or walk like a duck.
    Literature Cited
    1. Denbigh, K. G. Br. J. Philos. Sci. 1989, 40, 323-332.
    2. Shannon, C. E. Bell System Tech. J. 1948, 27, 329-423, 623-656.
    3. Tribus, M.; McIrvine, E. C. Sci. Am. 1971, 225, 180.
    4. Including: Golembiewski, R. T. Handbook of Organizational Behavior; Dekker: New York, 1993.
    5. Hillman, C. Entropy on the World Wide Web; http://www.math.washington.edu/~hillman/entropy.html. Extensive references to print and WWW sites, primarily information "entropy" but thermodynamic entropy in the physical sciences in http://www.math.washington.edu/~hillman/Entropy/phys.html (The "e" in entropy is case sensitive in these two URLs.). A European mirror site (via China) is at http://www.unibas.ch/mdpi/entropy (accessed June 1999).
    6. Tribus, M. Am. Sci. 1966, 54, 201-210.

  14. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  15. Shannon information entropy in the canonical genetic code.

    PubMed

    Nemzer, Louis R

    2017-02-21

    The Shannon entropy measures the expected information value of messages. As with thermodynamic entropy, the Shannon entropy is only defined within a system that identifies at the outset the collections of possible messages, analogous to microstates, that will be considered indistinguishable macrostates. This fundamental insight is applied here for the first time to amino acid alphabets, which group the twenty common amino acids into families based on chemical and physical similarities. To evaluate these schemas objectively, a novel quantitative method is introduced based the inherent redundancy in the canonical genetic code. Each alphabet is taken as a separate system that partitions the 64 possible RNA codons, the microstates, into families, the macrostates. By calculating the normalized mutual information, which measures the reduction in Shannon entropy, conveyed by single nucleotide messages, groupings that best leverage this aspect of fault tolerance in the code are identified. The relative importance of properties related to protein folding - like hydropathy and size - and function, including side-chain acidity, can also be estimated. This approach allows the quantification of the average information value of nucleotide positions, which can shed light on the coevolution of the canonical genetic code with the tRNA-protein translation mechanism. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Exploring stability of entropy analysis for signal with different trends

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Li, Jin; Wang, Jun

    2017-03-01

    Considering the effects of environment disturbances and instrument systems, the actual detecting signals always are carrying different trends, which result in that it is difficult to accurately catch signals complexity. So choosing steady and effective analysis methods is very important. In this paper, we applied entropy measures-the base-scale entropy and approximate entropy to analyze signal complexity, and studied the effect of trends on the ideal signal and the heart rate variability (HRV) signals, that is, linear, periodic, and power-law trends which are likely to occur in actual signals. The results show that approximate entropy is unsteady when we embed different trends into the signals, so it is not suitable to analyze signal with trends. However, the base-scale entropy has preferable stability and accuracy for signal with different trends. So the base-scale entropy is an effective method to analyze the actual signals.

  17. The gravity dual of Rényi entropy.

    PubMed

    Dong, Xi

    2016-08-12

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Rényi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometric prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Rényi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Rényi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.

  18. The gravity dual of Rényi entropy

    PubMed Central

    Dong, Xi

    2016-01-01

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Rényi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometric prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Rényi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Rényi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity. PMID:27515122

  19. Nonparametric entropy estimation using kernel densities.

    PubMed

    Lake, Douglas E

    2009-01-01

    The entropy of experimental data from the biological and medical sciences provides additional information over summary statistics. Calculating entropy involves estimates of probability density functions, which can be effectively accomplished using kernel density methods. Kernel density estimation has been widely studied and a univariate implementation is readily available in MATLAB. The traditional definition of Shannon entropy is part of a larger family of statistics, called Renyi entropy, which are useful in applications that require a measure of the Gaussianity of data. Of particular note is the quadratic entropy which is related to the Friedman-Tukey (FT) index, a widely used measure in the statistical community. One application where quadratic entropy is very useful is the detection of abnormal cardiac rhythms, such as atrial fibrillation (AF). Asymptotic and exact small-sample results for optimal bandwidth and kernel selection to estimate the FT index are presented and lead to improved methods for entropy estimation.

  20. Heart rate variability analysis based on time-frequency representation and entropies in hypertrophic cardiomyopathy patients.

    PubMed

    Clariá, F; Vallverdú, M; Baranowski, R; Chojnowska, L; Caminal, P

    2008-03-01

    In hypertrophic cardiomyopathy (HCM) patients there is an increased risk of premature death, which can occur with little or no warning. Furthermore, classification for sudden cardiac death on patients with HCM is very difficult. The aim of our study was to improve the prognostic value of heart rate variability (HRV) in HCM patients, giving insight into changes of the autonomic nervous system. In this way, the suitability of linear and nonlinear measures was studied to assess the HRV. These measures were based on time-frequency representation (TFR) and on Shannon and Rényi entropies, and compared with traditional HRV measures. Holter recordings of 64 patients with HCM and 55 healthy subjects were analyzed. The HCM patients consisted of two groups: 13 high risk patients, after aborted sudden cardiac death (SCD); 51 low risk patients, without SCD. Five-hour RR signals, corresponding to the sleep period of the subjects, were considered for the analysis as a comparable standard situation. These RR signals were filtered in the three frequency bands: very low frequency band (VLF, 0-0.04 Hz), low frequency band (LF, 0.04-0.15 Hz) and high frequency band (HF, 0.15-0.45 Hz). TFR variables based on instantaneous frequency and energy functions were able to classify HCM patients and healthy subjects (control group). Results revealed that measures obtained from TFR analysis of the HRV better classified the groups of subjects than traditional HRV parameters. However, results showed that nonlinear measures improved group classification. It was observed that entropies calculated in the HF band showed the highest statistically significant levels comparing the HCM group and the control group, p-value < 0.0005. The values of entropy measures calculated in the HCM group presented lower values, indicating a decreasing of complexity, than those calculated from the control group. Moreover, similar behavior was observed comparing high and low risk of premature death, the values of the entropy being lower in high risk patients, p-value < 0.05, indicating an increase of predictability. Furthermore, measures from information entropy, but not from TFR, seem to be useful for enhanced risk stratification in HCM patients with an increased risk of sudden cardiac death.

  1. Multiscale permutation entropy analysis of EEG recordings during sevoflurane anesthesia

    NASA Astrophysics Data System (ADS)

    Li, Duan; Li, Xiaoli; Liang, Zhenhu; Voss, Logan J.; Sleigh, Jamie W.

    2010-08-01

    Electroencephalogram (EEG) monitoring of the effect of anesthetic drugs on the central nervous system has long been used in anesthesia research. Several methods based on nonlinear dynamics, such as permutation entropy (PE), have been proposed to analyze EEG series during anesthesia. However, these measures are still single-scale based and may not completely describe the dynamical characteristics of complex EEG series. In this paper, a novel measure combining multiscale PE information, called CMSPE (composite multi-scale permutation entropy), was proposed for quantifying the anesthetic drug effect on EEG recordings during sevoflurane anesthesia. Three sets of simulated EEG series during awake, light and deep anesthesia were used to select the parameters for the multiscale PE analysis: embedding dimension m, lag τ and scales to be integrated into the CMSPE index. Then, the CMSPE index and raw single-scale PE index were applied to EEG recordings from 18 patients who received sevoflurane anesthesia. Pharmacokinetic/pharmacodynamic (PKPD) modeling was used to relate the measured EEG indices and the anesthetic drug concentration. Prediction probability (Pk) statistics and correlation analysis with the response entropy (RE) index, derived from the spectral entropy (M-entropy module; GE Healthcare, Helsinki, Finland), were investigated to evaluate the effectiveness of the new proposed measure. It was found that raw single-scale PE was blind to subtle transitions between light and deep anesthesia, while the CMSPE index tracked these changes accurately. Around the time of loss of consciousness, CMSPE responded significantly more rapidly than the raw PE, with the absolute slopes of linearly fitted response versus time plots of 0.12 (0.09-0.15) and 0.10 (0.06-0.13), respectively. The prediction probability Pk of 0.86 (0.85-0.88) and 0.85 (0.80-0.86) for CMSPE and raw PE indicated that the CMSPE index correlated well with the underlying anesthetic effect. The correlation coefficient for the comparison between the CMSPE index and RE index of 0.84 (0.80-0.88) was significantly higher than the raw PE index of 0.75 (0.66-0.84). The results show that the CMSPE outperforms the raw single-scale PE in reflecting the sevoflurane drug effect on the central nervous system.

  2. Entropy, energy, and entanglement of localized states in bent triatomic molecules

    NASA Astrophysics Data System (ADS)

    Yuan, Qiang; Hou, Xi-Wen

    2017-05-01

    The dynamics of quantum entropy, energy, and entanglement is studied for various initial states in an important spectroscopic Hamiltonian of bent triatomic molecules H2O, D2O, and H2S. The total quantum correlation is quantified in terms of the mutual information and the entanglement by the concurrence borrowed from the theory of quantum information. The Pauli entropy and the intramolecular energy usually used in the theory of molecules are calculated to establish a possible relationship between both theories. Sections of two quantities among these four quantities are introduced to visualize such relationship. Analytic and numerical simulations demonstrate that if an initial state is taken to be the stretch- or the bend-vibrationally localized state, the mutual information, the Pauli entropy, and the concurrence are dominant-positively correlated while they are dominantly anti-correlated with the interacting energy among three anharmonic vibrational modes. In particular, such correlation is more distinct for the localized state with high excitations in the bending mode. The nice quasi-periodicity of those quantities in D2O molecule reveals that this molecule prepared in the localized state in the stretching or the bending mode can be more appreciated for molecular quantum computation. However, the dynamical correlations of those quantities behave irregularly for the dislocalized states. Moreover, the hierarchy of the mutual information and the Pauli entropy is explicitly proved. Quantum entropy and energy in every vibrational mode are investigated. Thereby, the relation between bipartite and tripartite entanglements is discussed as well. Those are useful for the understanding of quantum correlations in high-dimensional states in polyatomic molecules from quantum information and intramolecular dynamics.

  3. Analysis of interacting entropy-corrected holographic and new agegraphic dark energies

    NASA Astrophysics Data System (ADS)

    Ranjit, Chayan; Debnath, Ujjal

    In the present work, we assume the flat FRW model of the universe is filled with dark matter and dark energy where they are interacting. For dark energy model, we consider the entropy-corrected HDE (ECHDE) model and the entropy-corrected NADE (ECNADE). For entropy-corrected models, we assume logarithmic correction and power law correction. For ECHDE model, length scale L is assumed to be Hubble horizon and future event horizon. The ωde-ωde‧ analysis for our different horizons are discussed.

  4. Entropy in statistical energy analysis.

    PubMed

    Le Bot, Alain

    2009-03-01

    In this paper, the second principle of thermodynamics is discussed in the framework of statistical energy analysis (SEA). It is shown that the "vibrational entropy" and the "vibrational temperature" of sub-systems only depend on the vibrational energy and the number of resonant modes. A SEA system can be described as a thermodynamic system slightly out of equilibrium. In steady-state condition, the entropy exchanged with exterior by sources and dissipation exactly balances the production of entropy by irreversible processes at interface between SEA sub-systems.

  5. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  6. The effect of orthostatic stress on multiscale entropy of heart rate and blood pressure.

    PubMed

    Turianikova, Zuzana; Javorka, Kamil; Baumert, Mathias; Calkovska, Andrea; Javorka, Michal

    2011-09-01

    Cardiovascular control acts over multiple time scales, which introduces a significant amount of complexity to heart rate and blood pressure time series. Multiscale entropy (MSE) analysis has been developed to quantify the complexity of a time series over multiple time scales. In previous studies, MSE analyses identified impaired cardiovascular control and increased cardiovascular risk in various pathological conditions. Despite the increasing acceptance of the MSE technique in clinical research, information underpinning the involvement of the autonomic nervous system in the MSE of heart rate and blood pressure is lacking. The objective of this study is to investigate the effect of orthostatic challenge on the MSE of heart rate and blood pressure variability (HRV, BPV) and the correlation between MSE (complexity measures) and traditional linear (time and frequency domain) measures. MSE analysis of HRV and BPV was performed in 28 healthy young subjects on 1000 consecutive heart beats in the supine and standing positions. Sample entropy values were assessed on scales of 1-10. We found that MSE of heart rate and blood pressure signals is sensitive to changes in autonomic balance caused by postural change from the supine to the standing position. The effect of orthostatic challenge on heart rate and blood pressure complexity depended on the time scale under investigation. Entropy values did not correlate with the mean values of heart rate and blood pressure and showed only weak correlations with linear HRV and BPV measures. In conclusion, the MSE analysis of heart rate and blood pressure provides a sensitive tool to detect changes in autonomic balance as induced by postural change.

  7. Differential Entropy Preserves Variational Information of Near-Infrared Spectroscopy Time Series Associated With Working Memory

    PubMed Central

    Keshmiri, Soheil; Sumioka, Hidenubo; Yamazaki, Ryuji; Ishiguro, Hiroshi

    2018-01-01

    Neuroscience research shows a growing interest in the application of Near-Infrared Spectroscopy (NIRS) in analysis and decoding of the brain activity of human subjects. Given the correlation that is observed between the Blood Oxygen Dependent Level (BOLD) responses that are exhibited by the time series data of functional Magnetic Resonance Imaging (fMRI) and the hemoglobin oxy/deoxy-genation that is captured by NIRS, linear models play a central role in these applications. This, in turn, results in adaptation of the feature extraction strategies that are well-suited for discretization of data that exhibit a high degree of linearity, namely, slope and the mean as well as their combination, to summarize the informational contents of the NIRS time series. In this article, we demonstrate that these features are inefficient in capturing the variational information of NIRS data, limiting the reliability and the adequacy of the conclusion on their results. Alternatively, we propose the linear estimate of differential entropy of these time series as a natural representation of such information. We provide evidence for our claim through comparative analysis of the application of these features on NIRS data pertinent to several working memory tasks as well as naturalistic conversational stimuli. PMID:29922144

  8. Analysis of informational redundancy in the protein-assembling machinery

    NASA Astrophysics Data System (ADS)

    Berkovich, Simon

    2004-03-01

    Entropy analysis of the DNA structure does not reveal a significant departure from randomness indicating lack of informational redundancy. This signifies the absence of a hidden meaning in the genome text and supports the 'barcode' interpretation of DNA given in [1]. Lack of informational redundancy is a characteristic property of an identification label rather than of a message of instructions. Yet randomness of DNA has to induce non-random structures of the proteins. Protein synthesis is a two-step process: transcription into RNA with gene splicing and formation a structure of amino acids. Entropy estimations, performed by A. Djebbari, show typical values of redundancy of the biomolecules along these pathways: DNA gene 4proteins 15-40in gene expression, the RNA copy carries the same information as the original DNA template. Randomness is essentially eliminated only at the step of the protein creation by a degenerate code. According to [1], the significance of the substitution of U for T with a subsequent gene splicing is that these transformations result in a different pattern of RNA oscillations, so the vital DNA communications are protected against extraneous noise coming from the protein making activities. 1. S. Berkovich, "On the 'barcode' functionality of DNA, or the Phenomenon of Life in the Physical Universe", Dorrance Publishing Co., Pittsburgh, 2003

  9. Probing the History of Galaxy Clusters with Metallicity and Entropy Measurements

    NASA Astrophysics Data System (ADS)

    Elkholy, Tamer Yohanna

    Galaxy clusters are the largest gravitationally bound objects found today in our Universe. The gas they contain, the intra-cluster medium (ICM), is heated to temperatures in the approximate range of 1 to 10 keV, and thus emits X-ray radiation. Studying the ICM through the spatial and spectral analysis of its emission returns the richest information about both the overall cosmological context which governs the formation of clusters, as well as the physical processes occurring within. The aim of this thesis is to learn about the history of the physical processes that drive the evolution of galaxy clusters, through careful, spatially resolved measurements of their metallicity and entropy content. A sample of 45 nearby clusters observed with Chandra is analyzed to produce radial density, temperature, entropy and metallicity profiles. The entropy profiles are computed to larger radial extents than in previous Chandra analyses. The results of this analysis are made available to the scientific community in an electronic database. Comparing metallicity and entropy in the outskirts of clusters, we find no signature on the entropy profiles of the ensemble of supernovae that produced the observed metals. In the centers of clusters, we find that the metallicities of high-mass clusters are much less dispersed than those of low-mass clusters. A comparison of metallicity with the regularity of the X-ray emission morphology suggests that metallicities in low-mass clusters are more susceptible to increase from violent events such as mergers. We also find that the variation in the stellar-to-gas mass ratio as a function of cluster mass can explain the variation of central metallicity with cluster mass, only if we assume that there is a constant level of metallicity for clusters of all masses, above which the observed galaxies add more metals in proportion to their mass. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  10. Quantification of knee vibroarthrographic signal irregularity associated with patellofemoral joint cartilage pathology based on entropy and envelope amplitude measures.

    PubMed

    Wu, Yunfeng; Chen, Pinnan; Luo, Xin; Huang, Hui; Liao, Lifang; Yao, Yuchen; Wu, Meihong; Rangayyan, Rangaraj M

    2016-07-01

    Injury of knee joint cartilage may result in pathological vibrations between the articular surfaces during extension and flexion motions. The aim of this paper is to analyze and quantify vibroarthrographic (VAG) signal irregularity associated with articular cartilage degeneration and injury in the patellofemoral joint. The symbolic entropy (SyEn), approximate entropy (ApEn), fuzzy entropy (FuzzyEn), and the mean, standard deviation, and root-mean-squared (RMS) values of the envelope amplitude, were utilized to quantify the signal fluctuations associated with articular cartilage pathology of the patellofemoral joint. The quadratic discriminant analysis (QDA), generalized logistic regression analysis (GLRA), and support vector machine (SVM) methods were used to perform signal pattern classifications. The experimental results showed that the patients with cartilage pathology (CP) possess larger SyEn and ApEn, but smaller FuzzyEn, over the statistical significance level of the Wilcoxon rank-sum test (p<0.01), than the healthy subjects (HS). The mean, standard deviation, and RMS values computed from the amplitude difference between the upper and lower signal envelopes are also consistently and significantly larger (p<0.01) for the group of CP patients than for the HS group. The SVM based on the entropy and envelope amplitude features can provide superior classification performance as compared with QDA and GLRA, with an overall accuracy of 0.8356, sensitivity of 0.9444, specificity of 0.8, Matthews correlation coefficient of 0.6599, and an area of 0.9212 under the receiver operating characteristic curve. The SyEn, ApEn, and FuzzyEn features can provide useful information about pathological VAG signal irregularity based on different entropy metrics. The statistical parameters of signal envelope amplitude can be used to characterize the temporal fluctuations related to the cartilage pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Entropy studies on beam distortion by atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2015-09-01

    When a beam propagates through atmospheric turbulence over a known distance, the target beam profile deviates from the projected profile of the beam on the receiver. Intuitively, the unwanted distortion provides information about the atmospheric turbulence. This information is crucial for guiding adaptive optic systems and improving beam propagation results. In this paper, we propose an entropy study based on the image from a plenoptic sensor to provide a measure of information content of atmospheric turbulence. In general, lower levels of atmospheric turbulence will have a smaller information size while higher levels of atmospheric turbulence will cause significant expansion of the information size, which may exceed the maximum capacity of a sensing system and jeopardize the reliability of an AO system. Therefore, the entropy function can be used to analyze the turbulence distortion and evaluate performance of AO systems. In fact, it serves as a metric that can tell the improvement of beam correction in each iteration step. In addition, it points out the limitation of an AO system at optimized correction as well as the minimum information needed for wavefront sensing to achieve certain levels of correction. In this paper, we will demonstrate the definition of the entropy function and how it is related to evaluating information (randomness) carried by atmospheric turbulence.

  12. Image encryption based on a delayed fractional-order chaotic logistic system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na

    2012-05-01

    A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.

  13. To Control False Positives in Gene-Gene Interaction Analysis: Two Novel Conditional Entropy-Based Approaches

    PubMed Central

    Lin, Meihua; Li, Haoli; Zhao, Xiaolei; Qin, Jiheng

    2013-01-01

    Genome-wide analysis of gene-gene interactions has been recognized as a powerful avenue to identify the missing genetic components that can not be detected by using current single-point association analysis. Recently, several model-free methods (e.g. the commonly used information based metrics and several logistic regression-based metrics) were developed for detecting non-linear dependence between genetic loci, but they are potentially at the risk of inflated false positive error, in particular when the main effects at one or both loci are salient. In this study, we proposed two conditional entropy-based metrics to challenge this limitation. Extensive simulations demonstrated that the two proposed metrics, provided the disease is rare, could maintain consistently correct false positive rate. In the scenarios for a common disease, our proposed metrics achieved better or comparable control of false positive error, compared to four previously proposed model-free metrics. In terms of power, our methods outperformed several competing metrics in a range of common disease models. Furthermore, in real data analyses, both metrics succeeded in detecting interactions and were competitive with the originally reported results or the logistic regression approaches. In conclusion, the proposed conditional entropy-based metrics are promising as alternatives to current model-based approaches for detecting genuine epistatic effects. PMID:24339984

  14. Entanglement entropy of electronic excitations.

    PubMed

    Plasser, Felix

    2016-05-21

    A new perspective into correlation effects in electronically excited states is provided through quantum information theory. The entanglement between the electron and hole quasiparticles is examined, and it is shown that the related entanglement entropy can be computed from the eigenvalue spectrum of the well-known natural transition orbital (NTO) decomposition. Non-vanishing entanglement is obtained whenever more than one NTO pair is involved, i.e., in the case of a multiconfigurational or collective excitation. An important implication is that in the case of entanglement it is not possible to gain a complete description of the state character from the orbitals alone, but more specific analysis methods are required to decode the mutual information between the electron and hole. Moreover, the newly introduced number of entangled states is an important property by itself giving information about excitonic structure. The utility of the formalism is illustrated in the cases of the excited states of two interacting ethylene molecules, the conjugated polymer para-phenylene vinylene, and the naphthalene molecule.

  15. Alternate entropy measure for assessing volatility in financial markets.

    PubMed

    Bose, Ranjan; Hamacher, Kay

    2012-11-01

    We propose two alternate information theoretical approaches to assess non-Gaussian fluctuations in the return dynamics of financial markets. Specifically, we use superinformation, which is a measure of the disorder of the entropy of time series. We argue on theoretical grounds on its usefulness and show that it can be applied effectively for analyzing returns. A study of stock market data for over five years has been carried out using this approach. We show how superinformation helps to identify and classify important signals in the time series. The financial crisis of 2008 comes out very clearly in the superinformation plots. In addition, we introduce the super mutual information. Distinct super mutual information signatures are observed that might be used to mitigate idiosyncratic risk. The universality of our approach has been tested by carrying out the analysis for the 100 stocks listed in S&P100 index. The average superinformation values for the S&P100 stocks correlates very well with the VIX.

  16. Alternate entropy measure for assessing volatility in financial markets

    NASA Astrophysics Data System (ADS)

    Bose, Ranjan; Hamacher, Kay

    2012-11-01

    We propose two alternate information theoretical approaches to assess non-Gaussian fluctuations in the return dynamics of financial markets. Specifically, we use superinformation, which is a measure of the disorder of the entropy of time series. We argue on theoretical grounds on its usefulness and show that it can be applied effectively for analyzing returns. A study of stock market data for over five years has been carried out using this approach. We show how superinformation helps to identify and classify important signals in the time series. The financial crisis of 2008 comes out very clearly in the superinformation plots. In addition, we introduce the super mutual information. Distinct super mutual information signatures are observed that might be used to mitigate idiosyncratic risk. The universality of our approach has been tested by carrying out the analysis for the 100 stocks listed in S&P100 index. The average superinformation values for the S&P100 stocks correlates very well with the VIX.

  17. Entanglement entropy of dispersive media from thermodynamic entropy in one higher dimension.

    PubMed

    Maghrebi, M F; Reid, M T H

    2015-04-17

    A dispersive medium becomes entangled with zero-point fluctuations in the vacuum. We consider an arbitrary array of material bodies weakly interacting with a quantum field and compute the quantum mutual information between them. It is shown that the mutual information in D dimensions can be mapped to classical thermodynamic entropy in D+1 dimensions. As a specific example, we compute the mutual information both analytically and numerically for a range of separation distances between two bodies in D=2 dimensions and find a logarithmic correction to the area law at short separations. A key advantage of our method is that it allows the strong subadditivity property to be easily verified.

  18. Topological entanglement Rényi entropy and reduced density matrix structure.

    PubMed

    Flammia, Steven T; Hamma, Alioscia; Hughes, Taylor L; Wen, Xiao-Gang

    2009-12-31

    We generalize the topological entanglement entropy to a family of topological Rényi entropies parametrized by a parameter alpha, in an attempt to find new invariants for distinguishing topologically ordered phases. We show that, surprisingly, all topological Rényi entropies are the same, independent of alpha for all nonchiral topological phases. This independence shows that topologically ordered ground-state wave functions have reduced density matrices with a certain simple structure, and no additional universal information can be extracted from the entanglement spectrum.

  19. Topological Entanglement Rényi Entropy and Reduced Density Matrix Structure

    NASA Astrophysics Data System (ADS)

    Flammia, Steven T.; Hamma, Alioscia; Hughes, Taylor L.; Wen, Xiao-Gang

    2009-12-01

    We generalize the topological entanglement entropy to a family of topological Rényi entropies parametrized by a parameter α, in an attempt to find new invariants for distinguishing topologically ordered phases. We show that, surprisingly, all topological Rényi entropies are the same, independent of α for all nonchiral topological phases. This independence shows that topologically ordered ground-state wave functions have reduced density matrices with a certain simple structure, and no additional universal information can be extracted from the entanglement spectrum.

  20. Transmitting Information by Propagation in an Ocean Waveguide: Computation of Acoustic Field Capacity

    DTIC Science & Technology

    2015-06-17

    progress, Eq. (4) is evaluated in terms of the differential entropy h. The integrals can be identified as differential entropy terms by expanding the log...all ran- dom vectors p with a given covariance matrix, the entropy of p is maximized when p is ZMCSCG since a normal distribution maximizes the... entropy over all distributions with the same covariance [9, 18], implying that this is the optimal distribution on s as well. In addition, of all the

  1. Entropy and econophysics

    NASA Astrophysics Data System (ADS)

    Rosser, J. Barkley

    2016-12-01

    Entropy is a central concept of statistical mechanics, which is the main branch of physics that underlies econophysics, the application of physics concepts to understand economic phenomena. It enters into econophysics both in an ontological way through the Second Law of Thermodynamics as this drives the world economy from its ecological foundations as solar energy passes through food chains in dissipative process of entropy rising and production fundamentally involving the replacement of lower entropy energy states with higher entropy ones. In contrast the mathematics of entropy as appearing in information theory becomes the basis for modeling financial market dynamics as well as income and wealth distribution dynamics. It also provides the basis for an alternative view of stochastic price equilibria in economics, as well providing a crucial link between econophysics and sociophysics, keeping in mind the essential unity of the various concepts of entropy.

  2. Universal Entropy of Word Ordering Across Linguistic Families

    PubMed Central

    Montemurro, Marcelo A.; Zanette, Damián H.

    2011-01-01

    Background The language faculty is probably the most distinctive feature of our species, and endows us with a unique ability to exchange highly structured information. In written language, information is encoded by the concatenation of basic symbols under grammatical and semantic constraints. As is also the case in other natural information carriers, the resulting symbolic sequences show a delicate balance between order and disorder. That balance is determined by the interplay between the diversity of symbols and by their specific ordering in the sequences. Here we used entropy to quantify the contribution of different organizational levels to the overall statistical structure of language. Methodology/Principal Findings We computed a relative entropy measure to quantify the degree of ordering in word sequences from languages belonging to several linguistic families. While a direct estimation of the overall entropy of language yielded values that varied for the different families considered, the relative entropy quantifying word ordering presented an almost constant value for all those families. Conclusions/Significance Our results indicate that despite the differences in the structure and vocabulary of the languages analyzed, the impact of word ordering in the structure of language is a statistical linguistic universal. PMID:21603637

  3. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  4. Classification of Partial Discharge Signals by Combining Adaptive Local Iterative Filtering and Entropy Features

    PubMed Central

    Morison, Gordon; Boreham, Philip

    2018-01-01

    Electromagnetic Interference (EMI) is a technique for capturing Partial Discharge (PD) signals in High-Voltage (HV) power plant apparatus. EMI signals can be non-stationary which makes their analysis difficult, particularly for pattern recognition applications. This paper elaborates upon a previously developed software condition-monitoring model for improved EMI events classification based on time-frequency signal decomposition and entropy features. The idea of the proposed method is to map multiple discharge source signals captured by EMI and labelled by experts, including PD, from the time domain to a feature space, which aids in the interpretation of subsequent fault information. Here, instead of using only one permutation entropy measure, a more robust measure, called Dispersion Entropy (DE), is added to the feature vector. Multi-Class Support Vector Machine (MCSVM) methods are utilized for classification of the different discharge sources. Results show an improved classification accuracy compared to previously proposed methods. This yields to a successful development of an expert’s knowledge-based intelligent system. Since this method is demonstrated to be successful with real field data, it brings the benefit of possible real-world application for EMI condition monitoring. PMID:29385030

  5. Multichannel interictal spike activity detection using time-frequency entropy measure.

    PubMed

    Thanaraj, Palani; Parvathavarthini, B

    2017-06-01

    Localization of interictal spikes is an important clinical step in the pre-surgical assessment of pharmacoresistant epileptic patients. The manual selection of interictal spike periods is cumbersome and involves a considerable amount of analysis workload for the physician. The primary focus of this paper is to automate the detection of interictal spikes for clinical applications in epilepsy localization. The epilepsy localization procedure involves detection of spikes in a multichannel EEG epoch. Therefore, a multichannel Time-Frequency (T-F) entropy measure is proposed to extract features related to the interictal spike activity. Least squares support vector machine is used to train the proposed feature to classify the EEG epochs as either normal or interictal spike period. The proposed T-F entropy measure, when validated with epilepsy dataset of 15 patients, shows an interictal spike classification accuracy of 91.20%, sensitivity of 100% and specificity of 84.23%. Moreover, the area under the curve of Receiver Operating Characteristics plot of 0.9339 shows the superior classification performance of the proposed T-F entropy measure. The results of this paper show a good spike detection accuracy without any prior information about the spike morphology.

  6. Sample entropy and regularity dimension in complexity analysis of cortical surface structure in early Alzheimer's disease and aging.

    PubMed

    Chen, Ying; Pham, Tuan D

    2013-05-15

    We apply for the first time the sample entropy (SampEn) and regularity dimension model for measuring signal complexity to quantify the structural complexity of the brain on MRI. The concept of the regularity dimension is based on the theory of chaos for studying nonlinear dynamical systems, where power laws and entropy measure are adopted to develop the regularity dimension for modeling a mathematical relationship between the frequencies with which information about signal regularity changes in various scales. The sample entropy and regularity dimension of MRI-based brain structural complexity are computed for early Alzheimer's disease (AD) elder adults and age and gender-matched non-demented controls, as well as for a wide range of ages from young people to elder adults. A significantly higher global cortical structure complexity is detected in AD individuals (p<0.001). The increase of SampEn and the regularity dimension are also found to be accompanied with aging which might indicate an age-related exacerbation of cortical structural irregularity. The provided model can be potentially used as an imaging bio-marker for early prediction of AD and age-related cognitive decline. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Analysis of neuronal cells of dissociated primary culture on high-density CMOS electrode array

    PubMed Central

    Matsuda, Eiko; Mita, Takeshi; Hubert, Julien; Bakkum, Douglas; Frey, Urs; Hierlemann, Andreas; Takahashi, Hirokazu; Ikegami, Takashi

    2017-01-01

    Spontaneous development of neuronal cells was recorded around 4–34 days in vitro (DIV) with high-density CMOS array, which enables detailed study of the spatio-temporal activity of neuronal culture. We used the CMOS array to characterize the evolution of the inter-spike interval (ISI) distribution from putative single neurons, and estimate the network structure based on transfer entropy analysis, where each node corresponds to a single neuron. We observed that the ISI distributions gradually obeyed the power law with maturation of the network. The amount of information transferred between neurons increased at the early stage of development, but decreased as the network matured. These results suggest that both ISI and transfer entropy were very useful for characterizing the dynamic development of cultured neural cells over a few weeks. PMID:24109870

  8. Crowd macro state detection using entropy model

    NASA Astrophysics Data System (ADS)

    Zhao, Ying; Yuan, Mengqi; Su, Guofeng; Chen, Tao

    2015-08-01

    In the crowd security research area a primary concern is to identify the macro state of crowd behaviors to prevent disasters and to supervise the crowd behaviors. The entropy is used to describe the macro state of a self-organization system in physics. The entropy change indicates the system macro state change. This paper provides a method to construct crowd behavior microstates and the corresponded probability distribution using the individuals' velocity information (magnitude and direction). Then an entropy model was built up to describe the crowd behavior macro state. Simulation experiments and video detection experiments were conducted. It was verified that in the disordered state, the crowd behavior entropy is close to the theoretical maximum entropy; while in ordered state, the entropy is much lower than half of the theoretical maximum entropy. The crowd behavior macro state sudden change leads to the entropy change. The proposed entropy model is more applicable than the order parameter model in crowd behavior detection. By recognizing the entropy mutation, it is possible to detect the crowd behavior macro state automatically by utilizing cameras. Results will provide data support on crowd emergency prevention and on emergency manual intervention.

  9. The minimal work cost of information processing

    NASA Astrophysics Data System (ADS)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  10. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  11. Measurement-induced randomness and state-merging

    NASA Astrophysics Data System (ADS)

    Chakrabarty, Indranil; Deshpande, Abhishek; Chatterjee, Sourav

    In this work we introduce the randomness which is truly quantum mechanical in nature arising as an act of measurement. For a composite classical system, we have the joint entropy to quantify the randomness present in the total system and that happens to be equal to the sum of the entropy of one subsystem and the conditional entropy of the other subsystem, given we know the first system. The same analogy carries over to the quantum setting by replacing the Shannon entropy by the von Neumann entropy. However, if we replace the conditional von Neumann entropy by the average conditional entropy due to measurement, we find that it is different from the joint entropy of the system. We call this difference Measurement Induced Randomness (MIR) and argue that this is unique of quantum mechanical systems and there is no classical counterpart to this. In other words, the joint von Neumann entropy gives only the total randomness that arises because of the heterogeneity of the mixture and we show that it is not the total randomness that can be generated in the composite system. We generalize this quantity for N-qubit systems and show that it reduces to quantum discord for two-qubit systems. Further, we show that it is exactly equal to the change in the cost quantum state merging that arises because of the measurement. We argue that for quantum information processing tasks like state merging, the change in the cost as a result of discarding prior information can also be viewed as a rise of randomness due to measurement.

  12. Entropy in Postmerger and Acquisition Integration from an Information Technology Perspective

    ERIC Educational Resources Information Center

    Williams, Gloria S.

    2012-01-01

    Mergers and acquisitions have historically experienced failure rates from 50% to more than 80%. Successful integration of information technology (IT) systems can be the difference between postmerger success or failure. The purpose of this phenomenological study was to explore the entropy phenomenon during postmerger IT integration. To that end, a…

  13. [Study on once sampling quantitation based on information entropy of ISSR amplified bands of Houttuynia cordata].

    PubMed

    Wang, Haiqin; Liu, Wenlong; He, Fuyuan; Chen, Zuohong; Zhang, Xili; Xie, Xianggui; Zeng, Jiaoli; Duan, Xiaopeng

    2012-02-01

    To explore the once sampling quantitation of Houttuynia cordata through its DNA polymorphic bands that carried information entropy, from other form that the expression of traditional Chinese medicine polymorphism, genetic polymorphism, of traditional Chinese medicine. The technique of inter simple sequence repeat (ISSR) was applied to analyze genetic polymorphism of H. cordata samples from the same GAP producing area, the DNA genetic bands were transformed its into the information entropy, and the minimum once sampling quantitation with the mathematical mode was measured. One hundred and thirty-four DNA bands were obtained by using 9 screened ISSR primers to amplify from 46 strains DNA samples of H. cordata from the same GAP, the information entropy was H=0.365 6-0.978 6, and RSD was 14.75%. The once sampling quantitation was W=11.22 kg (863 strains). The "once minimum sampling quantitation" were calculated from the angle of the genetic polymorphism of H. cordata, and a great differences between this volume and the amount from the angle of fingerprint were found.

  14. Symbolic phase transfer entropy method and its application

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    2017-10-01

    In this paper, we introduce symbolic phase transfer entropy (SPTE) to infer the direction and strength of information flow among systems. The advantages of the proposed method are investigated by simulations on synthetic signals and real-world data. We demonstrate that symbolic phase transfer entropy is a robust and efficient tool to infer the information flow between complex systems. Based on the study of the synthetic data, we find a significant advantage of SPTE is its reduced sensitivity to noise. In addition, SPTE requires less amount of data than symbolic transfer entropy(STE). We analyze the direction and strength of information flow between six stock markets during the period from 2006 to 2016. The results indicate that the information flow among stocks varies over different periods. We also find that the interaction network pattern among stocks undergoes hierarchial reorganization with transition from one period to another. It is shown that the clusters are mainly classified according to period, and then by region. The stocks during the same time period are shown to drop into the same cluster.

  15. Entropy Analyses of Four Familiar Processes.

    ERIC Educational Resources Information Center

    Craig, Norman C.

    1988-01-01

    Presents entropy analysis of four processes: a chemical reaction, a heat engine, the dissolution of a solid, and osmosis. Discusses entropy, the second law of thermodynamics, and the Gibbs free energy function. (MVL)

  16. Beyond the classical theory of heat conduction: a perspective view of future from entropy

    PubMed Central

    Lai, Xiang; Zhu, Pingan

    2016-01-01

    Energy is conserved by the first law of thermodynamics; its quality degrades constantly due to entropy generation, by the second law of thermodynamics. It is thus important to examine the entropy generation regarding the way to reduce its magnitude and the limit of entropy generation as time tends to infinity regarding whether it is bounded or not. This work initiates such an analysis with one-dimensional heat conduction. The work not only offers some fundamental insights of universe and its future, but also builds up the relation between the second law of thermodynamics and mathematical inequalities via developing the latter of either new or classical nature. A concise review of entropy is also included for the interest of performing the analysis in this work and the similar analysis for other processes in the future. PMID:27843400

  17. Optimal attacks on qubit-based Quantum Key Recycling

    NASA Astrophysics Data System (ADS)

    Leermakers, Daan; Škorić, Boris

    2018-03-01

    Quantum Key Recycling (QKR) is a quantum cryptographic primitive that allows one to reuse keys in an unconditionally secure way. By removing the need to repeatedly generate new keys, it improves communication efficiency. Škorić and de Vries recently proposed a QKR scheme based on 8-state encoding (four bases). It does not require quantum computers for encryption/decryption but only single-qubit operations. We provide a missing ingredient in the security analysis of this scheme in the case of noisy channels: accurate upper bounds on the required amount of privacy amplification. We determine optimal attacks against the message and against the key, for 8-state encoding as well as 4-state and 6-state conjugate coding. We provide results in terms of min-entropy loss as well as accessible (Shannon) information. We show that the Shannon entropy analysis for 8-state encoding reduces to the analysis of quantum key distribution, whereas 4-state and 6-state suffer from additional leaks that make them less effective. From the optimal attacks we compute the required amount of privacy amplification and hence the achievable communication rate (useful information per qubit) of qubit-based QKR. Overall, 8-state encoding yields the highest communication rates.

  18. Simultaneous Multi-Scale Diffusion Estimation and Tractography Guided by Entropy Spectrum Pathways

    PubMed Central

    Galinsky, Vitaly L.; Frank, Lawrence R.

    2015-01-01

    We have developed a method for the simultaneous estimation of local diffusion and the global fiber tracts based upon the information entropy flow that computes the maximum entropy trajectories between locations and depends upon the global structure of the multi-dimensional and multi-modal diffusion field. Computation of the entropy spectrum pathways requires only solving a simple eigenvector problem for the probability distribution for which efficient numerical routines exist, and a straight forward integration of the probability conservation through ray tracing of the convective modes guided by a global structure of the entropy spectrum coupled with a small scale local diffusion. The intervoxel diffusion is sampled by multi b-shell multi q-angle DWI data expanded in spherical waves. This novel approach to fiber tracking incorporates global information about multiple fiber crossings in every individual voxel and ranks it in the most scientifically rigorous way. This method has potential significance for a wide range of applications, including studies of brain connectivity. PMID:25532167

  19. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  20. Combined Population Dynamics and Entropy Modelling Supports Patient Stratification in Chronic Myeloid Leukemia

    PubMed Central

    Brehme, Marc; Koschmieder, Steffen; Montazeri, Maryam; Copland, Mhairi; Oehler, Vivian G.; Radich, Jerald P.; Brümmendorf, Tim H.; Schuppert, Andreas

    2016-01-01

    Modelling the parameters of multistep carcinogenesis is key for a better understanding of cancer progression, biomarker identification and the design of individualized therapies. Using chronic myeloid leukemia (CML) as a paradigm for hierarchical disease evolution we show that combined population dynamic modelling and CML patient biopsy genomic analysis enables patient stratification at unprecedented resolution. Linking CD34+ similarity as a disease progression marker to patient-derived gene expression entropy separated established CML progression stages and uncovered additional heterogeneity within disease stages. Importantly, our patient data informed model enables quantitative approximation of individual patients’ disease history within chronic phase (CP) and significantly separates “early” from “late” CP. Our findings provide a novel rationale for personalized and genome-informed disease progression risk assessment that is independent and complementary to conventional measures of CML disease burden and prognosis. PMID:27048866

  1. Gas-water two-phase flow characterization with Electrical Resistance Tomography and Multivariate Multiscale Entropy analysis.

    PubMed

    Tan, Chao; Zhao, Jia; Dong, Feng

    2015-03-01

    Flow behavior characterization is important to understand gas-liquid two-phase flow mechanics and further establish its description model. An Electrical Resistance Tomography (ERT) provides information regarding flow conditions at different directions where the sensing electrodes implemented. We extracted the multivariate sample entropy (MSampEn) by treating ERT data as a multivariate time series. The dynamic experimental results indicate that the MSampEn is sensitive to complexity change of flow patterns including bubbly flow, stratified flow, plug flow and slug flow. MSampEn can characterize the flow behavior at different direction of two-phase flow, and reveal the transition between flow patterns when flow velocity changes. The proposed method is effective to analyze two-phase flow pattern transition by incorporating information of different scales and different spatial directions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Entropy generation across Earth's collisionless bow shock.

    PubMed

    Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A

    2012-02-10

    Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.

  3. Quench action and Rényi entropies in integrable systems

    NASA Astrophysics Data System (ADS)

    Alba, Vincenzo; Calabrese, Pasquale

    2017-09-01

    Entropy is a fundamental concept in equilibrium statistical mechanics, yet its origin in the nonequilibrium dynamics of isolated quantum systems is not fully understood. A strong consensus is emerging around the idea that the stationary thermodynamic entropy is the von Neumann entanglement entropy of a large subsystem embedded in an infinite system. Also motivated by cold-atom experiments, here we consider the generalization to Rényi entropies. We develop a new technique to calculate the diagonal Rényi entropy in the quench action formalism. In the spirit of the replica treatment for the entanglement entropy, the diagonal Rényi entropies are generalized free energies evaluated over a thermodynamic macrostate which depends on the Rényi index and, in particular, is not the same state describing von Neumann entropy. The technical reason for this perhaps surprising result is that the evaluation of the moments of the diagonal density matrix shifts the saddle point of the quench action. An interesting consequence is that different Rényi entropies encode information about different regions of the spectrum of the postquench Hamiltonian. Our approach provides a very simple proof of the long-standing issue that, for integrable systems, the diagonal entropy is half of the thermodynamic one and it allows us to generalize this result to the case of arbitrary Rényi entropy.

  4. An adaptive technique to maximize lossless image data compression of satellite images

    NASA Technical Reports Server (NTRS)

    Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe

    1994-01-01

    Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.

  5. Algorithm based on the short-term Rényi entropy and IF estimation for noisy EEG signals analysis.

    PubMed

    Lerga, Jonatan; Saulig, Nicoletta; Mozetič, Vladimir

    2017-01-01

    Stochastic electroencephalogram (EEG) signals are known to be nonstationary and often multicomponential. Detecting and extracting their components may help clinicians to localize brain neurological dysfunctionalities for patients with motor control disorders due to the fact that movement-related cortical activities are reflected in spectral EEG changes. A new algorithm for EEG signal components detection from its time-frequency distribution (TFD) has been proposed in this paper. The algorithm utilizes the modification of the Rényi entropy-based technique for number of components estimation, called short-term Rényi entropy (STRE), and upgraded by an iterative algorithm which was shown to enhance existing approaches. Combined with instantaneous frequency (IF) estimation, the proposed method was applied to EEG signal analysis both in noise-free and noisy environments for limb movements EEG signals, and was shown to be an efficient technique providing spectral description of brain activities at each electrode location up to moderate additive noise levels. Furthermore, the obtained information concerning the number of EEG signal components and their IFs show potentials to enhance diagnostics and treatment of neurological disorders for patients with motor control illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Information entropy method and the description of echo hologram formation in gaseous media

    NASA Astrophysics Data System (ADS)

    Garnaeva, G. I.; Nefediev, L. A.; Akhmedshina, E. N.

    2018-02-01

    The effect of collisions with a change in velocity of gas particles, on the value of information entropy, is associated with the spectral structure of the echo hologram’s response, where its temporal form is considered. It is shown that collisions with a change in gas particle velocity increase the ‘parasitical’ information, on the background of which the information contained in the temporary shape of the object laser pulse is lost.

  7. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin.

    PubMed

    Hacisuleyman, Aysima; Erman, Burak

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins.

  8. Data Mining in Earth System Science (DMESS 2011)

    Treesearch

    Forrest M. Hoffman; J. Walter Larson; Richard Tran Mills; Bhorn-Gustaf Brooks; Auroop R. Ganguly; William Hargrove; et al

    2011-01-01

    From field-scale measurements to global climate simulations and remote sensing, the growing body of very large and long time series Earth science data are increasingly difficult to analyze, visualize, and interpret. Data mining, information theoretic, and machine learning techniques—such as cluster analysis, singular value decomposition, block entropy, Fourier and...

  9. Recoverability in quantum information theory

    NASA Astrophysics Data System (ADS)

    Wilde, Mark

    The fact that the quantum relative entropy is non-increasing with respect to quantum physical evolutions lies at the core of many optimality theorems in quantum information theory and has applications in other areas of physics. In this work, we establish improvements of this entropy inequality in the form of physically meaningful remainder terms. One of the main results can be summarized informally as follows: if the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small, then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution. Our proof method is elementary, relying on the method of complex interpolation, basic linear algebra, and the recently introduced Renyi generalization of a relative entropy difference. The theorem has a number of applications in quantum information theory, which have to do with providing physically meaningful improvements to many known entropy inequalities. This is based on arXiv:1505.04661, now accepted for publication in Proceedings of the Royal Society A. I acknowledge support from startup funds from the Department of Physics and Astronomy at LSU, the NSF under Award No. CCF-1350397, and the DARPA Quiness Program through US Army Research Office award W31P4Q-12-1-0019.

  10. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  11. Information theory-based decision support system for integrated design of multivariable hydrometric networks

    NASA Astrophysics Data System (ADS)

    Keum, Jongho; Coulibaly, Paulin

    2017-07-01

    Adequate and accurate hydrologic information from optimal hydrometric networks is an essential part of effective water resources management. Although the key hydrologic processes in the water cycle are interconnected, hydrometric networks (e.g., streamflow, precipitation, groundwater level) have been routinely designed individually. A decision support framework is proposed for integrated design of multivariable hydrometric networks. The proposed method is applied to design optimal precipitation and streamflow networks simultaneously. The epsilon-dominance hierarchical Bayesian optimization algorithm was combined with Shannon entropy of information theory to design and evaluate hydrometric networks. Specifically, the joint entropy from the combined networks was maximized to provide the most information, and the total correlation was minimized to reduce redundant information. To further optimize the efficiency between the networks, they were designed by maximizing the conditional entropy of the streamflow network given the information of the precipitation network. Compared to the traditional individual variable design approach, the integrated multivariable design method was able to determine more efficient optimal networks by avoiding the redundant stations. Additionally, four quantization cases were compared to evaluate their effects on the entropy calculations and the determination of the optimal networks. The evaluation results indicate that the quantization methods should be selected after careful consideration for each design problem since the station rankings and the optimal networks can change accordingly.

  12. Entropy vs. energy waveform processing: A comparison based on the heat equation

    DOE PAGES

    Hughes, Michael S.; McCarthy, John E.; Bruillard, Paul J.; ...

    2015-05-25

    Virtually all modern imaging devices collect electromagnetic or acoustic waves and use the energy carried by these waves to determine pixel values to create what is basically an “energy” picture. However, waves also carry “information”, as quantified by some form of entropy, and this may also be used to produce an “information” image. Numerous published studies have demonstrated the advantages of entropy, or “information imaging”, over conventional methods. The most sensitive information measure appears to be the joint entropy of the collected wave and a reference signal. The sensitivity of repeated experimental observations of a slowly-changing quantity may be definedmore » as the mean variation (i.e., observed change) divided by mean variance (i.e., noise). Wiener integration permits computation of the required mean values and variances as solutions to the heat equation, permitting estimation of their relative magnitudes. There always exists a reference, such that joint entropy has larger variation and smaller variance than the corresponding quantities for signal energy, matching observations of several studies. Moreover, a general prescription for finding an “optimal” reference for the joint entropy emerges, which also has been validated in several studies.« less

  13. Information-Theoretical Quantifier of Brain Rhythm Based on Data-Driven Multiscale Representation

    PubMed Central

    2015-01-01

    This paper presents a data-driven multiscale entropy measure to reveal the scale dependent information quantity of electroencephalogram (EEG) recordings. This work is motivated by the previous observations on the nonlinear and nonstationary nature of EEG over multiple time scales. Here, a new framework of entropy measures considering changing dynamics over multiple oscillatory scales is presented. First, to deal with nonstationarity over multiple scales, EEG recording is decomposed by applying the empirical mode decomposition (EMD) which is known to be effective for extracting the constituent narrowband components without a predetermined basis. Following calculation of Renyi entropy of the probability distributions of the intrinsic mode functions extracted by EMD leads to a data-driven multiscale Renyi entropy. To validate the performance of the proposed entropy measure, actual EEG recordings from rats (n = 9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Simulation and experimental results demonstrate that the use of the multiscale Renyi entropy leads to better discriminative capability of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective diagnostic and prognostic tool. PMID:26380297

  14. Entanglement entropy for (3+1)-dimensional topological order with excitations

    NASA Astrophysics Data System (ADS)

    Wen, Xueda; He, Huan; Tiwari, Apoorv; Zheng, Yunqin; Ye, Peng

    2018-02-01

    Excitations in (3+1)-dimensional [(3+1)D] topologically ordered phases have very rich structures. (3+1)D topological phases support both pointlike and stringlike excitations, and in particular the loop (closed string) excitations may admit knotted and linked structures. In this work, we ask the following question: How do different types of topological excitations contribute to the entanglement entropy or, alternatively, can we use the entanglement entropy to detect the structure of excitations, and further obtain the information of the underlying topological order? We are mainly interested in (3+1)D topological order that can be realized in Dijkgraaf-Witten (DW) gauge theories, which are labeled by a finite group G and its group 4-cocycle ω ∈H4[G ;U(1 ) ] up to group automorphisms. We find that each topological excitation contributes a universal constant lndi to the entanglement entropy, where di is the quantum dimension that depends on both the structure of the excitation and the data (G ,ω ) . The entanglement entropy of the excitations of the linked/unlinked topology can capture different information of the DW theory (G ,ω ) . In particular, the entanglement entropy introduced by Hopf-link loop excitations can distinguish certain group 4-cocycles ω from the others.

  15. Spatiotemporal Dependency of Age-Related Changes in Brain Signal Variability

    PubMed Central

    McIntosh, A. R.; Vakorin, V.; Kovacevic, N.; Wang, H.; Diaconescu, A.; Protzner, A. B.

    2014-01-01

    Recent theoretical and empirical work has focused on the variability of network dynamics in maturation. Such variability seems to reflect the spontaneous formation and dissolution of different functional networks. We sought to extend these observations into healthy aging. Two different data sets, one EEG (total n = 48, ages 18–72) and one magnetoencephalography (n = 31, ages 20–75) were analyzed for such spatiotemporal dependency using multiscale entropy (MSE) from regional brain sources. In both data sets, the changes in MSE were timescale dependent, with higher entropy at fine scales and lower at more coarse scales with greater age. The signals were parsed further into local entropy, related to information processed within a regional source, and distributed entropy (information shared between two sources, i.e., functional connectivity). Local entropy increased for most regions, whereas the dominant change in distributed entropy was age-related reductions across hemispheres. These data further the understanding of changes in brain signal variability across the lifespan, suggesting an inverted U-shaped curve, but with an important qualifier. Unlike earlier in maturation, where the changes are more widespread, changes in adulthood show strong spatiotemporal dependence. PMID:23395850

  16. Extension and Application of High-Speed Digital Imaging Analysis Via Spatiotemporal Correlation and Eigenmode Analysis of Vocal Fold Vibration Before and After Polyp Excision.

    PubMed

    Wang, Jun-Sheng; Olszewski, Emily; Devine, Erin E; Hoffman, Matthew R; Zhang, Yu; Shao, Jun; Jiang, Jack J

    2016-08-01

    To evaluate the spatiotemporal correlation of vocal fold vibration using eigenmode analysis before and after polyp removal and explore the potential clinical relevance of spatiotemporal analysis of correlation length and entropy as quantitative voice parameters. We hypothesized that increased order in the vibrating signal after surgical intervention would decrease the eigenmode-based entropy and increase correlation length. Prospective case series. Forty subjects (23 males, 17 females) with unilateral (n = 24) or bilateral (n = 16) polyps underwent polyp removal. High-speed videoendoscopy was performed preoperatively and 2 weeks postoperatively. Spatiotemporal analysis was performed to determine entropy, quantification of signal disorder, correlation length, size, and spatially ordered structure of vocal fold vibration in comparison to full spatial consistency. The signal analyzed consists of the vibratory pattern in space and time derived from the high-speed video glottal area contour. Entropy decreased (Z = -3.871, P < .001) and correlation length increased (t = -8.913, P < .001) following polyp excision. The intraclass correlation coefficients (ICC) for correlation length and entropy were 0.84 and 0.93. Correlation length and entropy are sensitive to mass lesions. These parameters could potentially be used to augment subjective visualization after polyp excision when evaluating procedural efficacy. © The Author(s) 2016.

  17. A Study of Turkish Chemistry Undergraduates' Understandings of Entropy

    ERIC Educational Resources Information Center

    Sozbilir, Mustafa; Bennett, Judith M.

    2007-01-01

    Entropy is that fundamental concept of chemical thermodynamics, which explains the natural tendency of matter and energy in the Universe. The analysis presents the description of entropy, as understood by the Turkish chemistry undergraduates.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Xi

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Re´nyi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometricmore » prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Re´nyi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Re´nyi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.« less

  19. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  20. Rogue waves and entropy consumption

    NASA Astrophysics Data System (ADS)

    Hadjihoseini, Ali; Lind, Pedro G.; Mori, Nobuhito; Hoffmann, Norbert P.; Peinke, Joachim

    2017-11-01

    Based on data from the Sea of Japan and the North Sea the occurrence of rogue waves is analyzed by a scale-dependent stochastic approach, which interlinks fluctuations of waves for different spacings. With this approach we are able to determine a stochastic cascade process, which provides information of the general multipoint statistics. Furthermore the evolution of single trajectories in scale, which characterize wave height fluctuations in the surroundings of a chosen location, can be determined. The explicit knowledge of the stochastic process enables to assign entropy values to all wave events. We show that for these entropies the integral fluctuation theorem, a basic law of non-equilibrium thermodynamics, is valid. This implies that positive and negative entropy events must occur. Extreme events like rogue waves are characterized as negative entropy events. The statistics of these entropy fluctuations changes with the wave state, thus for the Sea of Japan the statistics of the entropies has a more pronounced tail for negative entropy values, indicating a higher probability of rogue waves.

  1. Fundamental limits on quantum dynamics based on entropy change

    NASA Astrophysics Data System (ADS)

    Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.

    2018-01-01

    It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.

  2. The gravity dual of Rényi entropy

    DOE PAGES

    Dong, Xi

    2016-08-12

    A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement entropy, whereas a full understanding of a quantum state requires Re´nyi entropies. Here we show that all Rényi entropies satisfy a similar area law in holographic theories and are given by the areas of dual cosmic branes. This geometricmore » prescription is a one-parameter generalization of the minimal surface prescription for entanglement entropy. Applying this we provide the first holographic calculation of mutual Re´nyi information between two disks of arbitrary dimension. Our results provide a framework for efficiently studying Re´nyi entropies and understanding entanglement structures in strongly coupled systems and quantum gravity.« less

  3. State fusion entropy for continuous and site-specific analysis of landslide stability changing regularities

    NASA Astrophysics Data System (ADS)

    Liu, Yong; Qin, Zhimeng; Hu, Baodan; Feng, Shuai

    2018-04-01

    Stability analysis is of great significance to landslide hazard prevention, especially the dynamic stability. However, many existing stability analysis methods are difficult to analyse the continuous landslide stability and its changing regularities in a uniform criterion due to the unique landslide geological conditions. Based on the relationship between displacement monitoring data, deformation states and landslide stability, a state fusion entropy method is herein proposed to derive landslide instability through a comprehensive multi-attribute entropy analysis of deformation states, which are defined by a proposed joint clustering method combining K-means and a cloud model. Taking Xintan landslide as the detailed case study, cumulative state fusion entropy presents an obvious increasing trend after the landslide entered accelerative deformation stage and historical maxima match highly with landslide macroscopic deformation behaviours in key time nodes. Reasonable results are also obtained in its application to several other landslides in the Three Gorges Reservoir in China. Combined with field survey, state fusion entropy may serve for assessing landslide stability and judging landslide evolutionary stages.

  4. The Holographic Entropy Cone

    DOE PAGES

    Bao, Ning; Nezami, Sepehr; Ooguri, Hirosi; ...

    2015-09-21

    We initiate a systematic enumeration and classification of entropy inequalities satisfied by the Ryu-Takayanagi formula for conformal field theory states with smooth holographic dual geometries. For 2, 3, and 4 regions, we prove that the strong subadditivity and the monogamy of mutual information give the complete set of inequalities. This is in contrast to the situation for generic quantum systems, where a complete set of entropy inequalities is not known for 4 or more regions. We also find an infinite new family of inequalities applicable to 5 or more regions. The set of all holographic entropy inequalities bounds the phasemore » space of Ryu-Takayanagi entropies, defining the holographic entropy cone. We characterize this entropy cone by reducing geometries to minimal graph models that encode the possible cutting and gluing relations of minimal surfaces. We find that, for a fixed number of regions, there are only finitely many independent entropy inequalities. To establish new holographic entropy inequalities, we introduce a combinatorial proof technique that may also be of independent interest in Riemannian geometry and graph theory.« less

  5. An efficient algorithm for automatic phase correction of NMR spectra based on entropy minimization

    NASA Astrophysics Data System (ADS)

    Chen, Li; Weng, Zhiqiang; Goh, LaiYoong; Garland, Marc

    2002-09-01

    A new algorithm for automatic phase correction of NMR spectra based on entropy minimization is proposed. The optimal zero-order and first-order phase corrections for a NMR spectrum are determined by minimizing entropy. The objective function is constructed using a Shannon-type information entropy measure. Entropy is defined as the normalized derivative of the NMR spectral data. The algorithm has been successfully applied to experimental 1H NMR spectra. The results of automatic phase correction are found to be comparable to, or perhaps better than, manual phase correction. The advantages of this automatic phase correction algorithm include its simple mathematical basis and the straightforward, reproducible, and efficient optimization procedure. The algorithm is implemented in the Matlab program ACME—Automated phase Correction based on Minimization of Entropy.

  6. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Treesearch

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  7. Isobaric yield ratio difference and Shannon information entropy

    NASA Astrophysics Data System (ADS)

    Ma, Chun-Wang; Wei, Hui-Ling; Wang, Shan-Shan; Ma, Yu-Gang; Wada, Ryoichi; Zhang, Yan-Li

    2015-03-01

    The Shannon information entropy theory is used to explain the recently proposed isobaric yield ratio difference (IBD) probe which aims to determine the nuclear symmetry energy. Theoretically, the difference between the Shannon uncertainties carried by isobars in two different reactions (ΔIn21), is found to be equivalent to the difference between the chemical potentials of protons and neutrons of the reactions [the IBD probe, IB- Δ(βμ)21, with β the reverse temperature]. From the viewpoints of Shannon information entropy, the physical meaning of the above chemical potential difference is interpreted by ΔIn21 as denoting the nuclear symmetry energy or density difference between neutrons and protons in reactions more concisely than from the statistical ablation-abrasion model.

  8. Shannon entropy and Fisher information of the one-dimensional Klein-Gordon oscillator with energy-dependent potential

    NASA Astrophysics Data System (ADS)

    Boumali, Abdelmalek; Labidi, Malika

    2018-02-01

    In this paper, we studied, at first, the influence of the energy-dependent potentials on the one-dimensionless Klein-Gordon oscillator. Then, the Shannon entropy and Fisher information of this system are investigated. The position and momentum information entropies for the low-lying states n = 0, 1, 2 are calculated. Some interesting features of both Fisher and Shannon densities, as well as the probability densities, are demonstrated. Finally, the Stam, Cramer-Rao and Bialynicki-Birula-Mycielski (BBM) inequalities have been checked, and their comparison with the regarding results have been reported. We showed that the BBM inequality is still valid in the form Sx + Sp ≥ 1 +ln π, as well as in ordinary quantum mechanics.

  9. Wang-Landau method for calculating Rényi entropies in finite-temperature quantum Monte Carlo simulations.

    PubMed

    Inglis, Stephen; Melko, Roger G

    2013-01-01

    We implement a Wang-Landau sampling technique in quantum Monte Carlo (QMC) simulations for the purpose of calculating the Rényi entanglement entropies and associated mutual information. The algorithm converges an estimate for an analog to the density of states for stochastic series expansion QMC, allowing a direct calculation of Rényi entropies without explicit thermodynamic integration. We benchmark results for the mutual information on two-dimensional (2D) isotropic and anisotropic Heisenberg models, a 2D transverse field Ising model, and a three-dimensional Heisenberg model, confirming a critical scaling of the mutual information in cases with a finite-temperature transition. We discuss the benefits and limitations of broad sampling techniques compared to standard importance sampling methods.

  10. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  11. ENTROPY VS. ENERGY WAVEFORM PROCESSING: A COMPARISON ON THE HEAT EQUATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Michael S.; McCarthy, John; Bruillard, Paul J.

    2015-05-25

    Virtually all modern imaging devices function by collecting either electromagnetic or acoustic backscattered waves and using the energy carried by these waves to determine pixel values that build up what is basically an ”energy” picture. However, waves also carry ”informa- tion” that also may be used to compute the pixel values in an image. We have employed several measures of information, all of which are based on different forms of entropy. Numerous published studies have demonstrated the advantages of entropy, or “information imaging”, over conventional methods for materials characterization and medical imaging. Similar results also have been obtained with microwaves.more » The most sensitive information measure appears to be the joint entropy of the backscattered wave and a reference signal. A typical study is comprised of repeated acquisition of backscattered waves from a specimen that is changing slowing with acquisition time or location. The sensitivity of repeated experimental observations of such a slowly changing quantity may be defined as the mean variation (i.e., observed change) divided by mean variance (i.e., observed noise). We compute the sensitivity for joint entropy and signal energy measurements assuming that noise is Gaussian and using Wiener integration to compute the required mean values and variances. These can be written as solutions to the Heat equation, which permits estimation of their magnitudes. There always exists a reference such that joint entropy has larger variation and smaller variance than the corresponding quantities for signal energy, matching observations of several studies. Moreover, a general prescription for finding an “optimal” reference for the joint entropy emerges, which also has been validated in several studies.« less

  12. Quantum darwinism in a mixed environment.

    PubMed

    Zwolak, Michael; Quan, H T; Zurek, Wojciech H

    2009-09-11

    Quantum Darwinism recognizes that we-the observers-acquire our information about the "systems of interest" indirectly from their imprints on the environment. Here, we show that information about a system can be acquired from a mixed-state, or hazy, environment, but the storage capacity of an environment fragment is suppressed by its initial entropy. In the case of good decoherence, the mutual information between the system and the fragment is given solely by the fragment's entropy increase. For fairly mixed environments, this means a reduction by a factor 1-h, where h is the haziness of the environment, i.e., the initial entropy of an environment qubit. Thus, even such hazy environments eventually reveal the state of the system, although now the intercepted environment fragment must be larger by approximately (1-h)(-1) to gain the same information about the system.

  13. Quantum Darwinism in a Mixed Environment

    NASA Astrophysics Data System (ADS)

    Zwolak, Michael; Quan, H. T.; Zurek, Wojciech H.

    2009-09-01

    Quantum Darwinism recognizes that we—the observers—acquire our information about the “systems of interest” indirectly from their imprints on the environment. Here, we show that information about a system can be acquired from a mixed-state, or hazy, environment, but the storage capacity of an environment fragment is suppressed by its initial entropy. In the case of good decoherence, the mutual information between the system and the fragment is given solely by the fragment’s entropy increase. For fairly mixed environments, this means a reduction by a factor 1-h, where h is the haziness of the environment, i.e., the initial entropy of an environment qubit. Thus, even such hazy environments eventually reveal the state of the system, although now the intercepted environment fragment must be larger by ˜(1-h)-1 to gain the same information about the system.

  14. Symmetric polynomials in information theory: Entropy and subentropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jozsa, Richard; Mitchison, Graeme

    2015-06-15

    Entropy and other fundamental quantities of information theory are customarily expressed and manipulated as functions of probabilities. Here we study the entropy H and subentropy Q as functions of the elementary symmetric polynomials in the probabilities and reveal a series of remarkable properties. Derivatives of all orders are shown to satisfy a complete monotonicity property. H and Q themselves become multivariate Bernstein functions and we derive the density functions of their Levy-Khintchine representations. We also show that H and Q are Pick functions in each symmetric polynomial variable separately. Furthermore, we see that H and the intrinsically quantum informational quantitymore » Q become surprisingly closely related in functional form, suggesting a special significance for the symmetric polynomials in quantum information theory. Using the symmetric polynomials, we also derive a series of further properties of H and Q.« less

  15. Eigensolutions, Shannon entropy and information energy for modified Tietz-Hua potential

    NASA Astrophysics Data System (ADS)

    Onate, C. A.; Onyeaju, M. C.; Ituen, E. E.; Ikot, A. N.; Ebomwonyi, O.; Okoro, J. O.; Dopamu, K. O.

    2018-04-01

    The Tietz-Hua potential is modified by the inclusion of De ( {{Ch - 1}/{1 - C_{h e^{{ - bh ( {r - re } )}} }}} )be^{{ - bh ( {r - re } )}} term to the Tietz-Hua potential model since a potential of such type is very good in the description and vibrational energy levels for diatomic molecules. The energy eigenvalues and the corresponding eigenfunctions are explicitly obtained using the methodology of parametric Nikiforov-Uvarov. By putting the potential parameter b = 0, in the modified Tietz-Hua potential quickly reduces to the Tietz-Hua potential. To show more applications of our work, we have computed the Shannon entropy and Information energy under the modified Tietz-Hua potential. However, the computation of the Shannon entropy and Information energy is an extension of the work of Falaye et al., who computed only the Fisher information under Tietz-Hua potential.

  16. Entropy inequality and hydrodynamic limits for the Boltzmann equation.

    PubMed

    Saint-Raymond, Laure

    2013-12-28

    Boltzmann brought a fundamental contribution to the understanding of the notion of entropy, by giving a microscopic formulation of the second principle of thermodynamics. His ingenious idea, motivated by the works of his contemporaries on the atomic nature of matter, consists of describing gases as huge systems of identical and indistinguishable elementary particles. The state of a gas can therefore be described in a statistical way. The evolution, which introduces couplings, loses part of the information, which is expressed by the decay of the so-called mathematical entropy (the opposite of physical entropy!).

  17. Entropic manifestations of topological order in three dimensions

    NASA Astrophysics Data System (ADS)

    Bullivant, Alex; Pachos, Jiannis K.

    2016-03-01

    We evaluate the entanglement entropy of exactly solvable Hamiltonians corresponding to general families of three-dimensional topological models. We show that the modification to the entropic area law due to three-dimensional topological properties is richer than the two-dimensional case. In addition to the reduction of the entropy caused by a nonzero vacuum expectation value of contractible loop operators, a topological invariant emerges that increases the entropy if the model consists of nontrivially braiding anyons. As a result the three-dimensional topological entanglement entropy provides only partial information about the two entropic topological invariants.

  18. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    PubMed

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  19. Beyond the Shannon–Khinchin formulation: The composability axiom and the universal-group entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tempesta, Piergiulio, E-mail: p.tempesta@fis.ucm.es

    2016-02-15

    The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a considerable effort has been devoted to the study of new entropic forms, which generalize the standard Boltzmann–Gibbs (BG) entropy and could be applicable in thermodynamics, quantum mechanics and information theory. In Khinchin (1957), by extending previous ideas of Shannon (1948) and Shannon and Weaver (1949), Khinchin proposed a characterization of the BG entropy, based on four requirements, nowadays known as the Shannon–Khinchin (SK) axioms. The purpose of this paper is twofold. First, we show that there exists an intrinsic group-theoretical structure behindmore » the notion of entropy. It comes from the requirement of composability of an entropy with respect to the union of two statistically independent systems, that we propose in an axiomatic formulation. Second, we show that there exists a simple universal family of trace-form entropies. This class contains many well known examples of entropies and infinitely many new ones, a priori multi-parametric. Due to its specific relation with Lazard’s universal formal group of algebraic topology, the new general entropy introduced in this work will be called the universal-group entropy. A new example of multi-parametric entropy is explicitly constructed.« less

  20. Formal groups and Z-entropies

    PubMed Central

    2016-01-01

    We shall prove that the celebrated Rényi entropy is the first example of a new family of infinitely many multi-parametric entropies. We shall call them the Z-entropies. Each of them, under suitable hypotheses, generalizes the celebrated entropies of Boltzmann and Rényi. A crucial aspect is that every Z-entropy is composable (Tempesta 2016 Ann. Phys. 365, 180–197. (doi:10.1016/j.aop.2015.08.013)). This property means that the entropy of a system which is composed of two or more independent systems depends, in all the associated probability space, on the choice of the two systems only. Further properties are also required to describe the composition process in terms of a group law. The composability axiom, introduced as a generalization of the fourth Shannon–Khinchin axiom (postulating additivity), is a highly non-trivial requirement. Indeed, in the trace-form class, the Boltzmann entropy and Tsallis entropy are the only known composable cases. However, in the non-trace form class, the Z-entropies arise as new entropic functions possessing the mathematical properties necessary for information-theoretical applications, in both classical and quantum contexts. From a mathematical point of view, composability is intimately related to formal group theory of algebraic topology. The underlying group-theoretical structure determines crucially the statistical properties of the corresponding entropies. PMID:27956871

  1. A deeper look at two concepts of measuring gene-gene interactions: logistic regression and interaction information revisited.

    PubMed

    Mielniczuk, Jan; Teisseyre, Paweł

    2018-03-01

    Detection of gene-gene interactions is one of the most important challenges in genome-wide case-control studies. Besides traditional logistic regression analysis, recently the entropy-based methods attracted a significant attention. Among entropy-based methods, interaction information is one of the most promising measures having many desirable properties. Although both logistic regression and interaction information have been used in several genome-wide association studies, the relationship between them has not been thoroughly investigated theoretically. The present paper attempts to fill this gap. We show that although certain connections between the two methods exist, in general they refer two different concepts of dependence and looking for interactions in those two senses leads to different approaches to interaction detection. We introduce ordering between interaction measures and specify conditions for independent and dependent genes under which interaction information is more discriminative measure than logistic regression. Moreover, we show that for so-called perfect distributions those measures are equivalent. The numerical experiments illustrate the theoretical findings indicating that interaction information and its modified version are more universal tools for detecting various types of interaction than logistic regression and linkage disequilibrium measures. © 2017 WILEY PERIODICALS, INC.

  2. Entanglement evaluation with atomic Fisher information

    NASA Astrophysics Data System (ADS)

    Obada, A.-S. F.; Abdel-Khalek, S.

    2010-02-01

    In this paper, the concept of atomic Fisher information (AFI) is introduced. The marginal distributions of the AFI are defined. This quantity is used as a parameter of entanglement and compared with linear and atomic Wehrl entropies of the two-level atom. The evolution of the atomic Fisher information and atomic Wehrl entropy for only the pure state (or dissipation-free) of the Jaynes-Cummings model is analyzed. We demonstrate the connections between these measures.

  3. Supersymmetric Renyi entropy in CFT 2 and AdS 3

    DOE PAGES

    Giveon, Amit; Kutasov, David

    2016-01-01

    We show that in any two dimensional conformal field theory with (2, 2) super-symmetry one can define a supersymmetric analog of the usual Renyi entropy of a spatial region A. It differs from the Renyi entropy by a universal function (which we compute) of the central charge, Renyi parameter n and the geometric parameters of A. In the limit n → 1 it coincides with the entanglement entropy. Thus, it contains the same information as the Renyi entropy but its computation only involves correlation functions of chiral and anti-chiral operators. We also show that this quantity appears naturally in stringmore » theory on AdS3.« less

  4. Sample entropy analysis of cervical neoplasia gene-expression signatures

    PubMed Central

    Botting, Shaleen K; Trzeciakowski, Jerome P; Benoit, Michelle F; Salama, Salama A; Diaz-Arrastia, Concepcion R

    2009-01-01

    Background We introduce Approximate Entropy as a mathematical method of analysis for microarray data. Approximate entropy is applied here as a method to classify the complex gene expression patterns resultant of a clinical sample set. Since Entropy is a measure of disorder in a system, we believe that by choosing genes which display minimum entropy in normal controls and maximum entropy in the cancerous sample set we will be able to distinguish those genes which display the greatest variability in the cancerous set. Here we describe a method of utilizing Approximate Sample Entropy (ApSE) analysis to identify genes of interest with the highest probability of producing an accurate, predictive, classification model from our data set. Results In the development of a diagnostic gene-expression profile for cervical intraepithelial neoplasia (CIN) and squamous cell carcinoma of the cervix, we identified 208 genes which are unchanging in all normal tissue samples, yet exhibit a random pattern indicative of the genetic instability and heterogeneity of malignant cells. This may be measured in terms of the ApSE when compared to normal tissue. We have validated 10 of these genes on 10 Normal and 20 cancer and CIN3 samples. We report that the predictive value of the sample entropy calculation for these 10 genes of interest is promising (75% sensitivity, 80% specificity for prediction of cervical cancer over CIN3). Conclusion The success of the Approximate Sample Entropy approach in discerning alterations in complexity from biological system with such relatively small sample set, and extracting biologically relevant genes of interest hold great promise. PMID:19232110

  5. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    PubMed

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  6. Entropy Analysis of Kinetic Flux Vector Splitting Schemes for the Compressible Euler Equations

    NASA Technical Reports Server (NTRS)

    Shiuhong, Lui; Xu, Jun

    1999-01-01

    Flux Vector Splitting (FVS) scheme is one group of approximate Riemann solvers for the compressible Euler equations. In this paper, the discretized entropy condition of the Kinetic Flux Vector Splitting (KFVS) scheme based on the gas-kinetic theory is proved. The proof of the entropy condition involves the entropy definition difference between the distinguishable and indistinguishable particles.

  7. PREFACE: Mathematical Aspects of Generalized Entropies and their Applications

    NASA Astrophysics Data System (ADS)

    Suyari, Hiroki; Ohara, Atsumi; Wada, Tatsuaki

    2010-01-01

    In the recent increasing interests in power-law behaviors beyond the usual exponential ones, there have been some concrete attempts in statistical physics to generalize the standard Boltzmann-Gibbs statistics. Among such generalizations, nonextensive statistical mechanics has been well studied for about the last two decades with many modifications and refinements. The generalization has provided not only a theoretical framework but also many applications such as chaos, multi-fractal, complex systems, nonequilibrium statistical mechanics, biophysics, econophysics, information theory and so on. At the same time as the developments in the generalization of statistical mechanics, the corresponding mathematical structures have also been required and uncovered. In particular, some deep connections to mathematical sciences such as q-analysis, information geometry, information theory and quantum probability theory have been revealed recently. These results obviously indicate an existence of the generalized mathematical structure including the mathematical framework for the exponential family as a special case, but the whole structure is still unclear. In order to make an opportunity to discuss the mathematical structure induced from generalized entropies by scientists in many fields, the international workshop 'Mathematical Aspects of Generalized Entropies and their Applications' was held on 7-9 July 2009 at Kyoto TERRSA, Kyoto, Japan. This volume is the proceedings of the workshop which consisted of 6 invited speakers, 14 oral presenters, 7 poster presenters and 63 other participants. The topics of the workshop cover the nonextensive statistical mechanics, chaos, cosmology, information geometry, divergence theory, econophysics, materials engineering, molecular dynamics and entropy theory, information theory and so on. The workshop was organized as the first attempt to discuss these mathematical aspects with leading experts in each area. We would like to express special thanks to all the invited speakers, the contributors and the participants at the workshop. We are also grateful to RIMS (Research Institute for Mathematical Science) in Kyoto University and the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (B), 18300003, 2009 for their support. Organizing Committee Editors of the Proceedings Hiroki Suyari (Chiba University, Japan) Atsumi Ohara (Osaka University, Japan) Tatsuaki Wada (Ibaraki University, Japan) Conference photograph

  8. A perspective on two chemometrics tools: PCA and MCR, and introduction of a new one: Pattern recognition entropy (PRE), as applied to XPS and ToF-SIMS depth profiles of organic and inorganic materials

    NASA Astrophysics Data System (ADS)

    Chatterjee, Shiladitya; Singh, Bhupinder; Diwan, Anubhav; Lee, Zheng Rong; Engelhard, Mark H.; Terry, Jeff; Tolley, H. Dennis; Gallagher, Neal B.; Linford, Matthew R.

    2018-03-01

    X-ray photoelectron spectroscopy (XPS) and time-of-flight secondary ion mass spectrometry (ToF-SIMS) are much used analytical techniques that provide information about the outermost atomic and molecular layers of materials. In this work, we discuss the application of multivariate spectral techniques, including principal component analysis (PCA) and multivariate curve resolution (MCR), to the analysis of XPS and ToF-SIMS depth profiles. Multivariate analyses often provide insight into data sets that is not easily obtained in a univariate fashion. Pattern recognition entropy (PRE), which has its roots in Shannon's information theory, is also introduced. This approach is not the same as the mutual information/entropy approaches sometimes used in data processing. A discussion of the theory of each technique is presented. PCA, MCR, and PRE are applied to four different data sets obtained from: a ToF-SIMS depth profile through ca. 100 nm of plasma polymerized C3F6 on Si, a ToF-SIMS depth profile through ca. 100 nm of plasma polymerized PNIPAM (poly (N-isopropylacrylamide)) on Si, an XPS depth profile through a film of SiO2 on Si, and an XPS depth profile through a film of Ta2O5 on Ta. PCA, MCR, and PRE reveal the presence of interfaces in the films, and often indicate that the first few scans in the depth profiles are different from those that follow. PRE and backward difference PRE provide this information in a straightforward fashion. Rises in the PRE signals at interfaces suggest greater complexity to the corresponding spectra. Results from PCA, especially for the higher principal components, were sometimes difficult to understand. MCR analyses were generally more interpretable.

  9. Intrasubject multimodal groupwise registration with the conditional template entropy.

    PubMed

    Polfliet, Mathias; Klein, Stefan; Huizinga, Wyke; Paulides, Margarethus M; Niessen, Wiro J; Vandemeulebroucke, Jef

    2018-05-01

    Image registration is an important task in medical image analysis. Whereas most methods are designed for the registration of two images (pairwise registration), there is an increasing interest in simultaneously aligning more than two images using groupwise registration. Multimodal registration in a groupwise setting remains difficult, due to the lack of generally applicable similarity metrics. In this work, a novel similarity metric for such groupwise registration problems is proposed. The metric calculates the sum of the conditional entropy between each image in the group and a representative template image constructed iteratively using principal component analysis. The proposed metric is validated in extensive experiments on synthetic and intrasubject clinical image data. These experiments showed equivalent or improved registration accuracy compared to other state-of-the-art (dis)similarity metrics and improved transformation consistency compared to pairwise mutual information. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Copula Entropy coupled with Wavelet Neural Network Model for Hydrological Prediction

    NASA Astrophysics Data System (ADS)

    Wang, Yin; Yue, JiGuang; Liu, ShuGuang; Wang, Li

    2018-02-01

    Artificial Neural network(ANN) has been widely used in hydrological forecasting. in this paper an attempt has been made to find an alternative method for hydrological prediction by combining Copula Entropy(CE) with Wavelet Neural Network(WNN), CE theory permits to calculate mutual information(MI) to select Input variables which avoids the limitations of the traditional linear correlation(LCC) analysis. Wavelet analysis can provide the exact locality of any changes in the dynamical patterns of the sequence Coupled with ANN Strong non-linear fitting ability. WNN model was able to provide a good fit with the hydrological data. finally, the hybrid model(CE+WNN) have been applied to daily water level of Taihu Lake Basin, and compared with CE ANN, LCC WNN and LCC ANN. Results showed that the hybrid model produced better results in estimating the hydrograph properties than the latter models.

  11. Multiscale Shannon entropy and its application in the stock market

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao

    2017-10-01

    In this paper, we perform a multiscale entropy analysis on the Dow Jones Industrial Average Index using the Shannon entropy. The stock index shows the characteristic of multi-scale entropy that caused by noise in the market. The entropy is demonstrated to have significant predictive ability for the stock index in both long-term and short-term, and empirical results verify that noise does exist in the market and can affect stock price. It has important implications on market participants such as noise traders.

  12. Entropy analysis of frequency and shape change in horseshoe bat biosonar

    NASA Astrophysics Data System (ADS)

    Gupta, Anupam K.; Webster, Dane; Müller, Rolf

    2018-06-01

    Echolocating bats use ultrasonic pulses to collect information about their environments. Some of this information is encoded at the baffle structures—noseleaves (emission) and pinnae (reception)—that act as interfaces between the bats' biosonar systems and the external world. The baffle beam patterns encode the direction-dependent sensory information as a function of frequency and hence represent a view of the environment. To generate diverse views of the environment, the bats can vary beam patterns by changes to (1) the wavelengths of the pulses or (2) the baffle geometries. Here we compare the variability in sensory information encoded by just the use of frequency or baffle shape dynamics in horseshoe bats. For this, we use digital and physical prototypes of both noseleaf and pinnae. The beam patterns for all prototypes were either measured or numerically predicted. Entropy was used as a measure to compare variability as a measure of sensory information encoding capacity. It was found that new information was acquired as a result of shape dynamics. Furthermore, the overall variability available for information encoding was similar in the case of frequency or shape dynamics. Thus, shape dynamics allows the horseshoe bats to generate diverse views of the environment in the absence of broadband biosonar signals.

  13. Information loss in effective field theory: Entanglement and thermal entropies

    NASA Astrophysics Data System (ADS)

    Boyanovsky, Daniel

    2018-03-01

    Integrating out high energy degrees of freedom to yield a low energy effective field theory leads to a loss of information with a concomitant increase in entropy. We obtain the effective field theory of a light scalar field interacting with heavy fields after tracing out the heavy degrees of freedom from the time evolved density matrix. The initial density matrix describes the light field in its ground state and the heavy fields in equilibrium at a common temperature T . For T =0 , we obtain the reduced density matrix in a perturbative expansion; it reveals an emergent mixed state as a consequence of the entanglement between light and heavy fields. We obtain the effective action that determines the time evolution of the reduced density matrix for the light field in a nonperturbative Dyson resummation of one-loop correlations of the heavy fields. The Von-Neumann entanglement entropy associated with the reduced density matrix is obtained for the nonresonant and resonant cases in the asymptotic long time limit. In the nonresonant case the reduced density matrix displays an incipient thermalization albeit with a wave-vector, time and coupling dependent effective temperature as a consequence of memory of initial conditions. The entanglement entropy is time independent and is the thermal entropy for this effective, nonequilibrium temperature. In the resonant case the light field fully thermalizes with the heavy fields, the reduced density matrix loses memory of the initial conditions and the entanglement entropy becomes the thermal entropy of the light field. We discuss the relation between the entanglement entropy ultraviolet divergences and renormalization.

  14. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin

    PubMed Central

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins. PMID:28095404

  15. Information, entropy, and fidelity in visual communication

    NASA Astrophysics Data System (ADS)

    Huck, Friedrich O.; Fales, Carl L.; Alter-Gartenberg, Rachel; Rahman, Zia-ur

    1992-10-01

    This paper presents an assessment of visual communication that integrates the critical limiting factors of image gathering an display with the digital processing that is used to code and restore images. The approach focuses on two mathematical criteria, information and fidelity, and on their relationships to the entropy of the encoded data and to the visual quality of the restored image.

  16. Information, entropy and fidelity in visual communication

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.; Alter-Gartenberg, Rachel; Rahman, Zia-Ur

    1992-01-01

    This paper presents an assessment of visual communication that integrates the critical limiting factors of image gathering and display with the digital processing that is used to code and restore images. The approach focuses on two mathematical criteria, information and fidelity, and on their relationships to the entropy of the encoded data and to the visual quality of the restored image.

  17. Rényi squashed entanglement, discord, and relative entropy differences

    NASA Astrophysics Data System (ADS)

    Seshadreesan, Kaushik P.; Berta, Mario; Wilde, Mark M.

    2015-10-01

    The squashed entanglement quantifies the amount of entanglement in a bipartite quantum state, and it satisfies all of the axioms desired for an entanglement measure. The quantum discord is a measure of quantum correlations that are different from those due to entanglement. What these two measures have in common is that they are both based upon the conditional quantum mutual information. In Berta et al (2015 J. Math. Phys. 56 022205), we recently proposed Rényi generalizations of the conditional quantum mutual information of a tripartite state on ABC (with C being the conditioning system), which were shown to satisfy some properties that hold for the original quantity, such as non-negativity, duality, and monotonicity with respect to local operations on the system B (with it being left open to show that the Rényi quantity is monotone with respect to local operations on system A). Here we define a Rényi squashed entanglement and a Rényi quantum discord based on a Rényi conditional quantum mutual information and investigate these quantities in detail. Taking as a conjecture that the Rényi conditional quantum mutual information is monotone with respect to local operations on both systems A and B, we prove that the Rényi squashed entanglement and the Rényi quantum discord satisfy many of the properties of the respective original von Neumann entropy based quantities. In our prior work (Berta et al 2015 Phys. Rev. A 91 022333), we also detailed a procedure to obtain Rényi generalizations of any quantum information measure that is equal to a linear combination of von Neumann entropies with coefficients chosen from the set \\{-1,0,1\\}. Here, we extend this procedure to include differences of relative entropies. Using the extended procedure and a conjectured monotonicity of the Rényi generalizations in the Rényi parameter, we discuss potential remainder terms for well known inequalities such as monotonicity of the relative entropy, joint convexity of the relative entropy, and the Holevo bound.

  18. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  19. Entropy production in a photovoltaic cell

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad H.

    2017-05-01

    We evaluate entropy production in a photovoltaic cell that is modeled by four electronic levels resonantly coupled to thermally populated field modes at different temperatures. We use a formalism recently proposed, the so-called multiple parallel worlds, to consistently address the nonlinearity of entropy in terms of density matrix. Our result shows that entropy production is the difference between two flows: a semiclassical flow that linearly depends on occupational probabilities, and another flow that depends nonlinearly on quantum coherence and has no semiclassical analog. We show that entropy production in the cells depends on environmentally induced decoherence time and energy detuning. We characterize regimes where reversal flow of information takes place from a cold to hot bath. Interestingly, we identify a lower bound on entropy production, which sets limitations on the statistics of dissipated heat in the cells.

  20. Detecting spatio-temporal modes in multivariate data by entropy field decomposition

    NASA Astrophysics Data System (ADS)

    Frank, Lawrence R.; Galinsky, Vitaly L.

    2016-09-01

    A new data analysis method that addresses a general problem of detecting spatio-temporal variations in multivariate data is presented. The method utilizes two recent and complimentary general approaches to data analysis, information field theory (IFT) and entropy spectrum pathways (ESPs). Both methods reformulate and incorporate Bayesian theory, thus use prior information to uncover underlying structure of the unknown signal. Unification of ESP and IFT creates an approach that is non-Gaussian and nonlinear by construction and is found to produce unique spatio-temporal modes of signal behavior that can be ranked according to their significance, from which space-time trajectories of parameter variations can be constructed and quantified. Two brief examples of real world applications of the theory to the analysis of data bearing completely different, unrelated nature, lacking any underlying similarity, are also presented. The first example provides an analysis of resting state functional magnetic resonance imaging data that allowed us to create an efficient and accurate computational method for assessing and categorizing brain activity. The second example demonstrates the potential of the method in the application to the analysis of a strong atmospheric storm circulation system during the complicated stage of tornado development and formation using data recorded by a mobile Doppler radar. Reference implementation of the method will be made available as a part of the QUEST toolkit that is currently under development at the Center for Scientific Computation in Imaging.

  1. Maximum entropy modeling of invasive plants in the forests of Cumberland Plateau and Mountain Region

    Treesearch

    Dawn Lemke; Philip Hulme; Jennifer Brown; Wubishet. Tadesse

    2011-01-01

    As anthropogenic influences on the landscape change the composition of 'natural' areas, it is important that we apply spatial technology in active management to mitigate human impact. This research explores the integration of geographic information systems (GIS) and remote sensing with statistical analysis to assist in modeling the distribution of invasive...

  2. Using quantum erasure to exorcize Maxwell's demon: I. Concepts and context

    NASA Astrophysics Data System (ADS)

    Scully, Marlan O.; Rostovtsev, Yuri; Sariyanni, Zoe-Elizabeth; Suhail Zubairy, M.

    2005-10-01

    Szilard [L. Szilard, Zeitschrift für Physik, 53 (1929) 840] made a decisive step toward solving the Maxwell demon problem by introducing and analyzing the single atom heat engine. Bennett [Sci. Am. 255 (1987) 107] completed the solution by pointing out that there must be an entropy, ΔS=kln2, generated as the result of information erased on each cycle. Nevertheless, others have disagreed. For example, philosophers such as Popper “have found the literature surrounding Maxwell's demon deeply problematic.” We propose and analyze a new kind of single atom quantum heat engine which allows us to resolve the Maxwell demon paradox simply, and without invoking the notions of information or entropy. The energy source of the present quantum engine [Scully, Phys. Rev. Lett. 87 (2001) 22601] is a Stern-Gerlach apparatus acting as a demonesque heat sorter. An isothermal compressor acts as the entropy sink. In order to complete a thermodynamic cycle, an energy of ΔW=kTln2 must be expended. This energy is essentially a “reset” or “eraser” energy. Thus Bennett's entropy ΔS=ΔW/T emerges as a simple consequence of the quantum thermodynamics of our heat engine. It would seem that quantum mechanics contains the kernel of information entropy at its very core. That is the concept of information erasure as it appears in quantum mechanics [Scully and Drühl, Phys. Rev. A 25 (1982) 2208] and the present quantum heat engine have a deep common origin.

  3. Ultrascalable Techniques Applied to the Global Intelligence Community Information Awareness Common Operating Picture (IA COP)

    DTIC Science & Technology

    2005-11-01

    more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates

  4. Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2013-04-01

    Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  5. Information Theory Broadens the Spectrum of Molecular Ecology and Evolution.

    PubMed

    Sherwin, W B; Chao, A; Jost, L; Smouse, P E

    2017-12-01

    Information or entropy analysis of diversity is used extensively in community ecology, and has recently been exploited for prediction and analysis in molecular ecology and evolution. Information measures belong to a spectrum (or q profile) of measures whose contrasting properties provide a rich summary of diversity, including allelic richness (q=0), Shannon information (q=1), and heterozygosity (q=2). We present the merits of information measures for describing and forecasting molecular variation within and among groups, comparing forecasts with data, and evaluating underlying processes such as dispersal. Importantly, information measures directly link causal processes and divergence outcomes, have straightforward relationship to allele frequency differences (including monotonicity that q=2 lacks), and show additivity across hierarchical layers such as ecology, behaviour, cellular processes, and nongenetic inheritance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Quantum Darwinism for mixed-state environment

    NASA Astrophysics Data System (ADS)

    Quan, Haitao; Zwolak, Michael; Zurek, Wojciech

    2009-03-01

    We exam quantum darwinism when a system is in the presence of a mixed environment, and we find a general relation between the mutual information for the mixed-state environment and the change of the entropy of the fraction of the environment. We then look at a particular solvable model, and we numerically exam the time evolution of the ``mutual information" for large environment. Finally we discuss about the exact expressions for all entropies and the mutual information at special time.

  7. Entropy Inequalities for Stable Densities and Strengthened Central Limit Theorems

    NASA Astrophysics Data System (ADS)

    Toscani, Giuseppe

    2016-10-01

    We consider the central limit theorem for stable laws in the case of the standardized sum of independent and identically distributed random variables with regular probability density function. By showing decay of different entropy functionals along the sequence we prove convergence with explicit rate in various norms to a Lévy centered density of parameter λ >1 . This introduces a new information-theoretic approach to the central limit theorem for stable laws, in which the main argument is shown to be the relative fractional Fisher information, recently introduced in Toscani (Ricerche Mat 65(1):71-91, 2016). In particular, it is proven that, with respect to the relative fractional Fisher information, the Lévy density satisfies an analogous of the logarithmic Sobolev inequality, which allows to pass from the monotonicity and decay to zero of the relative fractional Fisher information in the standardized sum to the decay to zero in relative entropy with an explicit decay rate.

  8. Entropy of electromyography time series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.

    2007-12-01

    A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.

  9. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method

    PubMed Central

    Roux, Benoît; Weare, Jonathan

    2013-01-01

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140

  10. Analysis of the horizontal structure of a measurement and control geodetic network based on entropy

    NASA Astrophysics Data System (ADS)

    Mrówczyńska, Maria

    2013-06-01

    The paper attempts to determine an optimum structure of a directional measurement and control network intended for investigating horizontal displacements. For this purpose it uses the notion of entropy as a logarithmical measure of probability of the state of a particular observation system. An optimum number of observations results from the difference of the entropy of the vector of parameters ΔHX̂ (x)corresponding to one extra observation. An increment of entropy interpreted as an increment of the amount of information about the state of the system determines the adoption or rejection of another extra observation to be carried out. W pracy podjęto próbę określenia optymalnej struktury sieci kierunkowej pomiarowo-kontrolnej przeznaczonej do badań przemieszczeń poziomych. W tym celu wykorzystano pojęcie entropii jako logarytmicznej miary prawdopodobieństwa stanu określonego układu obserwacyjnego. Optymalna liczba realizowanych obserwacji wynika z różnicy entropii wektora parametrów ΔHX̂ (x) odpowiadającej jednej obserwacji nadliczbowej. Przyrost entropii interpretowany jako przyrost objętości informacji na temat stanu układu decyduje o przyjęciu względnie odrzuceniu do realizacji kolejnej obserwacji nadliczbowej.

  11. Characterizing Protease Specificity: How Many Substrates Do We Need?

    PubMed Central

    Schauperl, Michael; Fuchs, Julian E.; Waldner, Birgit J.; Huber, Roland G.; Kramer, Christian; Liedl, Klaus R.

    2015-01-01

    Calculation of cleavage entropies allows to quantify, map and compare protease substrate specificity by an information entropy based approach. The metric intrinsically depends on the number of experimentally determined substrates (data points). Thus a statistical analysis of its numerical stability is crucial to estimate the systematic error made by estimating specificity based on a limited number of substrates. In this contribution, we show the mathematical basis for estimating the uncertainty in cleavage entropies. Sets of cleavage entropies are calculated using experimental cleavage data and modeled extreme cases. By analyzing the underlying mathematics and applying statistical tools, a linear dependence of the metric in respect to 1/n was found. This allows us to extrapolate the values to an infinite number of samples and to estimate the errors. Analyzing the errors, a minimum number of 30 substrates was found to be necessary to characterize substrate specificity, in terms of amino acid variability, for a protease (S4-S4’) with an uncertainty of 5 percent. Therefore, we encourage experimental researchers in the protease field to record specificity profiles of novel proteases aiming to identify at least 30 peptide substrates of maximum sequence diversity. We expect a full characterization of protease specificity helpful to rationalize biological functions of proteases and to assist rational drug design. PMID:26559682

  12. Connectivity in the human brain dissociates entropy and complexity of auditory inputs☆

    PubMed Central

    Nastase, Samuel A.; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-01-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. PMID:25536493

  13. Whole-Lesion Apparent Diffusion Coefficient-Based Entropy-Related Parameters for Characterizing Cervical Cancers: Initial Findings.

    PubMed

    Guan, Yue; Li, Weifeng; Jiang, Zhuoran; Chen, Ying; Liu, Song; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-12-01

    This study aimed to develop whole-lesion apparent diffusion coefficient (ADC)-based entropy-related parameters of cervical cancer to preliminarily assess intratumoral heterogeneity of this lesion in comparison to adjacent normal cervical tissues. A total of 51 women (mean age, 49 years) with cervical cancers confirmed by biopsy underwent 3-T pelvic diffusion-weighted magnetic resonance imaging with b values of 0 and 800 s/mm 2 prospectively. ADC-based entropy-related parameters including first-order entropy and second-order entropies were derived from the whole tumor volume as well as adjacent normal cervical tissues. Intraclass correlation coefficient, Wilcoxon test with Bonferroni correction, Kruskal-Wallis test, and receiver operating characteristic curve were used for statistical analysis. All the parameters showed excellent interobserver agreement (all intraclass correlation coefficients  > 0.900). Entropy, entropy(H) 0 , entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean were significantly higher, whereas entropy(H) range and entropy(H) std were significantly lower in cervical cancers compared to adjacent normal cervical tissues (all P <.0001). Kruskal-Wallis test showed that there were no significant differences among the values of various second-order entropies including entropy(H) 0, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean. All second-order entropies had larger area under the receiver operating characteristic curve than first-order entropy in differentiating cervical cancers from adjacent normal cervical tissues. Further, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean had the same largest area under the receiver operating characteristic curve of 0.867. Whole-lesion ADC-based entropy-related parameters of cervical cancers were developed successfully, which showed initial potential in characterizing intratumoral heterogeneity in comparison to adjacent normal cervical tissues. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  14. Device-Independent Tests of Entropy

    NASA Astrophysics Data System (ADS)

    Chaves, Rafael; Brask, Jonatan Bohr; Brunner, Nicolas

    2015-09-01

    We show that the entropy of a message can be tested in a device-independent way. Specifically, we consider a prepare-and-measure scenario with classical or quantum communication, and develop two different methods for placing lower bounds on the communication entropy, given observable data. The first method is based on the framework of causal inference networks. The second technique, based on convex optimization, shows that quantum communication provides an advantage over classical communication, in the sense of requiring a lower entropy to reproduce given data. These ideas may serve as a basis for novel applications in device-independent quantum information processing.

  15. Neuronal Entropy-Rate Feature of Entopeduncular Nucleus in Rat Model of Parkinson's Disease.

    PubMed

    Darbin, Olivier; Jin, Xingxing; Von Wrangel, Christof; Schwabe, Kerstin; Nambu, Atsushi; Naritoku, Dean K; Krauss, Joachim K; Alam, Mesbah

    2016-03-01

    The function of the nigro-striatal pathway on neuronal entropy in the basal ganglia (BG) output nucleus, i.e. the entopeduncular nucleus (EPN) was investigated in the unilaterally 6-hyroxydopamine (6-OHDA)-lesioned rat model of Parkinson's disease (PD). In both control subjects and subjects with 6-OHDA lesion of dopamine (DA) the nigro-striatal pathway, a histological hallmark for parkinsonism, neuronal entropy in EPN was maximal in neurons with firing rates ranging between 15 and 25 Hz. In 6-OHDA lesioned rats, neuronal entropy in the EPN was specifically higher in neurons with firing rates above 25 Hz. Our data establishes that the nigro-striatal pathway controls neuronal entropy in motor circuitry and that the parkinsonian condition is associated with abnormal relationship between firing rate and neuronal entropy in BG output nuclei. The neuronal firing rates and entropy relationship provide putative relevant electrophysiological information to investigate the sensory-motor processing in normal condition and conditions such as movement disorders.

  16. RELATIONSHIP BETWEEN ENTROPY OF SPIKE TIMING AND FIRING RATE IN ENTOPEDUNCULAR NUCLEUS NEURONS IN ANESTHETIZED RATS: FUNCTION OF THE NIGRO-STRIATAL PATHWAY

    PubMed Central

    Darbin, Olivier; Jin, Xingxing; von Wrangel, Christof; Schwabe, Kerstin; Nambu, Atsushi; Naritoku, Dean K; Krauss, Joachim K.; Alam, Mesbah

    2016-01-01

    The function of the nigro-striatal pathway on neuronal entropy in the basal ganglia (BG) output nucleus (entopeduncular nucleus, EPN) was investigated in the unilaterally 6-hyroxydopamine (6-OHDA)-lesioned rat model of Parkinson’s disease (PD). In both control subjects and subjects with 6-OHDA lesion of the nigro-striatal pathway, a histological hallmark for parkinsonism, neuronal entropy in EPN was maximal in neurons with firing rates ranging between 15Hz and 25 Hz. In 6-OHDA lesioned rats, neuronal entropy in the EPN was specifically higher in neurons with firing rates above 25Hz. Our data establishes that nigro-striatal pathway controls neuronal entropy in motor circuitry and that the parkinsonian condition is associated with abnormal relationship between firing rate and neuronal entropy in BG output nuclei. The neuronal firing rates and entropy relationship provide putative relevant electrophysiological information to investigate the sensory-motor processing in normal condition and conditions with movement disorders. PMID:26711712

  17. Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis

    PubMed Central

    Ré, Miguel A.; Azad, Rajeev K.

    2014-01-01

    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms. PMID:24728338

  18. Visual wetness perception based on image color statistics.

    PubMed

    Sawayama, Masataka; Adelson, Edward H; Nishida, Shin'ya

    2017-05-01

    Color vision provides humans and animals with the abilities to discriminate colors based on the wavelength composition of light and to determine the location and identity of objects of interest in cluttered scenes (e.g., ripe fruit among foliage). However, we argue that color vision can inform us about much more than color alone. Since a trichromatic image carries more information about the optical properties of a scene than a monochromatic image does, color can help us recognize complex material qualities. Here we show that human vision uses color statistics of an image for the perception of an ecologically important surface condition (i.e., wetness). Psychophysical experiments showed that overall enhancement of chromatic saturation, combined with a luminance tone change that increases the darkness and glossiness of the image, tended to make dry scenes look wetter. Theoretical analysis along with image analysis of real objects indicated that our image transformation, which we call the wetness enhancing transformation, is consistent with actual optical changes produced by surface wetting. Furthermore, we found that the wetness enhancing transformation operator was more effective for the images with many colors (large hue entropy) than for those with few colors (small hue entropy). The hue entropy may be used to separate surface wetness from other surface states having similar optical properties. While surface wetness and surface color might seem to be independent, there are higher order color statistics that can influence wetness judgments, in accord with the ecological statistics. The present findings indicate that the visual system uses color image statistics in an elegant way to help estimate the complex physical status of a scene.

  19. Generalization of entropy based divergence measures for symbolic sequence analysis.

    PubMed

    Ré, Miguel A; Azad, Rajeev K

    2014-01-01

    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.

  20. Coupled-Double-Quantum-Dot Environmental Information Engines: A Numerical Analysis

    NASA Astrophysics Data System (ADS)

    Tanabe, Katsuaki

    2016-06-01

    We conduct numerical simulations for an autonomous information engine comprising a set of coupled double quantum dots using a simple model. The steady-state entropy production rate in each component, heat and electron transfer rates are calculated via the probability distribution of the four electronic states from the master transition-rate equations. We define an information-engine efficiency based on the entropy change of the reservoir, implicating power generators that employ the environmental order as a new energy resource. We acquire device-design principles, toward the realization of corresponding practical energy converters, including that (1) higher energy levels of the detector-side reservoir than those of the detector dot provide significantly higher work production rates by faster states' circulation, (2) the efficiency is strongly dependent on the relative temperatures of the detector and system sides and becomes high in a particular Coulomb-interaction strength region between the quantum dots, and (3) the efficiency depends little on the system dot's energy level relative to its reservoir but largely on the antisymmetric relative amplitudes of the electronic tunneling rates.

  1. Information analysis of immune and endocrine organs. Morphological changes in the course of infection.

    PubMed

    Avtandilov, G G; Barsukov, V S

    1992-11-01

    Morphological and morphometric studies were conducted into lymphoid and endocrine organs of 259 human adults and infants with pyoinflammatory diseases (PID) and of 300 experimental mice. Informative and correlation analyses of the data thus recorded provided evidence to the effect that in the course of an infection process adaptation and compensation responses were characterized by intensified exchange of information within the immune-endocrine system (IES). Septic courses of PID were found to be accompanied by impairment of inter-organ correlations, increase in information entropy and progressive structural disorganization of the IES.

  2. Steganography on quantum pixel images using Shannon entropy

    NASA Astrophysics Data System (ADS)

    Laurel, Carlos Ortega; Dong, Shi-Hai; Cruz-Irisson, M.

    2016-07-01

    This paper presents a steganographical algorithm based on least significant bit (LSB) from the most significant bit information (MSBI) and the equivalence of a bit pixel image to a quantum pixel image, which permits to make the information communicate secretly onto quantum pixel images for its secure transmission through insecure channels. This algorithm offers higher security since it exploits the Shannon entropy for an image.

  3. Transfer entropy in physical systems and the arrow of time

    NASA Astrophysics Data System (ADS)

    Spinney, Richard E.; Lizier, Joseph T.; Prokopenko, Mikhail

    2016-08-01

    Recent developments have cemented the realization that many concepts and quantities in thermodynamics and information theory are shared. In this paper, we consider a highly relevant quantity in information theory and complex systems, the transfer entropy, and explore its thermodynamic role by considering the implications of time reversal upon it. By doing so we highlight the role of information dynamics on the nuanced question of observer perspective within thermodynamics by relating the temporal irreversibility in the information dynamics to the configurational (or spatial) resolution of the thermodynamics. We then highlight its role in perhaps the most enduring paradox in modern physics, the manifestation of a (thermodynamic) arrow of time. We find that for systems that process information such as those undergoing feedback, a robust arrow of time can be formulated by considering both the apparent physical behavior which leads to conventional entropy production and the information dynamics which leads to a quantity we call the information theoretic arrow of time. We also offer an interpretation in terms of optimal encoding of observed physical behavior.

  4. RED: a set of molecular descriptors based on Renyi entropy.

    PubMed

    Delgado-Soler, Laura; Toral, Raul; Tomás, M Santos; Rubio-Martinez, Jaime

    2009-11-01

    New molecular descriptors, RED (Renyi entropy descriptors), based on the generalized entropies introduced by Renyi are presented. Topological descriptors based on molecular features have proven to be useful for describing molecular profiles. Renyi entropy is used as a variability measure to contract a feature-pair distribution composing the descriptor vector. The performance of RED descriptors was tested for the analysis of different sets of molecular distances, virtual screening, and pharmacological profiling. A free parameter of the Renyi entropy has been optimized for all the considered applications.

  5. High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains

    NASA Technical Reports Server (NTRS)

    Fisher, Travis C.; Carpenter, Mark H.

    2013-01-01

    Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.

  6. Differences between state entropy and bispectral index during analysis of identical electroencephalogram signals: a comparison with two randomised anaesthetic techniques.

    PubMed

    Pilge, Stefanie; Kreuzer, Matthias; Karatchiviev, Veliko; Kochs, Eberhard F; Malcharek, Michael; Schneider, Gerhard

    2015-05-01

    It is claimed that bispectral index (BIS) and state entropy reflect an identical clinical spectrum, the hypnotic component of anaesthesia. So far, it is not known to what extent different devices display similar index values while processing identical electroencephalogram (EEG) signals. To compare BIS and state entropy during analysis of identical EEG data. Inspection of raw EEG input to detect potential causes of erroneous index calculation. Offline re-analysis of EEG data from a randomised, single-centre controlled trial using the Entropy Module and an Aspect A-2000 monitor. Klinikum rechts der Isar, Technische Universität München, Munich. Forty adult patients undergoing elective surgery under general anaesthesia. Blocked randomisation of 20 patients per anaesthetic group (sevoflurane/remifentanil or propofol/remifentanil). Isolated forearm technique for differentiation between consciousness and unconsciousness. Prediction probability (PK) of state entropy to discriminate consciousness from unconsciousness. Correlation and agreement between state entropy and BIS from deep to light hypnosis. Analysis of raw EEG compared with index values that are in conflict with clinical examination, with frequency measures (frequency bands/Spectral Edge Frequency 95) and visual inspection for physiological EEG patterns (e.g. beta or delta arousal), pathophysiological features such as high-frequency signals (electromyogram/high-frequency EEG or eye fluttering/saccades), different types of electro-oculogram or epileptiform EEG and technical artefacts. PK of state entropy was 0.80 and of BIS 0.84; correlation coefficient of state entropy with BIS 0.78. Nine percent BIS and 14% state entropy values disagreed with clinical examination. Highest incidence of disagreement occurred after state transitions, in particular for state entropy after loss of consciousness during sevoflurane anaesthesia. EEG sequences which led to false 'conscious' index values often showed high-frequency signals and eye blinks. High-frequency EEG/electromyogram signals were pooled because a separation into EEG and fast electro-oculogram, for example eye fluttering or saccades, on the basis of a single EEG channel may not be very reliable. These signals led to higher Spectral Edge Frequency 95 and ratio of relative beta and gamma band power than EEG signals, indicating adequate unconscious classification. The frequency of other artefacts that were assignable, for example technical artefacts, movement artefacts, was negligible and they were excluded from analysis. High-frequency signals and eye blinks may account for index values that falsely indicate consciousness. Compared with BIS, state entropy showed more false classifications of the clinical state at transition between consciousness and unconsciousness.

  7. Conditional Entropy-Constrained Residual VQ with Application to Image Coding

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Chung, Wilson C.; Smith, Mark J. T.

    1996-01-01

    This paper introduces an extension of entropy-constrained residual vector quantization (VQ) where intervector dependencies are exploited. The method, which we call conditional entropy-constrained residual VQ, employs a high-order entropy conditioning strategy that captures local information in the neighboring vectors. When applied to coding images, the proposed method is shown to achieve better rate-distortion performance than that of entropy-constrained residual vector quantization with less computational complexity and lower memory requirements. Moreover, it can be designed to support progressive transmission in a natural way. It is also shown to outperform some of the best predictive and finite-state VQ techniques reported in the literature. This is due partly to the joint optimization between the residual vector quantizer and a high-order conditional entropy coder as well as the efficiency of the multistage residual VQ structure and the dynamic nature of the prediction.

  8. Use of Raman microscopy and band-target entropy minimization analysis to identify dyes in a commercial stamp. Implications for authentication and counterfeit detection.

    PubMed

    Widjaja, Effendi; Garland, Marc

    2008-02-01

    Raman microscopy was used in mapping mode to collect more than 1000 spectra in a 100 microm x 100 microm area from a commercial stamp. Band-target entropy minimization (BTEM) was then employed to unmix the mixture spectra in order to extract the pure component spectra of the samples. Three pure component spectral patterns with good signal-to-noise ratios were recovered, and their spatial distributions were determined. The three pure component spectral patterns were then identified as copper phthalocyanine blue, calcite-like material, and yellow organic dye material by comparison to known spectral libraries. The present investigation, consisting of (1) advanced curve resolution (blind-source separation) followed by (2) spectral data base matching, readily suggests extensions to authenticity and counterfeit studies of other types of commercial objects. The presence or absence of specific observable components form the basis for assessment. The present spectral analysis (BTEM) is applicable to highly overlapping spectral information. Since a priori information such as the number of components present and spectral libraries are not needed in BTEM, and since minor signals arising from trace components can be reconstructed, this analysis offers a robust approach to a wide variety of material problems involving authenticity and counterfeit issues.

  9. Parameters Selection for Bivariate Multiscale Entropy Analysis of Postural Fluctuations in Fallers and Non-Fallers Older Adults.

    PubMed

    Ramdani, Sofiane; Bonnet, Vincent; Tallon, Guillaume; Lagarde, Julien; Bernard, Pierre Louis; Blain, Hubert

    2016-08-01

    Entropy measures are often used to quantify the regularity of postural sway time series. Recent methodological developments provided both multivariate and multiscale approaches allowing the extraction of complexity features from physiological signals; see "Dynamical complexity of human responses: A multivariate data-adaptive framework," in Bulletin of Polish Academy of Science and Technology, vol. 60, p. 433, 2012. The resulting entropy measures are good candidates for the analysis of bivariate postural sway signals exhibiting nonstationarity and multiscale properties. These methods are dependant on several input parameters such as embedding parameters. Using two data sets collected from institutionalized frail older adults, we numerically investigate the behavior of a recent multivariate and multiscale entropy estimator; see "Multivariate multiscale entropy: A tool for complexity analysis of multichannel data," Physics Review E, vol. 84, p. 061918, 2011. We propose criteria for the selection of the input parameters. Using these optimal parameters, we statistically compare the multivariate and multiscale entropy values of postural sway data of non-faller subjects to those of fallers. These two groups are discriminated by the resulting measures over multiple time scales. We also demonstrate that the typical parameter settings proposed in the literature lead to entropy measures that do not distinguish the two groups. This last result confirms the importance of the selection of appropriate input parameters.

  10. Temporal Correlations and Neural Spike Train Entropy

    NASA Astrophysics Data System (ADS)

    Schultz, Simon R.; Panzeri, Stefano

    2001-06-01

    Sampling considerations limit the experimental conditions under which information theoretic analyses of neurophysiological data yield reliable results. We develop a procedure for computing the full temporal entropy and information of ensembles of neural spike trains, which performs reliably for limited samples of data. This approach also yields insight to the role of correlations between spikes in temporal coding mechanisms. The method, when applied to recordings from complex cells of the monkey primary visual cortex, results in lower rms error information estimates in comparison to a ``brute force'' approach.

  11. Design of high entropy alloys based on the experience from commercial superalloys

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Huang, Y.; Wang, J.; Liu, C. T.

    2015-01-01

    High entropy alloys (HEAs) have been drawing increasing attention recently and gratifying results have been obtained. However, the existing metallurgic rules of HEAs could not provide specific information of selecting candidate alloys for structural applications. Our brief survey reveals that many commercial superalloys have medium and even to high configurational entropies. The experience of commercial superalloys provides a clue for helping us in the development of HEAs for structural applications.

  12. Exact Test of Independence Using Mutual Information

    DTIC Science & Technology

    2014-05-23

    1000 × 0.05 = 50. Entropy 2014, 16 2844 Importantly, the permutation test, which does not preserve Markov order, resulted in 489 Type I errors! Using...Block 13 ARO Report Number Block 13: Supplementary Note © 2014 . Published in Entropy , Vol. Ed. 0 16, (7) (2014), (, (7). DoD Components reserve a...official Department of the Army position, policy or decision, unless so designated by other documentation. ... Entropy 2014, 16, 2839-2849; doi:10.3390

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flammia, Steven T.; Hamma, Alioscia; Hughes, Taylor L.

    We generalize the topological entanglement entropy to a family of topological Renyi entropies parametrized by a parameter alpha, in an attempt to find new invariants for distinguishing topologically ordered phases. We show that, surprisingly, all topological Renyi entropies are the same, independent of alpha for all nonchiral topological phases. This independence shows that topologically ordered ground-state wave functions have reduced density matrices with a certain simple structure, and no additional universal information can be extracted from the entanglement spectrum.

  14. The Informational Patterns of Laughter

    NASA Astrophysics Data System (ADS)

    Bea, José A.; Marijuán, Pedro C.

    2003-06-01

    Laughter is one of the most characteristic -and enigmatic- communicational traits of human individuals. Its analysis has to take into account a variety of emotional, social, cognitive, and communicational factors densely interconnected. In this article we study laughter just as an auditive signal (as a 'neutral' information carrier), and we compare its structure with the regular traits of linguistic signals. In the experimental records of human laughter that we have performed, the most noticeable trait is the disorder content of frequencies. In comparison with the sonograms of vowels, the information content of which appears as a characteristic, regular function of the first vibration modes of the dynamic system formed, for each vowel, by the vocal cords and the accompanying resonance of the vocalization apparatus, the sonograms of laughter are highly irregular. In the episodes of laughter, a highly random content in frequencies appears, reason why it cannot be considered as a genuine codification of patterned information like in linguistic signals. In order to numerically gauge the disorder content of laughter frequencies, we have performed several "entropy" measures of the spectra -trying to unambiguously identify spontaneous laughter from "faked", articulated laughter. Interestingly, Shannon's entropy (the most natural candidate) performs rather poorly.

  15. Multiscale permutation entropy analysis of electrocardiogram

    NASA Astrophysics Data System (ADS)

    Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao

    2017-04-01

    To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.

  16. A retrospective analysis of the effect of blood transfusion on cerebral oximetry entropy and acute kidney injury.

    PubMed

    Engoren, Milo; Brown, Russell R; Dubovoy, Anna

    2017-01-01

    Acute anemia is associated with both cerebral dysfunction and acute kidney injury and is often treated with red blood cell transfusion. We sought to determine if blood transfusion changed the cerebral oximetry entropy, a measure of the complexity or irregularity of the oximetry values, and if this change was associated with subsequent acute kidney injury. This was a retrospective, case-control study of patients undergoing cardiac surgery with cardiopulmonary bypass at a tertiary care hospital, comparing those who received a red blood cell transfusion to those who did not. Acute kidney injury was defined as a perioperative increase in serum creatinine by ⩾26.4 μmol/L or by ⩾50% increase. Entropy was measured using approximate entropy, sample entropy, forbidden word entropy and basescale4 entropy in 500-point sets. Forty-four transfused patients were matched to 88 randomly selected non-transfused patients. All measures of entropy had small changes in the transfused group, but increased in the non-transfused group (p<0.05, for all comparisons). Thirty-five of 132 patients (27%) suffered acute kidney injury. Based on preoperative factors, patients who suffered kidney injury were similar to those who did not, including baseline cerebral oximetry levels. After analysis with hierarchical logistic regression, the change in basescale4 entropy (odds ratio = 1.609, 95% confidence interval = 1.057-2.450, p = 0.027) and the interaction between basescale entropy and transfusion were significantly associated with subsequent development of acute kidney injury. The transfusion of red blood cells was associated with a smaller rise in entropy values compared to non-transfused patients, suggesting a change in the regulation of cerebral oxygenation, and these changes in cerebral oxygenation are also associated with acute kidney injury.

  17. Scaling behaviour of Fisher and Shannon entropies for the exponential-cosine screened coulomb potential

    NASA Astrophysics Data System (ADS)

    Abdelmonem, M. S.; Abdel-Hady, Afaf; Nasser, I.

    2017-07-01

    The scaling laws are given for the entropies in the information theory, including the Shannon's entropy, its power, the Fisher's information and the Fisher-Shannon product, using the exponential-cosine screened Coulomb potential. The scaling laws are specified, in the r-space, as a function of |μ - μc, nℓ|, where μ is the screening parameter and μc, nℓ its critical value for the specific quantum numbers n and ℓ. Scaling laws for other physical quantities, such as energy eigenvalues, the moments, static polarisability, transition probabilities, etc. are also given. Some of these are reported for the first time. The outcome is compared with the available literatures' results.

  18. Information-Based Analysis of Data Assimilation (Invited)

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.; Gupta, H. V.; Crow, W. T.; Gong, W.

    2013-12-01

    Data assimilation is defined as the Bayesian conditioning of uncertain model simulations on observations for the purpose of reducing uncertainty about model states. Practical data assimilation methods make the application of Bayes' law tractable either by employing assumptions about the prior, posterior and likelihood distributions (e.g., the Kalman family of filters) or by using resampling methods (e.g., bootstrap filter). We propose to quantify the efficiency of these approximations in an OSSE setting using information theory and, in an OSSE or real-world validation setting, to measure the amount - and more importantly, the quality - of information extracted from observations during data assimilation. To analyze DA assumptions, uncertainty is quantified as the Shannon-type entropy of a discretized probability distribution. The maximum amount of information that can be extracted from observations about model states is the mutual information between states and observations, which is equal to the reduction in entropy in our estimate of the state due to Bayesian filtering. The difference between this potential and the actual reduction in entropy due to Kalman (or other type of) filtering measures the inefficiency of the filter assumptions. Residual uncertainty in DA posterior state estimates can be attributed to three sources: (i) non-injectivity of the observation operator, (ii) noise in the observations, and (iii) filter approximations. The contribution of each of these sources is measurable in an OSSE setting. The amount of information extracted from observations by data assimilation (or system identification, including parameter estimation) can also be measured by Shannon's theory. Since practical filters are approximations of Bayes' law, it is important to know whether the information that is extracted form observations by a filter is reliable. We define information as either good or bad, and propose to measure these two types of information using partial Kullback-Leibler divergences. Defined this way, good and bad information sum to total information. This segregation of information into good and bad components requires a validation target distribution; in a DA OSSE setting, this can be the true Bayesian posterior, but in a real-world setting the validation target might be determined by a set of in situ observations.

  19. SUPERMODEL ANALYSIS OF A1246 AND J255: ON THE EVOLUTION OF GALAXY CLUSTERS FROM HIGH TO LOW ENTROPY STATES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fusco-Femiano, R.; Lapi, A., E-mail: roberto.fuscofemiano@iaps.inaf.it

    2015-02-10

    We present an analysis of high-quality X-ray data out to the virial radius for the two galaxy clusters A1246 and GMBCG J255.34805+64.23661 (J255) by means of our entropy-based SuperModel. For A1246 we find that the spherically averaged entropy profile of the intracluster medium (ICM) progressively flattens outward, and that a nonthermal pressure component amounting to ≈20% of the total is required to support hydrostatic equilibrium in the outskirts; there we also estimate a modest value C ≈ 1.6 of the ICM clumping factor. These findings agree with previous analyses on other cool-core, relaxed clusters, and lend further support to themore » picture by Lapi et al. that relates the entropy flattening, the development of the nonthermal pressure component, and the azimuthal variation of ICM properties to weakening boundary shocks. In this scenario clusters are born in a high-entropy state throughout, and are expected to develop on similar timescales a low-entropy state both at the center due to cooling, and in the outskirts due to weakening shocks. However, the analysis of J255 testifies how such a typical evolutionary course can be interrupted or even reversed by merging especially at intermediate redshift, as predicted by Cavaliere et al. In fact, a merger has rejuvenated the ICM of this cluster at z ≈ 0.45 by reestablishing a high-entropy state in the outskirts, while leaving intact or erasing only partially the low-entropy, cool core at the center.« less

  20. Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory

    PubMed Central

    Zhang, Lichuan; Wang, Tonghao; Xu, Demin

    2017-01-01

    Cooperative localization (CL) is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs). In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD) filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers’ observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position. PMID:28991191

  1. Using constrained information entropy to detect rare adverse drug reactions from medical forums.

    PubMed

    Yi Zheng; Chaowang Lan; Hui Peng; Jinyan Li

    2016-08-01

    Adverse drug reactions (ADRs) detection is critical to avoid malpractices yet challenging due to its uncertainty in pre-marketing review and the underreporting in post-marketing surveillance. To conquer this predicament, social media based ADRs detection methods have been proposed recently. However, existing researches are mostly co-occurrence based methods and face several issues, in particularly, leaving out the rare ADRs and unable to distinguish irrelevant ADRs. In this work, we introduce a constrained information entropy (CIE) method to solve these problems. CIE first recognizes the drug-related adverse reactions using a predefined keyword dictionary and then captures high- and low-frequency (rare) ADRs by information entropy. Extensive experiments on medical forums dataset demonstrate that CIE outperforms the state-of-the-art co-occurrence based methods, especially in rare ADRs detection.

  2. ECG contamination of EEG signals: effect on entropy.

    PubMed

    Chakrabarti, Dhritiman; Bansal, Sonia

    2016-02-01

    Entropy™ is a proprietary algorithm which uses spectral entropy analysis of electroencephalographic (EEG) signals to produce indices which are used as a measure of depth of hypnosis. We describe a report of electrocardiographic (ECG) contamination of EEG signals leading to fluctuating erroneous Entropy values. An explanation is provided for mechanism behind this observation by describing the spread of ECG signals in head and neck and its influence on EEG/Entropy by correlating the observation with the published Entropy algorithm. While the Entropy algorithm has been well conceived, there are still instances in which it can produce erroneous values. Such erroneous values and their cause may be identified by close scrutiny of the EEG waveform if Entropy values seem out of sync with that expected at given anaesthetic levels.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Maoyuan; Besford, Quinn Alexander; Mulvaney, Thomas

    The entropy of hydrophobic solvation has been explained as the result of ordered solvation structures, of hydrogen bonds, of the small size of the water molecule, of dispersion forces, and of solvent density fluctuations. We report a new approach to the calculation of the entropy of hydrophobic solvation, along with tests of and comparisons to several other methods. The methods are assessed in the light of the available thermodynamic and spectroscopic information on the effects of temperature on hydrophobic solvation. Five model hydrophobes in SPC/E water give benchmark solvation entropies via Widom’s test-particle insertion method, and other methods and modelsmore » are tested against these particle-insertion results. Entropies associated with distributions of tetrahedral order, of electric field, and of solvent dipole orientations are examined. We find these contributions are small compared to the benchmark particle-insertion entropy. Competitive with or better than other theories in accuracy, but with no free parameters, is the new estimate of the entropy contributed by correlations between dipole moments. Dipole correlations account for most of the hydrophobic solvation entropy for all models studied and capture the distinctive temperature dependence seen in thermodynamic and spectroscopic experiments. Entropies based on pair and many-body correlations in number density approach the correct magnitudes but fail to describe temperature and size dependences, respectively. Hydrogen-bond definitions and free energies that best reproduce entropies from simulations are reported, but it is difficult to choose one hydrogen bond model that fits a variety of experiments. The use of information theory, scaled-particle theory, and related methods is discussed briefly. Our results provide a test of the Frank-Evans hypothesis that the negative solvation entropy is due to structured water near the solute, complement the spectroscopic detection of that solvation structure by identifying the structural feature responsible for the entropy change, and point to a possible explanation for the observed dependence on length scale. Our key results are that the hydrophobic effect, i.e. the signature, temperature-dependent, solvation entropy of nonpolar molecules in water, is largely due to a dispersion force arising from correlations between rotating permanent dipole moments, that the strength of this force depends on the Kirkwood g-factor, and that the strength of this force may be obtained exactly without simulation.« less

  4. The Entropy of Non-Ergodic Complex Systems — a Derivation from First Principles

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Hanel, Rudolf

    In information theory the 4 Shannon-Khinchin1,2 (SK) axioms determine Boltzmann Gibbs entropy, S -∑i pilog pi, as the unique entropy. Physics is different from information in the sense that physical systems can be non-ergodic or non-Markovian. To characterize such strongly interacting, statistical systems - complex systems in particular - within a thermodynamical framework it might be necessary to introduce generalized entropies. A series of such entropies have been proposed in the past decades. Until now the understanding of their fundamental origin and their deeper relations to complex systems remains unclear. To clarify the situation we note that non-ergodicity explicitly violates the fourth SK axiom. We show that by relaxing this axiom the entropy generalizes to, S ∑i Γ(d + 1, 1 - c log pi), where Γ is the incomplete Gamma function, and c and d are scaling exponents. All recently proposed entropies compatible with the first 3 SK axioms appear to be special cases. We prove that each statistical system is uniquely characterized by the pair of the two scaling exponents (c, d), which defines equivalence classes for all systems. The corresponding distribution functions are special forms of Lambert-W exponentials containing, as special cases, Boltzmann, stretched exponential and Tsallis distributions (power-laws) - all widely abundant in nature. This derivation is the first ab initio justification for generalized entropies. We next show how the phasespace volume of a system is related to its generalized entropy, and provide a concise criterion when it is not of Boltzmann-Gibbs type but assumes a generalized form. We show that generalized entropies only become relevant when the dynamically (statistically) relevant fraction of degrees of freedom in a system vanishes in the thermodynamic limit. These are systems where the bulk of the degrees of freedom is frozen. Systems governed by generalized entropies are therefore systems whose phasespace volume effectively collapses to a lower-dimensional 'surface'. We explicitly illustrate the situation for accelerating random walks, and a spin system on a constant-conectancy network. We argue that generalized entropies should be relevant for self-organized critical systems such as sand piles, for spin systems which form meta-structures such as vortices, domains, instantons, etc., and for problems associated with anomalous diffusion.

  5. SD-MSAEs: Promoter recognition in human genome based on deep feature extraction.

    PubMed

    Xu, Wenxuan; Zhang, Li; Lu, Yaping

    2016-06-01

    The prediction and recognition of promoter in human genome play an important role in DNA sequence analysis. Entropy, in Shannon sense, of information theory is a multiple utility in bioinformatic details analysis. The relative entropy estimator methods based on statistical divergence (SD) are used to extract meaningful features to distinguish different regions of DNA sequences. In this paper, we choose context feature and use a set of methods of SD to select the most effective n-mers distinguishing promoter regions from other DNA regions in human genome. Extracted from the total possible combinations of n-mers, we can get four sparse distributions based on promoter and non-promoters training samples. The informative n-mers are selected by optimizing the differentiating extents of these distributions. Specially, we combine the advantage of statistical divergence and multiple sparse auto-encoders (MSAEs) in deep learning to extract deep feature for promoter recognition. And then we apply multiple SVMs and a decision model to construct a human promoter recognition method called SD-MSAEs. Framework is flexible that it can integrate new feature extraction or new classification models freely. Experimental results show that our method has high sensitivity and specificity. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Entropy change of biological dynamics in COPD.

    PubMed

    Jin, Yu; Chen, Chang; Cao, Zhixin; Sun, Baoqing; Lo, Iek Long; Liu, Tzu-Ming; Zheng, Jun; Sun, Shixue; Shi, Yan; Zhang, Xiaohua Douglas

    2017-01-01

    In this century, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of large amount of data in human physiological signals. Entropy is a key metric for quantifying the irregularity contained in physiological signals. In this review, we focus on how entropy changes in various physiological signals in COPD. Our review concludes that the entropy change relies on the types of physiological signals under investigation. For major physiological signals related to respiratory diseases, such as airflow, heart rate variability, and gait variability, the entropy of a patient with COPD is lower than that of a healthy person. However, in case of hormone secretion and respiratory sound, the entropy of a patient is higher than that of a healthy person. For mechanomyogram signal, the entropy increases with the increased severity of COPD. This result should give valuable guidance for the use of entropy for physiological signals measured by wearable medical device as well as for further research on entropy in COPD.

  7. Thermodynamic criteria analysis for the use of taro starch spherical aggregates as microencapsulant matrix.

    PubMed

    Hoyos-Leyva, Javier D; Bello-Pérez, Luis A; Alvarez-Ramirez, J

    2018-09-01

    Spherical aggregates can be obtained from taro starch by spray-drying without using bonding agents. Accurate information about thermal issues of spherical aggregates can provide valuable information for assessing the application as encapsulant. Spherical aggregates of taro starch were obtained by spray-drying and analyzed using dynamic vapour sorption. The use of the Guggenheim, Anderson and de Boer (GAB) model indicated a Type II isotherm pattern with weaker interactions in the multilayer region. Differential enthalpy and entropy estimates reflected a mesoporous microstructure, implying that energetic mechanisms dominate over transport mechanisms in the sorption process. The limitation by energetic mechanisms was corroborated with enthalpy-entropy compensation estimates. The diffusivity coefficient was of the order of 10 -8  m 2 ·s -1 , which is in line with results obtained for common materials used for encapsulation purposes. The thermodynamic properties and the lack of a bonding agent indicated the viability of spherical aggregates of taro starch for encapsulation of biocompounds. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Redundant imprinting of information in nonideal environments: Objective reality via a noisy channel

    NASA Astrophysics Data System (ADS)

    Zwolak, Michael; Quan, H. T.; Zurek, Wojciech H.

    2010-06-01

    Quantum Darwinism provides an information-theoretic framework for the emergence of the objective, classical world from the quantum substrate. The key to this emergence is the proliferation of redundant information throughout the environment where observers can then intercept it. We study this process for a purely decohering interaction when the environment, E, is in a nonideal (e.g., mixed) initial state. In the case of good decoherence, that is, after the pointer states have been unambiguously selected, the mutual information between the system, S, and an environment fragment, F, is given solely by F’s entropy increase. This demonstrates that the environment’s capacity for recording the state of S is directly related to its ability to increase its entropy. Environments that remain nearly invariant under the interaction with S, either because they have a large initial entropy or a misaligned initial state, therefore have a diminished ability to acquire information. To elucidate the concept of good decoherence, we show that, when decoherence is not complete, the deviation of the mutual information from F’s entropy change is quantified by the quantum discord, i.e., the excess mutual information between S and F is information regarding the initial coherence between pointer states of S. In addition to illustrating these results with a single-qubit system interacting with a multiqubit environment, we find scaling relations for the redundancy of information acquired by the environment that display a universal behavior independent of the initial state of S. Our results demonstrate that Quantum Darwinism is robust with respect to nonideal initial states of the environment: the environment almost always acquires redundant information about the system but its rate of acquisition can be reduced.

  9. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  10. Relations between work and entropy production for general information-driven, finite-state engines

    NASA Astrophysics Data System (ADS)

    Merhav, Neri

    2017-02-01

    We consider a system model of a general finite-state machine (ratchet) that simultaneously interacts with three kinds of reservoirs: a heat reservoir, a work reservoir, and an information reservoir, the latter being taken to be a running digital tape whose symbols interact sequentially with the machine. As has been shown in earlier work, this finite-state machine can act as a demon (with memory), which creates a net flow of energy from the heat reservoir into the work reservoir (thus extracting useful work) at the price of increasing the entropy of the information reservoir. Under very few assumptions, we propose a simple derivation of a family of inequalities that relate the work extraction with the entropy production. These inequalities can be seen as either upper bounds on the extractable work or as lower bounds on the entropy production, depending on the point of view. Many of these bounds are relatively easy to calculate and they are tight in the sense that equality can be approached arbitrarily closely. In their basic forms, these inequalities are applicable to any finite number of cycles (and not only asymptotically), and for a general input information sequence (possibly correlated), which is not necessarily assumed even stationary. Several known results are obtained as special cases.

  11. Entropy as an indicator of cerebral perfusion in patients with increased intracranial pressure.

    PubMed

    Khan, James; Mariappan, Ramamani; Venkatraghavan, Lashmi

    2014-07-01

    Changes in electroencephalogram (EEG) patterns correlate well with changes in cerebral perfusion pressure (CPP) and hence entropy and bispectral index values may also correlate with CPP. To highlight the potential application of entropy, an EEG-based anesthetic depth monitor, on indicating cerebral perfusion in patients with increased intracranial pressure (ICP), we report two cases of emergency neurosurgical procedure in patients with raised ICP where anesthesia was titrated to entropy values and the entropy values suddenly increased after cranial decompression, reflecting the increase in CPP. Maintaining systemic blood pressure in order to maintain the CPP is the anesthetic goal while managing patients with raised ICP. EEG-based anesthetic depth monitors may hold valuable information on guiding anesthetic management in patients with decreased CPP for better neurological outcome.

  12. Elements of the cognitive universe

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2017-06-01

    "The least biased inference, taking available information into account, is the one with maximum entropy". So we are taught by Jaynes. The many followers from a broad spectrum of the natural and social sciences point to the wisdom of this principle, the maximum entropy principle, MaxEnt. But "entropy" need not be tied only to classical entropy and thus to probabilistic thinking. In fact, the arguments found in Jaynes' writings and elsewhere can, as we shall attempt to demonstrate, profitably be revisited, elaborated and transformed to apply in a much more general abstract setting. The approach is based on game theoretical thinking. Philosophical considerations dealing with notions of cognition - basically truth and belief - lie behind. Quantitative elements are introduced via a concept of description effort. An interpretation of Tsallis Entropy is indicated.

  13. Human vision is determined based on information theory.

    PubMed

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-03

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  14. Human vision is determined based on information theory

    NASA Astrophysics Data System (ADS)

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  15. Human vision is determined based on information theory

    PubMed Central

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-01-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition. PMID:27808236

  16. Connectivity in the human brain dissociates entropy and complexity of auditory inputs.

    PubMed

    Nastase, Samuel A; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-03-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Copyright © 2014. Published by Elsevier Inc.

  17. Information-Theoretic Uncertainty of SCFG-Modeled Folding Space of The Non-coding RNA

    PubMed Central

    Manzourolajdad, Amirhossein; Wang, Yingfeng; Shaw, Timothy I.; Malmberg, Russell L.

    2012-01-01

    RNA secondary structure ensembles define probability distributions for alternative equilibrium secondary structures of an RNA sequence. Shannon’s Entropy is a measure for the amount of diversity present in any ensemble. In this work, Shannon’s entropy of the SCFG ensemble on an RNA sequence is derived and implemented in polynomial time for both structurally ambiguous and unambiguous grammars. Micro RNA sequences generally have low folding entropy, as previously discovered. Surprisingly, signs of significantly high folding entropy were observed in certain ncRNA families. More effective models coupled with targeted randomization tests can lead to a better insight into folding features of these families. PMID:23160142

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giveon, Amit; Kutasov, David

    We show that in any two dimensional conformal field theory with (2, 2) super-symmetry one can define a supersymmetric analog of the usual Renyi entropy of a spatial region A. It differs from the Renyi entropy by a universal function (which we compute) of the central charge, Renyi parameter n and the geometric parameters of A. In the limit n → 1 it coincides with the entanglement entropy. Thus, it contains the same information as the Renyi entropy but its computation only involves correlation functions of chiral and anti-chiral operators. We also show that this quantity appears naturally in stringmore » theory on AdS3.« less

  19. Continuous time wavelet entropy of auditory evoked potentials.

    PubMed

    Cek, M Emre; Ozgoren, Murat; Savaci, F Acar

    2010-01-01

    In this paper, the continuous time wavelet entropy (CTWE) of auditory evoked potentials (AEP) has been characterized by evaluating the relative wavelet energies (RWE) in specified EEG frequency bands. Thus, the rapid variations of CTWE due to the auditory stimulation could be detected in post-stimulus time interval. This approach removes the probability of missing the information hidden in short time intervals. The discrete time and continuous time wavelet based wavelet entropy variations were compared on non-target and target AEP data. It was observed that CTWE can also be an alternative method to analyze entropy as a function of time. 2009 Elsevier Ltd. All rights reserved.

  20. Direct 4D reconstruction of parametric images incorporating anato-functional joint entropy.

    PubMed

    Tang, Jing; Kuwabara, Hiroto; Wong, Dean F; Rahmim, Arman

    2010-08-07

    We developed an anatomy-guided 4D closed-form algorithm to directly reconstruct parametric images from projection data for (nearly) irreversible tracers. Conventional methods consist of individually reconstructing 2D/3D PET data, followed by graphical analysis on the sequence of reconstructed image frames. The proposed direct reconstruction approach maintains the simplicity and accuracy of the expectation-maximization (EM) algorithm by extending the system matrix to include the relation between the parametric images and the measured data. A closed-form solution was achieved using a different hidden complete-data formulation within the EM framework. Furthermore, the proposed method was extended to maximum a posterior reconstruction via incorporation of MR image information, taking the joint entropy between MR and parametric PET features as the prior. Using realistic simulated noisy [(11)C]-naltrindole PET and MR brain images/data, the quantitative performance of the proposed methods was investigated. Significant improvements in terms of noise versus bias performance were demonstrated when performing direct parametric reconstruction, and additionally upon extending the algorithm to its Bayesian counterpart using the MR-PET joint entropy measure.

  1. Maximum Entropy Production As a Framework for Understanding How Living Systems Evolve, Organize and Function

    NASA Astrophysics Data System (ADS)

    Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.

    2014-12-01

    The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.

  2. On variational expressions for quantum relative entropies

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Fawzi, Omar; Tomamichel, Marco

    2017-12-01

    Distance measures between quantum states like the trace distance and the fidelity can naturally be defined by optimizing a classical distance measure over all measurement statistics that can be obtained from the respective quantum states. In contrast, Petz showed that the measured relative entropy, defined as a maximization of the Kullback-Leibler divergence over projective measurement statistics, is strictly smaller than Umegaki's quantum relative entropy whenever the states do not commute. We extend this result in two ways. First, we show that Petz' conclusion remains true if we allow general positive operator-valued measures. Second, we extend the result to Rényi relative entropies and show that for non-commuting states the sandwiched Rényi relative entropy is strictly larger than the measured Rényi relative entropy for α \\in (1/2, \\infty ) and strictly smaller for α \\in [0,1/2). The latter statement provides counterexamples for the data processing inequality of the sandwiched Rényi relative entropy for α < 1/2. Our main tool is a new variational expression for the measured Rényi relative entropy, which we further exploit to show that certain lower bounds on quantum conditional mutual information are superadditive.

  3. Self-organisation of symbolic information

    NASA Astrophysics Data System (ADS)

    Feistel, R.

    2017-01-01

    Information is encountered in two different appearances, in native form by arbitrary physical structures, or in symbolic form by coded sequences of letters or the like. The self-organised emergence of symbolic information from structural information is referred to as a ritualisation transition. Occurring at some stage in evolutionary history, ritualisation transitions have in common that after the crossover, arbitrary symbols are issued and recognised by information-processing devices, by transmitters and receivers in the sense of Shannon's communication theory. Symbolic information-processing systems exhibit the fundamental code symmetry whose key features, such as largely lossless copying or persistence under hostile conditions, may elucidate the reasons for the repeated successful occurrence of ritualisation phenomena in evolution history. Ritualisation examples are briefly reviewed such as the origin of life, the appearance of human languages, the establishment of emergent social categories such as money, or the development of digital computers. In addition to their role as carriers of symbolic information, symbols are physical structures which also represent structural information. For a thermodynamic description of symbols and their arrangements, it appears reasonable to distinguish between Boltzmann entropy, Clausius entropy and Pauling entropy. Thermodynamic properties of symbols imply that their lifetimes are limited by the 2nd law.

  4. Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?

    PubMed

    De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko

    2018-06-01

    To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.

  5. Vestibular Activation Differentially Modulates Human Early Visual Cortex and V5/MT Excitability and Response Entropy

    PubMed Central

    Guzman-Lopez, Jessica; Arshad, Qadeer; Schultz, Simon R; Walsh, Vincent; Yousif, Nada

    2013-01-01

    Head movement imposes the additional burdens on the visual system of maintaining visual acuity and determining the origin of retinal image motion (i.e., self-motion vs. object-motion). Although maintaining visual acuity during self-motion is effected by minimizing retinal slip via the brainstem vestibular-ocular reflex, higher order visuovestibular mechanisms also contribute. Disambiguating self-motion versus object-motion also invokes higher order mechanisms, and a cortical visuovestibular reciprocal antagonism is propounded. Hence, one prediction is of a vestibular modulation of visual cortical excitability and indirect measures have variously suggested none, focal or global effects of activation or suppression in human visual cortex. Using transcranial magnetic stimulation-induced phosphenes to probe cortical excitability, we observed decreased V5/MT excitability versus increased early visual cortex (EVC) excitability, during vestibular activation. In order to exclude nonspecific effects (e.g., arousal) on cortical excitability, response specificity was assessed using information theory, specifically response entropy. Vestibular activation significantly modulated phosphene response entropy for V5/MT but not EVC, implying a specific vestibular effect on V5/MT responses. This is the first demonstration that vestibular activation modulates human visual cortex excitability. Furthermore, using information theory, not previously used in phosphene response analysis, we could distinguish between a specific vestibular modulation of V5/MT excitability from a nonspecific effect at EVC. PMID:22291031

  6. [Wavelet packet extraction and entropy analysis of telemetry EEG from the prelimbic cortex of medial prefrontal cortex in morphine-induced CPP rats].

    PubMed

    Bai, Yu; Bai, Jia-Ming; Li, Jing; Li, Min; Yu, Ran; Pan, Qun-Wan

    2014-12-25

    The purpose of the present study is to analyze the relationship between the telemetry electroencephalogram (EEG) changes of the prelimbic (PL) cortex and the drug-seeking behavior of morphine-induced conditioned place preference (CPP) rats by using the wavelet packet extraction and entropy measurement. The recording electrode was stereotactically implanted into the PL cortex of rats. The animals were then divided randomly into operation-only control and morphine-induced CPP groups, respectively. A CPP video system in combination with an EEG wireless telemetry device was used for recording EEG of PL cortex when the rats shuttled between black-white or white-black chambers. The telemetry recorded EEGs were analyzed by wavelet packet extraction, Welch power spectrum estimate, normalized amplitude and Shannon entropy algorithm. The results showed that, compared with operation-only control group, the left PL cortex's EEG of morphine-induced CPP group during black-white chamber shuttling exhibited the following changes: (1) the amplitude of average EEG for each frequency bands extracted by wavelet packet was reduced; (2) the Welch power intensity was increased significantly in 10-50 Hz EEG band (P < 0.01 or P < 0.05); (3) Shannon entropy was increased in β, γ₁, and γ₂waves of the EEG (P < 0.01 or P < 0.05); and (4) the average information entropy was reduced (P < 0.01). The results suggest that above mentioned EEG changes in morphine-induced CPP group rat may be related to animals' drug-seeking motivation and behavior launching.

  7. A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na

    2013-01-01

    We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.

  8. Unifying three perspectives on information processing in stochastic thermodynamics.

    PubMed

    Barato, A C; Seifert, U

    2014-03-07

    So far, feedback-driven systems have been discussed using (i) measurement and control, (ii) a tape interacting with a system, or (iii) by identifying an implicit Maxwell demon in steady-state transport. We derive the corresponding second laws from one master fluctuation theorem and discuss their relationship. In particular, we show that both the entropy production involving mutual information between system and controller and the one involving a Shannon entropy difference of an information reservoir like a tape carry an extra term different from the usual current times affinity. We, thus, generalize stochastic thermodynamics to the presence of an information reservoir.

  9. Permutation entropy analysis of financial time series based on Hill's diversity number

    NASA Astrophysics Data System (ADS)

    Zhang, Yali; Shang, Pengjian

    2017-12-01

    In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.

  10. Halo-independence with quantified maximum entropy at DAMA/LIBRA

    NASA Astrophysics Data System (ADS)

    Fowlie, Andrew

    2017-10-01

    Using the DAMA/LIBRA anomaly as an example, we formalise the notion of halo-independence in the context of Bayesian statistics and quantified maximum entropy. We consider an infinite set of possible profiles, weighted by an entropic prior and constrained by a likelihood describing noisy measurements of modulated moments by DAMA/LIBRA. Assuming an isotropic dark matter (DM) profile in the galactic rest frame, we find the most plausible DM profiles and predictions for unmodulated signal rates at DAMA/LIBRA. The entropic prior contains an a priori unknown regularisation factor, β, that describes the strength of our conviction that the profile is approximately Maxwellian. By varying β, we smoothly interpolate between a halo-independent and a halo-dependent analysis, thus exploring the impact of prior information about the DM profile.

  11. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  12. Clarifying the link between von Neumann and thermodynamic entropies

    NASA Astrophysics Data System (ADS)

    Deville, Alain; Deville, Yannick

    2013-01-01

    The state of a quantum system being described by a density operator ρ, quantum statistical mechanics calls the quantity - kTr( ρln ρ), introduced by von Neumann, its von Neumann or statistical entropy. A 1999 Shenker's paper initiated a debate about its link with the entropy of phenomenological thermodynamics. Referring to Gibbs's and von Neumann's founding texts, we replace von Neumann's 1932 contribution in its historical context, after Gibbs's 1902 treatise and before the creation of the information entropy concept, which places boundaries into the debate. Reexamining von Neumann's reasoning, we stress that the part of his reasoning implied in the debate mainly uses thermodynamics, not quantum mechanics, and identify two implicit postulates. We thoroughly examine Shenker's and ensuing papers, insisting upon the presence of open thermodynamical subsystems, imposing us the use of the chemical potential concept. We briefly mention Landau's approach to the quantum entropy. On the whole, it is shown that von Neumann's viewpoint is right, and why Shenker's claim that von Neumann entropy "is not the quantum-mechanical correlate of thermodynamic entropy" can't be retained.

  13. Entropy production in a box: Analysis of instabilities in confined hydrothermal systems

    NASA Astrophysics Data System (ADS)

    Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.

    2017-09-01

    We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.

  14. How to find what you don't know: Visualising variability in 3D geological models

    NASA Astrophysics Data System (ADS)

    Lindsay, Mark; Wellmann, Florian; Jessell, Mark; Ailleres, Laurent

    2014-05-01

    Uncertainties in input data can have compounding effects on the predictive reliability of three-dimensional (3D) geological models. Resource exploration, tectonic studies and environmental modelling can be compromised by using 3D models that misrepresent the target geology, and drilling campaigns that attempt to intersect particular geological units guided by 3D models are at risk of failure if the exploration geologist is unaware of inherent uncertainties. In addition, the visual inspection of 3D models is often the first contact decision makers have with the geology, thus visually communicating the presence and magnitude of uncertainties contained within geological 3D models is critical. Unless uncertainties are presented early in the relationship between decision maker and model, the model will be considered more truthful than the uncertainties allow with each subsequent viewing. We present a selection of visualisation techniques that provide the viewer with an insight to the location and amount of uncertainty contained within a model, and the geological characteristics which are most affected. A model of the Gippsland Basin, southeastern Australia is used as a case study to demonstrate the concepts of information entropy, stratigraphic variability and geodiversity. Central to the techniques shown here is the creation of a model suite, performed by creating similar (but not the same) version of the original model through perturbation of the input data. Specifically, structural data in the form of strike and dip measurements is perturbed in the creation of the model suite. The visualisation techniques presented are: (i) information entropy; (ii) stratigraphic variability and (iii) geodiversity. Information entropy is used to analyse uncertainty in a spatial context, combining the empirical probability distributions of multiple outcomes with a single quantitative measure. Stratigraphic variability displays the number of possible lithologies that may exist at a given point within the model volume. Geodiversity analyses various model characteristics (or 'geodiveristy metrics'), including the depth, volume of unit, the curvature of an interface, the geological complexity of a contact and the contact relationships units have with each other. Principal component analysis, a multivariate statistical technique, is used to simultaneously examine each of the geodiveristy metrics to determine the boundaries of model space, and identify which metrics contribute most to model uncertainty. The combination of information entropy, stratigraphic variability and geodiversity analysis provides a descriptive and thorough representation of uncertainty with effective visualisation techniques that clearly communicate the geological uncertainty contained within the geological model.

  15. Detecting Spatio-Temporal Modes in Multivariate Data by Entropy Field Decomposition

    PubMed Central

    Frank, Lawrence R.; Galinsky, Vitaly L.

    2016-01-01

    A new data analysis method that addresses a general problem of detecting spatio-temporal variations in multivariate data is presented. The method utilizes two recent and complimentary general approaches to data analysis, information field theory (IFT) and entropy spectrum pathways (ESP). Both methods reformulate and incorporate Bayesian theory, thus use prior information to uncover underlying structure of the unknown signal. Unification of ESP and IFT creates an approach that is non-Gaussian and non-linear by construction and is found to produce unique spatio-temporal modes of signal behavior that can be ranked according to their significance, from which space-time trajectories of parameter variations can be constructed and quantified. Two brief examples of real world applications of the theory to the analysis of data bearing completely different, unrelated nature, lacking any underlying similarity, are also presented. The first example provides an analysis of resting state functional magnetic resonance imaging (rsFMRI) data that allowed us to create an efficient and accurate computational method for assessing and categorizing brain activity. The second example demonstrates the potential of the method in the application to the analysis of a strong atmospheric storm circulation system during the complicated stage of tornado development and formation using data recorded by a mobile Doppler radar. Reference implementation of the method will be made available as a part of the QUEST toolkit that is currently under development at the Center for Scientific Computation in Imaging. PMID:27695512

  16. Entropy of the information retrieved from black holes

    NASA Astrophysics Data System (ADS)

    Mersini-Houghton, Laura

    2016-07-01

    The retrieval of black hole information was recently presented in two interesting proposals in the ‘Hawking Radiation’ conference: a revised version by Hooft of a proposal he initially suggested 20 years ago and, a new proposal by Hawking. Both proposals address the problem of black hole information loss at the classical level and derive an expression for the scattering matrix. The former uses gravitation back reaction of incoming particles that imprints its information on the outgoing modes. The latter uses supertranslation symmetry of horizons to relate a phase delay of the outgoing wave packet compared to their incoming wave partners. The difficulty in both proposals is that the entropy obtained from them appears to be infinite. By including quantum effects into the Hawking and Hooft’s proposals, I show that a subtlety arising from the inescapable measurement process, the quantum Zeno effect, not only tames divergences but it actually recovers the correct 1/4 of the area Bekenstein-Hawking entropy law of black holes.

  17. Comparison and evaluation of fusion methods used for GF-2 satellite image in coastal mangrove area

    NASA Astrophysics Data System (ADS)

    Ling, Chengxing; Ju, Hongbo; Liu, Hua; Zhang, Huaiqing; Sun, Hua

    2018-04-01

    GF-2 satellite is the highest spatial resolution Remote Sensing Satellite of the development history of China's satellite. In this study, three traditional fusion methods including Brovey, Gram-Schmidt and Color Normalized (CN were used to compare with the other new fusion method NNDiffuse, which used the qualitative assessment and quantitative fusion quality index, including information entropy, variance, mean gradient, deviation index, spectral correlation coefficient. Analysis results show that NNDiffuse method presented the optimum in qualitative and quantitative analysis. It had more effective for the follow up of remote sensing information extraction and forest, wetland resources monitoring applications.

  18. Entropy production and nonlinear Fokker-Planck equations.

    PubMed

    Casas, G A; Nobre, F D; Curado, E M F

    2012-12-01

    The entropy time rate of systems described by nonlinear Fokker-Planck equations--which are directly related to generalized entropic forms--is analyzed. Both entropy production, associated with irreversible processes, and entropy flux from the system to its surroundings are studied. Some examples of known generalized entropic forms are considered, and particularly, the flux and production of the Boltzmann-Gibbs entropy, obtained from the linear Fokker-Planck equation, are recovered as particular cases. Since nonlinear Fokker-Planck equations are appropriate for the dynamical behavior of several physical phenomena in nature, like many within the realm of complex systems, the present analysis should be applicable to irreversible processes in a large class of nonlinear systems, such as those described by Tsallis and Kaniadakis entropies.

  19. The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval

    DTIC Science & Technology

    2006-07-01

    reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We

  20. Apparent diffusion coefficient histogram shape analysis for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2016-10-22

    To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.

  1. Entropy Conservation of Linear Dilaton Black Holes in Quantum Corrected Hawking Radiation

    NASA Astrophysics Data System (ADS)

    Sakalli, I.; Halilsoy, M.; Pasaoglu, H.

    2011-10-01

    It has been shown recently that information is lost in the Hawking radiation of the linear dilaton black holes in various theories when applying the tunneling formalism of Parikh and Wilczek without considering quantum gravity effects. In this paper, we recalculate the emission probability by taking into account the log-area correction to the Bekenstein-Hawking entropy and the statistical correlation between quanta emitted. The crucial role of the quantum gravity effects on the information leakage and black hole remnant is highlighted. The entropy conservation of the linear dilaton black holes is discussed in detail. We also model the remnant as an extreme linear dilaton black hole with a pointlike horizon in order to show that such a remnant cannot radiate and its temperature becomes zero. In summary, we show that the information can also leak out of the linear dilaton black holes together with preserving unitarity in quantum mechanics.

  2. Credit market Jitters in the course of the financial crisis: A permutation entropy approach in measuring informational efficiency in financial assets

    NASA Astrophysics Data System (ADS)

    Siokis, Fotios M.

    2018-06-01

    We explore the evolution of the informational efficiency for specific instruments of the U.S. money, bond and stock exchange markets, prior and after the outbreak of the Great Recession. We utilize the permutation entropy and the complexity-entropy causality plane to rank the time series and measure the degree of informational efficiency. We find that after the credit crunch and the collapse of Lehman Brothers the efficiency level of specific money market instruments' yield falls considerably. This is an evidence of less uncertainty included in predicting the related yields throughout the financial disarray. Similar trend is depicted in the indices of the stock exchange markets but efficiency remains in much higher levels. On the other hand, bond market instruments maintained their efficiency levels even after the outbreak of the crisis, which could be interpreted into greater randomness and less predictability of their yields.

  3. Entropy is conserved in Hawking radiation as tunneling: A revisit of the black hole information loss paradox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang Baocheng; Graduate University of Chinese Academy of Sciences, Beijing 100049; Cai Qingyu, E-mail: qycai@wipm.ac.cn

    2011-02-15

    Research Highlights: > Information is found to be encoded and carried away by Hawking radiations. > Entropy is conserved in Hawking radiation. > We thus conclude no information is lost. > The dynamics of black hole may be unitary. - Abstract: We revisit in detail the paradox of black hole information loss due to Hawking radiation as tunneling. We compute the amount of information encoded in correlations among Hawking radiations for a variety of black holes, including the Schwarzchild black hole, the Reissner-Nordstroem black hole, the Kerr black hole, and the Kerr-Newman black hole. The special case of tunneling throughmore » a quantum horizon is also considered. Within a phenomenological treatment based on the accepted emission probability spectrum from a black hole, we find that information is leaked out hidden in the correlations of Hawking radiation. The recovery of this previously unaccounted for information helps to conserve the total entropy of a system composed of a black hole plus its radiations. We thus conclude, irrespective of the microscopic picture for black hole collapsing, the associated radiation process: Hawking radiation as tunneling, is consistent with unitarity as required by quantum mechanics.« less

  4. [Metrological analysis of measuring systems in testing an anticipatory reaction to the position of a moving object].

    PubMed

    Aksiuta, E F; Ostashev, A V; Sergeev, E V; Aksiuta, V E

    1997-01-01

    The methods of the information (entropy) error theory were used to make a metrological analysis of the well-known commercial measuring systems for timing an anticipative reaction (AR) to the position of a moving object, which is based on the electromechanical, gas-discharge, and electron principles. The required accuracy of measurement was ascertained to be achieved only by using the systems based on the electron principle of moving object simulation and AR measurement.

  5. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.

  6. Group entropies, correlation laws, and zeta functions.

    PubMed

    Tempesta, Piergiulio

    2011-08-01

    The notion of group entropy is proposed. It enables the unification and generaliztion of many different definitions of entropy known in the literature, such as those of Boltzmann-Gibbs, Tsallis, Abe, and Kaniadakis. Other entropic functionals are introduced, related to nontrivial correlation laws characterizing universality classes of systems out of equilibrium when the dynamics is weakly chaotic. The associated thermostatistics are discussed. The mathematical structure underlying our construction is that of formal group theory, which provides the general structure of the correlations among particles and dictates the associated entropic functionals. As an example of application, the role of group entropies in information theory is illustrated and generalizations of the Kullback-Leibler divergence are proposed. A new connection between statistical mechanics and zeta functions is established. In particular, Tsallis entropy is related to the classical Riemann zeta function.

  7. The third order correction on Hawking radiation and entropy conservation during black hole evaporation process

    NASA Astrophysics Data System (ADS)

    Yan, Hao-Peng; Liu, Wen-Biao

    2016-08-01

    Using Parikh-Wilczek tunneling framework, we calculate the tunneling rate from a Schwarzschild black hole under the third order WKB approximation, and then obtain the expressions for emission spectrum and black hole entropy to the third order correction. The entropy contains four terms including the Bekenstein-Hawking entropy, the logarithmic term, the inverse area term, and the square of inverse area term. In addition, we analyse the correlation between sequential emissions under this approximation. It is shown that the entropy is conserved during the process of black hole evaporation, which consists with the request of quantum mechanics and implies the information is conserved during this process. We also compare the above result with that of pure thermal spectrum case, and find that the non-thermal correction played an important role.

  8. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    PubMed

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  9. Consistent maximum entropy representations of pipe flow networks

    NASA Astrophysics Data System (ADS)

    Waldrip, Steven H.; Niven, Robert K.; Abel, Markus; Schlegel, Michael

    2017-06-01

    The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.

  10. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement

    PubMed Central

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-01-01

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893

  11. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement.

    PubMed

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-02-07

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.

  12. Entropy change of biological dynamics in COPD

    PubMed Central

    Cao, Zhixin; Sun, Baoqing; Lo, Iek Long; Liu, Tzu-Ming; Zheng, Jun; Sun, Shixue; Shi, Yan; Zhang, Xiaohua Douglas

    2017-01-01

    In this century, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of large amount of data in human physiological signals. Entropy is a key metric for quantifying the irregularity contained in physiological signals. In this review, we focus on how entropy changes in various physiological signals in COPD. Our review concludes that the entropy change relies on the types of physiological signals under investigation. For major physiological signals related to respiratory diseases, such as airflow, heart rate variability, and gait variability, the entropy of a patient with COPD is lower than that of a healthy person. However, in case of hormone secretion and respiratory sound, the entropy of a patient is higher than that of a healthy person. For mechanomyogram signal, the entropy increases with the increased severity of COPD. This result should give valuable guidance for the use of entropy for physiological signals measured by wearable medical device as well as for further research on entropy in COPD. PMID:29066881

  13. On the Application of Information Theory to Sustainability

    EPA Science Inventory

    According to the 2nd Law of Thermodynamics, entropy must be an increasing function of time for the whole universe, system plus surroundings. This gives rise to conjectures regarding the lost of work with entropy generation in a general processes. It can be shown that under cond...

  14. How long the singular value decomposed entropy predicts the stock market? - Evidence from the Dow Jones Industrial Average Index

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao; Shao, Yanmin

    2016-07-01

    In this paper, a new concept of multi-scales singular value decomposition entropy based on DCCA cross correlation analysis is proposed and its predictive power for the Dow Jones Industrial Average Index is studied. Using Granger causality analysis with different time scales, it is found that, the singular value decomposition entropy has predictive power for the Dow Jones Industrial Average Index for period less than one month, but not for more than one month. This shows how long the singular value decomposition entropy predicts the stock market that extends Caraiani's result obtained in Caraiani (2014). On the other hand, the result also shows an essential characteristic of stock market as a chaotic dynamic system.

  15. Connecting complexity with spectral entropy using the Laplace transformed solution to the fractional diffusion equation

    NASA Astrophysics Data System (ADS)

    Liang, Yingjie; Chen, Wen; Magin, Richard L.

    2016-07-01

    Analytical solutions to the fractional diffusion equation are often obtained by using Laplace and Fourier transforms, which conveniently encode the order of the time and the space derivatives (α and β) as non-integer powers of the conjugate transform variables (s, and k) for the spectral and the spatial frequencies, respectively. This study presents a new solution to the fractional diffusion equation obtained using the Laplace transform and expressed as a Fox's H-function. This result clearly illustrates the kinetics of the underlying stochastic process in terms of the Laplace spectral frequency and entropy. The spectral entropy is numerically calculated by using the direct integration method and the adaptive Gauss-Kronrod quadrature algorithm. Here, the properties of spectral entropy are investigated for the cases of sub-diffusion and super-diffusion. We find that the overall spectral entropy decreases with the increasing α and β, and that the normal or Gaussian case with α = 1 and β = 2, has the lowest spectral entropy (i.e., less information is needed to describe the state of a Gaussian process). In addition, as the neighborhood over which the entropy is calculated increases, the spectral entropy decreases, which implies a spatial averaging or coarse graining of the material properties. Consequently, the spectral entropy is shown to provide a new way to characterize the temporal correlation of anomalous diffusion. Future studies should be designed to examine changes of spectral entropy in physical, chemical and biological systems undergoing phase changes, chemical reactions and tissue regeneration.

  16. Analysis of HD 73045 light curve data

    NASA Astrophysics Data System (ADS)

    Das, Mrinal Kanti; Bhatraju, Naveen Kumar; Joshi, Santosh

    2018-04-01

    In this work we analyzed the Kepler light curve data of HD 73045. The raw data has been smoothened using standard filters. The power spectrum has been obtained by using a fast Fourier transform routine. It shows the presence of more than one period. In order to take care of any non-stationary behavior, we carried out a wavelet analysis to obtain the wavelet power spectrum. In addition, to identify the scale invariant structure, the data has been analyzed using a multifractal detrended fluctuation analysis. Further to characterize the diversity of embedded patterns in the HD 73045 flux time series, we computed various entropy-based complexity measures e.g. sample entropy, spectral entropy and permutation entropy. The presence of periodic structure in the time series was further analyzed using the visibility network and horizontal visibility network model of the time series. The degree distributions in the two network models confirm such structures.

  17. Relating quantum coherence and correlations with entropy-based measures.

    PubMed

    Wang, Xiao-Li; Yue, Qiu-Ling; Yu, Chao-Hua; Gao, Fei; Qin, Su-Juan

    2017-09-21

    Quantum coherence and quantum correlations are important quantum resources for quantum computation and quantum information. In this paper, using entropy-based measures, we investigate the relationships between quantum correlated coherence, which is the coherence between subsystems, and two main kinds of quantum correlations as defined by quantum discord as well as quantum entanglement. In particular, we show that quantum discord and quantum entanglement can be well characterized by quantum correlated coherence. Moreover, we prove that the entanglement measure formulated by quantum correlated coherence is lower and upper bounded by the relative entropy of entanglement and the entanglement of formation, respectively, and equal to the relative entropy of entanglement for all the maximally correlated states.

  18. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  19. The increase of the functional entropy of the human brain with age.

    PubMed

    Yao, Y; Lu, W L; Xu, B; Li, C B; Lin, C P; Waxman, D; Feng, J F

    2013-10-09

    We use entropy to characterize intrinsic ageing properties of the human brain. Analysis of fMRI data from a large dataset of individuals, using resting state BOLD signals, demonstrated that a functional entropy associated with brain activity increases with age. During an average lifespan, the entropy, which was calculated from a population of individuals, increased by approximately 0.1 bits, due to correlations in BOLD activity becoming more widely distributed. We attribute this to the number of excitatory neurons and the excitatory conductance decreasing with age. Incorporating these properties into a computational model leads to quantitatively similar results to the fMRI data. Our dataset involved males and females and we found significant differences between them. The entropy of males at birth was lower than that of females. However, the entropies of the two sexes increase at different rates, and intersect at approximately 50 years; after this age, males have a larger entropy.

  20. Increased resting-state brain entropy in Alzheimer's disease.

    PubMed

    Xue, Shao-Wei; Guo, Yonghu

    2018-03-07

    Entropy analysis of resting-state functional MRI (R-fMRI) is a novel approach to characterize brain temporal dynamics and facilitates the identification of abnormal brain activity caused by several disease conditions. However, Alzheimer's disease (AD)-related brain entropy mapping based on R-fMRI has not been assessed. Here, we measured the sample entropy and voxel-wise connectivity of the network degree centrality (DC) of the intrinsic brain activity acquired by R-fMRI in 26 patients with AD and 26 healthy controls. Compared with the controls, AD patients showed increased entropy in the middle temporal gyrus and the precentral gyrus and also showed decreased DC in the precuneus. Moreover, the magnitude of the negative correlation between local brain activity (entropy) and network connectivity (DC) was increased in AD patients in comparison with healthy controls. These findings provide new evidence on AD-related brain entropy alterations.

Top